Off-topic but I wish 16:10 hadn't died with the move to 4k. I have an old 1200p dell ultrasharp in 16:10 and it's sublime compared to my other 16:9. You'd think it doesn't make much of a difference, but it really does. Especially when you have it vertically oriented, but otherwise too.
It's special. Just because people overload it with spurious claims doesn't alter its mathematical elegance. And seeing as how it falls out naturally from geometric figures like pentagons and pentagrams, people are likely to stay interested in it for as long as humans have 5 fingers.
Nor does it make sense for a monitor. I can't remember ever wanting to arrange a bunch of windows as a square, a smaller square, a smaller square, etc.
Not quite. ISO 216 paper sizes are designed so that if you split them in half you get the same ratio (1:sqrt(2), or 1:1.414). The golden section is so that if you take a square out from one side, you get the same ratio (1:1.618).
Hopefully historians in the future will be sufficiently advanced that when they look back on us 16:10 ratio displays won't rate the top 500 things they are horrified by.
It's less of a problem with 2560xN than 1920xN, in my view. I also prefer 16:10 to 16:9, but 2560x1440 has enough pixels vertically not to be disagreeable, and 1440x2560 is wide enough not to be cramped. I'd never buy a 1080p monitor, but I've found 1440p a most worthwhile upgrade from 1200p.
I can't stand 2560x1400 -- I swear by my 30" 2560x1600 (16:10) displays at home and at the office. The 16:9 ratio is just too skinny for anything but movies.
Yes, I've found 30" 2560x1600 (16:10) displays, specifically Dell U3014 are the current ideal. I expect in a few years I will cave in and purchase a pair of Eizo FlexScan EV3237-BK 32" 4K screens, but for now 30" 2560x1600 is a good compromise between cost and practicality.
Well, it definitely takes all sorts, because I use 2 side by side in portrait mode and I'm quite happy with it ;)
You're right about the length of the longer axis, though... I've rarely found it all that useful to have one view fill the entire screen. So I generally either have two 1440x1280 windows/panels/etc. on a screen, or one 1440x~1700 and one 1440x~860. That works pretty well. Many window layout tools will let you quickly set up this sort of arrangement.
I can relate. Open up a page like this [0] in a maximized window on a very wide screen and try to read. I ended up writing a Chrome Extension [1] to narrow/center such pages.
I don't like to see the noisy background. I also find it fiddly to do it manually. After I click unmaximize I will probably have to adjust each side of the window, which takes much longer and isn't really convenient (IMO).
I did exactly this, upgraded from 24-inch 1920x1200 to 27-inch 2560x1440. The aspect ratio change hasn't bothered me, precisely because of the increase in space. Much easier to have two files side-by-side now, for example.
So the thing that a lot of people miss is that unless you bascially 4x the pixel density and go anti-aliased... it doesn't actually help to increase the density when regarding reading.
The thing I've learned is that really I want a specific pixel per inch real world or I want a specific font size in inches real world. I sit comfortably back and the dell 30 lets me easily set my text at about 1/8" tall or 1/4" which is super easy to read relaxed in my chair. If I went for a increased resolution display, the font would just get smaller until I get to a 4x density... and I'm not really gaining anything until then either when AA smothing is a win.
But also since I jack my font up even more than usual for my distance.. I'm already anti-aliasing about 4x AA, so a 4x resolution bump would put me at 16xAA which is, I don't think, needed. I'd rather blanket my field of view with lots of large text than have one very nice high density display central that makes me lean forward and strain my eyes.
Comparing with a super-fancy Apple display doesn't seem that fair. Even today "1080p" monitors are not uncommon - for example the popular Dell XPS 13 defaults to a 1080p monitor. Also worth noting that 2560x1600 (the resolution of that monitor) is identical to the original Retina display.
If people really care about 16:10, it's available with reasonably good specs and they can buy it. If they don't, it will probably die permanently, but we can't exactly blame the market.
I think a "modern" 16:10 should be 3840x2400 (like the IBM T220, released in 2001) or 5120x3200. Neither of these exists in any display currently made.
This. Right now I'm sitting in front of 13,312,000 non-UHD/4k pixels, all in beautiful 16:10 aspect ratio. I expect that I will keep this setup until I die.
4k got me to stop complaining about the loss of 16:10. A 4k screen at 16:9 is still not as wide as even two 4:3 monitors side by side -- and you still get the screen real estate of four HD screens in one. Is the aspect ratio my favorite? No, but with this resolution my attention is almost always limited to a subset of what's on screen. The aspect ratio of the whole thing becomes increasingly irrelevant.
I apologize if this is an incredibly stupid question, but why do these monitors cost so much? As I understand it, you can buy a good 4k TV for considerably less, so what features of this monitor make it a better deal?
I've been using a Samsung 40" 4k TV as my main monitor for development work for about 8 months and it's been great for me.
A couple caveats -- Because I don't game or do graphics work, I don't know or care about color reproduction. I made sure to get a TV, graphics card, and cable that supported 4k@60hz with 4:4:4 chroma so the text is perfectly sharp and the refresh is not obtrusive. It's not hard to find a Samsung TV around $350 that supports that, but as far as I could tell, I had to go up to a GTX1050 graphics card to get that. I sit between 24 and 28" away from the monitor so the text is readable to me at full resolution. At that distance, I do have to turn my head to comfortably read text in the corners of the TV, especially the upper corners. In practice, that means I keep monitoring-type applications such as CPU charts, Datadog graphs, etc., in one of those corners for quick reference. While I still have two 1920x1080 monitors on either side of the TV, it's quite nice to be able to open up on my main monitor a huge IDE window and, when necessary, put 3-4 normal-sized windows next to each other for comparison work.
Same here. I bought a Philips BDM4065UC monitor in 2015 (40 inch, 3840x2160 resolution) for 740 € (in Germany) and I'm loving it. The only cave-at with that model is that reaction time is a bit slow (I would guess 5 ms between full black and full white).
When I replace it, the replacement will be the same size and resolution, but I'll probably go for OLED and HDR-10 or better.
I happened to buy the same, and import it from Germany to the UK.
Was very excited for it as it featured a very similar DPI to a 27" 1440p Korean import, coming from 3+ 1080p monitors originally the lack of bezels was incredible.
It's not built for gaming at all though, extreme screen tearing and poor response times has me now looking to replace the 27" 1440p Korean import with something >144Hz and G-Sync, before replacing the Philips BDM40.
Yeah, I got that Philips too. For games 40 inch at 4K resolution is really a lot more practical then 38 inch with an odd resolution, and it's cheaper too.
I think the fact that the 27" monitors next to it are closer to the camera in that photo makes the TV look artificially smaller. It absolutely dwarfs the 27" monitors. If you're close to any store that carries 40" 4k TVs, though, just go there and stand 28" away from one. I think you'll see what I'm saying about having to turn your head to comfortably read text in the corners, something that has never been the case for me with smaller monitors.
I just bought (today) a 32" 1080p monitor and I'm lovin' it.
I think it's more productive to use a single big screen then set up a two monitor setup.
My old setup consisted of two monitors, one 23" and one 21". One day I realized that my 23" monitor could not reproduce colors. I needed to photoshop a picture. I sent the retouched version to my phone, but as I couldn't see the color depth on the 23" one, the image looked ugly on the better screen. At that time I learnt not to buy the cheapest monitor, just by looking at its size and resolution. So, I sold that monitor, and bought a 32" 1080p monitor at a bargain price.
Of course it would be better if I could buy a 1440p monitor at this size, (or even 4K OMG :)). But considering my budget this was all I could purchase.
I had serious doubts about the pixel density, however there was nothing to be afraid of. In usual tasks (e.g. browsing, writing code in emacs, terminal) nothing disturbs my eye, it's beautiful. However, if I open a 1080p youtube video fullscreen, it looks like as if it were a high quality SD video, because of the pixel density. I am standing close to the monitor, and although I cannot see any pixel points, I can notice the difference.
I bought my monitor for 900 Turkish Lira, equivalant to 254 dollars and I think it is a great investment.
I detect no lag when moving windows around. It might not (or might -- I simply don't know) be good enough for gamers, but it's definitely smooth enough for development, document, or web browsing (aka productivity) tasks.
144Hz monitors do produce a very noticeable difference for gaming if you can rival the refresh rate with in-game FPS, outside of gaming I've only found it noticeable when moving windows about.
For productivity I'd go for the 40" 2160p over a lower resolution higher refresh panel.
Mine is a 40" Samsung UN40KU6300 though Samsung appears to rev its model lineup frequently so they may have already released a successor to this model. It was a little difficult to confirm that it supported 4k@60hz with 4:4:4 chroma, but I relied on the experiences of some folks on an audio/video forum who confirmed that they had been able to run with those settings. Unfortunately, I don't remember the name of the forum or have it in my browsing history. I do wish manufacturers would make that information easier to come by though it's certainly a tiny minority of consumers that know or care about those kinds of specs.
rtings.com is a TV review site that is superb for finding such details, they test for 4:4:4 chroma specifically along with input lag in each mode and one of their sub-cores is "use as a PC monitor". A valuable resource.
It's designed for graphic design and other display-critical tasks, so it is calibrated to 99% sRGB color space. What it looks like on this monitor will be what it should look like in print and on the best of every other display. Plus, it's the UltraSharp top-end model, so all the mechanical construction will be top notch.
TVs, on the other hand, are designed to show the most oversaturated, "vibrant" colors on the demo loop on the show-room wall. And mechanically, they're designed to hang on the wall and never be touched.
> so it is calibrated to 99% sRGB color space. What it looks like on this monitor will be what it should look like in print and on the best of every other display.
That’s completely useless for graphic design. sRGB is defined as the lowest common denominator of CRTs on the market in the early 90s.
Actual monitors for graphic design purposes, every TV released in 2017, and most new gaming monitors use instead AdobeRGB or DCI-P3 colorspace or even Rec.2020, which is what is also used in cinemas.
I have an UltraSharp and it's fucking rubbish. Half of the screen has a yellow tint, and it has dark corners. Obviously support says it's "within spec".
It's not odd. The size is what makes this expensive. It's 10 inches more screen, and curved to boot (which you may not care about, but it does affect cost).
This screen is much bigger (37.5" vs 27"), has a better refresh rate (75Hz vs 60Hz), and is curved. All of those things correspond with higher price for both the panel itself and the resulting product that contains it.
* removed mention of FreeSync as this display lacks it
I have a 5k iMac myself, but for my work, which is Linux based, I would rather have that wide-screen. The iMac for sure has the way better font rendering, but the real estate is that of a 27" screen. With Linux, HiDpi support isn't quite there yet - so for now the 38" screen gives you the larger desktop.
This is a good display, and price has gone down.
But anyway to connect it to a Windows 10 / nVidia PC and get the full 5K resolution? No GPU that I’m familiar with has support for USB-C.
I cannot speak for 4K TVs, but as I was making a research for my own purchase I found the following differences between monitors and TVs.
TVs are said to be configured to display the best-looking colors, monitors try to stay true to the real color.
TVs have an immensely higher refresh rate, 50hz/60hz usually suffice for a TV. But on monitors we speak of milliseconds. The refresh rate makes a lot of difference, if you intend to game on your computer/monitor. An e-sport professional Redittor claims he had improved his playing performance multiple-folds after switching to a monitor instead of TV. (https://www.reddit.com/r/FIFA/comments/5whb9v/did_the_change...)
Other than these differences and the occasional overscan/underscan problems on older televisions, I personally see no reason to prefer a Monitor over a TV, if there is huge price difference. If the price difference is minor, I'd opt for a monitor.
Refresh rate and response time are different. Refresh rate is how often a new frame is displayed, i.e. 50/60/144Hz. Response time is the amount of time it takes for the display to process a frame and display it, i.e. 1/2/4/16ms.
>>Over a decade has passed since the LCD monitor unceremoniously ousted the boxy CRT monitor into obsolescence, but with that ousting came a small problem: CRT monitors redrew every frame from scratch, and this was baked into the fundamentals of how PCs sent information to the screen. Monitors redrew the screen with a refresh rate of 100 Hz, (100 times a second), and they were silky smooth.
>>LCD monitors don’t have this problem, because the pixels don’t need to be refreshed.
Instead, they update individual pixels with new colors, and each of those updates takes a certain amount of time depending on what the change is.
The response time refers to how long this change takes. LCDs started with a 60-Hz refresh rate but a response time of about 15 milliseconds (ms).
There's also factors like input lag that are separate from display refresh that can be a problem for people playing action games. Even casual players can notice some high input delay from time to time (we're into the 100ms+ range) so it's a good idea to check the display's input lag if games are anywhere near a possibility. While I like my 34UM95P for professional purposes it's really not that great for games that are much more than interactive movies. In contrast, nothing stops me from using an Acer x34 monitor for coding 95% of the time.
And it double-sucks, because here in Italy we have to pay "TV tax", even if you don't have the TV service hooked up. Modern LCD/OLED TVs are perfectly fine for most tasks, except maybe high-end FPS gaming. The colors are good, the refresh rates are decent.
That's an ownership tax on tv's as there's an ownership tax on cars. The fact that the proceedings of the Tv ownership tax pay for the public (national) tv service is just a side factor. As very few people were actually paying it, the Tv ownership tax has been embedded on the electricity bill. _Italians, good people_
I was just saying that above, it's an ownership tax. A lot more people are paying it as it's now embedded on the electricity bill, so it's useless to recommend to pay it as it's now been basically enforced.
I believe you should comment in english if you wish to be understood.
It's like how megapixels are a bad measure of camera quality. This is a pro monitor with accurate color reproduction, low fading on the edges, minimal dead pixels, etc.
I know someone who uses a 1080p TV as a monitor for his gaming PC. The picture quality is poor; it's too blurry to be tolerable for work. That's okay in this particular case, since he uses a different PC for work, but that one needs a proper monitor.
Granted that's 1080p, but it wouldn't surprise me if the same thing is true of 4k.
Related musing: why is it seemingly easier to make very high resolution small screen (phones) than large screens of the same resolution ? Instinctevly I would think smaller leds are harder to make, but it doesn't seem it's the case ?
There is a greater failure rate the larger you make apanel. a great many are discarded from production lines. If the screen is smaller then it is less likely to incur error.
Also that cost is hidden in a phone, i'm not sure what the value of screen components would be.
A large defective panel can be cut down into smaller panels most of which will have no defects. You can sell most of your faulty large panels as small panels, but there is no extra source of large panels.
Right, and a Phone could spend $50 to build that screen on a $700 device (not that they do). Just a raw 75 * $50 is near $4000, which would explain a high price.
That is if it is the pixels per unit that lead to cost, vs the size of the display itself.
Obviously it is harder to make a 1 inch x 1 inch display with 1 million pixels vs 1000. But maybe the difference from 600ppi to 200ppi is not enough to matter...
Also, incidentally, I'd happily pay a bit of premium for a TV / Monitor that has good image quality but no other features. Perhaps not 2x as much, but 20% might be doable.
High-end monitors are niche products. 40" TVs aren't.
HDMI 2.0 is a pain compared to DP, other than that, there's almost no reason not to buy a TV if you just care about office use. I've been using a Samsung UHD TV for almost two years.
Monitors are designed to be viewed at close range, from multiple near angles. TVs are designed to be viewed several meters away, at roughly the same angle.
TVs play content of the same frame rate, so there's no reason for them to be any more precise. Monitors can go at very high frame rate as the content can be very fast paced, especially gaming monitors.
Freesync/gsync also integrates with the graphics card to reduce lag. This is the most expensive part of these monitors.
- TV's are generally low framerate, as much as they'd like to claim 240FPS, it's mostly all 30FPS, with software interpolation to increase the frames.
- Bulk. TV's are solid in higher numbers, justifying the lower price
- Distance from face. Your 60" TV can be two smaller panels "glued" together. Not noticeable from watching distance, but having a monitor so close to your face, you're more likely to notice the millisecond tearing.
"- TV's are generally low framerate, as much as they'd like to claim 240FPS, it's mostly all 30FPS, with software interpolation to increase the frames."
That makes absolutely no sense. If the panel is incapable of 240 refreshes per second, how does software interpolation "increase the frames"? You are confusing content and panel.
Mainly it is that for more than 60fps you need dual-link DVI, display port or HDMI 2.0 and you won't find many TVs with either of those. So even though the screen can do more than 60fps the input and processor can't take it.
That’s not what was said in the comment. Obviously, the point of a interpolation is to take a signal with little samples and increase those. What the comment said was that the hardware (panel) was not capable, so the software somehow was able to do it. It makes no sense.
More like: the panel supports 240hz, but you can't get 240hz content into the TV (they don't exist and require high end interfaces), so you interpolate 30hz (which you do have) to 240 hz. Why go through all this trouble for fake 240hz? Probably because it sounds good as marketing.
"Dell has managed to increase maximum brightness of its U3818DW to 350 nits (from 300 nits on competing monitors)"
Bah, why does everyone seem to love blinding themselves? Am I the only one who doesn't like a bright screen? (Ergonomically, we're supposed to have the monitor's backlight match the brightness of a white sheet of paper in the same lighting conditions.)
I imagine it's easier for you to dim the brightness to your preference than for others to somehow manage to tweak every bit of brightness out of their graphics card settings after the monitor's settings tap out
Use case: I'm in front of a window, if the sun's out and I've got the curtains open I might as well just turn the screen off
You would think monitor manufacturers would the brightness control go all the way to barely visible, but they don't. I have two monitors, an Asus and a Dell, both set to what they consider brightness level 0, and neither quite as dim as I'd like.
I work in a pretty bright office -- usually I have the windows open and the overhead light on. My monitors (a Planar and a Philips) are both plenty bright enough for this use case, but neither of them will get as dim as I'd prefer once it's dark out and I have only the overhead light. If I would prefer to work by lamplight only (as I sometimes do), the contrast between the super-bright-on-the-dimmest-setting monitors and all the peripheral space around them makes my eyes hurt.
And, here's the thing: I like brightness, usually. As @ianai mentions, what's needed is a larger range, not the same range but brighter.
I have used f.lux and redshift in the past. They can help reduce the eyestrain a little, and I guess they have a psychological effect around sleep, but they don't actually dim the light.
Once upon a time it was normal to adjust the location of the desk to the person working there. If lefthander, place table towards the left wall with light coming from window on the right. If righthander, place to the right of window. The typewriter was placed on a small table in front of the window.
When the computer showed up, it was first placed in the corner toward the window or in front of the window but the later location proved hard on the eyes so we stopped with that.
What has changed now that we stopped with this practice? I thougt we would have the office layout perfected by now?
Not being a part of a hive mind unfortunately I did not have this wisdom inbuilt and apparently neither did the people designing my house with the window facing the sun that my home office lives in.
I could rotate the layout 90 degrees and only have sun glare half of the day but it's just easier to either close the curtain a bit or crank the brightness
My problem with the brightness is that it pulls up the minimum brightness available. Most of my devices are too bright for me at their minimum settings at night time.
Can't you always lower the brightness in software, at the OS/driver level? Unless you're saying that this compresses the dynamic range of brightnesses too much, to the point where you can't distinguish shades that you need to be able to for your work.
I'd rather stick with 2x 25" Dell UltraSharp U2515H 2560x1440 monitors[0].
2 of those are nearly 3x cheaper than the 1x 38" curved display and for tasks that require a lot of horizontal space (video editing, etc.) 5120 is WAY wider than 3840. Also a curved display is pretty questionable for professional image editing (it distorts pixels).
I run the 25" here for development and video editing. Probably the best development environment upgrade I've made in 5 years.
If anyone wants to read a deep dive on how to pick a solid monitor for development, I put together a detailed blog post[1] that covers why I picked that 25" Dell specifically.
> Also a curved display is pretty questionable for professional image editing (it distorts pixels).
Isn't the point of a curved display that at the designed viewing distance, each point (at the same vertical position, at least) is equidistant from the viewer, eliminating the distortion that occurs that occurs when viewing a large flat screen.
It would seem to me that for professional image editing where you often need to zoom in to a higher magnification than an image would usually be viewed at by the target audience, a curved display would be very useful.
You probably also want to verify work on a display and conditions (viewing distance, etc.) that match the target environment.
I'd have to try this myself but I use straight guidelines all the time and I can't imagine having them look skewed if I turned my head from the precise targeted distance.
>> 2 of those are nearly 3x cheaper than the 1x 38" curved display
For about the same price as the 38", you can get a pair 27" Dell P2715Q 4K monitors which will give you more than double the resolution of that 38" monitor.
I have a pair of P2715Q's mounted side by side in a "v" shape (not as good as curved, but does the trick) and it works fairly well. Of course, having no bezels in the middle where the monitors touch would be nice. The main drawback of the P2715Q's is that they don't have HDMI 2.0 inputs, which means your system needs two DP outs that can support 4k/60.
To me, it's the ability to put my current "focus window" exactly center what makes the curved screens so attractive, with the 32-38" size range always feeling just about right to me. My neck starts aching if I have two screens, making me end up using one as the focus screen and the other as support. While three 27"s in a row seem slightly excessive "screen wall" to me, but some like that...
I guess it really depends on your workflow. My own preference is to use my left monitor for source code and shell windows and my right monitor for browser testing. That does mean I tend to spend a significant amount of time looking to my left.
I have two of those on arms on my desk. Another great advantage of multiple monitors, I can just switch between devices (Thinkpad, MBP and my gaming rig).
The only thing I hate on U2515Hs: they could be more rebust. The plastic hull is not so great.
And here I am with a Vizio 40" 4k[1] that I got for less than $500. It's not perfect, and the refresh rate is only 60Hz for 4k (which is the same as this Dell offering), but it's served me well for the last six months at a fraction of the cost and with more pixels.
I've done the TV as a monitor thing, and the value is good, but the picture will be nowhere near as nice as a real monitor. Color reproduction is usually really inaccurate and even when you turn all the sharpening and post-processing off that they will let you, you still have an inferior picture.
That said, for gaming they are a solid choice and $500 is obviously a whole lot cheaper.
I would not recommend a TV for gaming unless one happens to find latency benchmarks for the display. TV's tend to add enormous amounts of delay their buffering and pre-processing layers, latencies that really don't matter if you're displaying a movie.
rtings.com has latency benchmarks for most that I saw, including separate ratings per display mode (1080p, 4k, 1080p game mode, etc). It was invaluable when I was researching what to buy.
I found the site I linked, rtings.com, to be invaluable in getting the specifics of television lines and their capabilities and qualities. Different TVs will have varying levels of picture quality, and making sure you turn off (and can turn off) various settings that help a TV experience but hurt of computer monitor experience (motion blur, etc) is important. The one I chose had a fairly low display lag for 4k, and supported a "game mode" (which many do). It takes a bit of research to figure out what you should be looking for, but there are some good resources you can find by googling (articles, subreddits) that let you know what are likely the most important attributes in general, and you have to supplement that with your own specific needs.
I don't have one, coworker does, but for coding it shouldn't be too bad. Browsing usually means more scrolling and so you actually will get tearing but if you don't care then should be fine.
Your other reply talked entirely about games so I responded to your actual question.
I wasn't talking about games at all. I was talking about input lag, which for TVs as displays manifests in some negative effects, most notably mouse lag.
Many TVs have a mode to reduce this lag and turn off other processing effects that are undesirable when playing games, and that tends to map well to the settings that make for a good computer display experience.
Agreed. I bought a Samsung (UN40JU6700) 4k60p curved monitor for ~$800 and it works reasonably well. There are some quirks using a TV as a monitor and I was a little nervous about the screen quality, but it's worked well so far. Also, it's a TV.
edit: I especially like that the DPI is the same as a 27" 2560x1440 monitor.
Have you been extremely annoyed by software that detects a 4k resolution and scales everything to HUGE? I had trouble with this when I got my 40"/4k, and in the end did unspeakable things to the registry to force Windows to always scale 1:1.
No, actually. I don't think I've experienced that at all. The biggest problem I have is that it's actually a docking station setup, and when I pull the laptop out, and have to scale back down Chrome's zoom from 130%, and change PuTTY's font. I may have googled and turned off that in the beginning, but I definitely didn't need a registry hack, at least for Windows 10.
In my experience (routinely using both a 4K iMac and high resolution dell xps with win 10) the mac is still better at doing the "right thing" on its own, resolution-correction wise.
Only 60 hz. Sigh - I hope mainstream moves to 100+Hz monitors soon. I love the fluency of my far-too-expensive 'gaming' IPS 144hz monitor despite I don't play that much. Just the fact that web pages and mouse pointers don't lag is a noticeable improvement.
I rather we invest in PPI rather than refresh rate. Not that the two are mutually exclusive, but as I look at my use cases, most of the time the screen is static, so a higher refresh rate would be wasted, whereas higher PPI screens improve readability and legibility (when they work on Windows).
4k / 28" is 157 PPI, which is only indistinguishable at around 2 feet view distance on average. Better eyesight puts that distance further back.
8k at 28" is "retina" at just under a foot.
So depending on how far you sit from your screen PPI can matter more or less to you, but when we finally get 8k panels we should also finally be getting people in retina range most of the time.
Not for a 27” panel and above. 5120x2880/3200 for a 27” is good, but for larger panels (29”, 30”, 32”), an even higher logical resolution is needed.
That’s for 2x scaling, of course, but I don’t think fraction scaling creates a good end result, unless everything is vector, which decidedly, very little is.
This isn't 4K or even 4x HD. There's not enough vertical resolution. Consumer 4K is 2160 vertical pixels. This is only 1600. If you write software for a living, vertical space is precious!
No, it's not. Because a 4K "retina" (aka HiDPI) display is only 1080P. 1080P on anything larger than a 24" screen is a non-starter for me (thanks but no thanks to huge icons on screen). I'd ideally like a 27-30" 8k "retina" monitor (aka 4K useable) with DCI-P3 color gamut. Refresh rate I'm ok with 60hz, but wouldn't object to higher obviously.
I agree. 60hz is fine but having tried +100hz CRT monitors back in 199X I am still waiting for them to be mainstream. Perhaps Apple will take the iPad's "ProMotion" 120hz to the mainstream crowd in a not-too-expensive display package.
Yeah, 60Hz looks like a slideshow after using 144Hz for a while.
It does get rapidly better after 60, though. I think I can stop telling a difference at around 80. So there is some hope that we can have 4k@better-than-60 in the near future.
Using text, I am fine with 5fps if it is not blinking like a CRT did. My kindle I think does 0fps when not changing pages and it is 100x better than a monitor. I have found the 300fps thing a bit silly for computers meant for text (which is my use case, not claiming it is the ONLY use case).
I bought the U3415W when it first came out. At first it was for games (coming from 3x U2412Ms) however I quickly realised how incredibly good the 21:9 3440x1440 resolution is for programming. No DPI scaling needs be involved, so I'm looking at Visual Studio experience where even with NCrunch unit test runners and the Solution overview, I have plenty of room for two main code editing windows. Brilliant.
I ended up buying an extra Acer X34 for home (surrounded by 2x U2412Ms on an Ergotech stand) and brought the U3415W to work as a personal device.
The 38" could potentially be even better, however I'm rather happy with the 34" as is. It's a bit of a shame they didn't add Freesync to it.
I've had the Dell U3415W (3440x1440) since it was released two years ago, and it is the best monitor I've ever used. The resolution is great, and avoiding the DPI scaling needed for practical use of most 4K screens is key. As a bonus, it is absolutely perfect for movies in 21:9.
If you haven't seen one, a more familiar comparison might be that the U3415W is essentially the same as a 27" IPS panel, only wider. It has 1440 pixels of height at roughly the same physical height as the 27" IPS panels.
Many people argue that curved screens are unnecessary or a gimmick, which I would agree is true for TV screens with multiple off-center viewers. From my own comparison of several 32"+ monitors, I do think the curve is very useful in reducing neck/eye fatigue.
The new 38" does sound appealing, but at 2-3x the cost of the U3415W on sale I would probably still recommend the 34" to most people.
>> The resolution is great, and avoiding the DPI scaling needed for practical use of most 4K screens is key.
While ymmv, I have been using a pair of 27" 4K Dell P2715Q's as my working monitors running with no scaling (1:1). I have a standing setup (although lately I've been using a drafting stool because of plantar fasciitis, but that's another story) so I do have the monitors closer to my face than most people, but I find it's more than tolerable.
Out of curiosity what font sizes do you use for your editor? I have a 27" 4K monitor around 20" from my face and find it too small without scaling, 1.5-1.75x is around perfect for me (although support for that under Windows and Linux is hit or miss).
In Sublime for Windows, I am using the default font with a size of 10.5. I have always tended to use higher dpi screens on laptops (i.e. 1680 and 1080 @ 15" running at 1:1) so I have years of conditioning to small text on screens.
An added piece of information - I got presbyopia (in addition to my existing myopia and astigmatism) about 5 or so years ago, so I now use reading glasses, which also helps a little.
I had a special pair of single vision glasses prescribed to me from my optometrist where its optimal range of focus is between 21" and 27". I got those distances from measuring the closest and furthest points between my eyes and monitors. I see everything fine. Before I got the single vision glasses, I was using a flip up reading glass attachment that I wore over my normal progressives to see the screen, which a - looked goofy, b - was unwieldy, and c - not as good as the single vision glasses.
I had a colleague who did the same. Size 10 on a 27" 1440p monitor. Whenever I had to pair with him I always had to ask him to increase the font size.
I also have myopia and astigmatism, but if I use a smaller font eventually my eyes get tired and I get headaches. My optician suggested I wear slightly weaker than my prescription glasses for computer work which helped a bit, but still I find it more comfortable to stick to my size 14 at 1.75x scaling :-)
I ended up getting 2 Dell 43" monitors and are very happy with them for coding. I actually have 3, but the 3rd is too much (I never thought I'd see the day).
They aren't perfect, and the "old" version is buggy, so if you want these, get them straight from Dell. Resellers still have the old version that drops connection.
Also, the HDMI is only 30Hz while the DP and mDP are both 60.
Finally, they stretch a few pixels past the border, so you will need to shrink the picture slightly with the video card driver.
If you can deal with all that, they are incredible. Dell has a window manager that allows you to assign a grid and windows will snap to their respective cells once dropped into it.
Thank you for posting this. I've recently been considering buying one (Dell 43 Ultra HD 4k Multi Client Monitor P4317Q, right?) However, this negative review is holding me back. Is there any merit to its claim?
This monitor has glorious resolution made totally useless because the backlight uses PWM Pulse Width Modulation to adjust the backlight brightness. This leads to eye fatigue and, in some cases, severe headaches. One would be hard pressed to find other monitors in Dell's lineup with this outdated technology since it has almost entirely been removed from all modern monitors. As a business monitor, it is expected to have "Comfort View" aka "low blue light" mode to reduce eye stress during a long day of use. Fix these problems and you will sell more of these than you could make.
Yes, that is the model. I'm sure different people have different sensitivities to different things. I work in front of them all day. The only complaint I have is they are so tall (I'm using arms) that I tend to look up more than usual. Also, you can't daisy chain them. Each one needs a dedicated port on your card (I'm running a single GForce 1080)
The real estate is amazing. I have my main one set to VS taking up about 2/3rds of the screen, then two large windows stacked next to it. The secondary monitor is divided by 6 portions, so I have Pandora, Email, Browser, Postman, Excel (for billing), Windows Exploder, SSMS, whatever. I'm typing on this in one of the 6 portions with VS opened on the main monitor. I rarely fill everything up. Also, I can make VS full screen and do split windows within it to compare code. I think the Dell Display Manager only works on Windows though.
Having said that, it's 4K, but it's so large, you don't get a retina PPI. You get a regular PPI but many more pixels, which, as a coder, is much more useful to me.
Like I said, I'm very happy with them. I've finally realized maximum useful resolution, and I work on them all day and many nights.
I got my P4317Q in Aug 2016, it was that early faulty A00 revision with PWM (flicker) issue. I took a flickering video on the phone camera with lowered exposure and filed a warranty ticket. Local (EU) Dell service center picked up the display, confirmed an issue and one week later I got a new screen of A01 rev. from Germany with no issues. Superb monitor, just make sure to get A01 rev.
A happy recent HP Z34c convert here. I realized it had roughly the same number of pixels as my previous two monitor setup (older HP LP series in 24" and 22" portrait mode) and made the upgrade. This screen looks better, runs cooler, has a cleaner physical footprint (except for the useless speakers on the side bezel--I wish I could cut them off), and requires half the cables. From HP's refurbished outlet at half price with PC purchase the price is tough to beat.
I did have to scale this one up by a few percent to save my eyes, but overall I found as you did that the orientation works really well for code window plus emacs/shell or browser. It's pretty impressive to flip an HD video to full screen once in a while.
Another U3415W owner here, it's the best monitor purchase I've made in years. I previously had a 27" and two 24"s and replaced the 24s with the U2415W. I do programming and gaming on it and it's excellent for both.
But isn't this a software, not a panel resolution problem? Fonts and icons could be scaled down in software just fine and could keep excellent readability.
I suspect that in a few years a hololens / google glass style device will become cheaper and work as well as a big bank of monitors, and at that point it is going to rapidly replace physical monitors - and then economy of scale and iteration of the product will do the usual to the price/performance ratio of head-mounted devices, then screens go the way of CRT monitors when flat screens came along.
I'm not saying that a head-mounted device and virtual screens will necessarily be better than a bank of monitors - in fact it's time to asses drawbacks - but once it's cheaper and seems "just as good" then business will want to switch over, for better or worse.
I wonder if an AR device that only renders stationary floating text could be produced with much lower GPU (and thus power) requirements than the current crop of headsets.
What I want is a computer + headset that can be fully powered by moderate exercise. From what I've read, manual labor averages about 75 watts of mechanical power throughout the work day, so with conversion losses, maybe 20 watts produced by pedaling or rowing action. Then I could "sit at the desk" all day without wasting away, and do it from the middle of a forest or the top of a mountain if I wanted to.
"Need" is a strong word, but you'll probably want to if you're trying to be productive. VR/AR won't replace the tactile feedback of a good keyboard, etc.
Then again, as someone who can't even use current VR/AR offerings (high myopia, don't use contact lenses) and is only following all of this as a fairly disinterested spectactor, it looks to me like the initial craze is already blowing over and people are taking a better look at the disadvantages and limitations of the current products. I don't see VR/AR taking over monitors or much of anything else, though I'm sure AR will find productivity applications in specific fields soon enough.
There isn't a voice input in existence that I'd be interested in using while writing code or doing photo/video editing.
Even dictating code to another human who is as fluent as you are in the language of choice is a non-starter, so I don't see any machine based way of doing it catching up soon.
The only solutions I've seen that even "come close" in the speed / accuracy comparison are unfair comparisons.
An example was posted by someone in another comment of a guy doing Python coding with Emacs using Dragon Dictation. In that example, the guy used over 2,000 custom "macros" to get to something close to his pre-RSI speeds. This sounds fantastic, but the speeds of a typer who has learned to leverage 2,000 custom macros would be an order of magnitude faster (imho) than someone dictating using similar macros.
Neither gestures nor voice input will ever eliminate keyboards for the simple reason that keyboards are vastly more efficient. No technological innovation can ever change that because the limitation is on the human side of things.
Futuristic solution to keyboards? I've tried lots of keyboards. Probably going to try lots more, there are frequent times when you have to use the only keyboard that's there.
> so saying that you must use a keyboard seems a little premature
What part of "I for one much prefer a physical keyboard" implies "you must use a keyboard" ? I expressed a personal preference, that's all. I don't expect my preference to change, but sure, anything is technically possible though unlikely.
The part where you say to prefer a keyboard to some futuristic solution that doesn’t yet exist. You can’t have an opinion about something that doesn’t yet exist.
Before someone dreamed up the automobile, you would we telling us how you prefer the horse, perhaps just make it faster.
No, but you'll want to be at a desk or a chair with a keyboard mount.
Typing is always going to be more efficient for precision data input until we get to the third or fourth generation of direct brain interface... and nobody wants to be a beta tester for that.
PSA - don't get tempted by UltraSharp reviews and recommendations, like I did. Just pick an Eizo instead.
UltraSharps are widely recommended for coding with glorious reviews and endorsements. Got one and no matter how I adjusted it, it was still too... eye-piercing, if you will, for longer coding sessions. Got mild headaches, tired eyes and general feeling of discomfort when working on hem even for shorter periods. Then switched to FlexScan and it's a completely different ballgame - softer, more gentle feel, incomparably more comfortable. The best monitor I've had a pleasure of staring at in my 20 years of programming.
However the interesting part here is that both monitors use the same panel (!), so the panel itself is only part of the recipe, which is something that many reciews tend to either downplay or not mention at all.
Definitely. I made the mistake of buying an UltraSharp U2713HM which actually buzzed with lots of text on the screen (apparently it's a feature, just google the model). I tried swapping my new display and they sent me a refurbished one with dead pixels that buzzed also. I'm never buying Dell again.
This time around I did the right thing and went the more expensive route and got an Eizo and it's just absolutely perfect.
Sadly, outside of macOS, the support for high PPI display is dodgy at best. On Linux, it is becoming better, but still a lot more to improve. On Windows, it's a real shitshow. Windows 10, for the most part, is OK (still a lot of inconsistencies and lack of polish when scale > 100%), but application support is just terrible, be it blur, tiny scale, inconsistent elements (some small, some large) or plain crashing. Really, any vendor that makes such display is taking a risk that people will complain and a sizeable portion of the customers that purchase such a display will not understand the technical difficulties.
On Linux, it is becoming better, but still a lot more to improve.
Even on Fedora 26, I have to manually patch and recompile Mutter to get HiDPI support on Wayland. Mutter is hardcoded [1] to use 2x scaling on displays that are 192 PPI or more. My Dell p2415q is 188.2 PPI, so you get 1x 'scaling' by default and everything is tiny. This problem has been known for a while [2].
And even though GNOME applications then generally work well, you have to start Chromium with a special flag to let it run with scaling. There are a lot of glitches throughout the system, e.g. the mouse cursor has the right size, but when you go to a window corner to resize a window, it becomes tiny. A lot of icons are blurry, because they have a low resolution and are scaled up.
macOS is basically plug & play. All applications are in full HiDPI glory. The only exception are websites that use low-resolution images. There is one catch: the (terribly expensive) Apple USB-C Digital AV Multiport Adapter only supports 4k at 30Hz. This refresh rate is very tiring and annoying. However, a USB-C -> DisplayPort ALT mode connector works great @ 60Hz.
[1] IIRC there is work in the master branch to solve this.
Chrome/Chromium is notoriously bad at adopting any technology. On Windows, it took them years to support scaling other than 1x, color Emoji was added very recently, etc. I think on Linux it is even worse. Have they finally fixed ligature support in Chromium?
With Firefox rapidly improving in performance and memory usage, less and less reasons to use Chrome.
I set Xft.dpi: 192 in ~/.Xresources (gnome-settings-daemon and others will perform the equivalent setting).
Chromium respects this (for many major versions by now) and shouldn’t need a separate flag. Are you certain the flag is necessary in your setup? Can you check what Xft.dpi is on your system? xrdb -q | grep '^Xft.dpi'
On Linux, it just depends on your desktop environment.
If you use KDE or any other Qt based one, HiDPI works perfectly at any scale, even in heterogenous environments – a landscape 4K 144ppi and a portrait 70ppi monitor right next to each other will work fine, and when moving applications they automatically rescale. On X11 and Wayland. The scale is set via the environmental variable
It’s GNOME and GTK which have almost no HiDPI support at all – they only support 96 and 192dpi (only integer scaling), and just added a feature for all other resolutions to just scale the windows down with blur. GTK2 still supported proper HiDPI, but with GTK3 they decided to abandon that and instead build apps that rely on pixels always being the same size. Considering how young GTK3 is this decision may sound a bit short-sighted, but the GNOME devs always refer to macOS as an example where this worked (while ignoring Android and Windows 10, which use fractional scaling, and ignoring the fact that with Gnome’s solution a game such as Minecraft on a 4K 144dpi screen is rendered at 6K and then half the resolution gets thrown away during the downscaling, because there is no option to avoid this)
That's not exactly true. On Linux, it's a real shit show because only KDE (Qt) supports proper fractional scaling, GTK apps don't. GTK supports 100%, 200%, and so on. You can play with xrandr, but that'll cause serious other issues (not even speaking about multi-monitor setups).
On Win10, however, all of the programs I'm using have flawless hidpi support - browsers, IntelliJ, console windows (cmder), Spotify, etc. W10 supports per-display DPI as well.
I don't know which applications are "crashing", that sounds like FUD. I know about two notable exceptions that don't have good hidpi support: Adobe tools and Hyper-V.
I'm not sure what "not exactly true" means, it's my personal experience of trying to run Windows 10 on my MBP. Many of the video tools I use regularly (for encoding, subtitle creation and editing, authoring, muxing, etc) are either broken or look bad. I understand that may not be mainstream use, but it certainly isn't esoteric either. You say browsers, but last I remember, Chrome didn't really support it. Now I am on Firefox, so it's proper, but that's not what most people use. Adobe software, as you said, is notorious at supporting scale. That's not esoteric either.
Chrome's hidpi support has been acceptable on Windows since about 2015. It's still not absolutely perfect (there are issues with 1px black bars when resizing Windows), but it's completely usable.
I think when two monitors are connected, Windows still has issues with correctly transitioning the software from one scale to another. But that's at the system level and can be solved. The main issue is legacy software that would require complete rewrite to support scaling. A bit issue in those cases is that the community is completely unwilling to support such displays, with silly claims such as "you can't see the difference".
Rescaling between two monitors that are extending each other, even with different scales works well, its when one get disconnected and the window is forced to move to another screen that causes issues.
As to legacy software, most software works fine. Those that never cared about scaling are handled by the system easily enough. It's those rare cases where the legacy software is saying they support scaling to solve some issues with them not using the windows apis but not really doing anything to actually support scaling, such as GIMP.
Those are rare though, and usually are software with it's own rendering stack that don't use the windows apis.
I meant when moving an application window from one screen to another. I saw some issues that didn’t exist when I worked only on the MBP screen, but when I connected my 27” external display, they were suddenly issues noticeable when dragging windows.
This is simply wrong. We switched from Macs to Windows 10 because of the better monitor options available for professional color grading work. And true "Cinema" and consumer 4K resolutions are supported and supported well ONLY on Windows 10.
I respectfully completely disagree as do most professional photo/video folks I know. Color grading support on Macs is superb. There's also no commonly used resolutions supported on Windows that I know of that aren't available on Mac, so I'm not sure what you mean by the last part.
That's how I use a 4K @ 32" (16:9). It's just on the side of being comfortably usable. If the density were any greater, I would have to use some HighDPI mode and sacrifice real estate.
That's awesome. Would love to try one of these one day, alas I haven't seen them in the flesh yet. I sit pretty close the screens so have been wondering how a large 4k at 100% would work for myself.
In some ways, I imagine the Ultrawides (especially the curved ones) are probably more usable, because quite often you could get by nicely with a purely horizontal array of windows, and not have to be thinking about how to most effectively make use of vertical partitioning too.
When spending the money, I didn't feel like hitching myself to a "weird" aspect ratio, but maybe Ultrawides are here to stay.
It's interesting to see the variety of monitors people are using - and especially the variety of pixel densities, all the way down to 69 pixels per inch on a 32" 1080p panel.
For anyone who is curious to compare the pixel density of a variety of monitors and televisions, I have a spreadsheet with about 120 different display sizes and resolutions and their pixel densities:
Email me at the address in my profile or use the chat box in the spreadsheet (to avoid cluttering this thread) if you have any other displays you'd like to add.
Ugh, I bought the 34" a few months ago and replaced it with a 42" 16:9 yesterday. The ultra wide format is really awkward for everything except gaming and media, and those have tons of software issues with the aspect ratio.
Low DPI, certainly Dell's traditional low quality anti-glare filter (3h). When will they wake up and build monitors for people who work with letters and numbers (vs video or image) and favor quality over price? For a few years I think I will still have to switch every programms to dark theme as a work around to hide the poor quality and defaults of the monitors... sad. :(
Beware if you're somewhat of a messy, like me, you're going to have hundreds of (terminal) windows and another hundreds of browser tabs open simultaneously on these, until you can't find your current stuff on your desktop anymore at which point you have to spend half an hour to sort out what you want to close. I actually sold my fat 27" and went back to 24" (and to 13" notebooks).
A special window manager is a requisite on the big screen, but which one? I like the original Expose feature on Mac OS <= 10.6 while the remade 10.7 version just doesn't blend from/into full desktop view as spatially recognizable as the original for me. Ubuntu's Unity does an ok job, too, but it's going away ([1]).
>> Beware if you're somewhat of a messy, like me, you're going to have hundreds of (terminal) windows and another hundreds of browser tabs open simultaneously on these, until you can't find your current stuff on your desktop anymore at which point you have to spend half an hour to sort out what you want to close.
I have this problem, sort of. I have a dual 27" 4K running at 1:1, and it's gotten to the point where I want two more monitors for more real estate for my mess of windows. Given the added weight and stability issues (I have a standing desk), I think I'm going to wait until ~50" 8K curved monitors are available and affordable.
For macOS window control via keyboard I recommend Spectacle (https://github.com/eczarny/spectacle). Among other options, the app can position windows by thirds which really shines for this class of screen - think three windows side-by-side each using 1280px.
The built in Win+Tab command works well for the closing windows problem on Windows 10 on a 34" screen. I'm using a combination of built in commands and WindowSpace commands I learned with two monitors. Keyboard shortcuts to throw a window to the left or right half of the screen go a long way, but I could use a small step up in tiling features.
I wish it were available in HiDPI version, i.e. twice the resolution - it would make it way better to look at, as seeing pixels while editing video or producing audio in Live is no longer acceptable.
I have 3x 4k displays, one of them 30" DCI 4k, one 55" UHD, plus a rMBP and an ultrabook with 3200x1800. I simply can't go back to see pixels again. Once you get used to better, you dislike a downgrade.
I get that, but it's a general point. You specifically called out producing audio, and I was curious if there were specific visual features that are necessary in producing audio. I get it for video editing. I also understand the general "I like high resolution" case.
I am wondering if there is something I am unaware of about the combination of high resolution and audio that made you call that out as a specific use case.
Beside nicer fonts you have "sub-pixel" resolution in e.g. your piano roll/channel list, so you can zoom out a bit more while still able to distinguish stuff there. That is handy if you work with a very large number of channels.
This is based off a monitor from LG that is over a year old and is not 4K.
I have a 34 inch flat LG thunderbolt screen right now on my 3 year old MacBook Pro. Great screen
I plan to swap the computer to the MacBook and get a LG 32ud99-w when they finally ship (4K, USB-C from MacBook). I fly 100K a year so I want the smaller laptop. I hate giving up on the thunderbolt. I have been waiting for a decent USB-C monitor.
This is a difficult time to commit $1500 to a monitor. For that kind of money I would really want better support for new color standards.
But the standards are still settling, and you can actually end up worse off with better color if the stars are not aligned with your operating system and applications.
It is not. Is is a very wide screen. 24:10 aspect.
A UHD/4k pixel density that you would find in a 16:9 display would mean a lot more horizontal pixels.
Though I would like to see one that is 5184x2160 or 5760x2160 pixels. I expect that is just too expensive and niche atm. And pushing gfx cards and display ports to its limits.
At the office we have a dozen QNIX 4K 32 inches monitors. They are not quite IPS, although it says it is, and they are $399 USD from eBay. Hard to beat that price. They are bright and they have DP & HDMI 2 ports.
Pretty interested in this as I'm not a fan of 21:9, I'm wondering how I'll feel about looking at 24:10. As stated I wish it was a way higher refresh (120 I feel should be the new norm now?)
I'm guessing you don't like 21:9 because it's too wide relative to its height, or put another way it's not tall enough relative to its width.
If that's the case, 24:10 would be even worse.
Don't be misled by the :9 vs. :10 part - that is meaningless. Reduce them both to a single number for a fair comparison: 21:9 and 24:10 are 2.333:1 and 2.4:1 respectively.