Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Why do all these oled displays have to be 4k nowadays? I was shopping TVs recently and found exactly one 2k (1920px) oled screen for about 700 euros when looking across three countries, which was slightly above budget but I was considering it. If there had been more than just this one choice, just to have some OSes and features to choose between, I might have gone for it. But 4k oled costs thousands at no added benefit, so now we have a crappy (girlfriend-selected) 4k backlit lcd that you actually notice turning on backlight selectively on parts of the screen that have non-black pixels.

It's not as if I can see individual pixels from across the room. Same with a laptop (I don't see my pixels on a 16" 2k screen), a computer display (I don't see pixels on a 23" 2k screen), and a phone (I don't see pixels on a 5.5" 720px screen). I'm pretty sure that if you can see individual pixels at those sizes, you're sitting too close.

On my phone I really just want my 720p oled display back: there, too, I have a crappy 2k lcd now because oled is too expensive at the resolutions they make them nowadays. And I was hoping to see oled laptops soon, but it looks like we first have to get every single application to be rewritten to support more-pixels-than-you-can-actually-see zooming modes, then bring down the price of these panels, and then I can buy one.



I'm not sure about the TV across the room, but I certainly see the pixels on a 2k 23" monitor. I'm using two now and every second I'm itching to get back to my Macbook's Retina display. Everything else about these monitors is fine... except pixel density.

In Windows this is better (for me) because I personally like the subpixel rendering there - it's aligned to pixel boundaries more often and looks clearer to my eyes.


I've got a 27" 5K iMac. At work, we've these 21.5" 1080p displays. I just can't use them, the text just looks so ridiculously blurry. Once you've seen HiDPI rendering, you can't go back.

I'm doubting GP has ever worked with HiDPI content for an extended period of time.


Few things:

1. TV sizes are getting bigger and bigger. You mightn't notice the pixels at the size of your current TV, but your current TV might be considered small by the standard of increasing screen sizes.

2. You mightn't be able to see the pixels on your 16" 2k screen or on a 4K TV, but your eyes aren't the only ones on the planet. To my eyes, 2k is old news; my 13" has 1600 pixels vertically. My left eye can barely see anything thanks to keratoconus, but with both eyes open I can still discern the difference between 720p, 1080p, and full 4K content on my TV — mainly in games and Blu Ray movies.

3. You don't need to rewrite every single application to support HiDPI displays — Windows already has scaling baked into the OS (and has for years), and the majority of apps support the feature with no issue; macOS has had Retina display support since late 2012 and only apps ported from Linux exhibit any weirdness.

4. All of the Galaxy phones have had OLED displays standard for years, long after 720p was considered a high resolution for a phone (back in 2011). Even some of the non-flagship models have HiDPI OLEDs and aren't particularly expensive.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: