20 May 2017

Who needs 4K...?

By Jedi787plus, from Wikipedia

While I have mentioned that I think it's possible that 4K resolution TVs and monitors might be prevalent in the future (say around 2026-2028) they're a long way from mainstream use in the average person's home at this point in 2017. Worse still, content providers still haven't really settled on a good way to get content to consumers at 4K resolution - Blu Ray discs are too small and streaming 4K content is made difficult by the greedy policies of ISPs worldwide who want to charge both the consumer and provider for passing information over their networks. The UHD disc format is now available but I've never seen a system or disc for sale - the systems themselves are incredibly expensive (think €300+) when you could buy a game console for the same price - none of which support UHD discs. The Xbox Scorpio will have a UHD disc player when it's released but there's no word on the cost of the device as yet and I don't expect it to be cheap* given the specs...

While video content has been filmed at a quality which would allow for recording in 4K resolution before digital cameras were commonly used, the high quality copies were discarded - probably in favour of storage space, though the article linked claims that people they have heard from "didn't realise the format was coming to the public space".

The heart of the issue is "do we need it?" and if so "why do we need it?"


I've been a relative resolution luddite for many years. In theory, I resisted the HD push in the 2000s, 3D displays in the late 2000s/early 2010s and am resisting the 4K push now. In practise, it comes down to the fact that the displays I've owned have been good enough that they did not need to be replaced and I'm not one to waste or junk items just because they're a bit old. 

Not only that but the content we're viewing isn't improved by watching it at a higher resolution. Does the drama increase when moving from 480p to 720p to 1080p to 2160p? Does contrast or colour depth? Do filming or editing techniques get better? The answer is a resounding "No!". People say they like to look at the clearer picture and better detail but, quite honestly, most people won't be able to pick out a lot of that detail on a fast-moving series of images... not that the incidental detail is important to the quality of the content either... In many use cases, a bigger screen would be better than one of higher resolution, though I will concede that at certain distances, higher resolution on a bigger screen will make more sense than on a smaller screen.

However, because I and my partner are two extremes of user, I think I have a pretty good, unbiased understanding of how different resolutions, screen sizes and distances to view those screens work out. For reference, one of us has perfect vision and the other is almost (figuratively) blind. The one with perfect vision can view detail at 1080p on a 32" screen from 10 feet away whilst the other won't see that detail from 4-5 feet away. It's quite an interesting experience!

Most people won't have perfect vision so increased detail through resolution won't help them. Most people won't have great peripheral vision so sitting near a screen that is too large will result in lots of lost information that cannot be processed without moving the eyes to focus on that part of the screen. I really believe that it's a case of 'social status' coming to the fore - owning a high-end, large TV is more affordable than an equivalent car.

This is also one of those cases where cultural differences will come into play because most European, Korean, Taiwanese and Japanese homes are very different from those in the USA and Canada. I can't even begin to speculate on the average 'distance to couch' for countries in South America, Russia, Eastern Europe and China but the point is that this is where people will come in and say, "55" TV? You're crazy!" and, "32" TV?! How close do you sit? It must look like a postage stamp!". This is also ignoring the trend of people not even having a TV in their homes in Asia.

For my setup, a 32" TV is about the maximum I can stomach and, PC-wise, a 22-24" monitor is the desktop equivalent but mostly because the vertical real estate is more important there.

Getting onto the gaming side of things, a mid-range PC outfitted with a GTX1060 or RX480 will struggle to fill a 1080p/60fps screen at max settings on all modern games and will have to have settings turned down on many titles at 1440p/60fps. That's still an outlay of €800-1000 for the whole PC case/innards - not including the monitor and keyboard/mouse, so not cheap. Consoles are much less powerful and both the Xbox One and PS4 struggle to output 1080p/30fps on many titles with far fewer graphical bells and whistles than are usually available on the PC ports of those titles.

The reason I'm laying all this groundwork is because gaming has been technologically constrained for the last 10 years. I've seen many calls for better AI and world simulations - those require more CPU power but every time we get more advancement in this segment, higher resolutions and graphical processing techniques are being simultaneously pushed on the consumer, negating those advances in the process. Sure, the graphics cards of today take on a large portion of those duties but the CPU is still required to push the data to them.

As a result, developers do have more to work with now than in the past (it would be silly to argue that they don't) but they have much less than they would have if manufacturers weren't chasing that 4K resolution. If the PS4 Pro had come out and targeted 1080p/60fps as a minimum I doubt that many consumers would have been unhappy. If it had also allowed 1440p/30fps that would have been a good compromise as well. Unfortunately, TV manufacturers decided to skip the 1440p format for whatever reason (I like to imagine that they were unaware of it because it would be another surefire way to extract money from consumers by touting another 'upgrade step' ;) ) and we're left with a huge, almost insurmountable chasm to cross in terms of performance per dollar.

Now, some people may argue that "4K's not that big of a deal" but the fact that you need a £/€/$700 GPU to achieve a consistent(ish) 60fps kinda underscores the point, doesn't it? And** that's not even considering the rest of the system that you need to feed that GPU... probably another £/€/$/whatevs 700 to boot! You simply cannot expect this performance in a 'normally priced' consumer-orientated console.


So, who is "4K" really for?

- It's not for the gaming tech companies who are barely struggling to reach that metric (although I have a feeling that if NVidia had competition at the high end the GTX1080Ti would have been the GTX1080) because the majority of their market is low-end. Jen-Hsun Huang, CEO of NVidia, recently stated in their Investor Q&A the following:

The average selling price of the NVIDIA GeForce is about a third of a game console. That’s the way to think about it. That’s the simple math. People are willing to spend $200, $300, $400, $500 for a new game console, and the NVIDIA GeForce GPU PC gaming card is on average far less.

That puts the average selling price in the region of $100-120 but that average is very deceptive because the higher-end graphics cards are exponentially more expensive. The below list is the general price you'll find the cards at as of today. You will be able to find cards cheaper and more expensive but this is the average for each card.


  • The GTX 1030 is $80.
  • The GTX 1050 is $110
  • The GTX 1060 3GB is $180
  • The GTX 1060 6GB is $230
  • The GTX 1070 is $370
  • The GTX 1080 is $500
  • The GTX 1080Ti is $700
  • The GTX Titan X Pascal is $1200

The geometric and harmonic means of these prices are $294 & $208 respectively and this means that the sales numbers of the higher priced cards are overly represented. In actual fact, what Mr Huang is saying is that the majority of their cards sell at the lowest price range and from their older, less expensive cards. The lowest priced vendor-sold graphics card available on Amazon is around $25-30. Including that into our above means gives us $223 & $114 respectively. Including all the other priced cards in between the GTX 1030 and the $25 Geforce 8400 GS which was released in 2007 or the $30 Geforce 210 which was released in 2009 would result in skewing the means even lower to more likely reflect Mr Huang's numbers.

The point of this is that 1080p is still the most common resolution worldwide and the majority of graphics hardware is not even powerful enough to drive the latest games at that resolution at 60fps... The required hardware to power those games is already light years beyond the average person's ability so why are we moving onto a resolution which is completely unwanted and out of reach of consumer's hardware?

- It's not for the developers because it's much more expensive to target 4K assets than those that looked 'good enough' at 1080p and 1440p.

- It's not for the consumers because (as pointed out above) most consumers won't benefit from it and those that do are edge cases (seriously, a 100" screen viewed from a mile away?***). Plus, consumers are forced to have sub-par experiences because their hardware never got to the stage of easily supporting the 1080p, 60fps standard. Now, their hardware doesn't even support 4K at 30fps without large amounts of trickery such as upscaling and dynamic resolution etc. Up/down-scaling is always a hack: It is better to run content at the exact resolution it is designed for.

The only people who are benefitting from this are the TV companies and the screen makers. They're pushing out a new 'standard' that has everyone else scrambling about when the last 'standard' was never even perfected or fully reached and has been around in the mainstream for only around 10 years. Hell, a lot of TV content is still not even delivered in 1080p around the world! Youtube still defaults to 720p or 480p and may go lower depending on the quality of your connection!!! A lot of streamers are still at 720p or 900p resolutions...

Worse still, many good features like displayport, HDR, freesync and 120Hz are commonly left out of 1080p TV displays in favour of sticking them into the 4K displays (if you're lucky!) and, currently, HDR is not even catered for by PC monitors****. There are plenty of ways the industry could have made money on the current resolutions before moving onto the 4K standard but, for whatever reason, they didn't.

As a result, we are all being forced into a situation that is sub-optimal.




*Cheap as in €300-400...
**I know it's a big faux pas! I'm writing in conversational English so give me a break - I really tried to avoid it, if that's any consolation...
***This is hyperbole!
****Though apparently this is coming later in the year but, again, will be tied to 4K resolutions for the most part.

No comments: