But if there are only double the amount of lines, why is it called 4K? Surely that would be 4,000 lines?
What UHD allows for is an increased amount of detail. With double the amount of lines there is double the number of pixels in comparison to 1080. Double the lines + double the pixels = four times the resolution.
What this greater amount of detail allows for is a truer colour palette, giving us a sharper, more vibrant image. Take a look at this side-by-side comparison for an example of what we mean.
As this technology is still fairly new in terms of availability on the market, the biggest problem at the moment is the lack of 4K programming. While some movies, shows and even the World Cup are now being filmed in UHD they are rarely released at this quality. Netflix stream series and films in HD and upscale some, but they are not true 4K films. That said, in a year’s time we can expect the amount of 4K content on our televisions to increase as broadband speeds increase and are more able to cope with the data.
Another problem is motion blurring, which is especially evident in gaming. 4K monitors aren’t able to handle the graphics being produced by games, so the gaming experience becomes stuttery and the graphics blurry.
Talking to CreativeCOW.net, John Galt of Panavision, said: “I subscribe to [celebrated director] Jim Cameron’s argument, which is we would get much better image quality by doubling the frame rate than by adding more pixel resolution.”
There is, however, an answer: the NVIDIA G-Sync. Rather than have a monitor that receives information from a graphics card, the G-Sync has hardware within it to enable the to communicate with the graphics card to sync refresh rates – increasing the frame rate as Jim Cameron and John Galt have asked for.
Gaming aside, 4K comes into its own with movies and pictures, as long as each is filmed or shot using 4K equipment. As content begins to catch up to demand, and super-fast broadband begins to reach towns and cities in the UK and Europe, these new screens will make your old HD screens look like that old analogue box you had to throw out for the digital switch-over.
Of course then it will be time for Panasonic and NHK to broadcast the 2020 Tokyo Olympics in 8K…
27 Oct 2020
NVIDIA's 30 Series Launch has been an interesting one to say the least, and with the RTX 3070 soon to arrive, we thought it would be even more interesting to discuss whether or not prospective buyers should be in the market for an RTX 3070 or an RTX 3080. As always, the answer isn't at first obvious, and depends on a few different things. Nevertheless, it's certainly an intriguing debate to be had, and there are many people on either side of the fence. So what exactly should we be considering, and what does this mean when it comes to deciding which way to go?