What are the differences between HD, UHD, 4K and 8K?

Friday, 17 January, 2020

Until the turn of the millennium, picture quality wasn’t really spoken about.

Television content was broadcast at a standard definition of 576 lines, in a ratio of 4:3, and any TV set would display comparable picture quality.

Then came widescreen TVs, with a superior 16:9 aspect ratio – now the industry standard for recording, broadcasting and playback.

Widescreen soon gave way to high definition content, commonly abbreviated to HD. Next came 4K, while some companies are now experimenting with 8K content.

Each generational change improves picture quality immensely. The jump from HD to 4K is just as noticeable as the improvements between standard definition and HD.

Yet the differences between these picture quality standards often become confusing, especially with 4K and 8K grabbing headlines at last week’s CES 2020 in Las Vegas.

As a result, many people are unclear whether they should be buying 4K and 8K-compatible devices.

In terms of streaming media, do the benefits in picture quality outweigh the much bigger file sizes which have to be sent along the UK’s creaking broadband infrastructure?

Is it worth investing in a 4K TV to enjoy The Grand Tour in all its globetrotting glory, or will the cinematography look equally dynamic in standard HD?

To answer this, we need to understand how pictures are displayed on our screens.

Making sense of scale

Pictures on smartphones, monitors, tablets and TV screens are made up of individual pixels, arranged in horizontal lines with a specific number of pixels along each row.

SD content is broadcast across 576 horizontal lines with Full HD content generally displayed across 1,080 lines.

This figure is often followed with a lower-case p, indicating a progressive scan where every pixel is displayed at once.

Most modern devices offer the Full HD resolution of 1920 x 1080. That means there are 1,920 pixels in each row, and 1,080 rows of pixels from top to bottom.

Rather like the 720 ratio adopted by some budget Millennial televisions, there is a halfway house between HD and 4K. It’s known variously as QHD or Quad HD +.

This uses 1440 lines, and is often deployed on high-end smartphones and gaming monitors. However, relatively little content is broadcast in this format.

The next full step up in picture quality is 4K, which broadcasts at a ratio of 3840 x 2160. Confusingly, it’s also referred to as Ultra HD, or UHD for short.

The name ‘4K’ comes from the 3,840 pixels in each row (almost 4,000 in total), just as 8K’s name originates in the 7,680 horizontal pixels within each of 4,320 individual rows.

So what’s the most practical format?

While standard definition content (like terrestrial TV channels and the lowest-resolution Netflix/Amazon Prime files) is perfectly watchable, Full HD is far superior.

It does away with the ghosting and blurriness SD footage experiences during fast-moving events like live sports.

You only need a 5Mbps line speed to stream HD files on Netflix, BBC iPlayer and Amazon Prime TV. Most domestic broadband connection will comfortably support this.

While 4K looks gorgeous, there’s relatively little content available, and compatible hardware remains expensive. These points are even more valid for the ultra-rare 8K format.

It’s worth noting that you can watch any quality of file size on any device; a program recorded in 8K could be viewed on a Full HD monitor. You just wouldn’t see the full benefits.

Finally, don’t be tempted to pay a premium for a 4K phone screen. Their small displays mean anything beyond a 1920 x 1080 resolution is unnecessary.

Neil Cumins author picture

By:

Neil is our resident tech expert. He's written guides on loads of broadband head-scratchers and is determined to solve all your technology problems!