4K HD Television Requirements

4K HD Television Requirements
4K HDTV, Television, Requirements

Once upon a time, in a galaxy far far away.....Sounds like the intro to a George Lucas blockbuster movie doesn't it? 

Our broadcast standard was originally developed sometime in late 30's and was based on the venerable CRT or Cathode Ray Tube.  Color broadcast television emerged in the 60's, as I recall, and it too demanded some rather significant compromises in the design due to the physical limitations of phosphors. Turns out that the real-world performance varied considerably from the mathematics. Regardless, the USA's TV broadcast standard is essentially based on technology that is approximately 75+ years old, and that my friends, is a problem for modern day TV displays.

We have all been witness to the slow and painful demise of analogue TV.  That process has taken multiple years to fully implement.  Essentially, analogue broadcasts were shut down on or about June 12, 2009 (with some exceptions) by our FCC.  Today we are enjoying a pure digital broadcast system on high definition TV Displays (HDTV).  Moreover, these same TV are incredibly thin...and getting thinner and lighter!  The current crop, of HDTV's, are generally LED based and provide typical max resolutions of 1080P.  The 1080P format provides slightly more than 2 million pixels or "Picture Elements".

This is where things start to get exciting or perhaps ugly, depending on your perspective.  Our most bestest friends, in Hollywood, were bouncing off the walls about the possibility of the common folk pilfering their HD content. Intel Corp. was enlisted to design a cipher system to preclude that possibility. This new cipher is referred to by the acronym HDCP or High-bandwidth Digital Content Protection. This copy protection protocol is part of the HDMI standard.  Today, Component Video and S-video have been supplanted by the newer and more rigorous digital HDMI interface.

All modern HD Displays, STB's (Set Top Boxes) and AVR's have HDMI ports to accommodate the defacto HDMI connection requirements.  The initial HDMI 1.0 rollout was truly a beastly deployment.  It turns out that many of its provisions were "optional to implement" by the manufacturers.  What a concept!  Can you imagine the consequences if driving on the right side of the road was optional?

Successive iterations of the HDMI standard soon followed and it was like a game of leap frog from the initial 1.0 to 1.1, 1.2, 1.3, 1.4.  Each new iteration added new feature sets and greater bandwidth/speed.  HDMI, to be sure, is a very complex technology.

4K TV's, AVR's and Blu-Ray Devices:

The latest TV Display resolution technology is referred to as 4K. A 4K TV Display is capable of delivering 4 times the 1080P Resolution, which we now know to be about 2 million pixels. Employing our 3rd grade math skills to this equation, we quickly arrive at the tidy sum of approximately 8 million pixels.

WOW!

All good so far?  Not really. You see, along with this quantitative jump in resolution possibility comes an equally problematic jump in the HDMI/HDCP requirements. The latest HDMI 2.0 protocol demands that a 4K product have HDCP 2.2 to be fully compliant with the soon to be released 4K Blu-Ray Players. One would naturally expect that a TV Display carrying the "4K" banner would have all of the required "gitty-up" necessary to actually watch true 4K content.

Regrettably, this isn't always the case with 4K Displays and their associated connected devices.  To insure complete compatibility, a 4K System must includeboth HDMI 2.0 and the most recent release of the HDCP protocol, which is HDCP 2.2.  The HDCP 2.2 provision requires a new chip set.  To be clear, this is a "hardware" related issue and not one that can be addressed via a "firmware" update.  What this all means to the average consumer is simple but a wee bit Machiavellian in nature.  Older non-HDCP 2.2 AVR's are not fully compliant and (with some minor exceptions) cannot be upgraded.

To help ease some confusion, every TV display has a built-in piece of hardware known as  a "Video Scaler".  Video Scalers use sophisticated processes to convert one resolution to a higher (or lower) resolution, depending on the native resolutioin of the TV display.  Accordingly, a 1080P Display would normally up-convert 720P or 480P program content to the more desirable 1080P resolution.  4K Displays utilize this very process to "Up-Convert" lower resolution content (720P/1080i/1080P) to approximate 4K content.  However, viewing an up-converted lower resolution file isNOT the same thing as viewing content mastered in true 4K resolution.

To be fair, there really isn't a lot of true 4K content available, at this time.  Content providers are hard at work creating true 4K content. Most notably, NetFlix filmed and released "House of Cards" in 4K this past season.  4K is coming, to a theater near you, but the actual timing, I suspect, will mimick the rollout of the venerable CD.

Streaming and Wi-Fi:

There is a misguided notion that everything is Wi-Fi compliant these days.  While one could reasonably argue that most new AV devices have embedded Wi-Fi capabilities, it would be a huge mistake to assume that you will be able to successfully stream 4K content over most home networks.  The demands for 1080P streaming is difficult enough. The demands for 4k will only serve to exacerbate these bandwidth related issues.  Message:  If you can connect via a Ethernet Cable, please do so.  You will greatly appreciate the stability, security and the enhanced speed afforded by a hardwired connection.

That is all for today folks but do stay tuned.  

PS:  Friday's quizz will cover this material.  Be sure to bring 2 #2 sharpened pencils to class!

Thanks for your interest.......

The Supreme Stereomeister