Haswell graphics – just in time for 4K UHD TV
While AMD Trinity was the first to technically allow it, Haswell seems to be the first CPU with integrated GPU to support proper 30 fps 4K for most of the video content. Here are the first impressions…
2013 is the year that, besides finally having seen some major PC CPU refreshes, will also be remembered as the one when the 3840*2160 quadHD, or UHD, or 4K – although not exactly 4K as only 3840 horizontal pixels – TV sets started proliferating at affordable prices. Yes, affordable, if the 50 inch UHD models cost less than USD 1,300 in retail plus tax, and that’s the situation in China right now.
The problem for these TVs right now is, of course, the lack of content which vendors like Sony are trying to solve with 4K content streaming and such. However, with a PC, the full usage of the UHD TV is there for you to tap into, as long as the machine and its GPU drivers support the resolution – at least most HDMI 1.4 setups can handle 30 fps at UHD level. And, PC will allow you to download whatever UHD / 4K content you can find, easier than a restricted TV.
Up to now, the AMD Trinity was the first integrated CPU/GPU to support native UHD / 4K resolution, although the mobile Trinity I tested had quite a bit of frame rate problems when playing, say, Hobbit 4K trailer or LG UHD demo. Same with Ivy Bridge, which allowed 4K in the latest driver incarnation. However, Haswell’s more powerful video engine promised to be the first to offer flawless 30 fps at UHD right out of the processor chip, a first considered that, up to now, you needed a midrange AMD or NV GPU at least for the job.
I took some time to test the Haswell UHD video handling on a configuration based on the i7-4770K and Intel’s Z87 mainboard running at default CPU and 1.3 GHz GPU, as well as three different memory speeds off the Kingston XMP DDR3-2400 2×4 GB kit. The 50 inch Skyworth UHD TV was the monitor, over a single HDMI 1.4a cable.
I ran four 4K videos under both Windows Media Player and the Total Media Theatre Player, the latter optimized for Haswell. The videos were Hobbit 4K trailer, Lupe movie and LG UHD TV demo, all publicly downloadable on YouTube, and Intel’s 4K test MP4 video, provided by Intel. The Hobbit video wasn’t exactly sharper than the usual FullHD Hobbit video, but the LG one and Lupe were clearly true 4K videos with stunning sharpness and detail. The Intel one seemed to be an upscaled FullHD as well, looking at the edge sharpness, but then we could be wrong.
And yes, the person driving that car in the video looks very much like Intel’s own Francois Piednoel, by the way… so he isn’t benchmarking just the CPUs, but also cars on the road to Santa Cruz Mountains, it seems.
At the default DDR3-1600 memory setting, playing the videos in Windows Media Player was OK but the Hobbit and Lupe videos did have some jerking and obvious dropped frames. The Total Media player played all fine, except one case of obvious dropped frames in Hobbit. Once I set the memory to DDR3-2133 XMP settings, but no further manual optimizations, both Media Player and Total Media played all the video files fine, although the subjective feeling was that Total Media one was somewhat smoother. Once the memory was set to DDR3-2400 XMP, again default from BIOS without further tweaks, both seemed to play everything smooth enough.
The outcome? Haswell definitely can be a good 4K movie playback device for your 4K TV, but, just in case, you may want to up the memory speed a bit, as running the frame buffer out of main memory, at this resolution and refresh rate, does have its price. Remember, in this case, every notch up memory speed was essentially providing extra frame buffer bandwith for the ultra hi-res TV refresh… Now how about a miniITX Haswell 4K home theatre box?