Since the ASUS has a pair of HDMI inputs, but there is effectively no 4K HDMI content right now, the performance of the internal scaler is essential to know. To test it, I use an Oppo BDP-105 Blu-ray player and the Spears and Munsil HD Benchmark, Version 2. The Oppo has its own 4K scaler so I can easily compare the two and see how the ASUS performs.

First off, the ASUS is poor when it comes to video processing. Common film and video cadences of 3:2 and 2:2 are not properly picked up upon and deinterlaced correctly. The wedge patterns are full of artifacts and never lock on. With the scrolling text of video over film, the ASUS passed which was strange as it fails the wedges. It also does a poor job with diagonals, showing very little if any filtering on them, and producing lots of jaggies.

Spears and Munsil also has a 1080p scaling pattern to test 4K and higher resolution devices. Using the ASUS scaler compared to the Oppo it had a bit more ringing but they were pretty comparable. This becomes very important for watching films or playing video games, as you’ll need to send a 1080p signal to get a 60p frame rate. 24p films will be fine, but concerts, some TV shows and some documentaries are 60i and would then appear choppy if sent at 4K over HDMI.

Brightness and Contrast

In our preview of the PQ321Q, we looked at how it performed out of the box with the default settings. What we did see is that the PQ321Q can get really, really bright. Cranked up to the maximum I see 408 cd/m2 of light from it. That is plenty no matter how bright of an office environment you might work in. At the very bottom of the brightness setting you still get 57 cd/m2. That is low enough that if you are using it for print work or something else in a darkened room the brightness won’t overwhelm you.

White Level -  XR Pro, Xrite i1D2 and XR i1DPro

The change to IGZO caused me to wonder how the black levels would behave on the ASUS. If energy flows far more freely, would that cause a slight bit of leakage to lead to a higher black level? Or would the overall current be scaled down so that the contrast ratio remains constant.

I’m not certain what the reason is, but the black level of the PQ321Q is a bit higher than I’d like to see. It is 0.756 cd/m2 at the lowest level and 0.5326 cd/m2 at the highest level. Even with the massive light output of the ASUS that is a bit high.

Black Level - XR Pro, Xrite i1D2 and XR i1DPro

Because of this higher black level, we see Contrast Ratios of 755:1 and 766:1 on the ASUS PQ321Q. These are decent, middle-of-the-pack numbers. I really like to see 1,000:1 or higher, especially when we are being asked to spend $3,500 on a display. Without another IGZO display or 4K display to compare the ASUS to, I can’t be certain if one of those is the cause, or if it is the backlighting system, or something else entirely. I just think we could see improvements in the black level and contrast ratio here.

Contrast Ratio -  XR Pro, Xrite i1D2 and XR i1DPro

Setup and Daily Use dE2000 Data, 200 cd/m2 Calibation
POST A COMMENT

166 Comments

View All Comments

  • ninjaburger - Tuesday, July 23, 2013 - link

    I feel like this is an easy position to take with very few 4k TVs in the wild, very little content delivered in 4k, and, maybe most importantly, *even less* content being finished at 4k (as opposed to upscaled 2.5k or 3k).

    When you see native 4k content distributed at 4k on a good 4k display, you can see the difference at normal viewing distances.

    I don't think it's worth adopting until 2015 or 2016, but it will make a difference.
    Reply
  • Hrel - Tuesday, July 23, 2013 - link

    I see no reason to upgrade until 1080p becomes 10800p. Kept the CRT for 30 years, inherited the damn thing. That's how long I intend to keep my 1080p TV; whether tv makers like it or not. Reply
  • Sivar - Tuesday, July 23, 2013 - link

    30 years? I hope you don't have a Samsung TV. Reply
  • althaz - Tuesday, July 23, 2013 - link

    It doesn't matter what brand the TV is, good TVs last up to about 7 years. Cheaper TVs last even less time. Reply
  • DanNeely - Tuesday, July 23, 2013 - link

    My parents no-name 19" CRT tv lasted from the early '80's to ~2000; the no-name ~30" CRT tv they replaced it with was still working fine ~3 years ago when they got a used ~35-40" 720p LCD for free from someone else. I'm not quite sure how old that TV is; IIRC it was from shortly after prices in that size dropped enough to make them mass market.

    Maybe you just abuse your idiotboxes.
    Reply
  • bigboxes - Wednesday, July 24, 2013 - link

    You must be trolling. My top of the line Mitsubishi CRT started having issues in 2006 in year seven. I replaced it with an NEC LCD panel that I'm still using today. It could go at any time and I'd update to the latest technology. I'm picky about image quality and could care less about screen thinness, but there is always options if you are looking for quality. I'm sure your 1080p tv won't make it 30 years. Of course, I don't believe your CRT made it 30 years without degradation issues. It's just not possible. Maybe you are just a cheap ass. At least man up about it. I want my 1080p tv to last at least ten years. Technology will have long passed it by at that time. Reply
  • bigboxes - Wednesday, July 24, 2013 - link

    Of course, this coming from a man who replaced his bedroom CRT tv after almost 25 years. Even so, the tube was much dimmer before the "green" stopped working. Not to mention the tuner had long given up the ghost. Of course, this tv had migrated from the living room as the primary set to bedroom until it finally gave up the ghost. I miss it, but I'm not going to kid myself into believing that 1988 tech is the same as 2012. It's night and day. Reply
  • cheinonen - Tuesday, July 23, 2013 - link

    I've expanded upon his chart and built a calculator and written up some more about it in other situations, like a desktop LCD here:

    http://referencehometheater.com/2013/commentary/im...

    Basically, your living room TV is the main area that you don't see a benefit from 4K. And I've seen all the 4K demos with actual 4K content in person. I did like at CES this year when companies got creative and arranged their booths so you could sometimes only be 5-6' away from a 4K set, as if most people ever watched it from that distance.
    Reply
  • psuedonymous - Tuesday, July 23, 2013 - link

    That site sadly perpetuates (by inference) the old myth of 1 arcminute/pixel being the limit of human acuity. This is totally false. Capability of the Human Visual System (http://www.itcexperts.net/library/Capability%20of%... is a report from the AFRL that nicely summarises how we are nowhere even CLOSE to what could actually be called a 'retina' display. Reply
  • patrickjchase - Tuesday, July 23, 2013 - link

    A lot of people seem to confuse cycles/deg with pixels/deg. The commonly accepted value for the practical limit of human visual acuity is 60 cycles/deg, or 1 cycle/min. The paper you posted supports this limit by the way: Section 3.1 states "When tested, the highest sinusoidal grating that can be resolved in normal viewing lies between 50 and 60 cy/deg..."

    To represent a 60 cycle/deg modulation you need an absolute minimum of 120 pixels/deg (Nyquist's sampling theorem). Assuming unassisted viewing and normal vision (not near-sighted) this leads to an overall resolution limit of 500 dpi or so.

    With that said, the limit of acuity is actually not as relevant to our subjective perception of "sharpness" as many believe. There have been several studies arguing that our subjective perception is largely a driven by modulations on the order of 20 pairs/degree (i.e. we consider a scene to be "sharp" if it has strong modulations in that range). Grinding through the math again we get an optimal resolution of 200-300 dpi, which is right about where the current crop of retina displays are clustered.
    Reply

Log in

Don't have an account? Sign up now