One of the most difficult aspects of shopping foris trying to compare specifications. Does Projector A's 5,000:1 actually look better than Projector B's 4,500:1? How much brighter is Projector C's 1,000 lumens compared to Projector D's 800? Let me tell you a little secret: These specs are largely meaningless.
In broad strokes, sure, a 3,000-projector is going to be significantly brighter than a 500 lumen projector. But if you're comparing projectors with similar technologies and price ranges, in most cases you'll see specifications that are a lot closer to one another. And the bigger issue is that even with similar measurements for color, brightness and contrast, projectors can look different in person.
In my years of reviewing projectors I've learned to pay less attention to spec sheets and more attention to how a projector actually measures and looks in person. That's why I test every projector I review with objective and subjective methods using my own eyes, my own instruments and side-by-side comparisons. Here's how that works.
Warning: This info can get a bit "into the weeds," but hopefully it will give you an idea about the behind-the-scenes work that goes into my reviews.
Initial setup and picture modes
First-time setup is important for anyor projector. The out-of-the box almost never let the display look as good as it can. With projectors the ability to tweak is especially crucial since there's a picture element the manufacturer can't control: the screen.
One of the first things I do after warming up a new projector is adjust the contrast and brightness using. Sometimes color too, but this usually tends to be correct out of the box. I start in , although with some projectors changing anything flips you automatically to the "user" mode. is usually the most accurate in movie mode as well, but if the image is noticeably cool or warm in color tone, I'll adjust that too.
Once the projector is set up, I'll watch a variety of content to see if I notice any issues that I should further check with test patterns.
Measuring brightness, aka light output
A projector's brightness, generally measured in lumens, is one of the most important aspects of its overall performance. Unfortunately, as mentioned above, the specs claimed by a manufacturer are rarely remotely accurate.
The issue is how projectors create light. It's easier for a projector to be bright if its color temperature -- the color of white and gray -- is way off. If grays are actually bluish or greenish, the image is probably a lot brighter than in a mode where the colors are more accurate, such as movie mode or the medium or warm color temperature. You lose light with accurate colors, but in my opinion that's a worthy trade-off for better color overall.
If a projector is capable of some extreme light outputs but its colors look wonky, I'll note that. For comparison purposes, I measure and compare projectors in their most accurate modes.
When I say "measure," I'm not talking about a ruler. I use Minolta and Photo Research test equipment to objectively measure a projector's output. In this case, the Minolta LS-100 meter gives me the projector's luminance in candelas per square meter (cd/m2). Then, if you know the size and gain of the screen, you can do a bit of math to find the estimated lumens. Since I always use the same screen for testing, this is easy.
The number listed in a projector's Geek Box is the brightest image the projector can produce given the settings and methodology listed above. This is almost always lower than a manufacturer's rating since their rating is usually in an extremely inaccurate, but brighter, mode. There's no regulatory body that oversees projector luminance claims. The American National Standards Institute has a standard for measuring luminance, but not every manufacturer complies with it.
Contrast ratio methodology
Just like with TVs, contrast ratio is easily the most important aspect of a projector's overall image quality. A projector with low contrast will look washed out, with grayish blacks and/or dimmer whites. Contrast can be challenging to measure correctly. I'll explain my methodology and then explain why I do it that way, as it seems convoluted at first.
Using the settings listed above and a Minolta LS-100 light meter, I measure a full black image and then a white window (100% white, but just in a small portion of the screen). I do this using whatever lamp and iris modes are available, though not with auto-iris or lamp-adjusting modes (more on those in a moment). Then, in whatever mode seems best, I measure again using an AEMC CA813 illuminance meter. I average all these measurements together for the overall contrast ratio.
This method is a slightly modified version of the one I learned at the Display Metrology Course at the National Institute of Standards and Technology, one which is sadly no longer offered. The main issue with contrast ratio measurements is that small variations can drastically change the overall measurement. For instance, if I measure 0.002 cd/m2 instead of 0.001, that changes the contrast ratio by half. What kind of small changes? Reflections in the room, for instance: light bouncing from the ceiling or furniture, back off the screen, and then into the light meter. So for consistency I keep everything the same. The CA813, which measures the light directly from the lens, eliminates the room from the measurement, and acts as a sort of check against the luminance measurements from the LS-100.
There are other methods to improve accuracy, like subtracting the room's ambient light from the measurements. Most of these additional methods are extremely time intensive. In my testing, averaging multiple measurements and using the two types of meters, the final result ends up being extremely close. More importantly, it's internally consistent.
That consistency is key since I want you to be able to compare the different projectors I've reviewed with as much accuracy as possible. As I've mentioned, you can't do this with any accuracy using manufacturers specs alone. It'd be great if you were able to do this across different websites, too, but getting video reviewers to agree on a standard of reviewing is far beyond my purview. So I aim to be as internally consistent as possible.
The method listed above gets us the projector's "native" contrast ratio, which is what you see at any given moment on-screen. Many projectors also have the ability to adjust the contrast ratio dynamically. They can use an iris on the lens or adjustable lamp that looks at the incoming video signal and basically decreases the light output of the projector during darker scenes. The result is darker, better black levels at the expense of making the whole image, including bright areas, darker overall.
Done well, dynamic contrast can help improve the projector's overall image quality, but it's less helpful than native contrast ratio for comparisons. I still measure both, however.
Color and color temperature
Compared to brightness and contrast ratio, measuring color and color temperature is relatively easy. Using a Photo Research spectroradiometer I measure the exact colors produced by the projector. How red is the red, how green is the green, and so on. This is more accurate IMO than just saying "well, the grass looks very grassy."
Beyond the red, green and blue primary colors -- and the cyan, magenta and yellow secondary colors -- the Portrait Displays Calman software also lets me test for a variety of in-between colors and shades, to get a broader idea how well the projector creates color.
For the most part, modern home theater projectors in their movie or cinema mode are able to produce fairly accurate colors. Portable projectors tend to be more of a mixed bag, usually in an attempt to squeeze out as much light as possible.
Objective measurements go a long way to telling me about a projector, but they have limits. Many projectors use the same internal components and could measure similarly, yet look different from each other in person. This can be due to a variety of factors, including specific settings chosen by the manufacturer, their video processing choices and more. That's why I side-by-side compare every projector I review for CNET with other, similar projectors.
To do this, I connect two or more projectors to an. A splitter takes a single HDMI source, like a or , and splits it into multiple, identical signals. I then view the projectors side-by-side on my 12-foot-wide, 2.35:1, 1.0-gain screen. Depending on the projectors, this might be a full image, shrunk so each fits on one screen, or I might block off part of each projector's image so I can look at one "sliver" of each projector's image adjacent to each other.
I then watch a mix of content, but always a few key selections that I watch on everything. For years I used the opening of The Fifth Element on DVD (Aziz, LIGHT!), which should amuse anyone who remembers me from Home Theater Magazine. These days my go-to clip is Thor and Loki meeting with Odin on the cliffside in Thor: Ragnarok. Lots of real and fantastical colors in this and the following scenes. I also like the test clips on the Spears & Munsil UHD HDR Benchmark. Because what would a video test be without slow-moving clips of nature.
Input lag for gaming
This is an easy test thanks to the Leo Bodnar Video Signal Input Lag Tester. This handy device tests how long it takes for the projector to create an image, measured in milliseconds. This measurement is of importance to gamers.
What about calibration?
One thing you might notice that's missing from my tests isthe projector. Calibration is the process of fine tuning the color and color temperature to get the projector looking as good as possible. It goes far beyond the simple user-menu setup, and requires specialized gear. I certainly have that gear, as well as the know-how. I'm ISF trained and have been calibrating displays for more than 20 years.
While calibrating a display can definitely improve how it looks, its use in a review is limited. If I find out a $1,000 projector looks better if you spend $400 or more on a calibration, what value is that? I can't assume most people would be willing to spend that money. Also, the only things calibration can improve are color and color temperature. While those are definitely important factors in a display's overall performance, they not nearly as important as brightness and contrast ratio.
Since the vast majority of people reading my reviews will never get their projectors calibrated, it's far more useful to judge them as you'll see them (i.e., out of the box with the setup possible with your eye or hopefully a setup disc).
So if you want to get your projector calibrated, it can improve the image. Nearly every modern home theater projector can be calibrated. It's not going to make a $1,000 projector look like a $3,000 projector, however. If you want to eke out every drop of performance and accuracy from your projector, and you don't mind paying for it, it's worth considering.
Geek Box info
Most of what I learn via objective measurements ends up in the Geek Box at the end of the article. Here's a bit more info about some of those specific numbers:
Average grayscale error: The average color temperature across the grayscale range. Correct is 6500.
Dark gray/bright gray error: how far off dark gray images (20% of maximum brightness) and bright gray (70% brightness) are. Correct is 6500.
Average color error: a rating of how accurate/inaccurate colors are. Lower is better.
Average saturations error: a separate test in ColorFacts, how accurate/inaccurate different saturations of colors are. How pink is pink, basically. Lower is better.
Average color checker error: similar to above, just with specific colors. These are predominantly shades of beige and brown, similar to a variety of skin tones.
In sum, I measure and look at a lot of different aspects of picture quality to figure out which projectors perform best and why. If you're interested in finding out more about how to shop for a projector, here's. Or you can just go straight to my lists of , and .