Author: Preston St. Pierre
Looks are everything
To begin with, understand that all display calibration has one goal: appearing correct to your eye. An image — whether it’s a JPEG file or a live video feed — is a big matrix of pixel values. The job of the computer monitor is to map those pixels to glowing dots that look right to you. That means, for one thing, that if your monitor resides in a brightly lit neon lamp showroom, calibrating it will result in different settings than the same model used in an unlit basement.
That said, calibration isn’t completely individualistic either. Basically, a calibrated display should map an absolute black pixel to the blackest color that it can produce, an absolute white pixel to the whitest color that it can produce, and smoothly scale the shades in between. This task is complicated slightly by the fact that not everybody agrees on what color white is. It is further complicated by the fact that CRTs and LCDs don’t generate a linear increase in brightness from a linear increase in voltage, so some math is required to make them paint their signals to the screen correctly.
But calibration really begins before you even touch the monitor. Remember the neon-light showroom? However well you calibrate that monitor, it would look better if you improved the viewing conditions; red and blue ambient light bouncing off the glass interfere with your vision. If you have control over your computing environment, reducing screen glare and using white light bulbs will give you an improvement. I don’t generally recommend replacing your lights with special daylight-balanced bulbs, but doing so would help if you must color-match images on your display with other items, such as printouts.
One last word: calibration will not make a bad monitor look like a good one. Calibration is getting optimal performance from a piece of hardware. Finding the optimal hardware is a different issue, and some montiors are just better than others. Almost every monitor has built-in hardware brightness and contrast controls, and most have color temperature control as well. If your hardware doesn’t have these controls or they are malfunctioning, calibration won’t make up the difference.
Step One: Learning colors with Lord Kelvin
In order to have a good, objective definition of color, physicists quantify it by something called color temperature — the temperature to which you would have to heat a black-body radiator for it to glow in a particular shade.
The general standard for normal daylight is 6,500K (Kelvins). Hopefully you have a hardware “color” control on your monitor. Most monitors have at least two pre-sets, 5,000K and 9,300K, which translate respectively to mustard yellow and an irritating blue. If you’re lucky enough to have a 6,500K preset, that is the one you want. If you don’t, check your monitor’s manual and see what your options are. In most cases you can adjust a slider to any value in between the presets. Fire up your calculator and find the correct position. If you don’t have that kind of control, you can eyeball the color temperature setting by switching off all artificial lights in the room and comparing the display’s white to sunlight hitting a white piece of paper.
If you have no idea what the presets are or what the color control does and you have no outside windows, you could try the same with a white LED light, but beware that “white” LEDs are not standardized. But honestly, if your situation is that dire, your time is better spent writing angry letters to your monitor’s manufacturer.
Step Two: Black is black, white is white
Next we set the minimum and maximum values the monitor can produce to absolute black and absolute white, respectively. The front of your monitor has buttons or dials labelled Contrast and Brightness. Here’s the first of many potential suprises: these names are firmly entrenched in monitor and television manufacturing, but they are incorrect.
The control labeled Contrast actually controls the brightness of the display. The control labeled Brightness controls the black level. I will try to refer to the controls as “the Contrast control” and “the Brightness control” rather than just contrast and brightness to not make this more confusing than it already is.
We are interested in getting the brightest white our display can produce, so set the Contrast control to 100% or whatever the maximum is labelled.
We are also interested in getting the darkest black our display can produce, so we will adjust that with the Brightness control. Turn off all room lighting, and you will see that the “black” produced by the monitor is actualy glowing ever so slightly — compare a black image to the unlit glass at the edges of your display.
To set the correct black level for your display, download Norman Koren‘s combined gamma/black-level chart and open it in an image viewer. Norman Koren’s site is the best resource for detailed information on monitor calibration, and he provides these charts free of charge. The left-hand column is a graduated gamma scale; the columns on the right are black bars of increasing luminence.
Lower the Brightness control on your display, then raise it bit by bit until you can see black bar “A” against the background parallel to the number 2.2 on the gamma scale. When you have done this, you should be able to see black bar “B” much more clearly. If you can’t, don’t worry, you will come back to double-check this step after setting the gamma-correction in the next step.
Step Three: Gray is gray. Sort of.
We have now affixed the black and white “endpoints” of the monitor’s tonal scale where they belong, so all that’s left is to fit a straight line between. Unfortunately reality throws us another curve at this point — an exponential curve. The trouble is that a CRT does not produce a 50% luminance pixel when given a 50% voltage. The actual luminance is generated according to a power function, approximately
Luminence = C * valuegamma + B
Here C is the brightness as determined by the Contrast control, value is the pixel value normalized to the [0,1] interval, and B is the black level as set by the Brightness control. We already maximized C and minimized B. The gamma exponent is the property that most directly controls the slope of the resulting function: what we want is a gamma that gives us a nice smooth gradient over the grays between black and white.
On a modern CRT display, the native hardware gamma is approximately 2.5. This is a property of the hardware, and you cannot change it. The sRGB colorspace is an abstract color model for working with computers. When doing their research, its creators determined that a gamma of 2.2 most exactly complemented the human eye’s response to light (Macs, for historical reasons, do not adhere to this standard, but you need to do the right thing, regardless of Steve Jobs’ example). So to get a computer monitor to show your eye all the grays that it can see, you will have to apply a software gamma correction that results in an overall gamma of 2.2.
Mathematica tells me 2.5 divided by 2.2 is around 1.136. Under ideal circumstances, this is the gamma correction factor that we would set with our software. Open Koren’s combined gamma/black-level test image again and look at the gamma column, either squinting, blurring your eyes, or standing back from the screen. Find the point on the chart where the horizontal lines blend in perfectly with the gray background; the number on the scale is your system’s current gamma. If it’s 2.2, you are done.
But it’s probably not. Open a terminal window so that you can see both it and the chart. Type the command xgamma -gamma 1.136
. Your display should change ever so slightly. Look for the new gamma on the chart, and re-run the xgamma command tweaking the value until you match 2.2 on the chart. These gamma charts work on a simple principle: at the right gamma, a 50% gray image will have the same average luminence as a test pattern composed of 50% black and 50% white pixels. Koren’s images are more complicated than the classic “box within a box” model because they are scaled to show you a range of possible gamma values.
The xgamma command — a component of the standard XFree86/X.org X server distributions — takes a gamma correction value as its argument, which is why we started with 1.136. You can specify different gamma correction values for the red, green, and blue signals with the -rgamma, -ggamma, and -bgamma arguments respectively, precise to three decimals. To decide if you need to tweak these gamma correction values separately, look for aberrant coloration in neutral grays. Any will do, but Koren provides a nice multi-value chart on his Web site. This is better than attempting to use separate color “box within a box” images because the pure signals are very hard to read. My own monitor is balanced at Red 1.05, Green 1.15, Blue 1.136.
That’s it. Your display is properly calibrated. I wish it were more complicated, but it isn’t.
Well, there is one thing left to do: automate the gamma-correction step. There are two ways to do this: have the xgamma command executed every time you log in, or set it system-wide in your XF86Config file.
On a system where you cannot edit XF86Config, put the proper xgamma command in a .login or .xsession file — but add the -quiet
flag to suppress the output. If you use GNOME or KDE, let the desktop environment’s session manager handle this instead, as they may have complicated start-up code that overlooks .xsession and the like. I run GNOME, so I would simply launch gnome-session-properties and add my xgamma
command to the Startup Programs tab.
But since I have root access to my desktop machine, I add a gamma-correcting line to my /etc/X11/XF86Config file, so it is applied for everyone at system startup. It goes in the Monitor section:
Section "Monitor" Identifier "Monitor0" VendorName "Vendor0" ModelName "P96" DisplaySize 330 240 HorizSync 30.0 - 94.0 VertRefresh 48.0 - 120.0 Gamma 1.05 1.15 1.136 EndSection
Other options and a warning
Mac and PC users have a variety of available display-calibration programs, ranging from freeware to extremely expensive. But as we have seen, there isn’t a lot to the calibration process (profiling for a color-managed workflow is another story, and really a different task). Most of the calibration software out there relies on the same steps we have just gone through: showing you a test image and asking you to tweak a setting until you determine it is correct. But the test pattern displayed by one of these programs is no better than a test pattern from a calibration Web site.
There are a handful of semi-useful GUI tools for changing the X server’s gamma-correction: tkgamma, an aging Tcl/Tk program; KGamma, which is now built in to KDE’s control center; and Nvidia’s nvidia-settings tool, which is good only for Nvidia graphics cards.
But I wouldn’t bother with any of them; they are all merely front-ends to the xgamma
command, and they make it less useful. For one thing, the test images they provide are less helpful than Koren’s because they provide no marked scales. Furthermore, tkgamma and KGamma have large numerical steps hardcoded to the control “sliders,” so you have far less precision than when using xgamma
directly. KGamma actually makes an additional interface inconvenience by putting the gamma-correction values in what appear to be editable text-boxes, but not making them editable! And, obviously, if you don’t have an Nvidia card, that company’s tool is useless to you.
A word about over-correction: some high-end displays have a “Gamma” control built in to the monitor’s on-screen controls. But remember: the monitor’s gamma is a physical property of the device, and it cannot be altered. What these controls do is apply a gamma-correction factor, just as we would do in software. But if you add one gamma-correction on the monitor’s controls, and another in your X server, they will be multiplied and overshoot the correct value.
I suspect that a high-end display doing its own gamma correction can be more precise than xgamma
can, so if you have this option, try it and do no gamma correction with your X server. Or if you want to purchase a high-end display for me, I’d be happy to try it both ways and tell you which looks better.
The future’s so bright, I gotta adjust my gamma-correction
Sticking with this hardware-is-better-than-software theme, there are a few companies that make hardware display calibration tools; they look like three-legged spiders that stick to the surface of your monitor and feed info back through USB cables. To my knowledge, none of these devices is supported under Linux or other “alternative” operating systems. If I am wrong and you know of one, please contact me and let me know; I would very much like to see it. Manufacturers of these devices include LaCie, ColorVision and Eye-One Color, if you wish to make polite inquiries.
Despite what I said about today’s woeful crop of GUI calibration programs, I would really like to see a more full-fledged graphical tool for X users, so that all of the relevant information is in one easy-to-use place. As desktop Linux makes further inroads into professional-level graphics, there will be more people who need such software. There may be only a handful of applications today that support 16-bit-depth pixels and color management, but every day there are more. If you’re interested in undertaking such a project, send me an email and I’ll tell you everything I know. It won’t take long, promise.
In the meantime, today there are more than enough resources at your disposal to properly calibrate your display. For further study I recommend that you begin at Norman Koren’s Monitor Calibration and Gamma site; in addition to the calibration charts I have already talked about, he explains in great detail the process involved and why some things (like individual-color gamma charts) just don’t work, and he has good, up-to-date links to other sites.