The typical VGA signal from a video card to the monitor is an analog signal ranging from 0v for zero brightness to 0.7v for full brightness, for each of the three colors, red, blue, and green. If the voltages produced by the video card are too high, the lighter colors get washed out to pure white. If there is no way to adjust the analog signal levels, (often there is not), then there is no way to see the lighter colors that are just slightly darker than pure white.Here is a test image to see if your computer and monitor are currently able to display subtle differences in colors near white. Click on it to see it full size. If you look at the image from an angle on an LCD monitor, it should clearly show horizontal bands that get progressively darker down the screen.
In order to reduce the analog signals such that the dynamic range is restored, an attenuator can be used in the video cable path. This is just a device to reduce the signals so that the maximum voltage is at .7v. A simple passive device can be made with 3 resistors, and a male and a female vga connector. The resistors should be as high of a resistance as possible, while still allowing the lightest shades of gray to be discernable. On my computer I had to use 470 ohm resistors. There is no fear of overloading the circuit because it is a low-impedance circuit relative to the resistor values. Typical output and input impedances are in the 50 to 75 ohm range. The 470 ohm resistors will not load it significantly.
Here is the schematic and pinout. The device should be as small as possible and needs to be shielded with a full metal case in order to prevent RF and magnetic interference. Because an attenuator acts as an impedance change, there may possibly some reflection artifacts, but they should be minimal and may be reduced by placing the attenuator in different distances from the video card, in the cable path. I found it satisfactory, however, to just install it on the back of the computer where the VGA cable connects.