Friday, February 22, 2013

How Is Image Quality Measured On A Digital Camera

Digital cameras began as an interesting, but severely compromised, alternative to traditional film. The limitations of digital camera hardware ensured that their early use was primarily for their utility and easily uploaded images rather than image quality. A lot has changed since then. Now digital cameras are the forefront of the market, with everything from ultraportables to full-sized single-lens reflex cameras (SLRs). Integration of digital cameras into cell phones and other electronics is near universal. That being said, there are still great differences in quality from camera to camera.


Pixel Count


While not necessarily the most important measurement of image quality, many camera companies use their pixel count as the main benchmark of comparison against other cameras. A pixel is the smallest unit of the image, typically a small square that gathers a single piece of color data. Pixel measurements for cameras are typically taken in megapixels, which is equal to one million pixels. Thus a digital camera sensor 2000 pixels tall and 3000 pixels wide would be a six megapixel camera.


Lenses


Less tangible, but equally important to image quality, is the type of lens used in the camera. This is why a digital SLR, with its large and precisely engineered lens, takes higher quality photos than a smaller camera with the same megapixel count. Researching individual lens brand names will give you a good sense of the quality of the glass in your new camera.


Sensor


While megapixels measurements matter, they are constrained by the ability of the camera's sensor to read them and interpret the data correctly. There are two types of sensors: CCD and CMOS. A Charge-Coupled Device measures light input and converts it to a digital voltage that can be stored on a memory card. Complementary Metal--Oxide--Semiconductor sensors combine the detection of the image with image processing, performing multiple functions in a single circuit. This makes them quicker and more agile than CCD sensors.


Color Interpretation


While the overall resolution of an image is constrained by the megapixel count and the sensor, it is a camera's ability to interpret colors that gives a picture depth and character. This can be achieved in several different ways. One of the most prevalent is the Bayer filter, which overlays the light-sensing pixels with a color filter composed of the three primary colors. Because each pixel then becomes dedicated to measuring one of three kinds of color data, the Bayer filter limits the final image quality by reducing the efficacy of each pixel. More expensive cameras now often feature three separate image processors, each one interpreting a different primary color. This method of capturing images allows for greater color fidelity and the full use of the camera's resolution.








Raw Images


The ability of a camera to shoot in a raw image format can provide a higher quality photograph. In most cameras, a chip converts the image into a format most amenable to storage on the digital card. This allows a great number of images to be stored, even on a small card. Raw photo formats, typically available on more expensive cameras, send the full data set to the memory chip without any compression performed inside the camera body. Each image then takes up a great deal of storage space but can be manipulated later without any of the artifacts and limitations of compression.

Tags: image quality, ability camera, Bayer filter, camera sensor, color data, digital camera