One of the most important concepts you need to understand when working with digital images is the resolution with which an image is captured or created and the resolution at which an image will be displayed or used. The resolution of the images you use is critical to their quality and suitability to the purposes you have in mind.
Resolution can be a difficult topic to grasp because the same term is used in different contexts. To understand the resolution you are working with, you need to understand the area you are in (for example, on the desktop versus working with a printer). Differences in resolution on different devices (scanners, cameras, printers, monitors, and even on different operating systems) greatly impacts how specific images look on those devices.
Basically, resolution is a measure of the number of dots a device can work with (a device means a monitor, image capturing, or output device). On a computer monitor, scanner, digital camera, or other digital imaging device, these dots are called picture elements, or more commonly, pixels. On printers and other output devices, dots are called, well, dots.
The confusing part of this is that a pixel has no defined physical size. A pixel on one device can appear larger or smaller than a pixel on another device. This is because every device works with its own range of number of pixels that it can display. Some devices display many pixels in a given area, whereas others display fewer. And almost all devices enable you to change the number of pixels that they use (up to some maximum number of pixels?that is, the maximum resolution of the device).
For a monitor or image-capturing device, resolution is measured by the total number of pixels that can be displayed or captured. This is indicated by the number of pixels that can be displayed or captured in both the horizontal direction and the vertical direction. For example, a common monitor resolution is 800 pixels in the horizontal direction by 600 pixels in the vertical direction?in shorthand, this is written as 800x600.
Understanding that many different resolutions are used is important if you prepare images that will be viewed by other people. For example, if you design a graphic assuming that your audience has a 21-inch monitor set to 1,280x960, users who have a 15-inch monitor set to use 800x600 won't be able to see all of your image at its full magnification because you use a larger display image than they are capable of viewing.
The resolution at which certain types of images are captured also greatly affects the ability to scale or resize those images onscreen. The resolution at which a bitmap image is captured limits how much that image can be scaled onscreen and still look sharp. For example, if you capture a screenshot (which has a low resolution of 72dpi), in some cases, that image will look sharp at the size at which it was captured, but when you magnify the image, it will become pixelated, which means that its edges will have the blocky appearance that you have no doubt seen. This happens because the image contains only so much information (a total number of pixels). When the Mac attempts to add information (pixels) to it, it can't fill them in smoothly and so the image begins to get blocky. However, the same images will look fine when they are made smaller (the Mac doesn't have to make up and then add information; it merely has to discard some of the pixels in the image).
The last general item you need to understand about resolution is that every device has a maximum resolution at which it can work. For example, a 21-inch monitor might be able to display a resolution up to 1,280x1,024, but a smaller monitor might be able to display only up to a resolution of 1,024x768. Similarly, one digital camera might only be able to capture images with a resolution of up to 640x460 pixels. Another camera that can capture images with a resolution of 1,024x768 will result in more detailed images because it can capture more information?more pixels.
Every graphic you create is developed in a specific resolution, which is not dependent on the monitor's resolution. The resolution of an image is sometimes measured in pixels per square inch rather than as a horizontal and vertical measurement, as is the case for a monitor's resolution. In a way, this measurement is fictitious: it's easy to resize an image by changing its resolution, or to change an image's resolution by changing its size, without otherwise affecting the contents of the image. It's best to think of image resolution as a measurement of how much information the image contains?how detailed it is, or how many pixels are used to represent a small portion (an inch) of the image.
The higher an image's resolution, the more detail the image will contain, and the better it will look regardless of the size at which it is viewed. For example, an image that has 120 pixels per inch will look better when a user zooms in on part of it than an image that was created at a resolution of 72 pixels per inch because it contains more information and thus more detail.
You can use Table 15.1 as a rough guide for the proper resolution to use for common imaging devices. The recommendations are for images at 100% size, and do not apply to images that have been scaled in a page layout program or by other means.
|Monitor (for Web pages and onscreen presentations)||72 pixels/inch|
|Laser and inkjet printers||100?150 pixels/inch|
|Imagesetters and direct-to-plate presses||300 pixels/inch|