This is one of the fundamentals of graphic design and a very important part of maintaining a great relationship with your printer! Resolution.
Resolution is the degree of sharpness of a computer-generated image as measured by the number of dots per linear inch in a hard-copy printout or the number of pixels across and down on a display screen. That’s the textbook definition. Basically, resolution means how sharp or blurry your picture is. If you have a high-resolution picture, your picture will be sharp and clear. If you have a low-resolution picture, it will be blurrier and pixilated. In order to understand how resolution works, we have to define what makes up a picture. Dots and pixels Pictures are made up of dots or pixels. A pixel is a small dot or square of color. Every single picture that you see on your computer is made up of pixels. The more pixels a picture has, the sharper and clearer the image. The opposite is also true, the less pixels, the fuzzier the image. We said before that resolution is based upon dots per inch. Since dots (or pixels) make up the image itself, the way this is calculated is by dpi or dots per inch (ppi is also the same thing pixels per inch). If you take 1 inch of space, there can be any number of pixels lined up within that inch. Imagine a picture of a flower. One inch of a flower can contain a petal, a leaf and many different shadows and highlights. So, if within that 1 inch there are five dots, you can imagine that five dots will not show the depth of the petals and leaves of the flower. If within that 1 inch there were 300 dots that would be enough dots to really show each variation of color and shadow in the picture. That is why a picture with a higher number of dots or pixels per inch will be sharper than a picture with fewer dots or pixels per inch. Print vs. web resolution The standard resolution for images on the Internet is 72 dots or pixels per inch. When web designers create websites or anything that is to primarily be used on the Internet, they must save their images for the Internet. This means that they will probably take the resolution down to 72 dpi. This will cause the image to appear the correct height and width in inches that they would like others to see on their computer screen. On the other hand, the correct resolution for print is 300 dpi. When you print an image, more dots or pixels are required in order to make the image appear clear and sharp enough for a printer to replicate it. Now we are starting to see why images on the Internet will not print out the same height and width that they appear on your computer screen. If you have an image that is 4 inches in height by 6 inches in width on your computer screen, it is 72 pixels or dots per inch. Let’s do the math: 6 x 72 = 432 and 4 x 72 = 288, so this image is 432 x 288 or 124,416 pixels total. In order for that same picture to print out as 4 inches high by 6 inches wide, let’s see how many pixels it would have to be. Remember that in print, the standard resolution is 300 dpi: 6 x 300 = 1,800 and 4 x 300 = 1,200, so the image would have to be 2.16 million pixels total to come out in real size. So as you can see, 124,416 and 2.16 million are two very different numbers. The computer image that shows 4-inch by 6-inch on the screen is going to print out to be about 1-inch by 1.5-inch on paper. Do not take pictures off of the Internet without realizing this! Bitmap image vs. vector image Lastly, because we’re talking about resolution and enlarging images, we need to talk about the difference between a bitmap image and a vector image. A bitmap (or raster) image is an image made up of pixels, like we’ve been talking about. It is the method of storing information that maps an image pixel, bit by bit. Photographs are bitmap images; anything scanned into your computer is a bitmap image. A vector image is an image that was created based on mathematical equation. It is a graphic made up of mathematically defined curves and line segments called vectors. If you see a picture of a logo or drawing that looks like it was created on a computer such as clip art, etc. it was probably made with a vector program. When a bitmap image is made bigger or smaller, the dots or pixels per inch grow or shrink, and if the image is stretched too much, it can “pixelate” or distort and loose quality. When a vector image is made bigger or smaller, there are a series of equations that calculate how the image should look bigger or smaller and it adjusts the image accordingly. There is absolutely no quality loss either way because there are no pixels, and as long as the program can handle it, the equation can multiply itself without distortion forever and ever. What does this mean to you? If you have access to a vector program (such as Adobe Illustrator) you can easily blow your images up to larger sizes and lose absolutely no quality. But with the bitmap or raster program, you’ll loose quality if you stretch the picture. That’s why if a vector image is made into a bitmap image it’s called “rasterizing” the image. It is also important to understand that although a vector image can be turned into a bitmap image, a bitmap image cannot be turned into a vector image and handled the way an original vector image would.