Image resolution is the single biggest factor in determining how good your images will look.
Therefore, everyone who produces, manipulates or prints digital images will find it helpful to have a basic grasp of what resolution is and how it works.
Technically, the concept of resolution applies to all images, but in the real world, it only matters with digital images. So, for the rest of this Q&A we’ll just be talking about digital images.
Resolution refers to the amount of data an image file contains. It is normally measured in pixels.
It absolutely is not, a measure of the quality or clarity of an image. Anyone that tells you it is, even if they are a large national photographic chain or have a PH.D, is wrong. It is perfectly possible to take a low quality, unclear image, which has a high resolution. Just set your Carlos Fandango Superwide 248 Megapixel camera to its highest possible image resolution, focus on an object 150 feet away and then take a picture of a flower from 5mm away. The resulting image file will have millions upon millions of bytes of data in it, but it will all be useless.
Now, read those 2 paragraphs again. If you understand that there is a distinction between resolution (which is the amount of data) and quality (which is the amount of useful data), carry on.
The higher an image’s resolution, the more data it contains. It therefore follows that the higher an image’s resolution the greater the potential there is to capture detail and clarity within that image.
Read that paragraph again. If you understand that higher resolution images only have a greater potential to capture detail and clarity, than low resolution ones do, carry on.
Resolution, therefore, is an effective measure of the ability to capture, or if you prefer, resolve, data. A high resolution image will, assuming all other conditions are equivalent, contain more data (and more useful data) than a low resolution one will.
A pixel is a single unit or block of colour. A megapixel is one million pixels. The acronym for megapixel is MP.
All digital images are composed of lots of individual pixels. The more pixels an image contains, the higher its resolution.
Therefore, a 5 megapixel image contains 2 million more pixels than a 3 megapixel image.
Whoa, there pumpkin. You’re jumping the gun.
Pixels aren’t a numbers game. More isn’t, necessarily, better. You can’t say with any degree of precision that an image with more pixels will be better than one with less. All you can say is that the one with less pixels, will necessarily, contain less information.
As a broad generalisation, particularly for printing purposes, there is a relationship between the minimum number of pixels in an image and an “acceptable” level of quality. More about this in a mo.
For now, don’t over think it. Just set your camera to take pictures on its highest possible resolution setting (the one with the highest total number of pixels) and forget about it. Forever.
You don’t really need to know anything else. You can go back to the Hob-nobs, if you like… it’s about to get geeky.
Q. I understand that pixels = data = resolution; but I’m not sure I’ve got the whole megapixel thing clear. How do you calculate resolution?
OK. If you take the number of pixels an image is wide and multiply this by the number of pixels an image is tall, you get the total number of pixels in the whole image: and this is its resolution.
Conventionally, an image’s resolution in pixels, is described with the horizontal number coming first. So, an image which is described as “2592px x 1944px” would be a landscape image which is 2,592 pixels wide and 1,944 pixels tall: and an image which described as “200px x 320px” would be a portrait image which is 200 pixels wide and 320 pixels tall.
Our large image, therefore, has a resolution of 5,038,848 pixels (being 2,592 x 1,944) or approximately 5MP; whereas our small image only has a resolution of 64,000 pixels (being 200 x 320) or approximately 0.064MP. To put it another way, the large image has over 78 times more data in it, than the small one does!
Because, a pixel is the smallest single element of a digital image and can only ever be one colour at any one time; the number of pixels in an image dictates the maximum amount of detail an image can capture or resolve. Theoretically, therefore our large image could potentially contain 78 times as much detail as the small one does.
Note, that I said theoretically and potentially. However, the relationship between resolution and “detail” isn’t a geometric one and, therefore, in practice, it doesn’t quite work like this.
Q. I noticed that you’ve been talking about Megapixel and MP’s. I thought digital files were measured in Megabytes and MB’s, what gives?
These are two different things. MP is a measure of image resolution. MB is a measure of memory or disk space.
When an image is stored as a file in memory or on a disk, data about each pixel in the image, e.g. the colour information, requires its own amount of memory. This data (rather than the image resolution) is what gives rise to the file size.
For ordinary colour images (24 bit RGB), the amount of memory required for each pixel is always 3 bytes. That is, simply, how big 24 bit data is. Accordingly, an image size of 1500x1000 pixels is 3MP and the amount of memory required by the image is 9MB (being 3 million pixels x 3 bytes per pixel).
However, it will only occupy that amount of space, when it is open in memory. When the image is not open in memory, i.e. when it is stored as a file on a disk, the amount of disk space used is dictated by the file format.
Probably, but not necessarily.
Generally, the higher the resolution, the larger the resulting image file. However, file size is determined as much by file format as it is by the amount of data that is in it.
A typical 3 megapixel photo saved as an uncompressed *.JPG file will create a file of about 1MB. The same 3 megapixel photo saved as an uncompressed *.TIF file will create a file of about 9MB. The resolution of both files is the same, but the amount of disk space used up by the two files varies by a ratio of 9:1.
For this reason, it is mathematically possible for a low resolution image in a “memory expensive” file format, to use up more disk space than a higher resolution image in a “memory cheap” format.
Q. Yeah, but if you printed them both, the 9MB *.tif would print nicer because, its a bigger file and it has more data in it, doesn’t it?
You’re right it is a bigger file, but no, it won’t produce a better print.
Trust us. It really doesn’t work like that. I know it seems like it should, but it doesn’t.
One image is a 3MP image in a 1MB file. The other is a 3MP image in a 9MB file. (Note: this illustrates the important distinction between Megapixels (MP): the measure of resolution and Megabytes (MB) the measure of disk space.)
The two images contained in the files both have exactly the same resolution. They are the same image. The TIF file format just uses up more disk space to hold the data that makes up the image.
(Aside: if you’re absolutely positively sure, this isn’t right, read this>blog: a little wager)
Q. Right, got it. Higher resolution equals bigger file, probably… but bigger file size does not necessarily equal a higher resolution or better print?
Yep, bang on.
However, whilst the resolution of the image (i.e. the number of pixels in it) is fixed, the size of an image depends on how much physical space each individual pixel takes up.
This is obvious, if you think about.
Imagine a very simple image made up of 64 pixels in an 8x8 grid. If each pixel was 1cm2, the size of the image would be 8cms x 8cms. If each pixel was the size of a square on a chessboard, the size of the image would be, erm, the size of a chessboard. If each pixel was 100m2, the size of the image would be 800m x 800m. And you’d need a go in a hot air ballon to see it. But, crucially, each image would have the same resolution.
Read that paragraph again. If you understand that scale is distinct from resolution and that images of different sizes can have the same resolution, carry on.
The relationship between resolution and image size is, therefore, not fixed.
This is quite hard to wrap your head around. But, the relationship between resolution and image size depends on how and in what medium the image is displayed.
Confusingly, the amount of space each pixel takes up is, also, usually described, as “resolution”.
So, an image where each pixel is printed at 1/100 inch is described as having a resolution of 100 dpi and an image where each pixel is printed at 1/300 inch would be described as having a resolution of 300 dpi.
Q. Hang on, that's not what you said a moment ago. How can the term "resolution" have two different meanings?
This is because when you render an image, you have 2 differing resolutions and their interaction to consider:
- The resolution of the image itself (“Image resolution”) and
- The resolution of the output medium for the image (“Output resolution”)
We’ve already discussed Image resolution. So, we’ll focus now on Output resolution.
“Output resolution” refers (somewhat inaccurately) to the ability of the output medium to resolve data.
Output medium is a rather clunky, but useful way, to describe any thing which produces the image you see. In practice, this is usually either a monitor or a printer.
However, (and just to add a bit more complexity into the mix) computer monitors and printers resolve data in a totally different ways. So, it’s best to realise at the outset that what is true for monitors, by and large, is not true for printers and we’ll treat them separately.
Just to make things fun, we also have to deal with some confusing acronyms.
Just down there. On your right.
Printers print dots. The output resolution of a printer is, therefore, measured in DPI (dots per inch) .
Monitors display pixels. The output resolution of a monitor should, therefore, be measured in PPI (pixels per inch).
However, the terms DPI and PPI are often used interchangeably (even though this is incorrect). Scanner and monitors adverts and manuals are particularly guilty of this. Strictly speaking, neither a scanner nor a monitor’s resolution can be measured in dpi. They don’t produce dots. Scanners produce files, which describe pixels. Monitors display pixels.
The crucial difference is that while PPI affects how an image appears on a monitor, PPI does not affect the actual quality of the image itself.
On the other hand, the DPI on a printer does affect the quality of the printed image. Printers use dots of ink to render images. The more dots a printer uses per square inch to render the image, the better the quality of the print.
Yeah, of course, you’re right… and this is why the terms are used, interchangeably.
However, the semantics are helpful, when you’re trying to understand:
- why DPI affect image quality, but PPI does not, and
- the fundamental difference between how monitors size images and how printers size images.
Q. I'm probably going to regret this, but what is difference between how monitors and printers size images?
On a monitor, the size of an image depends on how many pixels are in in the image. A larger (higher resolution) image will always appear larger than a smaller (lower resolution) one; assuming they are both viewed at full size.
The size of a print depends on both how many pixels are in it and how the printer is told to render it. Furthermore, the printer’s capabilities itself will determine, how a print can rendered.
How about an example? Let’s assume the following:
1) We have a low resolution picture of a dog. The image resolution of the dog is 600px x 400px 2) We have a high resolution picture of a horse. The image resolution of the horse is 3000px x 2400px. 3) Our monitor displays 1024px x 768px 4) Our printer is capable of printing at upto at 300 dpi.
If you look at the dog picture on our monitor, you can see of the whole of the image on screen. If you look at the horse picture on our monitor, you can only see about a third of the whole of the image, at any one time. If you want to see the whole of the horse picture on the monitor at the same time, you have use software to zoom out by 300%… (which is another way of saying you need to increase the PPI by a factor of 3). However, zooming doesn’t change (the quality of) the horse image in anyway.
These three facts arise because the number of and, crucially, the size of the individual pixels on our monitor is fixed.
Further assumptions: 5) Both pictures have been set to render at 200 dpi.
If you print the image of the dog, the resulting print will be 3 inches x 2 inches. If you print the image of the horse, the resulting print will be 15 inches x 12 inches.
Now, lets change our assumption 5: 6) The horse picture has been set to render at 300 dpi 7) The dog picture has been set to render at 100 dpi
If you print the image of the horse, again, the resulting print will be 10 inches x 8 inches, i.e. smaller. If you print the image of the dog, again, the resulting print will be 6 inches x 4 inches, i.e. larger.
From this you can see, that because the resolution of the dog image remains constant (600px x 400px), when the dpi is decreased 2 fold, the size of the print increases 2 fold. There is a direct, inverse co-relation between printer output resolution (dpi) and the resulting print size.
It should also be obvious that the 3x2 print of the dog has 200 dots per inch; but the 6x4 print only has 100 dots per inch. The print quality is therefore only half as good in the 6x4 print.
Accordingly, the relationship between pixels, print size and dpi can be expressed as a formula:
No. of pixels / Print Size in Inches = Print Resolution
e.g. (1200 x 800) / (6 x 4) = 200 dpi (1200 x 800) / (12 x 8) = 100 dpi
Further assumptions: 8) The minimum acceptable number of dots per inch in a photo print is 100.
Having established that when the print size increases, the dpi decreases (or vice versa), we can conclude something about minimum image resolutions for particular print sizes.
We already know our dog picture, being 600px x 400px will, if printed at 6x4, have a DPI of 100… and, if you’ve followed the maths, if we printed it at 12x8, it would have a DPI of 50… and at 24x16 a DPI of 25… etc. We also know that the number of dots per inch in a print determines its quality.
Accordingly, once we establish what the minimum number of dots per inch in a print is acceptable; we can, with mathematical certainty, determine the minimum image resolution for a particular print size.
Therefore, if you accept assumption 8 is valid and that the minimum acceptable dpi for a photo print is 100 (actually, this is an arbitary number intended to keep the maths easy), the minimum resolution for our 6x4 prints would be 600px x 400px; the minimum resolution for a 12x8 print would be 1200px x 800px
…and we could make an acceptable print from the horse picture on any sized print upto 30” x 20”.
Further assumptions: 9) The dog picture has been set to render at 400 dpi
At a nominal 400 dpi, the resulting print will be 1.5 inches x 1 inches in size.
So far, so normal. However, our assumption 4 was that our printer is “only” capable of printing at upto at 300 dpi. This means the printer cannot be render the print at its nominal 400 dpi; because our printer can’t physically do dots that small!
Our resulting print is therefore 1.5 inches x 1 inches in size; but it is constructed of “only” 300 dots per inch. The higher nominal resolution instruction is simply diregarded.
Therefore, there is no increase in print quality for our dog picture for any size less than 2 inches x 1.3 inches, because, at this size, the output resolution (dpi) of the print is greater than or equal to the maximum resolution the printer is capable of.
The quality of a print is therefore, ultimately determined, not by print size, but by the maximum output resolution of the printer, itself.
When you print stuff, the higher the resolution, the better.
Ideally, the image resolution and the print size should be sufficient to achieve an output resolution of 300 dpi.
Q. 300 dpi! Hang on, if I’ve understood the maths, the image resolution of an 18x12 print at 300 dpi would need to be, erm, massive. Something like 20MP (being 5400px x 3600px). My camera won’t produce images that big. What do I do?
Well the good news is, you’ve understood the maths. The bad news is… well, there is no bad news. Getting hung up on numbers and output resolutions is something best left to other people.
Image resolution is important but (despite having written all of the above) it’s only really important, when your image resolution is too low.
Printing images with a resolution of less than 1MP is unlikely to produce a satisfactory result at any size.
Once you get to an image resolution of about 3MP, resolution itself is unlikely to be the determining factor for the quality of conventional photo sized prints i.e. 10x8 or less.
Ultra high resolutions, do not always bring tangible benefits. For example, if you had a 10MP camera, took a shot at 10MP, then downsampled this shot to 6MP and 3MP and printed them, most people would find it extremely difficult to tell the 3 prints apart. I’m not saying there won’t be any difference between the 3 prints, just that most people won’t, actually, be able to see any. Furthermore, for smaller prints, the maximum output of resolution of the printer may well mean that there actually isn’t any difference in the prints.
In practice, after about 6MP, the importance of optics, ccd sensitivity and response and the techical skill employed in the shot itself, all become more significant than pure resolution in determining print quality.
Q. So, there’s no point in buying a fancy camera, then… any old point & shoot will do provided it can produce 3MP+ images?
No, actually quite the contrary. The lesson should be that, even though resolution is important, More Megapixels <> Better Camera.
The Ad-men use MegaPixels to try to give consumers a simple way to compare the quality of a camera, but they are false prophets.
Even, if we ignore, the fundamental issue of optics… (which you can’t: a “good” camera with bad optics is always a bad camera and good optics are intrisically expensive)… simply comparing the “magic” number of megapixels that a camera has, doesn’t really tell us anything.
The ultimate purpose of a digital camera (and its sensor) is to capture light and convert it into a digital image. Therefore, whenever you look at the number of pixels in a camera, you need to consider the size and quality those pixels. A large number of small pixels in a small sensor is invariably worse than having less, but larger, pixels in a large sensor. The large sensor’s larger pixels, despite being fewer in number, are instrisically capable of capturing more light. They are therefore, more likely to convert the captured light into a set of values which accurately reproduce colour fidelity and brightness. It also helps improve the signal-to-noise ratio, reducing the amount of noise in the raw captured image and the amount of post-processing that the camera needs to do to compensate for the noise.
This is why my (relatively) ancient Sony DSC909, which has great optics and a large sensor, but only takes 2.2MP images still takes much better photos, than my all singing, all dancing Nokia N82, despite its Carl Zeiss lens and 5MP resolution… and before you go, “So what? the N82 is a phone.” That’s the whole point… comparing Megapixel numbers doesn’t really tell you anything, conclusive. The old Sony is/was a good camera, but camera technology has moved on. The N82 takes good picture for a phone.
Well, boys do like their toys.
A top of the range Digital SLR will not only offer you the potential of 15MP image resolution, but it’ll also offer you good optics, “proper” lens-based optical zoom, a larger light sensor, faster focusing, less shutter lag, and a wider effective ISO ranges etc.
It will also offer you a wider range of control and creative opportunities.
Q. I’ve heard that computers can only display a maximum of 72 DPI, so there’s no point in saving an image at more than 72 DPI, is that true?
No. The whole 72 ppi myth is a total red herring. (…and you’re using the term DPI incorrectly.)
As we now know, for images which will only be displayed on a monitor, DPI is completely irrelevant. The only thing that matters is how many pixels are in it.
For images which will be printed, output resolution (dpi) does matter, but input resolution is more important.
Resolution is a complicated subject and some of this is pretty
dull taxing. So, you might want to go and get yourself a cup of hot Ribena and a packet of Hob-nobs before launching in to it.