Selecting Image Format and Quality

The RAW vs. JPG Debate

Spend a few moments searching the Internet for information about digital photography and you're sure to run across the great debate about whether it's better to shoot RAW or JPG (a.k.a. JPEG) images. While an overwhelming majority of websites and photo-forumites have declared RAW to be the clear winner for top-notch photography, their views are rarely qualified against the all important criteria of the individual user's needs. Here, I'll try to weigh in objectively on the matter, discussing the relative merits of both formats as I summarize and clarify a flood of opinions from the world wide web.

Image Format: RAW or JPG?

When you capture an image in JPG, the result is a complete image that has been fully processed "in-camera" and is ready to be printed. Sure, you can still do plenty of post-processing editing, but you don't have to. The camera has already performed several basic, built-in processes (sharpening, saturation, noise suppression, etc.) and compressed the data to yield a "finished" image that will look virtually identical no matter which computer you view it on. It will also eat up far less memory than it would if the image data were left in an uncompressed state. The quality of the final image will be dictated partly by the manual quality setting selected on the camera ("fine," "normal," or "basic"), partly by photographic skill (exposure and composition), and partly by how much (or how little) the image is manipulated in post-processing. Regardless, assuming you expose the shot properly to begin with, you should end up with a sharp, vibrant, accurate rendition of the scene, captured in one of the world's most compatible image formats, and capable of standing up to an impressive amount of post-processing manipulation. Simply put, the JPG format is intended to economize file space, reduce or eliminate the need for post-processing, and deliver superb "straight out of the camera" prints that should look virtually identical when printed from any color-calibrated printer.

When you capture an image in RAW, you are not actually capturing an image at all, but rather the unprocessed "raw" data which the camera sees and records at the sensor—without any (or very few) interpretive adjustments and file compression being applied. To convert this data into a viewable image, it must be processed through a RAW converter, at which time it is possible to make a number of manual adjustments (exposure, brightness, sharpness, saturation, etc.) which are normally performed in-camera for JPG images. As a result, you are somewhat more limited in your choice of post-processing tools (a RAW converter is mandatory, but many of the best photo-editing packages have them built-in), yet you have much greater control over the precise "look" and "feel" of the resulting image. In fact, you can adjust exposure, contrast, saturation, and other settings almost as if you were still staring through the lens of the camera, composing the original shot. You can also control many of the subtler processing decisions that the camera normally makes for you, such as sharpness, color temperature, shadow brightness, chromatic aberration, and vignetting. In short, RAW affords the greatest amount of post-processing control.

That said, RAW is not a standard format like JPG, BMP, or TIFF. Its data and encoding are subject to proprietary variations which, in turn, cause RAW images to look slightly different or to afford varying degrees of control depending upon which RAW converter you use to process them. (Allegedly, some manufacturers also encode or encrypt portions of the RAW data to render certain values unreadable to non-proprietary RAW converters, though I've never encountered such problems myself.) RAW is also an uncompressed or mostly uncompressed format, allowing it to capture slightly greater detail in a complex scene, but also making it far more "memory-hungry" than even a very high quality JPG.

JPG Myths & Mispercetions

Contrary to widespread misconception, the quality of a JPG image does not degrade or change every time you open the file. It only changes and degrades if you edit and re-save the file, at which time more image compression is applied and consequently, more image quality is lost. The effect is not unlike the degradation that occurs when you make a photocopy of a photocopy of a photocopy, or a fax of a fax of a fax. Realistically, however, the degradation from re-saving JPG files is not nearly so dramatic or destructive as many folks claim—nor does it occur from re-saving alone. In Adobe Photoshop, for example, if you maintain the same "quality" setting between re-saves, image degradation only occurs on the first save (when the selected "quality" level of file compression is initially applied). Thereafter, as long as you do not change the "quality" setting or edit the image, no further degradation of image quality will occur, even if you re-save the file dozens of times.

That said, if you make changes (edits) to the image between each re-save, JPG errors (blocking, artifacts, etc.) will begin to accumulate, so good editing practice dictates that you should avoid re-saves whenever possible by always working from the "master" (original) image. If you archive the original JPG "straight out of the camera" and always begin new edits from that master copy (or as some call it, your "digital negative"), there will be no degradation except for a variable amount of initial compression which is applied in-camera (as determined by the in-camera quality settings: "fine," "normal," or "basic") and any incidental degradation that may occur during the editing process (as image values and settings are manually adjusted, or the save "quality" setting is reduced). Thus, if you adhere to this sensible practice, image degradation due to JPG file compression should never pose a significant problem.

Don't fear the JPG format or its bad reputation for using "lossy" compression. Plenty of competent photographers prefer to shoot and work in the JPG format. So do many graphic artists. Yes, the JPG format offers comparatively less control over image adjustments in post-processing (compared to RAW), but a knowledgeable photographer using proper exposure settings can still achieve razor-sharp, outstanding results with JPG images which easily rival the quality of an uncompressed, "lossless" RAW image. More importantly, "lossy" compression does not normally or necessarily cause dramatic losses in image quality. Every now and then, I come across the absurd claim that by compressing file size to less than half the size of a RAW image, JPG images sacrifice at least half the image detail. (Or worse, the inconceivable belief that "lossy" JPG compression means the image will slowly degrade and "bleed" data simply by sitting too long in storage on the hard drive!) Such claims show absolutely no understanding of how file compression works. By the same logic, software programs compressed into ZIP files should cease to work (or lose at least half their functionality), or hard-drives that use file compression should regularly lose huge chunks of data.

The Raw Truth

In reality, the level of detail offered by a large, high quality JPG is very often indistinguishable from that offered by a comparable RAW image, except under very specific, predictable conditions. Certainly, RAW offers greater creative control in post-processing, but given the substantial difference in file size, you could opt to bracket difficult shots several times in JPG mode and still consume far less space than you would consume shooting RAW images. On a 6-megapixel camera, a typical "fine" JPG image (2.03MB) consumes less than half the space of a RAW file (4.86MB), while "normal" (1.51MB) or "basic" (688k) JPG images consume only about one-third or one-seventh as much space, respectively. (File sizes vary from scene to scene, but the numbers cited reflect actual file sizes taken from a sample image captured on a Nikon D50 at each of the four quality settings, all at "large" size.)

Furthermore, while it's true that RAW offers terrific flexibility, it is not quite the miracle format it's cracked up to be by a highly vocal majority of amateur and semi-professional photo-forumites. One argument I keep hearing for why you "must" shoot RAW is so that if you screw up the initial exposure badly, you can still adjust it later and save an oh-so-valuable shot from [cue dramatic music] being lost forever to oblivion. Occasionally, these arguments are accompanied by ridiculous testimonials, such as: "I went on vacation and didn't realize my camera's exposure was set to -3.00EV the whole trip! Thank God I was shooting RAW!" If that's really the case, I say better you lose your pictures and learn a valuable lesson, than rescue the pictures and continue your thoroughly oblivious approach to digital photography. Undoing catastrophic exposure mistakes may sound like an attractive benefit, but assuming that your DSLR includes an LCD screen to review each image right after you shoot it, I find it extremely hard to take these catastrophic exposure scenarios seriously. Even a very poor photographer should have no trouble diagnosing a very bad exposure with only a cursory glance at the LCD—even without using the tell-tale "Histogram" or "Highlight" features. (Sorry if I'm hurting someone's feelings here, but if these hyperbolic stories of RAW-rescued vacation photos are remotely true, perhaps those folks should keep their DSLRs set to the virtually idiot-proof "Auto" mode.) Of course, if you're even close to the right ballpark, most reasonable and likely exposure errors (within one full stop of a proper exposure) can be fixed quite effectively in post-processing (with decent photo-editing software) whether you're shooting RAW or not.

Compatibility is another issue that deserves mention concerning RAW—though here, the issue becomes speculative. Since RAW is not a standard format and is subject to proprietary variations in the way some camera information is encoded into the data, it may also run aground against compatibility issues that JPG shooters need not fear. For example, there have already been several reports that the White Balance information for RAW files captured with a Nikon D2X, D2Hs, or D50 will not decode properly in Adobe Photoshop. (My own experience with D50 RAW files in Adobe Photoshop CS2 has revealed no such problem.) Nonetheless, compatibility issues are a very real danger of working in a non-standard format, as many graphic artists will tell you. Down the road, if you leave your images in RAW format, you may find yourself unable to open or edit old RAW files because they are no longer recognized or supported by the latest software—meaning that you could waste days or weeks of your time (depending on the number of files) converting archived RAW files to a more compatible format before you can open or edit them properly with future photo-editing software.

Folks who tout RAW as the most "future proof" version of digital images—offering a level of protection or preservation comparable to an original "negative"—may be operating on a profoundly mistaken assumption. Certainly, I hope they're right, but if the history of file formats is any indication, there's good reason to suspect that many of today's RAW images will not be readable in tomorrow's software, or if so, they will (1) be subject to a more limited range of adjustments, (2) require you to shell out more money to purchase a special "plug in" or "decoder" to make them work, (3) require you to use only select RAW converters, or (4) restrict you to a proprietary RAW converter only (for full image control). It's happened before; it'll very likely happen again. That's the way of business and technology. Of course, as long as you currently own photo-editing software that can read and convert your RAW files properly, there's no immediate cause for concern, so to some people, this will be a moot point. As for me, I archive the original RAW files, but I also try to keep in the habit of converting my favorite images to TIFF or PSD format (the latter is an Adobe Photoshop file format; if you use it, be sure to select "Maximum Compatibility").

Finally, some folks make much ado about the fact that JPG uses 8 bits per color pixel, whereas RAW uses 12 bits. In some cases, this leads to the decidedly erroneous assumption that RAW is 50% "better" or "more accurate" or "more sophisticated" in its reproduction of color and detail. As Ken Rockwell points out in his semi-controversial article "RAW vs. JPG: Get it Right the First Time":

Raw is 12 bit linear, and JPG is 8 bit log, gamma corrected or some other non-linear transform derived from the 12 bit linear data. Thus in the shadows where this might matter the two are the same, since the full 12 bit resolution in the dark areas is preserved by the non-linear coding. Even if the two formats differed in dark resolution the sensor noise is still greater than one LSB anyway making it a moot point. (Rockwell,, July 11, 2006)

While I can't verify whether or not the bit difference is completely moot as Rockwell claims, the Microsoft and Adobe knowledge bases do appear to support the gist of his argument: that the difference in bit depth is virtually, if not entirely, insignificant from a visual imaging standpoint. An article in DigitalPro Shooter confirms the same conclusion:

Sure there is some small loss of mathematical detail, but if the tone curve is the right one for the image and you don't need to move the exposure after the fact you'll be hard pressed to see the difference. And since essentially every output device in common use is 8-bits or less per color, your software is going to need to convert the image to 8-bits per color at some point in any case. (Vol. 2, Issue 7, March 24, 2003)

All data and technical jargon aside, I certainly can't see the difference in any of the comparisons I've made for myself.

One noteworthy exception might be those cases in which the photographic subject has a fine gradation of colors distributed across a wide area, such as the graduated blues or reds that sprawl across a dramatic sky or the subtle tones which cover the surface of a spherical object in dim light, in which case JPG sometimes presents "color banding" problems (abrupt transitions which show up as hard "edges" between colors, instead of a smooth gradation) which RAW may not. In my experience, these cases are extremely rare, and I'm not certain they are caused by the difference in bit depth. Regardless, if the banding is subtle, I can fix it quite easily with some clever editing in Adobe Photoshop. If the banding is significant, it's easy enough to diagnose the problem on the camera's LCD screen, and to see if RAW preserves a smoother gradation for that particular shot.

To be clear, I'm not trying to denigrate RAW or to champion JPG. I like RAW as much as the next guy, but I don't hate JPG as much as the growing majority of photo-forumites. My goal, therefore, is to qualify widespread claims which grossly exaggerate the differences in quality and workability between the two formats. The recent tendency has been to dismiss JPG as if it were a laughably inadequate format for high-quality digital photography, or to proclaim RAW to be superior in every way. That is simply not the reality. The truth depends on your particular needs and preferences. Let's take a closer look at how the two formats compare.

Image Quality: RAW, Fine, Normal, or Basic?

After carefully inspecting 34 different shots taken at each of the four different quality settings (RAW, "fine" JPG, "normal" JPG, and "basic" JPG—136 images total) under a wide variety of imaging conditions (backlighting, extreme contrast, landscape, portrait, complex detail, etc.), I found no visible difference in image quality between any two adjacent quality settings (RAW vs. "fine," "fine" vs. "normal," or "normal" vs. "basic") except when the images included very finely detailed patterns (the labyrinthine twigs and branches of a tree, the individual quills of a bird feather, or the complex reflections on a storefront window) or very subtle color variations (a pale sky with faint wispy clouds, the pale gray bark of a weathered tree, or the soft pinkish hues cast by a mild sunset) and were subsequently magnified beyond (usually well beyond) 100% magnification. Even then, discernible differences were often extremely small—I dare say "negligible" in most cases.

In many cases, RAW yielded slightly brighter and richer coloration than "fine," "fine" yielded slightly richer coloration than "normal," and "normal" yielded slightly richer coloration than "basic," even though all camera settings (focal length, shutter speed, aperture, ISO, white balance, etc.) had remained unchanged. However, in the vast majority of cases it was extremely difficult to detect any difference in color, sharpness, or contrast without magnifying the images greatly—often as high as 300% or 600%—and comparing them pixel-by-pixel. At 100% magnification, and when comparing normal prints as large as 8 x 10, it was practically impossible to distinguish differences with the naked eye. In fact, with smaller 5 x 7 prints, it was often difficult to distinguish RAW from "basic" JPG!

Technically speaking, RAW surpassed "fine," "fine" surpassed "normal," and "normal" surpassed "basic," but the visible quality difference between an enormous RAW file (4.6MB) and a "basic" JPG (688k) was not one-tenth so dramatic as impassioned discussions on the Internet would lead rational folks to believe—not unless you plan to print these images to, say, 12" x 20" or beyond. Certainly, the significantly greater compression of the "basic" JPG file does result in more conflated pixels (the most noticeable losses occur, as expected, in bright highlights and dark shadows), but the compression is quite deftly applied to preserve good detail. At some sizes, "basic" JPG can actually appear sharper and more pleasing to the eye (with less "noise") than the comparable RAW image. Mind you, these observations were made before any post-processing was applied, comparing image quality straight out of the camera.

Realistically, then, the comparative "advantages" offered by RAW or the "fine," "normal," and "basic" JPG settings are mainly logistical: lower quality JPG images process faster and consume less memory than higher quality JPG or RAW images, while RAW images afford greater control in post-processing. With my D50 set to continuous shooting (2.5 fps), the LCD promises 4 continuous shots in RAW, 9 at "fine" JPG, 12 at "normal" JPG, or 19 at "basic" JPG before the buffer overflows and the camera needs to pause for processing. Equipped with a 2GB Sandisk Ultra II SD card, my D50 pauses noticeably after 4 continuous RAW shots (as expected), but easily exceeds the promised JPG buffer values: At "fine" quality, for example, I fired off 18 continuous shots (twice as many as promised) before the camera slowed. At "normal" quality, I fired off 40 continuous shots and could have kept right on going with no hint of hesitation on the camera's part. If you use a slower SD card, the buffer values may prove more restrictive during continuous shooting, but the general trend will hold: the camera should process lower quality JPG images much faster than those of higher quality, yielding better "continuous shooting" performance.

The four-shot limit (before camera lag) in RAW mode is quite restrictive even if you're only trying to capture a short sequence of events (about 2 seconds), requiring you to release the shutter at exactly the right moment. The "fine" JPG setting, on the other hand, delivers enough continuous shots that you could confidently release the shutter just before the "ideal" photographic moment and keep firing away steadily (about 7 seconds) until well after the action has ended. For longer sequences of events, dropping down to "normal" or "basic" JPG ensures truly continuous shooting until card capacity is full—a huge benefit for action-photography. Moreover, "fine," "basic" and "normal" JPG images also consume far less memory than RAW images, allowing you to do quite a bit more "continuous" shooting before your memory card fills up. Using a 2GB Sandisk Ultra II SD card, my D50 holds an estimated 270 images in RAW, 568 images at "fine" JPG (large), 1,100 images at "normal" JPG (large), and 2,000 images at "basic" JPG (large). Actual capacity varies according to the complexity and detail of the recorded subject (which, in turn, affects file size), but in my experience, these estimated capacities are fairly accurate.

Context Dictates the "Right" Choice

For action photography, the "fine" or "normal" JPG settings ("large" size) allow you to fire away steadily in continuous shooting mode, producing extremely good results with little to no slowing of the camera. If your SD card is small in size or slow in speed, don't be afraid to use the "basic" JPG setting. The quality is still very good, and you won't notice the difference in 5 x 7 prints. For very large images or images which include a lot of fine detail, shoot "fine" JPG or RAW. For tricky contrast situations, subtle color variations, large pale subjects, night or low light photography, and moments when you have serious doubts about the proper exposure, "fine" JPG can still deliver superb results, but RAW becomes much more attractive due to the greater control and flexibility it offers in post-processing.

Based on my experience with post-processing, once you depart from the basic realm of capturing and editing single shots (specifically, those taken under very tricky conditions), RAW's post-processing advantages are not nearly so substantial as Internet hype would lead you to believe. In fact, I'm convinced that many of the folks who espouse RAW's superiority with such hyperbolic fervor either (1) do not own a capable JPG editing tool like the amazingly-capable Adobe Photoshop CS2, (2) do not understand how to manipulate JPG images effectively, or (3) do not understand the basic principles and limitations of digital exposure. I don't mean that as a put-down. But out of all the post-processing experiments I could concoct, I was hard-pressed to find any situation in which I couldn't achieve strikingly similar results with RAW and JPG images. Moreover, neither format was consistently easier to work with than the other: Sometimes I found myself achieving desirable results faster by working with JPG; other times, RAW was faster. It all depended on the nature of the image, the specific results I wanted to achieve, and my luck at choosing the right adjustments on the first try.

Arguably, the most demanding, yet rewarding task of post-processing is that of balancing shadows and highlights in a "high dynamic range" scene which does not lend itself to a single, "correct" exposure (or to the quick fix of a grad ND filter), but requires multiple exposures spanning 4 full stops or more. In such cases, if you have a working familiarity with Adobe Photoshop (specifically, a good knowledge of layers and layer-based adjustments) there is not much you can do with RAW that you can't do with JPG. Best results are still obtained by shooting from a tripod and blending together the dynamic range of multiple shots of the same scene (taken at multiple exposures, with a fixed aperture) into a single, well-balanced, "high dynamic range" image that better approximates the color and light sensitivity of the human eye—in which case it seems to make little or no difference whether the exposures were captured in RAW or JPG mode, as long as an adequate range of exposures is recorded. Forced to choose, I would recommend RAW only because it is more forgiving if you fail to capture an adequate range of exposures to begin with. Realistically, if you know what you're trying to achieve (and what exactly is needed to achieve it), either format will get you there.

Note: Andreas Tofahrn has posted a quick overview of this technique, called "Dynamic Range Increase". Michael Reichmann describes three other digital blending techniques which can be used to produce similar results. Adobe Photoshop CS2 also includes an automated "High Dynamic Range" function (File -->Automate --> Merge to HDR) designed to accomplish the same task, but it's still a bit crude. You retain much more precision and control by performing the task manually, in layers.

Also worth mentioning: Some JPG advocates (yes, they exist!) complain that RAW files are slow to open and convert in post-processing. Technically speaking, RAW files do require more time to open and convert than JPG files (which not only open faster, but don't require conversion). Actual time varies depending on the converter you use, but running Photoshop CS2 on a 1.73 GHz Pentium-M laptop (1.5GB RAM), RAW files take about 4 seconds to open (to load into the converter) and up to an additional 10 seconds to convert (7 seconds is more typical), for a total of up to 14 seconds—not counting time spent tweaking exposure, white balance, or other converter settings beforehand. (Reportedly, some other RAW converters may take as long as 30 or 40 seconds.) In comparison, JPG files open near-instantaneously (2 seconds or less). The time difference becomes much more pronounced, however, if you compare the time it takes to open multiple JPG files vs. multiple RAW files.

If you typically print your pictures "straight out of the camera," JPG wins the "speed test" (or rather, the "convenience test") simply because JPG images become printer-ready the moment you release the shutter. On a shot-by-shot basis, it seems rather nit-picky to complain of RAW processing times. (Perhaps those folks are running older computers with slower processors, insufficient RAM, or a poorer RAW converter.) That said, if you're shooting hundreds of images at a time (all of which you plan to share or print), RAW will cost you a lot of extra post-processing time because, unlike JPG, you must convert every image—not just those which require correction. Believe me, the time adds up a lot more than you would think. RAW will also cost you a lot of archiving time, as RAW files load much slower from removable, recordable storage media (CDs and DVDs) than JPG files do. The costs of archiving are also higher, as fewer RAW files will fit on each storage disc.

Final Words

By now, I hope you can tell the choice to shoot RAW or JPG is not so easy or one-sided as it appears on many photography-oriented forums and websites. Personally, when shooting under normal and familiar conditions, I think the post-processing advantages of RAW are outweighed by the disadvantages of its aggressive consumption of storage space, its slower write times, its much more limited buffer performance, its requirement of RAW-capable software, and its lack of any glaring quality advantage over "fine" JPG images (assuming a knowledgeable photographer and proper exposure settings). On the other hand, if you have a large SD card (or you only take a few images at a time) and you enjoy experimenting with images in post-production, RAW may be the format of choice. For night and low light photography, when exposure becomes much more finnicky, I think RAW is the smarter choice; otherwise, you should bracket your images religiously in JPG.

Available photo-editing software may also influence which format you prefer. User-friendly editing programs like Nikon Capture 4 make it very quick and easy to "tweak" RAW settings to achieve fantastic results—even if you're not very technologically or artistically savvy. Since many of Nikon Capture's most useful editing options are unavailable for JPG images, that fact alone may dispose many users to shoot RAW. In a more sophisticated editing program like Adobe Photoshop CS2, however, the superior "control" of RAW vs. JPG becomes less significant under normal shooting conditions, and can be obviated even under the trickiest conditions by a knowledgeable photographer who is reasonably adept at photo-editing. Ultimately, the "right" image format depends entirely on you. There are many paths to "stunning" digital photography. Experiment with the various image settings and post-processing results to find out which choice gives you the best blend of quality, control, and convenience. You may find, as I have, that neither choice is ideal for all conditions.

If you're curious why I haven't posted dozens of side-by-side image comparisons to illustrate my claims on a pixel-to-pixel basis, it's for two reasons: First, it would be ridiculous to waste server space posting full-sized, uncompressed RAW and JPG images (at various magnifications, of course) just to demonstrate that there is little or no visible difference between them. (Try shooting and comparing a few images for yourself if you're still unconvinced.) Second, plenty of folks have already posted extensive online "comparison tests," including these three examples from UCIrvine, Luminous Landscape, and Photodoto. Notice that all three of these independent tests conclude that RAW is "superior" to JPG in some ways, yet all of them also concede (grudgingly, perhaps) that the differences are very slight and confined to a particular set of conditions, proving that folks who tout the "vast superiority" or "grave importance" of shooting RAW are creating a false sense of urgency about a relatively minor issue. Don't buy into it. Shoot what you prefer and what circumstances dictate.

A Brief Example

After all this talk about image quality, your head is probably spinning. No worries. To convince you that JPG is a capable format, here is a completely unedited JPG image (except re-sized smaller), captured at the "fine JPG" setting on a Nikon D50 SLR camera, using the 18-55mm kit lens:

And here is a 100% crop of the bee from the original, full-size JPG file to show the level of detail. Keep in mind, I didn't use a macro lens or a tripod, no sharpening was applied, no image properties were altered, and moderate file compression was applied to reduce the file size for the Internet. Not bad for a bare naked JPG! It looks even better in print, and it will stand up to a lot of editing before the dreaded "JPG artifacts" become even mildly problematic.

Would RAW offer finer detail and better color depth? Hypothetically, perhaps, but it would be hard to imagine a scenario in which the differences could be distinguished without scrutinizing enormous prints under a magnifying glass. In short, be skeptical of claims about significant quality differences between the two formats. RAW's "advantages" have mostly to do with post-processing flexibility, and even then, the advantages are small if you get the exposure right to begin with.

Terms of Use

© 2006, Wesley Kisting

Return to