|
|
Subscribe / Log in / New account

The Grumpy Editor's guide to HDR with Linux

This article is part of the LWN Grumpy Editor series.
Your editor has long enjoyed photography. As a high school student, he even pondered, briefly, the idea of pursuing photography as a career; for better or for worse, common sense won out and your editor went to engineering school instead. But taking pictures has remained an active hobby, even if it has tended to degrade to the creation of a stream of snapshots of the kids for grandparent consumption in recent years. The advent of digital photography has brought a couple of your editor's passions back together, with only one thing - free time - missing. But, your editor has discovered, one of the keys to the finding of free time is to take an activity of interest and redefine it as "work." Thus, this article.

A while back, your editor stumbled across the Flickr HDR pool; some of the photos in that pool were sufficiently amazing to inspire an immediate "I wanna do that" reaction. The better part of a year later, it finally became possible to learn a bit more about the process behind those pictures. HDR (or high dynamic range) photography is a set of techniques for overcoming the limitations of contemporary hardware and, in the process, generating images which better represent a scene as viewed by the human eye - or which appear to come from a work of fantasy art.

The sensors in today's digital cameras have gotten good, but they still fall short of the human eye in a few ways. In particular, the range of light levels which can be captured by the sensor is not yet up to what film can handle, and is far from what the eye can do. Anybody who has spent any time taking pictures is familiar with this problem: one can take a beautiful landscape picture, but, in the end result, the wild cloud formations are washed out completely and the shadows just go black. Being unable to capture a scene that one can see quite well can be most frustrating.

The idea behind HDR, as it is used with photography, is to extend the available dynamic range by taking multiple shots at different exposure levels. For a given exposure, there will be a range of light levels which will be captured with good resolution by the sensor; everything else gets compressed at one end or the other. If one has a series of images at different exposures and a reasonable model of the camera's response curve, one can generate a composite image by using the parts of each source image which are in the good part of the curve. So, in that landscape picture, a very dark exposure can be used for the bright parts of the scene - clouds, for example - while a bright exposure yields low-light details. By mixing them together, the HDR algorithm can produce an image with full sensor resolution across a much wider dynamic range.

As an example, consider the photograph below, taken from your editor's dining room:

OriginalHDR
[Original] [HDR]

(Larger versions of the images are available.) In the original, parts of the plant in the foreground are entirely lost in the shadows. Meanwhile, the breathtaking view of Colorado suburbia (with mountains in the distance) is washed out entirely. The HDR version brings all of that detail back.

HDR is not applicable to all situations. It has a tendency to turn people into cartoon characters. Beyond that, the need for multiple exposures generally implies setting up a tripod and taking some time for the entire process. It is thus not well suited to changing scenes, sports photography (though baseball, perhaps, can be expected to stand still for the requisite time), etc. It can work well for relatively static scenes: landscapes, buildings, the SCO case, and so on.

Most of the people playing with HDR seem to be using proprietary plugins for a proprietary image manipulation program running under a proprietary operating system. That is, needless to say, not your editor's preferred mode of operation. Thus your editor began a search for tools which would perform HDR processing under Linux. It turns out that there are a few such tools around; there is no need to use proprietary software for this task.

HDR generation

The first step is to look for a way to represent HDR images - normal image formats are not up to that task. Linux.com ran a reasonable article on HDR formats late last year; the end result appears to be that the OpenEXR format is the way to go. The OpenEXR package comes with the libraries needed by other applications and the deeply painful "exrdisplay" image viewer. The pfstools package adds a set of pipeline-oriented tools for working with HDR images; it is a necessary part of any HDR hobbyist's toolkit.

Next, one should come up with a set of source images. Ideally, these images are taken with a tripod-mounted camera and cover a range of at least two f-stops above and below the nominally "correct" exposure. Varying the exposure time is preferred over changing the aperture; if nothing else, this ensures that all of the images will have the same depth of field. One can start with images taken without a tripod, but it will be necessary to register them before continuing. Your editor did not get into that aspect of the task; tools like hugin and hdrprep can be used for this job. These tools may be a good topic for your editor's attention in a future article. One can also apply HDR techniques to a single image, especially if it is in the camera's raw format, but multiple exposures give much better results.

With the images in place, one can look at combining them into an HDR image. This is a two-stage process (two user-visible stages, at least): creating a set of response curves and using them to map the images together into a single dynamic range space. The response curves are a mapping between some sort of real-world light levels and the resulting sensor values on all three color channels. When combined with information on the relative exposure times of two (or more images), the response curves allow the HDR program to map pixels from all of the images into the same space. The response curves can be generated directly from the source images; they don't normally change, so they can be saved and reused later.

[Cinepaint windows] The first HDR-generation tool to look at is cinepaint, once known as "Film Gimp." This tool is a fork of the GIMP which is aimed at use by movie studios; its floating-point image data support makes it useful for HDR processing as well. The generation of HDR is done with the "bracketing to HDR" plugin which is, happily, packaged with the cinepaint source distribution. There is a detailed explanation of what this plugin does and how to use it. Be warned that it makes for somewhat difficult reading - and it would even if it weren't originally written in German.

The good news is that actually using this plugin is easy. One selects "bracketing for HDR" from the File->New from menu, then selects the set of source images from a simple dialog. The plugin will then import them. There is no provision for obtaining the relative exposure information from the image files themselves; instead, the plugin sorts the images by brightness and applies an assumed (adjustable) exposure difference between them. It attempts to feed each image to dcraw for decoding, but your editor was not able to get raw images to work despite the fact that dcraw supports his camera just fine; it looks like the raw import plugin was written for an older version of dcraw. That problem is likely to be easily overcome; your editor just didn't want to spend much time on it. So TIFF files were used instead.

Once the images are in, the user can check the exposure values, then hit the "compute response" button. That yields the two plots shown in the screenshot. By messing around with the buttons, one can look for the reference image which yields the smoothest set of response curves - or one can just accept what the plugin does by default. Then a click on the "generate HDR" button creates the final product, which can then be saved out in the OpenEXR format.

Your editor set out to take some amazing pictures for this article. The area in which your editor resides is widely held to be beautiful, but, frankly, Colorado is not at its best in early March; perhaps this article should have been written in June. Nonetheless, the effort was made. Below is a rather mediocre shot of the Boulder foothills in original and HDR (with cinepaint) forms (larger versions available).

OriginalHDR
[Original] [HDR]

The HDR image above shows a halo effect (the bright sky above the mountain) which is characteristic of some tone mapping algorithms; we'll get into tone mapping shortly.

An alternative approach is PFScalibration, a command-line HDR generation utility based on pfstools. These tools work as a netpbm-like pipeline; their use requires a fair amount of typing, though much of the work can be scripted. The steps are the following:

  • Run jpeg2hdrgen to generate a description file for the source images. It reads the EXIF information from the source files to get the relative exposures and outputs it in a simple file. There is a dcraw2hdrgen tool as well, but the subsequent stages in the pipeline are not able to work with raw files. Your editor suspects that TIFF files could be used by creating the hdrgen file by hand, but the whole process seems to be intended for use with JPEG files. A lossy file format is not the most auspicious starting point for somebody interested in high dynamic range imagery, but that's how it is.

  • The pfshdrcalibrate utility can then be used to create a set of response curves; gnuplot can be used to visualize them. This process can take some time (it's significantly slower than cinepaint), but the resulting file can be saved and reused with different images in the future.

  • Another pfshdrcalibrate run then uses the response curves to create the HDR image. Piping the output into pfsoutexr generates an OpenEXR file.

Here's an example generated from a series of pictures of your editor's dungeon office (larger versions):

OriginalHDR
[Original] [HDR]

As a general rule, HDR images generated with cinepaint and PFScalibration tend to look identical. The generation of HDR is not where the real magic lies, so the results should be close.

[qtpfsgui] For those who don't like command-line HDR processing, the qtpfsgui utility may be worth a look. It is a graphical wrapper around PFScalibration based on QT4; it handles both HDR generation and tone mapping. On the HDR side, it puts up a file selection dialog for the source images followed by the "HDR creation wizard." The user is asked to select a "creation configuration," from a list of configurations helpfully named "Configuration 1" through "Configuration 6". The advice to stick with Configuration 1 was hard for your editor to ignore; simply hitting "next" generated the image.

Said image appeared in a display window; like exrdisplay, this window can only show the image in full resolution. Your editor, lacking a 7 megapixel monitor, was thus unable to view the entire image at once. Even worse, qtpfsgui is one of the family of (generally KDE-based) graphical tools which feels the need to implement its own window manager. The display window lives within the larger qtpfsgui window; it cannot be resized with the usual shortcut your editor is used to. In summary, qtpfsgui gets the job done, but writing a simple script around PFScalibration seems like an easier way to go.

Tone mapping

While the tools above will generate a fine HDR image, one problem remains: the dynamic range in that HDR image far exceeds the range of your editor's monitor (or printer). Turning that image into something which can be displayed requires a step called tone mapping. This is where the serious magic comes in: somehow the vast amount of information in the HDR image must be scaled back in a way which does not compromise the image quality that was the whole point of this exercise in the first place. Several tone mapping algorithms exist, and most of them have a number of mysterious knobs to tweak. While the generation of HDR can be mostly automated, tone mapping inherently requires experimentation and human judgment.

The bulk of the action appears to be in the pfstmo package, which implements several tone mapping algorithms as separate, standalone filters. One can use pfstmo with the rest of the pfstools package to construct pipelines which generate tone-mapped images. Given the iterative nature of the task, however, it would be nice if there were a better way.

[qpfstmo] That better way is qpfstmo, a Qt-based graphical interface to pfstmo. The interface feels a little clunky at times, and it would sure be nice to have some online documentation on what the various parameters do, but qpfstmo does what is really needed: it lets the user play with tone mapping algorithms and compare the results. A small image size can be used for trying out algorithms and parameters - a real time saver, since some of the algorithms can take a long time on a full-size image - and multiple versions of the image can be on the screen at once. When a final configuration is found for a given image, it can be generated in a larger size and saved in any of the usual image formats. When applied to a large image file, this step can be rather hard on the hardware; your editor discovered that 1GB of memory was not really enough.

[qtpfsgui] The qtpfsgui tool mentioned above has the ability to drive pfstmo as well. It is, in fact, clear that this tool shares a lot of code with pfstmo. The interface is far less friendly, however: everything happens within the One Big Window and it does not appear to be possible to see the results from more than one algorithm at the same time. It resets the display image size every time the user changes algorithm. One assumes that this (fairly new) tool will improve over time. For now, though, qpfstmo seems like a much better way to go for tone mapping control.

A different set of tone mapping operators is supplied with the exrtools distribution. Your editor tried them all; each one is a cumbersome, multi-step process. It can take a long time to process an image, only to find that the parameters need quite a bit of tweaking. The tools seem like they will do quality transformations, but they just cry out for a qpfstmo-like interface which allows experimentation with smaller-size images and comparison of results. For what it's worth, here's a shot taken from the hill above your editor's house mapped with the exrtools non-linear masking method:

OriginalHDR
[Original] [HDR]

See the larger versions for more detail. Doubtless one could get good results from these tools with enough effort, but your editor found it easier to get quality images with psftmo.

Conclusion

For the generation of HDR images, your editor found cinepaint to be faster and simpler to work with. This does not count, however, the long and frustrating experience of building the HDR plugin on a Fedora Rawhide system; one gets the sense that the plugin's author uses a rather older, less picky version of g++. Longer-term, however, the PFScalibration suite may prove to be the way to go. It is far more compact and easy to install on a new system; why lug the weight of cinepaint if one is not going to use its other features? A bit of scripting will easily turn PFScalibration into a single-command HDR generation tool.

It's worth noting that there are a couple of other HDR generators for Linux out there. MakeHDR is where a lot of it started; one of its authors is Paul Debevec, who did much of the early research in this area. The code was last touched in 1999, however, and it comes with a "educational purposes only" license. One can also look at HDRgen, but it is a binary-only, free-beer tool. Your editor did not actually try either one of them; given that the free tools do the job so well, there didn't seem to be any point.

For tone mapping, pfstmo (and qpfstmo) are the best tools at this point. It is hard to be entirely satisfied with the state of the art in this area, though. Tone mapping will always be an exercise in compromises, so it's not surprising that the results are rarely perfect. There is likely to be room for improvement - in both the algorithms and the interface to them - for some time to come.

As is the case in many areas, Linux has the tools one needs to play with high dynamic range imagery. One just has to work a little harder to get started than on some other systems. HDR has found its way into your editor's photographic toolkit; look for the results in the reporting from some conference in some exotic part of the world. When playing with this stuff, your editor is far from grumpy.



(Log in to post comments)

The Grumpy Editor's guide to HDR with Linux

Posted Mar 14, 2007 18:35 UTC (Wed) by johnkarp (guest, #39285) [Link]

I was not expecting the Grumpy Editor to cover digital photography! Quite
a pleasant surprise. It seems there is little he can't do....

(I personally don't like the output of tone-mapping algorithms, its always
garish, especially the ones popular on Flickr. I think we'll have to wait
for LED-based or other next-generation large-gamut displays in order to
see the full benefits of HDR.)

The Grumpy Editor's guide to HDR with Linux

Posted Mar 14, 2007 18:41 UTC (Wed) by k8to (guest, #15413) [Link]

It would be really useful if someone familiar with the various tools could comment on the tone mapping in psftmo versus other available tools. Even proprietary ones.

For example the default Adobe Photoshop tool seems to produce very fakey-looking (painting-like) results almost all the time, and not prone to creating "realistic photo" type images. Photomatix is much further in this direction with almost completely automated creation of completely silly results.

One semi-pro photographer I know (not a nonsense term, he is actively retained and paid for his work, sometimes) has developed a happy relationship with FDRtools, although I haven't grilled him on the details.

Personally I would like to know if recommending windows and mac based photographers look at psftools is a good idea.

The Grumpy Editor's guide to HDR with Linux

Posted Mar 14, 2007 22:21 UTC (Wed) by dhess (guest, #7827) [Link]

For what it's worth, pfstools is sponsored by the Max Planck Institute, who've done lots of HDRI-related research and presented many times at SIGGRAPH and Eurographics. pfstmo, specifically, is maintained by Grzegorz Krawczyk of MPI. The pfstmo homepage explains exactly which tone-mapping algorithms it implements, along with a convenient URL that provides more information about each technique

If you'd like to get a feel for what pfstmo's operators can produce, there's a comparative gallery here:

http://www.mpi-inf.mpg.de/resources/tmo/NewExperiment/TmoOverview.html

As far as I can tell, all of the operators that produced those images are available in pfstmo, except for Greg Ward's, which is probably available in the Radiance package (free for non-commercial use), if not in pfstmo. The gallery also links to a short paper which describes the parameters used to produce the images.

As for recommending pfstools to photographers, I guess that depends on whether they're comfortable with command-line tools and possibly building their own packages (pfstools is not available in MacPorts, for example). If they are, then the availability of high-quality gratis tools is always a good thing, right?

Kudos to MPI for making both their HDR research and their code freely available, by the way. It's a great set of resources for people interested in HDRI.

The Grumpy Editor's guide to HDR with Linux

Posted Mar 14, 2007 22:28 UTC (Wed) by dhess (guest, #7827) [Link]

There's another (incomplete) gallery comparing various TMOs, with links to more info for some of them, here:
http://www.cgg.cvut.cz/~cadikm/tmo/
It looks like a work in progress, but it already has some useful information.

The Grumpy Editor's guide to HDR with Linux

Posted Mar 15, 2007 5:06 UTC (Thu) by k8to (guest, #15413) [Link]

Thanks for the pointers. Probably as close as I'll get without doing the experimentation.

The Grumpy Editor's guide to HDR with Linux

Posted Mar 14, 2007 18:41 UTC (Wed) by tjc (guest, #137) [Link]

Ideally, these images are taken with a tripod-mounted camera and cover a range of at least two f-stops above and below the nominally "correct" exposure.
This seems like a good application for autobracketing: http://en.wikipedia.org/wiki/Autobracketing

Once upon a time I had a Canon EOS that did this, although at the time it didn't seem to be very useful.

The Grumpy Editor's guide to HDR with Linux

Posted Mar 14, 2007 18:47 UTC (Wed) by k8to (guest, #15413) [Link]

Eventually, I suspect camera makers will figure out how to take special HDR digital photos, where the CMOS activation numbers are simply allowed to spread across a very wide range over time. Of course the processing is the bigger hassle, but it may allow capturing such images with a lower net exposure time.

Autobracketing

Posted Mar 14, 2007 18:48 UTC (Wed) by corbet (editor, #1) [Link]

My camera (a Sony DSC-V3) will do three-exposure bracketing. For HDR, though, I've found that it helps to have a wider range of exposures than the camera provides.

What would be nice is if the camera could do the entire HDR technique internally, giving a raw file with the full range.

Autobracketing

Posted Mar 15, 2007 0:08 UTC (Thu) by dhess (guest, #7827) [Link]

My camera (a Sony DSC-V3) will do three-exposure bracketing. For HDR, though, I've found that it helps to have a wider range of exposures than the camera provides.

I'm no expert, but here's what I do for HDR shooting with a Nikon D200.

The D200 has an exposure autobracketing feature which supports, among other, less useful things, 3-, 5-, 7- or 9-exposure brackets. In this mode, the largest exposure increment that the camera provides is +/- 1 stop, so you can capture at most 9 stops of range in a single exposure autobracket. When I want to capture an HDR sequence, I use the feature to take either 5 or 9 exposures with a +/- 1 stop exposure increment. In my experience, 3 exposures at -1, 0 and +1EV are too few to get good HDR results.

I use 5 exposures for quick-and-dirty shots in daylight when the subject matter isn't worth breaking out a tripod, but there's some nice light, and the final image will benefit from a little extra dynamic range. I take these shots by hand, but I've learned to remain pretty still, and the results are usually decent. If there's too much shake, I've still got at least 3 exposures to choose from as an LDR image (sometimes the +1 and/or +2 exposures are too blurry to be considered). This approach doesn't work too well for nighttime shooting :)

I use 9 exposures for "money shots." I almost always use a tripod for these sequences. Sometimes, e.g., for strong daylight scenes or well-lit nighttime scenes, I'd prefer to go beyond +/- 4EV, but that's as far as I can go with the D200's autobracketing feature. I'm not patient enough to shoot, adjust the exposure manually and repeat, nor am I skilled enough to keep the camera from moving while I'm adjusting the controls, even on a tripod. I recently bought a Nikon MC-36 remote control for the D200 under the assumption that I could adjust the exposure from the remote, but it doesn't work in a way that's useful for HDR shooting: for some bizarre reason, the MC-36's exposure time resolution is in whole seconds! I guess I could try building my own. I haven't done any research about the 10-pin remote connector on the D200, but I'd be surprised if Nikon publishes any information about the protocol. Maybe it's time to start a GNU movement for digital cameras.

What would be nice is if the camera could do the entire HDR technique internally, giving a raw file with the full range.

Don't hold your breath. The OpenEXR team has tried on numerous occasions to engage various camera vendors about supporting HDRI, but each time we've gotten nowhere. We're small and pretty niche, so we're easy to ignore -- no real surprise there -- but I've heard that Adobe and even Microsoft have trouble penetrating the monolithic corporate giants that are the major camera manufacturers. It's hard to know where to begin talking to them about the idea.

I think it's going to be a slow road. With the exception of Foveon and their X3 CMOS device, which hasn't gained any traction in the market, nobody appears to care about LDR color quality, let alone HDR. Unfortunately, the current trend amongst camera manufacturers is completely focused on more megapixels. More pixels means more heat and longer read-out times. More heat means more noise, i.e., lower color fidelity, especially with long exposures. Longer read-out times means fewer exposures in a given amount of time, which increases the chances of camera shake or moving subject material. We're going the wrong direction.

But articles like this one help by getting the word out! I hope that most people, once introduced to HDR, would agree that it makes a bigger difference in photo quality than more pixels, in today's post-10 megapixel world, anyway. Getting "pro-sumers" to clamor for it is the fastest way to getting HDR support in our cameras.

The Grumpy Editor's guide to HDR with Linux

Posted Mar 14, 2007 18:54 UTC (Wed) by kolloid (guest, #25282) [Link]

Sorry, but I hate HDR images. They're so artificially looking. Like a bad screenshots from 3D game.

The Grumpy Editor's guide to HDR with Linux

Posted Mar 14, 2007 19:58 UTC (Wed) by jwb (guest, #15467) [Link]

Many HDR images look stupid, but that is true of nearly any photograph. HDR is a good tool in the hands of a good photographer.

Randomly selected example:

http://flickr.com/photos/darrenstone/419760365/

The Grumpy Editor's guide to HDR with Linux

Posted Mar 15, 2007 5:13 UTC (Thu) by k8to (guest, #15413) [Link]

Hmm, I think even that one goes a bit over the taste line for me. There's something about the wide angle composition and bright flat cloud line that says "texture mapped" about the clouds to me.

I think it will take a while for people to establish a taste baseline in HDR. I really like the results my friend Dan produces, but they're not at all consistent. That is, he uses HDR in different ways for different purposes at different times. They're sometimes quite manipulated, and sometimes quite naturalistic, but I never get that "what videogame is this from" feeling.

The Grumpy Editor's guide to HDR with Linux

Posted Mar 14, 2007 20:40 UTC (Wed) by tjc (guest, #137) [Link]

Sorry, but I hate HDR images. They're so artificially looking.
They are somewhat artificial, but ordinary photographs are also a poor representation of the physical world. It's just that we've been looking at them for so long that they seem normal.

The Grumpy Editor's guide to HDR with Linux

Posted Mar 14, 2007 23:18 UTC (Wed) by allesfresser (guest, #216) [Link]

That's exactly correct. HDR has the potential (and this is why I want to try it) to enable the output to resemble what the human eye sees in a scene, since it has a far greater contrast ratio than any camera. But as you said, people tend to not notice this, and prefer photos which are comfortably within the limitations of the popular cameras of the day.

Pro photographers have been doing this sort of manipulation for years--it's called dodging and burning. Ansel Adams was an expert at it; that's why his images look so dramatic (and "artificial", sometimes, as the poster said above) because he intentionally tried to make the print look as *he saw* the scene, rather than what the camera captured. I love his famous line about the negative being the "orchestral score" of the photograph, and the print being the actual "live performance"--one comes from the other, but the life is breathed into the score by the performing artist. (Adams was also a trained classical pianist, btw.) So it is with the negative and the print; the camera's action of capturing light and affecting film or sensor is only the beginning of the piece of art, not the end. I get the feeling that Adams would have just adored the tools we have these days--so much easier than all that nasty mucking about with chemicals in the darkroom. :)

The Grumpy Editor's guide to HDR with Linux

Posted Mar 15, 2007 14:48 UTC (Thu) by tjc (guest, #137) [Link]

Pro photographers have been doing this sort of manipulation for years--it's called dodging and burning.
I've done a lot of that! Unlike our editor, I had a 5-year career in the photo industry before I made it to engineering school.

In some cases it is similar to HDR, but it's tricky because everything happens in realtime, and it's hard to get the same results on multiple exposures.

The Grumpy Editor's guide to HDR with Linux

Posted Mar 15, 2007 13:19 UTC (Thu) by jond (subscriber, #37669) [Link]

I agree with you, for most of the HDR stuff. I was quite amazed when I first saw it, but that wore off quickly. I think that, when applied in a way that you don't immediately think "this was done with HDR", it can be very effective: in the same way you don't notice the best CGI.

The Grumpy Editor's guide to HDR with Linux

Posted Mar 14, 2007 19:20 UTC (Wed) by orospakr (guest, #40684) [Link]

So *that's* how professional photographers make their photos look like that.

I have always wondered what the secret was. :D

professionals

Posted Mar 15, 2007 19:18 UTC (Thu) by qu1j0t3 (guest, #25786) [Link]

Uh no, real professionals shoot with film. (Flames to /dev/null :)

I don't see the point of this article. Is it to make mediocre digital shots look worse? There's a reason why highlights are blown out. Tone mapping is a Flickr freakshow, as other commenters also opine. This article would have been better devoted to: 1) buying a good lightmeter and learning how to use it; 2) photographic exposure from first principles through to Ansel Adams and what HDR is supposed to be used for, also see Debevec's original '90s research. Then you can write an article like this without looking foolish.

The Grumpy Editor should, in this Grumpy Commenter's opinion, avoid photographic topics in future.

professionals

Posted Mar 18, 2007 20:19 UTC (Sun) by ekj (guest, #1524) [Link]

There's a reason why highlights are blown out.

Sure. Because film and CMOS-censors share a weakness: they are incapable of capturing a high dynamic range.

Your eye can easily look at the face of your loved one, standing in a shadowed room, infront of a beautiful sunlit panorama, and enjoy the entire scene.

No camera, film or digital, can capture details *both* in the shadowed face of a person *and* on the sunligth snow-covered mountains outside.

So yes, there's a reason. What is your solution ?

professionals

Posted Apr 7, 2007 2:59 UTC (Sat) by ringerc (subscriber, #3071) [Link]

In my view the real solution goes way beyond sensors.

We need devices that can display decent dynamic range, and wider adoption of file formats with more colour representation that's a better match for human vision (eg exponential colour). Without that, we're going to have a very hard time producing, working with, and displaying true HDR (as opposed to flattened-to-low-DR like in the article) images.

The Grumpy Editor's guide to HDR with Linux

Posted Mar 14, 2007 20:13 UTC (Wed) by dhess (guest, #7827) [Link]

The OpenEXR package comes with the libraries needed by other applications and the deeply painful "exrdisplay" image viewer.

I'm one of the OpenEXR developers. I have to apologize for exrdisplay. Yes, it's crap. On the other hand, we never intended it to be the canonical method for displaying OpenEXR images. Its main purpose is to provide source code for a "real" application which demonstrates how to use the OpenEXR libraries.

I've been waiting for someone to write a better viewer for years, but so far nobody's been up for it -- not a free software version, anyway.

While I'm on the subject, another thing that OpenEXR needs is a browser plugin. It would be really cool to upload OpenEXR images to Flickr (or your own Gallery installation, or whatever) and dial the tonemapping to taste. Any volunteers?

hugin and panotools

Posted Mar 14, 2007 20:34 UTC (Wed) by ehovland (subscriber, #2284) [Link]

Hugin would probably make a good section worth of articles. The Hugin toolchain includes autopano-sift, panotools and enblend. Autopano-sift is probably one of the most useful applications written in Mono/C#. Panotools is a suite of code that because it is GPL'ed has been able to live on in spite of software patents. Hugin and enblend have been able to thrive because the developers are panoramic nuts themselves and seriously need to scratch the itch.

hugin and panotools

Posted Mar 15, 2007 8:29 UTC (Thu) by mcfrisk (guest, #40131) [Link]

Yes, hugin and others are worthy of a number of articles to show just a few basic usage patterns. Here's one I scratched together, sport sequence panoramas:

http://kapsi.fi/~mcfrisk/skiing/kuvia/2006/isosyote/

-Mikko

hugin-addict

Posted Mar 15, 2007 8:37 UTC (Thu) by bkoz (guest, #4027) [Link]

i now find myself taking 25% overlap pictures all the time. HELP!!!

;)

The Grumpy Editor's guide to HDR with Linux

Posted Mar 15, 2007 9:05 UTC (Thu) by stevan (guest, #4342) [Link]

Thanks for the article, and for a suggestion to expand a hobby which, like
you, has recently been revived by digital options. I find Free tools for
digital photography more than adequate for my needs, and my wife, recently
recovered from her Mac devotion to become a Kubuntu user, finds digikam
and other tools work better for her than iPhoto. I also find that photo
magazines assumption that the "only" tool for working with images is
Photoshop, with virtually no mention of Free tools at all. How do we
educate them?

S

The Grumpy Editor's guide to HDR with Linux

Posted Mar 15, 2007 9:55 UTC (Thu) by modernjazz (guest, #4185) [Link]

> It can work well for relatively static scenes: landscapes,
> buildings, the SCO case,

LOL!

The Grumpy Editor's guide to HDR with Linux

Posted Mar 15, 2007 12:21 UTC (Thu) by MortenSickel (subscriber, #3238) [Link]

Nearly killed me as well!

great little gem!

M.

The Grumpy Editor's guide to HDR with Linux

Posted Mar 16, 2007 10:18 UTC (Fri) by sitaram (guest, #5959) [Link]

not being familiar with baseball, I can't make out if the comment on baseball in the previous paragraph to this one was also tounge-in-cheek

But it definitely carries the stamp of "our editor", bless him!

:-)

Baseball is a little bit like cricket.

Posted Mar 17, 2007 13:49 UTC (Sat) by xoddam (subscriber, #2322) [Link]

> not being familiar with baseball, I can't make out if the comment on
> baseball in the previous paragraph to this one was also tounge-in-cheek

Baseball is to baseball-playing countries what cricket is to cricket-playing countries. It is marginally faster (games tend to start and end on the same day), but there is a comparable amount of careful rearrangement of fielders between balls.

If cricket is also unfamiliar, you are blessed.

The Grumpy Editor's guide to HDR with Linux

Posted Mar 17, 2007 15:59 UTC (Sat) by hingo (guest, #14792) [Link]

There is a Finnish flavor of baseball (pesäpallo, http://en.wikipedia.org/wiki/Pes%C3%A4pallo) which is the nr 2 summer sport in Finland and I also enjoy playing and watching it myself occasionally. With this background, when 7 years ago I for the first time visited the US, I was excited to see some of your baseball too. Luckily for me, there was a huge game on TV that night, NY vs Boston or something like that.

Apparently this was supposedly a great game too. The commentators were wild about the fact that the pitcher had thrown so many balls without a single batter hitting once and that he was about to break his personal best. Halfway through the game I fell asleep. Next day I read in the papers the game had ended 1-0. (Ok, so jet lag may have played a part too.) I must say, it did not quite live up to my expectations :-)

Next day my Finnish friend confirmed: Yup, this was a typical game and he has no idea why Americans are so excited about it.

[OT] baseball

Posted Mar 17, 2007 19:47 UTC (Sat) by roelofs (guest, #2599) [Link]

Next day my Finnish friend confirmed: Yup, this was a typical game and he has no idea why Americans are so excited about it.

Some of us Americans have no idea, either. ;-)

But it is pretty fun to play the game. There's nothing quite like nailing a line drive into the gap in left field...or, conversely, being the outfielder who snags it out of that gap (on the run) to save the game.

Greg

The Grumpy Editor's guide to HDR with Linux

Posted Mar 15, 2007 11:15 UTC (Thu) by scarabaeus (guest, #7142) [Link]

With regard to making cameras create HDR images: It would be interesting to see a camera that "inverts" the way pictures are taken. Instead of this way:
  • Expose the sensor to light for n milliseconds, then measure the amount of light
..the camera ought to work this way:
  • Expose the sensor to light. For each pixel, measure the time in milliseconds until the pixel is fully saturated
I'm aware that a sensor like this would be, er, "not straightforward" to build. Still: No blooming, automatic HDR, exposure can be corrected at a later time without quality loss... *sigh* (The above wouldn't really work; you need an upper bound on the exposure time, so you need to measure both the time and saturation => complex per-pixel logic => unacceptable resolution :-( )

The Grumpy Editor's guide to HDR with Linux

Posted Mar 15, 2007 15:08 UTC (Thu) by smurf (subscriber, #17840) [Link]

That method would also get you rather interesting effects when you photograph moving objects -- motion blur would be smaller for the brighter parts.

Still, at four transistors per bit, and a minimum of 12 bits per pixel and three MPixel, that's a whole damn lot of transistors. :-/

(That's the minimum level I'd personally like to have if I wanted to experiment with that kind of technology. 12 bits are sufficient, by the way, because nobody forces you to keep the clock rate constant.)

The Grumpy Editor's guide to HDR with Linux

Posted Mar 16, 2007 8:22 UTC (Fri) by wstfgl (guest, #42907) [Link]

> Expose the sensor to light. For each pixel, measure the time in
> milliseconds until the pixel is fully saturated

There's quite a bit of research currently in progress to develop cameras
that do just that (Google for "Address Event Representation Camera") - I
remember a couple of my engineering friends did their honours thesis on
the topic.

AFAIK, the technology is better than regular CMOS imagers in terms of
power consumption, but tends to have a lower resolution and dynamic
range - good for embedded devices, not so good for photography. Also since
the time available for each frame needs to be bounded, you tend to get a
lot of pixels 'squashed' against either the upper or lower bound of the
intensity range.

The Grumpy Editor's guide to HDR with Linux

Posted Jul 8, 2007 3:20 UTC (Sun) by ralatalo (guest, #46131) [Link]

Well... two (big) problems...

1) How do you turn these numbers into a picture? Essentially you are still measuring the brightness of each pixel, so if you have the following:

Light Level | Photons in Exposure | Millisecond till Fill
1 | 1 | 100
2 | 2 | 50
3 | 3 | 33
4 | 4 | 25
5 | 5 | 20
....
....
10 | 10 | 10
20 | 20 | 5
30 | 30 | 3 (or 3.3)
40 | 40 | 2 (or 2.5)
50 | 50 | 2 (really 2)
....
....
100 | 100 | 1

So, all you are really doing is inverting the brightness and you still have the same problems... How do you record a subject that is bright enough that it fills the buffer in under your measurable time? And how do you deal with a subject so dark that it takes more time that you can record in your counter.

The Jpeg and other formats use 8 bits to record brightness which gives 256 different levels, so no matter what else you do you will have trouble when you try to record the 257th level. Nikon's raw format uses 11 bits and I assume other raw formats use more than 8 as well. HDR uses 16 or maybe 32 bits which allow them to record a wider range. If the camera manufactures wanted to they could (and are) working on sensors which require less light which means that the could/will be able to measure more on the lower range and in turn could be make to record a wider range.

The second big problem is that is your scheme imposes an exposure timing on pictures. There would be so more fast shutter speeds to freeze action or show shutter speeds to blur movement.

The Grumpy Editor's guide to HDR with Linux

Posted Mar 15, 2007 13:56 UTC (Thu) by louie (guest, #3285) [Link]

Thankyouthankyouthankyou, grumpy editor! I've been wanting to do this for a while but not had the time to investigate doing it under Linux. I really look forward to playing with this to get some sunrise pictures of my local cathedral.

Some folks may also find the discussion on HDR with open source tools @ flickr to be useful. It focuses on pfstools.

The Grumpy Editor's guide to HDR with Linux

Posted Mar 15, 2007 20:04 UTC (Thu) by cjb (guest, #40354) [Link]

Here's an example of doing HDR in Gimp, with no other tools:

http://madprime.org/articles/2006/08/06/hdr-photos

- Chris.

The Grumpy Editor's guide to HDR with Linux

Posted Mar 17, 2007 12:28 UTC (Sat) by sstein (guest, #15028) [Link]

> Here's an example of doing HDR in Gimp, with no other tools:
>
> http://madprime.org/articles/2006/08/06/hdr-photos

This somehow works, but the results can not be compared to what is possible using cinepaint -> qpfstmo. There is a Gimp plugin called "Exposure Blend" (http://turtle.as.arizona.edu/jdsmith/exposure_blend.php) to automate what is described in the article you cited above. However, it seems more that the Gimp way just results in a little bit lighter image compared to the normal image taken by the camera.

Sebastian

The Grumpy Editor's guide to HDR with Linux

Posted Mar 16, 2007 16:38 UTC (Fri) by vondo (guest, #256) [Link]

While I'm always interested in seeing this digital manipulation techniques, there are two very simple ways to get shots with darker, dramatic skies without doing any post processing:

1) Use a circular polarizer
2) Use a graduated neutral density filter

Both are done in the field and have gotta take less time than this HDR stuff.

Cinepaint HDR plugin

Posted Mar 17, 2007 13:49 UTC (Sat) by csamuel (✭ supporter ✭, #2624) [Link]

The build of Cinepaint in Ubuntu Edgy (6.10) has the HDR plugin included,
for what it's worth.

Just remember to make sure you've enabled the universe repository *and*
its updates (to get the installable version).

The Grumpy Editor's guide to HDR with Linux

Posted Jul 23, 2009 0:25 UTC (Thu) by DomWatson (guest, #59744) [Link]

Great info thanks - really useful. On Ubuntu (Jaunty), I found that I just needed to install qtpfsgui (it was in the repos) and this covered the hdr and tonemap creation, I guess the project has moved on since this article was written.

To all the 'hdr looks fake' comments, I strongly disagree. It is easy to produce horrid looking HDR images and its abuse gives it a bad name - just like early (and even modern) use of green screens in movies was often horribly done.

Here's an example photo I edited using qtpfsgui and then gedit for further tweaking. I don't think its a stunning photo but I do think it shows a subtle use of hdr:

http://www.panoramio.com/photo/24776040

Made from a single shot in RAW format.


Copyright © 2007, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds