* Unfortunately this post will provide more questions than answers, but they need to be asked
On Sunday I went to the beach and didn’t intend to shoot but it was such an absolutely perfect day with blue skies, white puffy clouds and intense emerald colored water, I knew I just had to shoot even if it was mid-day. It was that pretty.
Normally I wouldn’t shoot mid-day for a number of reasons all us photographers know. But I could see with my eyes how beautiful it was and it was surely something easily captured by my camera. To cut down on the glare of the beautiful emerald water, I placed a B + W circular polarizer on my 17-40 lens in order to keep the glare of the sun on the water out of the shot which actually helps to tame the dynamic range of the shot by not have those specular highlight. One look therough the viewfinder and I saw pure magic. The scene in front of my camera was georgeous.
Normally I don’t go right home and download my images, It may wait till late that night or the next day to review them. But I had the time and I was anxious to review the images that just looked so beautiful through the view finder.
So I downloaded them and opened the images and….What? are you @#$%iung kidding me? That’s what I got?! REALLY?!
I mean, it’s OK ,it’s visually and compositionaly interesting..but that’s NOT what I saw.
The clouds are dingy and dull, The sky is a Gray/blue mix. Where is that emerald water? Why are the rocks almost monotone and desaturated looking. I hate it . I would never show this shot to anyone.
So what DID the scene look like? Well… This
Now on this day I didn’t set up to shoot HDRs. I assessed the conditions and saw no need to and if there isn’t a need I won’t. Was it a full dynamic range scene? Absolutely. But was it beyond what should have been capturable with a single exposure? Absolutely not. So I shot one single exposure of the scene. So how did I end up with the second shot above? Using a little talked about feature of Photomatix pro. The Single Exposure Pseudo HDR with Tone mapping. In the past I haven’t been that fond of Pseudo HDRs. Whether it was making different RAW exposures and combining those or any of the number of Pseudo HDR programs out there, Topaz, Lucis etc. I believed if you wanted to do it. Just do it right, take the exposures.
But in this case I didn’t want to increase the dynamic range at all. The camera captured it fine and is confirmed by the histogram. The exposure is fine too, within LESS than 1/3 of a stop of perfect. It’s not the range from bight to shadow that is all wrong, it’s what’s in between. But by Tone Mapping the single image in Photomatrix Pro I was able to correct for what the camera just could not get right. But WHY do I have to?
This is something I have seen for a good portion of my over 40 years of shooting. In fact in the Mid 90’s I stopped shooting for a full year because of never being able to capture with my camera what I saw. But at the end of the year I saw what had become the problem; The popularity of 1 Hour Photo labs. With their automated processing and no real person looking at the images to determine the correct exposure of the print. That automation took away from what was shot and what the final print looked like. This became evident to me when I shot a scene of Christmas Light right at dusk and the prints I got back looked like I shot it mid-day. Was I that off? Fortunately for me the lab that developed them, printed on the back the adjustments they made to the image, that was when the light bulb went off when I saw the huge amount of Exposure compensation they applied to that print. I had a perfect exposure. They messed it all up. So I retuned to shooting.
But even with that and even with better labs that were more hands on with quality technicians. There still were problems. I still often would see beautiful things through the viewfinder and never be able to get them on film. I had and still do have high hopes for digital. Eliminating that person at the photo lab that knew nothing about the shot in the development of it was a hugely freeing thing for me. I could make sure that what I wanted was what was printed. But still what I shot was not what I had straight out of the camera. And I blame this on the engineers that designed digital cameras because I believe that the basis they used for development of a digital sensor was based on film for the model to how it should react and NOT on our eyes as a model
Now I do realize that the above statement is not wholly true. We do know that part of the sensor design was based on the fact that the human eye is more sensitive to Green than it is Red or Blue. This is why there are twice as many green photosites on a sensor as there are Blue or Red. I also fully realize that no camera sensor nor film is as sensitive to dynamic range as the human eye is. I get that, I know that. But there is something inherently wrong in the in between, the way our eyes place brightness value and color values in everything in between. Is it because we try to make sensor linear when in fact our eyes are everything but?
I don’t know the answer to these questions. I’m neither a scientist nor an engineer. I don’t know much about how our eyes physically work nor do I know a lot about optical design nor sensor or film science. I just know something isn’t right.
As I sat on my couch this morning contemplating this article. I looked at the morning light coming into my house Lighting up my Family room and my living room. My camera was beside me as was my laptop. Could I capture an image at just that moment, immediately load it into my calibrated laptop and would it look like what was before my eyes that moment?
Here lets find out.
Was this what I saw?
No, there is barely any color in the walls and they are a Sandstone Tan
The Red Artwork on the far wall, it’s dark and doesn’t look anything like what I am looking at.
Is there a red pillow on that couch? I clearly see it with my eyes. where did it go.
The sun is really showing the color of the wood in the TV stand to my eyes. It barely looks anything but black except for the foot.
Where did everything I see right now go? There’s nothing wrong with the exposure, the dynamic range again was captured except for some slight blowout on the floor molding from the sunlight. There was plenty of light coming into the room. Why is it NOT what I see?
But yet a quick and I mean very quick run through Photomatix Tone Mapping and I got this:
This looks EXACTLY like what I see before me. Why does my camera get this sooo wrong yet Photomatix get’s it so right. and again. I don’t think it’s a Dynamics problem. The histograms for these two images are very similar in each end of the spectrum. It’s what’s in between that just is handled differently
So Like I said in the begining, this post really doesn’t provide many answers if any. It is more just questions. The only thing I know is that I WILL use single image processing in Photomatix much more often. IF I know that what I got just isn’t what was there at least I have some recourse to the end result