Category Archives: Photography

Pinholes

The most hyped solar eclipse of the century passed over the U.S. mainland today. I viewed it through a homemade pinhole camera. Pinhole cameras, made of a hole, a length of empty space, and an imaging surface, have only two adjustable parameters: size of the hole, and distance between the hole and the surface, which is a focal length of sorts.  How does tuning these two parameters affect our image?

Image Size

The angle subtended by the sun is about 0.5 degrees in the sky.  It’s simple to see that the size of the image of the sun (L) does not depend on the diameter of the pinhole, but rather only on the distance f in the image below:

IMG_8081

Figure 1. Pinhole geometry

For a focal length of 1 m, L should be about 9 mm.

Angular Resolution

A more interesting question is how the image resolution might be affected by tuning these parameters.  It might seem that a smaller pinhole should create a sharper image (Figure 2), but there is a limit.

image2

Figure 2. Ray optics picture.  A larger aperture yields a blurrier image, since it allows rays from different parts of the object to be mapped onto the same spot on the image.

The smallest spot that can be formed on the imaging plane is ultimately limited by diffraction.  By making the aperture too small, diffraction can rapidly blur the resulting image.  That effect is illustrated below.

image1.JPG

Figure 3. Diffraction introduces uncertainty measurement of the ray’s initial direction.

The resolution of our system is determined by the more dominant of these two effects, so,

CodeCogsEqn (3)

which has the following dependence on D and f:

 

pinhole_diameter

Figure 4a.  Making the hole diameter too small is much more detrimental to image quality than making it too large

pinhole_focal

Figure 4b.  Resolution improves with focal length increase.

A contour map showing the resolution as a function of D and f gives a clearer picture:

pinhole_contour

Figure 5. Contour map of resolution as a function of D and f.  Blue is better angular discrimination.  Plotted as log(theta in degrees).

Based on the calculations above, the pinhole should have a diameter of around one millimeter, and the focal length should be made as large as allowable.

Eclipse Test

I placed 3 pinholes, sized approximately 0.8 mm, 0.2 mm, and 0.4 mm, in order, on an aluminum foil.  Then light sealed one of our moving boxes (f ~ 1 m) from our recent move.  Two hole cut-outs for the eyes allowed me to look inside (and take photos).

 

The different light levels for the full sun images can be seen in the right panel.

As the eclipse started I whipped out a slightly better camera to document its progress.

066B8621

Figure 6.  Mirror image of the sun beginning its eclipse.

Zoomed in series of images of the eclipse progress in Boston, MA.  The most coverage we had was around 65%.

 

 

There are several design improvements I would make on the next iteration.  First, I underestimated the importance of being able to get very close to the projected image.  I would need to place the eye holes closer or baffle in a different way so that the viewer can approach the imaging surface.  Relatedly, since pinhole cameras suffer from minimal tilt distortion I would tilt the focal plane so that the image can be viewed more head-on.

See you in 7 years.

 

Advertisements

Valencia

Last summer I started a series of Photoshop tutorials where I reproduced Instagram filter effects in Adobe Photoshop. The idea being that if you want to master the suite of most important tools in Photoshop, completing the exercises here is enough to do that. At some point I ran out of steam. Or used Instagram less. Or took a break from photography. Or something. But I’m back!

Today: Valencia. Honestly, I only use about 1/3 of all the Instagram filters, and this is probably my favorite. I don’t like the ones that add texture, or try to do too much: contrast, levels, saturation, so. I like this filter, it’s soft, subtle, bright.

Here’s the picture, it’s my jasmine plant and it just started blooming like crazy:

iphone summer 2013 266 (2)

Here’s the Instagram version:

iphone summer 2013 265

Here are the layers I used:

layers

Untitled

And here’s my version:

iphone summer 2013 266

Commentary:

Yea, the crop is different. I went away from the square crop. And my picture is slightly less yellow in the highlights. Nothing too fancy here. Slight contrast enhancement, a bit of desaturation, a screening layer, once again, purple, but a bit more reddish purple this time. Set the shadows toward magenta, midtones green and blue, and highlights toward yellow. Added a yellow photo filter for the highlights. And a vignette just for the hell of it.

Cool. By the way, I started doing some commissioned work for the Graduate School at Harvard.

In Boston

They didn’t lock down Somerville. It’s one town away from where they set up the perimeter. So, although Suspect 2’s apartment was just to the south, in Inman Square, on the border of Somerville and Cambridge, here, half a mile north, people are walking their dogs. People are sitting on porches. The trash got picked up. Though about seven hours late.

Around 11pm last night, I got an alert about a potential situation at MIT. It was very similar to the alert about the gunman on campus a while back that put the school into lock down mode. That time, it turned out to be a hoax. But, pretty soon, this started looking like something different. “Shots fired”, this alert reported. Then, “officer down.” There was a robbery at the 7-11 in Kendall Square, was the next piece of information. Gunman on the loose, the advisory said, has fled. Avoid the Stata Center. Avoid Vassar St. About an hour later, Harvard sends out its alerts. They were of a different tone, reflective of the feeling of distance and relative safety at the time, perhaps. An incident at MIT, injuries reported, avoid the area, it said, no threat to Harvard. At this point, it was still an isolated incident. Posts started appearing on Facebook about “Oy this week.”

Then, a few minutes after midnight, word came that the officer died. The mood changed, then. Death, now that was something permanent. Members of the MIT community started changing their Facebook pictures to this:

Then, something like 15 police cars flew by my house, in a mad bevy, sirens and lights on. They were going south and west on Washington Ave. They were dispatching from the police station next door as well. Sirens continued.

All I thought I had to worry about this weekend was finishing my talk for the MIT group, how to weave that in before (during?) and after my camping trip. The original weekend plan was this: I would leave for the MITOC trip to New Hampshire Saturday morning at 4am, so I could see a concert Friday night at Symphony Hall. So Friday into Saturday, I wasn’t expecting much sleep. I wanted to stock up on some first, Thursday night. So I was thinking about going to bed when my friend writes me that there’s some crazy stuff going down in Watertown.

Grenades. Explosives. A carjacked vehicle.

Finally, that’s where it started coming together. What carjacker carries around IEDs?

The rest, is well-trodden territory. A 20 x 20 block area in Watertown has just now been fully combed by law enforcement and special forces, 15 hours later. One suspect is dead, apparently by gunshot wounds and a bomb vest. No sign of the escaped suspect, though. He has an automatic weapon, maybe grenades, but probably no bombs on him, except for what he’s wearing. About when the police decided to hold their perimeter and wait til daylight to begin their search, I fell asleep on my open lap top.

My mom called me at 7:30am to alert me of the developing situation. I discovered my school closing, the multi-city lock down, when I opened my email. Last night I slept two hours. Today I drank 3 cups of coffee.

Lovely day though. Looks like rain.

—–

69059_956527020028_1004528340_n
View downtown from the castle on Prospect Hill (Somerville, MA) – 3pm

 
 

71427_956534445148_1625483258_n
View towards Cambridge from the castle on Prospect Hill (Somerville, MA) – 3pm

A snowstorm in nauseating sepia

Some favorite things from the snowstorm of the past few days:

  1. My neighbors standing outside with their shovels in the blizzard chatting away. 8pm
    neighbors

  2. People wandering in the street like it was the apocalypse. 8pm
    snowstorm1
  3. But somehow The Independent was open. And packed. 8pm
    blizzard 064s
  4. Hometown Hamden, CT putting everyone else away in the snow totals?! 9pm
  5. Night looking like the day. 11pm
    c227b752726011e2ab9622000a9f1423_7
  6. Sidewalk trenches. 11am
    ebdebcaa731211e2a58222000a1fb810_7
  7. Union Square street dance party and snowball fight. 3pm
    photo4
  8. photo6

    photo8

  9. Snow tunnel? 3pm
    photo3
  10. More post-apocalyptic street-wandering and on skis. 3pm
    photo1
  11. Outside El Potro. 5pm
    photo7

Don’t much miss California. Honest.

Amaro

I had some fun in recreating the Amaro Instagram filter. This was my first time using the “screen” layer blending mode in a long time, and also the first time I’ve ever used a gradient mask on my curves/contrast adjustment layers, though it’s easy enough of a leap to make. This post I’ll talk a little bit more about the mechanics of Photoshop adjustment layers, and maybe some of the choices that go into choosing one filter over another for the same kind of basic effect.

This picture, taken yesterday between sporadic and alternating bouts of rain and sun, is the result of applying the Amaro filter to this original (cropped) iPhone photo:

First thing to notice is that Amaro is a cooling filter.  Though large portions of the image are allowed to be weakly yellow-tinted, it’s a yellow which is not added to the photo but rather what remains after the more vibrant green is subtracted.  Another thing you see immediately is that the center of the photo is very nearly overexposed, and highlights and midtones are flattened so that they are at almost the same brightness level.  This gives the photo a “washed-out” look which does not extend to the outer edges of the photo.  There, the opposite is true, the pixels are darker than in the original image.  This already gives us a lot to do:

The bottom layer Curves palette takes care of brightening and flattening the center of the image.  Looks like this:

The layer mask (on the right of the curves icon) is there to ensure that only the center region of the image is affected.  The layer mask is a draw layer, meaning you can edit it with paint brush, paint bucket, etc, which interprets the pixel darkness values as the degree to which the adjustment layer parameters will be applied to the pixels in the layers below.  Black means completely masked (these pixels will not see this adjustment layer); white is completely unmasked (these pixels will be affected); shades of grey are in between, as you would expect.  I used the radial gradient tool to draw a white-to-black mask.

To take care of the darker pixels near the photo’s edge, I used a second Curves adjustment layer, this time with a more or less inverted layer mask (white outside; black inside).  The adjustment is an overall darken.

The overall brightness adjustments over all RGB channels over with, I moved on to manipulating the image color.  I started by guessing a dark bluish-purple(  ) and filling an entire drawing layer with it.  Now instead of exclusion, as I did in the X-Pro II manip, I chose the blending mode “screen”.  The “photo filter” tool actually produces a very similar effect to a screening layer with some subtle differences.  The “photo filter” tool mimics a physical filter placed over the lens of the camera: when photons from a scene pass through a physical filter, some are reflected or absorbed, with only photons of the “allowed” color passing through, that is to say, in our case, the Photoshop filter works by subtracting non-purple, not by adding purple.  Photoshop allows you to adjust the opacity of this filter, but no matter what you try, there is no way to add color to the darkest shadows this way, as those are regions with very little light to begin with.  The screen appears to work differently.  It’s like a mesh of color placed over the image.  And the shadows actually end up brighter and tinted.  Since this was the behavior I preferred, I used screen as opposed to photo filter.

All the layers together, plus a hue/saturation layer to slightly desaturate the image:

And the product of all that:

Side-by-side comparison: