# Geometry and The Moon

Please do not run away. We are about to use adult language here. For example we will be using the word trigonometry. Still here? Good.  Here is a very pedestrian looking lunar eclipse photo taken with a 280mm lens*, cropped.

Very Ordinary Photo of the Lunar Eclipse with the planet Uranus in the lower left.

This past lunar eclipse several of us put our heads together to try to come up with a more creative photo than the one above. We had a trigonometry problem, however. On the West Coast the last moment of totality occurred at 4:24 AM PDT. We were brave enough to be out at any time of night – even if it meant extreme sleepiness in our day jobs but our problem was that the lowest the moon would be in the sky at the last bit of totality was 32.6 degrees above the horizon. We determined that angle using Stellarium, by the way. Unfortunately there is pretty much nowhere to go to get a nice large moon near an interesting object when the moon is almost 33 degrees high.

Wait: Why do we want the moon and the object to be similarly sized? Here is why… we want the moon to be noticeable like the Fantasy version below, not merely “present” like the real photo on the right. Even bigger would be better, right!?

Notice above right (Reality) and below how tiny the moon is compared to the building in the foreground?  Indeed, if you see a photo taken from anywhere on the West Coast where the eclipsed moon is significantly lower in the sky or larger than shown against foreground, you know it has been “photoshopped“.

Plan C: San Jose City Hall Eclipse Sequence

In short, it is nigh impossible to get the large moon effect with an altitude (angle) of 32 degrees here is why:

Calculating the Angles

Just how far away do we need to be in order to get the moon the same size as an object of interest:

114.6 x object size

In other words, an object that is one foot tall, requires us to stand 114.6 feet away to make the 1/2 a degree angular size of the moon the same angular size as that 1 foot tall object.  The number “114.6” is from this calculation:

1 / TAN (0.5 degrees)

Yeah, that is trigonometry. Using still more trigonometry it is possible to calculate how high above the horizon a 9 inch tall object has to be so that it is “moon sized”.  We did that for you in the “Calculating the Angles” diagram above. Once you calculate the distance from the camera of 85.9, you can multiply that by the sine of the angle to calculate a height of about 46 feet! Here is the trigonometry:

Height = 85.9′ * SIN (32 deg)

You can go one step farther and calculate the distance from the object with ‘distance = 85.9 * COS(32 deg)’.

Of course after all that calculating you will still need to find a location, have contingency plans for weather and so on. At StarCircleAcademy we have built some tools and put together materials to help in all these endeavors.  We teach these things in our NP111 Catching the Moon Webinar.

# The Road To The Temple

Below is where we ended up. This image is from our friend and co-conspirator Andy Morris.

Lunar Eclipse over Temple by Andy Morris of PhotoshopScaresMe

Four of us plotted and schemed to get an interesting shot. Above is Andy Morris’ result.  Click the image and you can read a great article about how he created the shot using Photoshop Skills at his site: PhotoshopScaresMe.com. In fact, it’s a great article which we strongly encourage you to read. You’ll learn how he composited the images together in Photoshop as layers.

### The Long Conversation to Pick a Location

Andy has more details including how alcohol played a part in the process. Mostly I, Steven, was the wet blanket explaining why the geometry was all wrong.

• The Stanford (Hoover) Tower looks like it is shrouded in trees from the needed angle
• Bank of Italy (formerly BofA) in SJC doesn’t work
• The main problem with the wind turbines is that the angle to the top of them is something around 12 degrees above the horizon which is 40 moon diameters below the eclipse.
• Here is why the GG Bridge doesn’t work…
• This seems to be the best solution I could find: the Coit Tower…
• Darn. It would appear the coast is out. Forecast calls for Fog from SF to HMB
• This might make an interesting foreground (see below)… Somebody want to check if they will mind us being on their property in the wee hours?

*Ok, we lied, it was actually a 70-200mm lens with a 1.4 TC on a full frame camera, but the net is the same: 280 effective mm focal length.

Where did you go and what did you get in your planning efforts?  Post a comment and link below… we’d love to see what you came up with!

# What is so Super about a Super Moon?

August 10, 2014 just passed. It was the most recent Super moon. The term “Super moon” was coined by astrologers not astronomers and refers to a moon which is both full and also within 4 hours of its closest approach to earth.

The media gleefully report the super moon and show pictures of huge moons (many of which have been photo manipulated).  Here is the straight scoop on the subject. If you’re wondering whether that photo you’ve seen of the “too big to be true moon” has been doctored, we have an article on that.

The most “Extreme” Supermoon of the Century occurred in 2012.  Here it was photographed in Yosemite approximately 15 minutes after it reached perigee.

## What Makes the Moon Larger or Smaller When Seen from Earth?

Because the orbit of the moon around the earth is not circular, the distance from earth to the moon varies and thus the apparent (angular size) of the moon changes. Every lunar cycle the earth-lunar distance varies between its closest approach called perigee and its farthest distance, called apogee.  How big is the difference? The closest approach is 363,104 km (225,622 miles) and the farthest, 406,696 km (252,088 miles).

What is the difference in apparent size?  At apogee, the moon is 22,293 km farther away or -5.8% smaller than an average moon.  At perigee the moon is 5.54% larger than the average moon. Comparing apogee and perigee moons, the difference is a maximum angular size difference of about 12%  The average angular size of the moon, by the way, is half of a degree or 30 minutes of arc. That angle is slightly smaller than the size of the nail on your little finger when held at arm’s length. Those of you with significantly mis-sized pinky nails or unusual arm length might want to find another object to measure with at arm’s length.

In short: You’d have to be a very keen observer to notice a 12% difference in size between a super moon and a “wimpy” (apogee) moon.

Because the moon is slowly spiraling away from earth eventually the perigee moon will grow smaller and smaller in apparent size until one day, we will no longer experience total solar eclipses. The perigee moon will be too small to cover the angular disk of the sun which also happens to be almost exactly one half of a degree. From that point on, all solar eclipses will be “annular” like this one in May, 2012. Had the moon been closer to the earth, this may have been a total solar eclipse.

## How is a Full Moon Determined?

A full moon is defined as the moment in time when the sun, earth and moon are in syzygy. Syzygy is not only an interesting Scrabble(tm) word, but it defines when three bodies are in alignment. When the sun and moon are 180 degrees opposite one another relative to the Earth, we have syzygy which is the instance in which the moon is Full Moon. Many of us think of a “full moon” as that period during the month when the moon appears to be fully lit. That period lasts almost 70 hours, so we understand how reckoning a full moon as a moment in time is a bit confusing.

If you didn’t observe the August moon within 4 hours either side of when it was full, you did not see the super moon.  On the United States West Coast the super moon was not visible. Why? The moon set at 6:10 AM almost 5 hours before the moon was full. Those in Hawaii could just catch the super moon setting.  Those on the East Coast of the US had no chance at all. The whole super moon window occurred during the time the moon was not visible on the East Coast.

## The Last (and Next) Visible Super Moons

If you missed the May, 2012 Extreme Super Moon (my term, photo above), you’ve missed the largest possible full moon for more than a century into the future. On May 5, 2012 fullness and perigee occurred within less than two minutes of one another.  But don’t fret.  The difference in size between the extreme super moon an the average super moon is too small to notice unless you measure carefully.  If you paid close attention you probably also noticed that the May 20, 2012 annular solar eclipse followed nearly half a lunar cycle after the May 5, 2012 super moon. That is not a coincidence! The moon was closest to us on May 5th so half of a lunar cycle away it must be farthest from us!

On August 10, 2014, full moon and perigee occurred within about 1 hour of each other. The next super moon is in September 8, 2014. The moon will not be as close to perigee at the moment when it becomes full, but the moment of full moon occurs at 9:38 PM PDT, just as the moon rises. It will be a true super moon!

### Catch One Yourself

We plan to schedule a “Catching the Moon” Webinar well in advance. Stay tuned.  One complication is that the wonderful Photographer’s Ephemeris Tool will cease to work in desktop mode soon. It is being replaced with a browser version. While the tools is excellent, and we highly recommend it (and that you donate if you use it!) TPE still leaves some important bits of the puzzle unresolved – we will fill those in for you and give you a crack at our tool(s).

The moon caught between El Capitan and Half Dome, Yosemite National Park – Actual size, no manipulation

# Photo Pills Ultimate App for Photography

Originally Published Nov 29, 2013
Last Updated April 18, 2016

For me this app is why I have a smartphone.  It has a lot of features, which makes it one of the most inclusive apps out there for photography.  Even just one of their modules would make up the entire functionality of others apps.  You are essentially buying many apps in one since is has a plethora of functions and shortcuts. It is going to take a long time to master so do yourself a favor:  sit down with it for a while read up and explore.   First go to the section on learning and learn! Tapping and swiping allows you to switch dates, times, modes, and more. Getting the feel of the app without trying some critical calculation will put you in a better frame of mind.

# Accuracy

Steven likes to point out that the app is only as good as the hardware it is built on, and I can attest to this. Steven’s iPhone 4 compass and accelerometer must be off – the sun and moon locations are wrong by up to 10 degrees (more than 16 moon diameters).  Inaccuracy seems to also be a problem with the 4s and the 5s there have been many complaints. The iPhone 5 seems to have better sensors.  See the Macworld article: Six Phones Can’t Agree on Magnetic North

While the Macworld article is the result of poorly conducted calibration, the important takeaway is that the app can only be as accurate as the environment you run it in and the hardware you run it on. It can’t be said enough: trust but verify. Then recalibrate and try again.  Bring your own GPS and compass to verify the accuracy. We are not suggesting to use the iPhone as a navigation device just an aid. Apple maps didn’t work out so well remember! We are suggesting that this can be a useful device for the visualization of photos or getting an idea of your compositions and bringing your most accurate tools to bear. If you’re fanatical about accuracy, like Steven you can also bring your compass, maps, GPS, planisphere (ref 123,) and sextant.  Ok, the sextant was a joke but I wouldn’t be surprised if he has one. Steven is crazy about accuracy in predictions.

Navigating the app is easy to get started. Start swiping and you will be unlocking all sorts of functionality.  At first you will be surprised by all of the hidden things you are doing.  For me the first time I opened it up I was like, Wow, what was that? What did I just do?  Once you begin to get a little more advanced you will start to realize you may not be remember the proper, tap, drop, swipe, handshake combination to get where you want to go.  Generally it will take some practice but let me give you some tips out to help.

More content dots – The dots in the image below are a symbol that shows there is more content on this topic available just swipe in the correct place to the left or right.

The next page dots are sneaky because they blend into the background.  However, they can be found in the same general location so just look to see if they are there.

Transition between right and left pages by swiping by paying attention to the more content dots.

Previous page button – Found in the upper left.  This button brings you back to the prior menu, usually. It can be helpful for getting around the app so don’t forget about it. Even when it says something strange, it usually is a “back” button – except when it is not there and is instead a “Done” button on the upper right.  There are also important buttons that appear in the upper right so when you are finished look up there for some important info.  Like what you ask? The save button often appears here – or at the upper right.

Photo Pills back button on upper left.

Changing your location or a value generally requires just a tap however in some cases it requires a tap and hold or double tap.  My solution try them all.  In the planner, the map will find have some icons you can tap (sometimes by accident).  Or save yourself a bit of hunting by finding the Learn page and reviewing the options described there.

## The Photo Pills Menu in all of its glory

The three main menus of Photo Pills.

# What does every module Do?

Here is a short summary:

## My Stuff

Plans – Where your Plans are saved.

Points of Interest – Local points of interest. Over 10,500 all over the world! Also has a search functionality.

Settings – Calculations are based in the units of measure you select, on the Camera body you select.   I would suggest starting here. Imperial and Metric are the options.

## Pills

Planner – The Photographer’s Ephemeris (TPE)-like functionality. Is used for Sun and Moon alignment planning similar to what we cover in our Catching the Moon Webinar.  Mostly used for planning, as well as scouting, but has some nice sharing functionality which will help in the organization of scouted locations on the fly.

Planner looks a lot like another program I know

Sun – Detailed information about Sun rise, set,  Time to set Azimuth Elevation, Distance, Shadow Ratio, start of different twilights (Civil Nautical Astronomical), Magic Hours (Blue, Golden), Calendar, Augmented Reality, Seasons, and Sharing (Facebook, Twitter,  email, or save as an image).

Moon – Detailed information about Sun rise, set, Time to set Azimuth Elevation, Distance, Shadow Ratio, start of different twilights (Civil Nautical Astronomical), Magic Hours (Blue, Golden), Calendar (Phases each day), Augmented Reality, Distance (Perigees and Apogees), and Sharing (Facebook, Twitter,  email, or save as an image).

Exposure – Allows you to determine equivalent exposures, that is equivalent brightness using different settings. Determine equivalent exposures by changing shutter speed, aperture, ISO. Will also help you understand how a ND (Neutral Density) filter will affect the exposure.  This also calculates the change in EV value.

DoF – Is a DoF (Depth of Field) calculator using your current exposure settings, (Camera, lens, aperture, distance to subject, lens set up teleconverter status), a DoF table, augmented Reality, and Sharing (Facebook, Twitter,  email, or save as an image).

Hyperfocal – A table which shows at a given focal length (14mm) and a given aperture value (F/1.4) what distance everything is going to appear in focus (15 ft 4in) to infinity.  We talk about it a lot now you have no excuse for looking it up.

FoV – Is a Field of View calculator using your current camera, lens, and distance to subject, Camera orientation (landscape or portrait) to give field of view information.  You can also inverse this so you know where to stand or use Augmented Reality to see it in the phones camera.  Oh and you guessed it; share (Facebook, Twitter, email, or save as an image).  Also allows you to find equivalent FoV settings between cropped sensor cameras.  One application Steven recently used was to determine what focal length to use to fill his field of view with Mercury, Saturn, Comet ISON and Comet Encke (136mm).

Night – Includes 3 main features Night AR, Star Trails, and Spot stars. Night AR allows you to see the location of the Milky Way, the rise of the moon, and the direction stars will rotate. Aids you in the pre-visualization of how long your star trails would be. Inversely, it allows you to calculate how long it would take for the stars to form a specified arc in the sky.  Finally, under Night is Spot stars which calculates the shutter speed necessary to make stars appear as spots without the aid of an equatorial mount.  For an in depth article on this subject, see here.

Time Lapse – Allows you to calculate data about your timelapse before you art.  Information such as Event duration in real time and as a final product, FPS, total number of photos and the file storage necessary to capture the sequence on your memory card.

Learn

Help – In depth help on the app, tips on the menus and how to navigate, what buttons do what.  There is a lot.  After a quick stop at the settings then go over here. Check it out.

About– Learn about the developers, Contact the support staff, rate the app, applaude the team.

## The Awesome Parts

The app is just so big that I am not going to be able to cover all of it in detail.  There are a lot of parts of the app I want to touch on.

The data –  The data on every tab is amazing and detailed.  You want the data on what phase the moon is in, data on when the phases are that month.  The data just does not stop, it is not just the Sun or moon tab it is on FOV or DoF tabs.  For a guy who does a lot of panos all of the hyper-focal, FoV info is impressive.  I LOVE ALL of this data presented in a logical, clear and concise way.  But, wait we want more data.

But I digress, the great part about AR is that it can help you see what might be possible, when might the moon be near that object.  You may still have to do the work over again because we have seen the predictions be as far off as 10 degrees if you rely on the phone hardware for compass direction.

The moon is 0.5 degrees if you are running photo pills with a bad compass (bad hardware or just a bad calibration) The moon could be 20 moon diameters away in this photo (10 degrees / 0.5 degrees = 20 moon diameters). The moon would be out of this frame.

Planner – It is not as good as I would have liked.  My main gripe was the small screen is difficult to navigate and to pinpoint the exact point where the alignment is going to be.  One thing is that there is elevation profiler which will allow you to determine what the height of the object is on the horizon.  Useful yes but useless if there is a huge building in the way.  It also currently doesn’t have ready access to a Topo Map so determining where there might be a hill in the way is not simple – unless, of course you are on site..  I like the share options, in the Points of interest tab you can export all of your points to a KMZ file you can open in a map editor (like Google Earth).   The win for me was the portability, I always have my phone so I can just scout whenever I see something interesting so I can come back later.

I can plan a shot where ever I am, unless I need the map and am not able to get a data connection, that is.  If I see something interesting, I can check for an alignment right on the spot.  I can figure out if it is possible then bring out the big guns for double checking.  I think this is one of the biggest advantages.

One thing that tripped us up… there are two AR modes in the Planner. The outer one – which is for “getting an idea” and the inner one that appears after you start a Find operation. The AR choice after selecting Find allows you to use AR to set the location of your desired target. Point the display and tap it to place the moon or sun where you want to capture it. Remember, though, that the moon or sun will be shown about 12 times larger than actual size.

To close this app is the most comprehensive and inclusive of features, some planned usability enhancements will definitely kick it up a notch.

## Enhancements We’d Like to See

What we would like to see alignment prediction tool additions.  Photo Pills has so much. Now that we have seen what it is capable of we want more.  Serious, understatement there because, honestly we what a whole LOT more, and that’s not to say we hate the app – not at all. It still packs more punch than everything else we’ve looked at.  However, as noted, we would like to see more data in the AR and in the planner.  We would like to be able to take a photo of the scene and have the all relevant data overlaid on the same photo.  Further, we would like to have access to the meta-data burned into the photo.  When we share a plan, it doesn’t seem to include the elevation (altitude), azimuth (compass direction) and tolerance information. We are geeks we want to write scripts to sort and map that data, and track our exploits much like Spyglass.

There are a couple of little niggles in the interface that are annoying. Lines that don’t get drawn on the planner map, accidentally resetting the observer location by dragging our finger over the “set location here” icon while scrolling the map, and others.

If possible, we’d like to see an “enhanced accuracy mode” so that you can be 90% confident that the AR alignment that you are shown will indeed be within 0.5 degrees.

Even though of how we’d like to improve the visual interface… but hey, that doesn’t mean it doesn’t work well as is.

# How Accurate Is The Application? Participate and Find Out!

We want to collect data from testing in many environments on many devices, not just from our own half dozen devices.  Please do the following.

1. Play with PhotoPills a bit to be sure you understand it.  Go to the help menus and learn about the AR and other functions.
2. Kill your compass app and Photo Pills (all apps preferably). This is to insure that you have a chance to calibrate.
3. Start PhotoPills. Go to “Moon” make sure that “Info” has the current time (double tap the center of the Moon).
4. Click on AR.
5. Verify that the current date and time are correct in the upper left. If not, go back to Info and double tap the Moon or go into settings.
6. If/when the phone asks, perform a calibration on IOS 7.0 and later you “roll a ball around”. Older IOS versions may ask you to wave the phone in a figure 8. Hold your phone normally (Portrait mode)
7. Point the camera at the Moon. Obviously the moon must already be up in the sky for this test. You CAN try this using the sun instead, but that wouldn’t be good for your eyes or the phone unless the sun is just rising or just about to set.
Once your target is in view select the “Action” button (lower right).

• If you have a Twitter account choose “Twitter” and send your photo to @starcircleacade be sure to include #photopillstest and your iPhone version e.g. #iphone4 or #iphone5s and your ios version e.g. #ios703.
• If you don’t have email available, you can save the image or post it to Facebook – just be sure to share it with us!
8. Next turn your device 90 degrees to Landscape mode, spin yourself around a full 360 degrees (trying not to get dizzy), point the phone back at the target and repeat step 7.
9. Super Extra Credit would be to take a photo of multiple iphones on a table all set to compass mode.

Please only send two photos one in portrait, one in landscape mode per each device you have Photo Pills on.   Thank you for your help! Also, please note that large metal objects (your car for example), computers, electronics and what-have-you will affect the accuracy of the compass. If you can move away from such things to do your calibration and take measurements that will help.

We will publish an update to this material once we get enough data to make some calculations.

# Bending Reality

Many of those who follow my work and my webinars already know that I’m passionate about all things astronomical and night photography.  I’m the kind of guy that will go 10 times to the same location over a 4 year span to capture a shot that requires the elements to all be in place… the moon, a soon-to-rise sun, a coastal lighthouse, and clear weather.

Why do I do it? Because it involves embracing challenges from several disciplines – mathematics, astronomy, and some hairy technical aspects of photography. I don’t know what to make, however, of the super cheesy way to get the moon where you want it using photo editing techniques.  Take a nice clear picture of the moon with a 200 mm lens, and put it in a landscape taken at 20mm.  You get a moon 10 times larger than it should be, but will people notice?  Not very often, it turns out! I submitted an “e-ticket” for a charity. The E-ticket allowed the bidder to select any of the items in my image library for an 18 x 12 print. To my chagrin the image chosen was this one – a complete fabrication.

The patron knew it was a fabrication, but loves Yosemite and likes having the moon, a moon bow and a waterfall co-mingled together.  Making compelling composite images certainly falls in the realm of art which I highly value. But sadly, I created this image to educate people about forgeries.

One distressing trend that I see are photographers who run workshops and draw in participants by exhibiting photographs that are composites – not reality.  I know how I would feel if I booked a particular hotel because a lovely photo of the property showed a pristine beach just yards away but upon arrival discovered that the pristine beach was actually four blocks and a freeway away. Angry.

And what about those aspiring photographers who wonder how they too can get a photograph of a fantastic huge moon behind Yosemite Falls when you can’t – it’s impossible!

I have spent substantial effort finding, researching, learning about and writing tools to aid in achieving alignments of the moon (and sun) with various landmarks – but it’s quite unfair to compare the days weeks and months of calculation and waiting for the right date and weather to arrive to those photographers who recycle their stock moon, lightning, or cloud photos into whichever photos they think they will look best.

And it’s not a moon-only phenomenon.  I’m amused when I see folks exhibiting impossible star trails, improbable eclipses, and many other manufactured phenomenon.  So I am seeking your help. I want to learn from you what you think the boundary between fantasy creation and photography should be. And I’ll also give you some tools to help spot forgeries.

Here is what I think must be true for a photo to be an Authentic Photograph. Some editors are even more stringent about what they allow.

1. Photo(s) used in the final image must be taken at the same focal length and using the same equipment.
2. And within the same 24 hour period.
3. With no introduction of or removal of elements except those that are “small distractions”.  E.g. noise, an overly bright or overly dark element such as shiny trash or a tripod shadow,  Or cloning out an object that moved between exposures as in HDR.

I find these acceptable:

• Cropping – any amount.
• Sharpening, or Blurring (smoothing)
• Color correction, saturation or desaturation (but not color change. Green eyes should not become blue ones – though “red eye” correction is certainly ok).
• Selective coloration, including black and white, duo toning, etc.
• Perspective or lens aberration corrections.
• Vignetting
• Framing
• HDR, or bracketed exposure combinations together with tonal compensation.
• Contrast enhancement

For me the following cross the line from Authentic Photographs into Composites

• Use of any elements taken a different focal lengths or with different equipment  unless those elements are resized proportionately and placed in their correct and actual location.
• Using elements taken at different dates or from different directions (e.g. combining a photo of lightning with anything that the lighting did not actually strike)
• Moving elements in a single image to other locations (except incidentally to clone out or cover over issues).

When people violate my personal ethical boundaries, they usually create physical conundrums that are often easily spotted by a trained eye. For example in this photo the moon is impossibly large and intuitively feels wrong.  The moon is too large in this photo, too. The impossibility can be determined either from experience or through some mathematics (which I’ll show later).  Sometimes the moon placement just doesn’t match physics. For example in this photo the moon is illuminated on the wrong side. The moon is always illuminated by the sun so if the sun is setting at the left and the moon is illuminated on the right – well that’s impossible. Another common mistake is when people put a Full moon anywhere near a sunset or sunrise. The full (or nearly full moon) is always located on the opposite side of the sky from the sun. Any photo showing otherwise is doctored.  Sometimes the doctoring is laughably obvious as in this photo.

## Math Reveals Forgeries

As I teach in the “Catching the Moon (and sun) Webinar” the moon is a well known and almost invariant size and its presence can be used to measure distances in the photo. Specifically the moon is 1/2 of a degree in angular size (the superest of super moons is 0.57 degrees). An image with a angle of view of 50 degrees – as might be achieved with a 35mm lens on a 35mm camera – will result in the moon being exactly 1/100th the size of the image. Since the field of view of an image isn’t always obvious – especially in unfamiliar locations, finding something in the scene of identifiable size helps. For example in this image:

Some googling will tell you the height of Upper Yosemite Fall (675 feet). By eyeball Upper Yosemite Fall appears to be about 2.5 times the moon’s diameter. That means the falls are about 1.25 degrees in total (0.5 degrees times 2.5). Throwing some trigonometry in here, we can conclude that to get the moon sized as in the photo, the photographer had to be 30,934 feet away from the fall (5.8 miles). Yosemite Valley is less than a mile wide in this direction. Did we have to do the trigonometry? Not really! The rainbow gave us another huge clue. The arc of a rainbow is about 2 degrees wide from the top color (Red) to the bottom color (Violet). So the rainbow height here SHOULD be 4 times larger than the moon. The moon, however is much too large – just as we might have suspected.

The trigonometry: For an object to be the same angular size as the moon you must be 114.6 x the object height away from it. So, for example a one foot tall object requires a 114.6 foot distance to have the one foot object and the moon be the same size. For a super moon, the multiplier is 100.5 times.

The formula:   distance = height / tan(0.50)

What if the only identifiably sized object in a photo is a sand dollar?  No problem.  If the sand dollar and the moon are nearly the same size in the image, it’s easy to calculate how far away the photographer was from the sand dollar by multiplying the size of the sand dollar by 114.6.  Assuming that a 4″ sand dollar and an equally sized moon are in the same scene, a simple calculation reveals that the camera was 38 feet away from the sand dollar. Do many photographers get 38 feet away from the sand dollar they put in their foreground? No, they don’t!  Suppose we were wrong and the sand dollar is really only 2 inches in diameter. It would be just as unlikely for a photographer to be 19 feet away!  If you find more than one thing of an identifiable size relative to the moon you have a second point of reference.

For comparison here is an undoctored photo featuring the moon and a guy who is about 6 feet tall.  See how tiny the moon looks in this 20mm focal length photo? Did you even SPOT the moon? It’s at chin level on the left of the post.

The moon, even the crescent moon is very bright. Any exposure showing a detailed full or half-moon and stars is immediately suspect because the 1/100 of a second exposure needed to keep detail in the moon will rule out the capture of any stars. The problem is a limitation in the dynamic range of the camera. Our eyes can see stars near a fully featured moon… but no contemporary camera can do so except when the moon is well veiled by clouds or hanging very near the thick part of the atmosphere at the horizon.   The presence of stars in the moon bow photograph, oops, I mean COMPOSITE, scream inconsistency – or at the very least some super-duper HDR processing.  Click the image and check out the observations that other people have made about what is wrong.  E.g. How can the moon create a moon bow, illuminate the face of the fall AND be behind all those things?

## A Parting Puzzle

This image accurately depicts an Annular Solar Eclipse as captured by a series of images taken with the same camera pointed in the same direction and all at the same focal length. I wrote about it in my previous column.

But the following image I fabricated to look much like the many forged eclipse photos on Flickr and does not jive with reality. Can you spot why?  By the way I used a REAL image of an annular eclipse to create this photo but I combined it with another photo which had nothing to do with an eclipse.

If you can come up with a reasonable refutation (or two) you may win a free Catching the Moon webinar.

## I’d Like to Hear From You!

Do you take exception to my exceptions? Do you resonate with my concerns? Did you spot something you’re pretty sure is faked, but can’t quite tell why? Please leave a comment!

Interested in more about faked photos?

See here or the Scientific American article on spotting fakes in general photos.