Establishing the No Parallax Point (NPP) of a Lens

Figure 1. Mount Rainier as seen from the Moraine Trail near Paradise. Comprised of 16 vertical shots stitched together to form a large Giga-panorama. © Beau Liddell, all rights reserved.

Figure 1. Mount Rainier as seen from the Moraine Trail near Paradise. Comprised of 16 vertical shots stitched together to form a high-resolution giga-panorama.  80mm, f/5.6, 1/400 sec., ISO 100.  Correctly capturing large panoramas requires preparation and proper execution of technique.  © Beau Liddell, all rights reserved.


Technology during the past 10-15 years has advanced if not revolutionized photographic creativity to previously unimaginable levels. One product of this transformation is that modern-day photographers often take multiple, overlapping images to capture a larger scene and later stitch and crop them into a final composition (Figures 1 & 2). Often the process is considered synonymous with creating panoramas, although the technique can be used to create a composition of any desired crop ratio (Figure 3).

Figure 2. Screen shot of the panorama in Figure 1 showing each overlapping frame prior to final stitching.

Figure 2. Screen shot of the panorama in Figure 1 showing each overlapping frame before final stitching.

The Milky Way shining over a campsite at Paradise Beach along Lake Superior's North Shore near Covill, Minnesota.

Figure 3. The Milky Way shining over a campsite at Paradise Beach along Lake Superior’s North Shore near Covill, Minnesota. This shot is comprised of 3 vertically stitched horizontal images to eliminate aberrations along the edges and corners of the sky, and cropped back to original size. As this image illustrates, stitched photos don’t have to be panoramas.  © Beau Liddell, all rights reserved.

I’ll cover the process of capturing images for stitching in a separate tutorial. But one important aspect to consistently getting good results is to avoid major parallax errors in the final, stitched image. If parallax can’t be avoided, then the software used may not be able to stitch the photos successfully, or very well. By rotating the camera-lens combination around the lens’ no-parallax-point (NPP) we can effectively eliminate parallax errors. The NPP is equivalent to the lens’ entrance pupil, which is the location of the optical image of the physical aperture as seen through the front of the lens. In this tutorial I’ll try my best to explain how to establish the NPP for a lens.

Ask the Manufacturer or Consult the User Manual

First, you might be able to get data from the manufacturer on the distance of the optical center of their lenses from the sensor plane or camera-lens interface in millimeters (mm). For example, Zeiss reports the entrance pupil distance from the focal plane in the lens’ specification sheet.  Armed with this information, attach the lens to the camera and measure the distance provided so you know where it occurs along the lens barrel (but remember that the actual NPP is in the center of the lens!).

If it’s a prime lens, I recommend marking the NPP on the barrel, or if possible apply gaffers tape to the barrel and mark the NPP position on the tape for future reference. Since the NPP will vary with magnification, a cheat sheet is needed if a zoom lens is involved (Figures 4 & 5; I like to record NPPs for focal lengths printed on the zoom ring). Then mount the camera on a pano slide (also referred to as a NPP or nodal slide) that has a distance scale marked in millimeters, making note of where along the scale the NPP occurs (Figure 5). This is the mark you’ll use to align the slide on the tripod head to control for parallax.

If you’re lucky, the tripod mounting plate or L-bracket that you purchased for your camera will have been designed to mount on the base of the camera or accessory grip so that it’s center coincides precisely with the camera’s sensor plane.  If that’s the case, just mount the camera on a pano slide and position the slide on the tripod at the point where the slide’s scale equals the NPP/entrance pupil distance provided by the manufacturer.  Then the camera will be correctly positioned on the NPP of the lens until/unless your using a ballhead and reposition vertically off level (see Axis of Rotation and Tripod Heads near the end of this tutorial for how to compensate when using a ballhead or multi-way panning head – this is where marking the NPP on the lens is handy).

Incidentally, if you own the same camera and lenses I use, you can’t necessarily use the data shown in figure 4 to establish the NPP unless you also use the same camera L-bracket I do (Really Right Stuff BGE11-LB).  You’ll also notice that the NPP with the same lens and focal length vary with the camera’s orientation.  That’s because for this particular L-bracket, the side and bottom mounting plates are not aligned in the same location relative to the camera’s sensor plane.  Finally, if you purchase a new camera and tripod mounting bracket, the NPP distances you previously used may no longer work (even though you might be using the same lenses, and regardless of how you initially established the NPP for your lenses) unless the brackets for both cameras are aligned with the sensor plane.  This is another reason to mark the NPP on your lenses so that you’ll have an easier time re-establishing it with new gear.

Figure ##. Example of a no parallax point table for lenses I frequently use to make stitched images.

Figure 4. Example of a no parallax point table for lenses I frequently use to make stitched images.  This is especially handy if you use many lenses or use zoom lenses.

Figure ##. Various styles of pano slides I use to align the NPP of my lenses with the tripod head when taking photos for making stitched images.

Figure 5. Various styles of pano slides I’ve used to align the NPP of my lenses with the tripod head when taking photos for making stitched images.  Documenting NPP and other pertinent data on post-it note tape and affixing that to a pano slide is a convenient way to ensure you always know what NPP distance to use for your lenses in case you forget or misplace your cheat sheets.

Rough Method for Determining the NPP

Unfortunately, lens manufacturers don’t often report the optical center point of their lenses, particularly at different focal lengths for a zoom lens. So now what? Although it’s not difficult to precisely determine the NPP yourself, you can roughly approximate it by viewing the front of the lens with the aperture engaged, visually estimate where you see the aperture along the barrel and measure the distance (mm) from that point back toward the sensor plane. Mark this point/plane on the lens.  This is the plane you will use when positioning the camera on a pano slide that in-turn is mounted on the tripod head.

Unless your depth perception is quite poor, it’s surprising how close to the actual NPP you can get by using this crude, visual method.

Precisely Estimating the NPP

There are many tutorials available on the Internet if you want to more precisely measure the NPP yourself.  Stitching software has gotten very good in recent years, thus it’s not critical that you be absolutely perfect.  But the closer you can position the camera relative to the lens’ true NPP the less risk there is that the software won’t be able to perform the stitch, and the less rotating, cropping or warping you’ll have to do on the final stitched result.  This is especially true if stitching images taken using wide-angle or ultra wide-angle lenses.

The process to precisely determine the NPP involves the following considerations and steps:

1)  Use a tripod and head that’s perfectly level and locked down to prevent any vertical movement. I highly recommend using a fully adjustable gimbal or pano-gimbal head (Figure 6) for doing stitched images and to determine a lens’ NPP, but a multi-way panning head (Figure 7) or ballhead (Figures 8) will work if you take added precautions noted below.

Figure 6. Camera mounted on a pano-gimbal tripod head. This is the ultimate type of head to use for creating stitched images as it enables quick and accurate setup and enables rotation around the NPP along all axes.

Figure 8. Example of a multi-way panning head. These heads enable precise positioning, but unfortunatley don't allow for rotation around a lens' NPP over all axes, requiring certain precautions to get consistent results when taking shots for stitched images.

Figure 7. Example of a multi-way panning head. These heads enable precise positioning, but unfortunately don’t allow for rotation around a lens’ NPP over all axes, requiring certain precautions to get consistent results when taking shots for stitched images.

Figure 9. Camera mounted on a pano slide and ballhead. Extra steps are needed to ensure alignment with the lens' NPP when positioning the horizon

Figure 8. Camera mounted on a pano slide and ballhead. Extra precaution is needed to ensure alignment with the lens’ NPP after positioning the horizon when using this type of tripod head.  The camera-lens combination in this photo are set to the lens’ NPP at 93mm, and will remain at the NPP distance so long the camera remains level on this type of tripod head.

2)  Mount the camera-lens combination on a pano slide marked with a distance scale (mm), and position the slide on the tripod head such that the center of the lens is roughly aligned with the horizontal axis of rotation (Figure 8).

3)  On a table or similar flat platform, place a couple of relatively thin, straight objects that you can stand on-end, positioning them in-line as viewed through the camera, but at least a couple of feet apart (Figure 9). You also don’t want the front object to completely cover up the back object as viewed through the camera, and want the objects close enough to the lens so you can view them easily. Adjust the tripod distance and/or height, or the platform the objects are on so that the top of each object can be seen through the viewfinder or on the LCD, with the top of the front object preferably near the center of the frame. You might have to re-level the tripod, or re-position the objects so that everything is level and lined up.

Figure 10.

Figure 9.  Photo showing a simple set-up and objects used to establish the no parallax point of my lenses.  I used two thin LED flashlights (shorter one closer to camera), positioned about 3 feet apart, aligned with the camera’s line of sight and positioned such that the top of the nearest flashlight was in the center of the frame.  Camera and lens were mounted on a level tripod with a gimbal head.

Another option is to position the tripod near a window, and draw a thin vertical line with a grease pen on the glass in front a well-defined background reference object outside.

4)  Once your reference objects are set and aligned with the center of the frame, slowly pan the horizontal axis of the tripod head back and forth, viewing the objects on the LCD or through the viewfinder. Unless by luck you positioned the lens on the tripod head exactly at the NPP (unlikely), you’ll notice either the rear or front object move relative to the other object as you pan. That’s parallax (Figure 10, also click the video link below showing parallax in real-time).

Figure 11.

Figure 10.  Example of parallax.  Center image shows objects aligned before beginning to pan the camera.  Left image shows the left side of the frame after panning to the right, and the right image shows the right side of the frame after panning fully to the left.  The camera was not being rotated around the lens’ no parallax point (NPP) since the two flashlights moved relative to one another while panning.  The camera was in fact being rotated in front of the lens’ NPP, and additional incremental movements backwards was required before the correct NPP distance could be determined.

5)  Next, re-center the objects in the frame, move the slide fore or aft 5-10mm, and repeat the panning process. If the apparent parallax worsens (more relative movement among the objects), stop, re-position, move the slide in the opposite direction, and continue panning the camera. But, if the parallax seems to improve (less relative movement between the objects), stop, re-position, and move the slide in the same direction using smaller increments, repeating the process until the two objects remain aligned in the same relative position to each other throughout the field of view as the camera is panned (Figure 11; or click the video link below showing what it looks like when the NPP has been attained). Once this has been achieved, record the distance on the slide’s scale that’s aligned with the center of rotation, and you’ve established the NPP for the lens at that focal length.

Figure 12

Figure 11.  After several iterations of re-positioning the pano slide on the tripod head and panning the camera to determine whether parallax was still present I was able to arrive at the NPP for the lens as shown here where the two flashlights were consistently aligned relative to one another as I panned across the field of view.  The left image shows the left side of the frame after panning to the far right, and the right image shows the right side of the frame after panning fully to the left.

Regardless of how you established the NPP of the lens, in the future all you have to do to control for parallax when shooting images for stitching is to position the camera-lens-slide combination on the tripod head at that distance. The axis of rotation is now set at the NPP for that lens and focal length, provided the platform is level. If you might be creating stitched images with multiple lenses or focal lengths (in the case of a zoom), it’s helpful to create a tabular cheat sheet, or jot the data on post-it tape and stick it on your pano slide component (Figures 4 & 5).

Axis of Rotation and Tripod Heads

One final precaution is needed before you start taking photos for stitching.  Make sure the tripod is level so that the horizon is appropriately aligned; otherwise you may get a poor stitch, or a skewed orientation that might require you rotate the image as well as crop out much of the composition (Figure 12). Once the camera/lens combination is set at the NPP on the slide and attached to an appropriately prepared tripod, you need to be able to rotate the lens precisely around that point as you pan among frames.

Milky Way at Splitrock Lighthouse State Park

Figure 12.  When making stitched images it’s important to have the horizon level.  Failure to do so such as with this 15-vertical image panorama will result in either a poor stitch or at a minimum require image rotation, potentially custom warping, and might also require some significant portions of the composition be cropped.

If you use a 3- to 5-way panning head or a ballhead (Figures 7 & 8), after positioning the horizon where you want it (which will result in a vertical adjustment), the camera will no longer be level, nor rotating around the correct axis of rotation to prevent parallax. Instead it will be rotating in front of the NPP (if you pushed the horizon toward the top of the frame) or behind the NPP (if you positioned the horizon toward the bottom of the frame). Depending on the lens, the axis of rotation could well be off the NPP by several inches if you positioned the horizon near the top or bottom edge of the frame (Figure 13).

Figure 14.

Figure 13.  Although the camera was initially positioned to rotate around the lens’ NPP in Figure 8, when using a multi-panning head or ball head as shown here, as soon as I compose the shot by positioning the horizon above or below center, the camera is no longer level, and will no longer rotate around the NPP as I pan.

As a result, you will need to adjust the pano slide fore or aft as needed to re-establish rotation of the horizontal axis at the NPP to compensate for the angle created when re-positioning the horizon (Figure 14). If the lens involved is a prime and you marked the NPP on the barrel, it should be easy to make the necessary adjustment. Assuming the horizon is still level after all this, you are now ready to capture a series of overlapping, parallax-free frames that will form a row in the final stitched image.

Figure 15.

Figure 14.  Since the camera in Figure 13 is no longer rotating around the lens’ NPP, I need to take steps to re-establish the NPP before I begin to take overlapping shots for a stitched image.  If the horizon is composed toward the extreme lower or upper edge of the frame (e.g. common when taking star photo landscapes) as simulated here, you can see that the camera was rotating several inches behind the NPP and required shifting the pano slide significantly forward.  Knowing where the NPP is on your lens barrel helps make these adjustments, and is made even easier if you mark the lens barrel (for a prime lens only).

If using a fully adjustable gimbal or pano-gimbal tripod head (Figure 6), once leveled, attach the camera-lens-pano slide combination to the head at the NPP marking, and shift the vertical riser of the head to the left or right as needed to center the lens over the horizontal axis of rotation.   Now the lens is centered along both axes at the NPP and you won’t have to make any further adjustments to the pano slide after you’ve positioned the horizon where you want it.

What’s more, if you need to capture multiple rows for the final stitched image, by using a pano-gimbal head you won’t have to worry about re-establishing the NPP point when you rotate the lens in the vertical plane (Figure 6; or click on the video below demonstrating how these heads precisely rotate a lens around the NPP). This efficiency is one of the big advantages of using these types of tripod heads over multi-way panning heads or ballheads when capturing images for stitching.

I hope you found some of this information useful for capturing images meant for stitched compositions. If you have any questions regarding the information provided in this tutorial, please leave a comment or contact me at

© Beau Liddell,, All rights reserved.


Planning Milky Way Landscapes

Starlight So Bright

Figure 1.  Ancient starlight illuminates an old growth mixed pine hardwood forest at Itasca State Park near Lake Alice, Minnesota.  © Beau Liddell, All rights reserved,


Over the past few years digital camera sensors have advanced to the point where the resolution, dynamic range, and overall image quality of the final output exceeds what could be captured on film. For no other genre is this more true than astrophotography.  Combined with advancements in post-processing software, now we can create relatively noise-free imagery while revealing subtle starlight that was simply impossible to capture well with film.

As for myself, I began dabbling in star photography after investing in a full-frame sensor in 2013. Due to my formal science background I’ve always been keenly interested in the universe beyond our solar system, and as an artist have always been inspired by various renditions of galaxies, particularly our own Milky Way.

Rather than deep sky star photography, I’m more drawn to nightscapes that include a large portion of the night sky with the Milky Way as the main subject. Capturing Milky Way landscapes has become one of my passions, and for up to a week each month from March through September I’ll rearrange my schedule and sleep cycle in pursuit of a nice Milky Way photo at my favorite locations…… my kind of night life!

If you’ve tried your hand at star photography you know there’s a lot involved, and appropriate planning helps me better achieve my artistic vision within this genre. In this tutorial I’ll describe the information, tools, and workflow I use to plan my Milky Way Landscape star photos.

Please know that I don’t work for and am not being paid to promote any product mentioned in this post or on my video blogs.  Since the initial learning curve associated with taking Milky Way landscapes can be steep, my students often ask for tips on planning these shots. So, I’m simply sharing information and my experience as of 2016 with what I consider to be some of the better tools available on the market to help plan these types of shots. If you prefer to view a video of this material, please visit my YouTube page.

As shown in the image above, when shooting the Milky Way I like to keep the stars sharp and round, and include as much of the galactic center as possible since it’s the brightest, most colorful and most compelling part of our galaxy, and if shooting large panoramas of the full arc of the Milky Way (Figure 2), I need to assess the elevation of the galactic disc relative to the horizon. If the disc is too high in the sky it will be more difficult to capture the scene, and may even be impossible depending on the lens I’m using. So, ensuring that the Milky Way is positioned exactly how I want it in a scene requires some  planning.

Milky Way Over Splitrock

Figure 2.  Gigapanorama (13 vertical photos blended together) of the Milky Way from Splitrock Lighthouse State Park near Beaver Bay, Minnesota. Confidently capturing shots like this requires predicting the location of the galaxy in the night sky.  © Beau Liddell, all rights reserved.

There are two general approaches to using the tools I’ll discuss. First, they can be used as planning aids prior to visiting a location with the goal of finding the final composition once you’re on-site, and therefore you’d be simply establishing the general plausibility of the shot in advance. Or, secondly, you can use them after you’ve developed a vision for your composition by previously visiting a site and establishing where in the sky you want the Milky Way, thereby using the tools to determine if it’s even possible to achieve your vision and if so when’s the best time of night and time of the year to capture it.

Of course these aids can also be used for planning other types of night sky photos such as those involving constellations, star trails, or more complicated multi-image panoramic compositions (Figures 3-5).

Sky Fire

Figure 3.  The big dipper and aurora borealis over a bur oak savanna in central Minnesota. Four-image stitched panorama.  © Beau Liddell, all rights reserved.

Star trails through the ecliptic over Dorace Lake at Lake Itasca State Park near Lake Alice, Minnesota. A time-lapse composite, comprised of 28 long exposures and one foreground exposure. © Beau Liddell, all rights reserved.

Cosmic Art

Figure 5.  Northern lights and the Milky Way shine through the atmosphere above a wildlife management area in central Minnesota during the 2015 St. Patrick’s Day solar storm. Comprised of 16 vertical images.  © Beau Liddell, all rights reserved.

There are many tools available for planning and timing these types of shots, and it seems more resources are being developed all the time. The ones I’ll reference below are simply those I’ve used since 2014 to plan out my Milky Way photos. Of course, you might find others tools on the market or resources available on the Internet that work just as well as the ones I’ll introduce.

All of the tools I’ll discuss are computer-based. To be fair though, good hard-copy topographic maps, air photos, star charts and planispheres (Figure 6) can be almost as effective in planning Milky Way landscapes, especially if you have experience observing the night sky and are familiar with identifying certain key stars and constellations. However, hard-copy tools aren’t nearly as convenient and thorough as the computer-based apps I’ll introduce……indeed this is a case where technological advancements shine brightly.


Figure 6.  With some experience you can effectively plan Milky Way landscapes using topographic maps, air photos, star charts and planispheres, but Internet & computer-based aids are more convenient and precise in estimating the location of the Milky Way when you plan on venturing out under the stars.

If you’re new to Milky Way landscape photography, initially you’ll benefit from an in-depth planning approach, but after developing some familiarity with the night sky, how it changes overnight and throughout the year, and learn the various aids available, you’ll find you can plan out these shots with ease.

STEP 1 – Find the Darkest Locations

Aside from overcoming any fears or anxieties you might have about working at night, and ensuring you’re fully prepared with the proper outdoor wear and survival gear, the first thing you need to do when planning to make a Milky Way landscape photo is to find a dark enough location since even the slightest light pollution can obscure the Milky Way too much to capture it well.

In addition, light pollution drastically influences the color balance of the sky (Figure 7) so the less urban lighting the better.

May Milky Way

Figure 7.  Our home galaxy shining through the early-morning sky over a wild rice lake in central Minnesota.  Light pollution from urban areas within 70 miles significantly influence the white balance of the night sky, and can make it tricky to correct during post-processing.  © Beau Liddell, all rights reserved.

If there’s any urban light pollution in the direction I’m viewing the Milky Way, I’ll try to find a spot that’s at least 70 miles (and preferably more) from major cities to get the best results.

Unfortunately, it can be hard to find a good, dark location free of light pollution, so being able to assess the darkness of a given location in advance is quite helpful, particularly if you’re unfamiliar with it.

Fortunately, night time satellite imagery captured over the past decade by NASA’s Earth Observatory make it possible to easily find dark locations and whether any sites you may already have in mind will work.

I can view this imagery on a couple of free websites, including Night Earth and Blue Marble (Figures 8 & 9).


Figure 8.  Night Earth website is one of two Internet browser-based resources I recommend for finding dark locations for shooting the Milky Way.



Figure 9.  Blue Marble is another Internet browser-based resource for finding dark skies that I highly recommend.  This site and Night Earth use nighttime satellite imagery provided by NOAA’s Earth Observatory.

Star Walk for mobile devices (Figure 10) will also provide nighttime imagery of earth, but it doesn’t provide as good views of the Milky Way, as some apps, and also doesn’t include good navigation aids such as roads, place names, or topography.


Figure 10.  This screen shot shows the locator map within StarWalk for mobile devices.  If you don’t already have latitude/longitude coordinates to enter manually, you can still pan around to find dark locations, but it’s not as detailed as the imagery and search engines you’ll find in some of the web browser-based resources on the Internet.

I tend to use Night Earth and Blue Marble the most because those sites provide more detailed imagery, and include information on roads, parks and other place names. I can also overlay daytime air photos and maps, all of which make it easier to navigate.

Within both of these sites I can pan and zoom to regions I’m interested in exploring, with the darkest areas denoting the most devoid of light pollution, and hence the best for viewing the night sky. If your computer and Internet connection are fast enough, you can perform tasks very quickly on both sites.

Both Night Earth and Blue Marble include powerful search engines that will not only find practically any city and town, but also major parks, refuges, large public forest lands and other landmarks as well, thereby expediting assessment of the overnight darkness of areas I’m interested in exploring.

For example, while I expect most search engines to find Mt. Rainier National Park is, I don’t expect them to know where Spray Park is within the national park. After all, such locations aren’t major landmarks, have no unique identifiable features or infrastructure, and aren’t part of any incorporated municipality. Yet each website found this location very quickly, and there have been very few instances where an area I input wasn’t found using their integrated search engines. In that regard they  surpassed my expectations and perform better than most other location database I’ve come across.

Both of these websites are great, but each has one minor advantage over the other. As of early-2016, Blue Marble currently displays better baseline map labeling and road functionality in Night Mode than the Night Earth website (Figure 11). If I need to access road information within Night Earth to help navigate, I must disable the night mode and refer to another map.


Figure 11.  Screen shot of the Blue Marble website zoomed to Washington State, showing road information that greatly aid navigating the nighttime satellite imagery and determining general directions to a shooting location.

However, one aspect of Night Earth that I feel is superior to Blue Marble, and one that I particularly like is Night Earth’s detailed latitude and longitude coordinates in standard degree-decimal format for the site I’ve found (Figure 12), so I can quickly copy & paste it into other site planning apps, a trip itinerary, or paste it to a list of my favorite photo-shoot locations. This format is accepted by most map-based apps, and importantly, is considered the GPS standard in the aviation community, which could be important if you ever require emergency search-and-rescue services in a remote area your shooting overnight.


Figure 12.  Screen shot of Night Earth showing the coordinate information provided for a dark site you’ve found.  This is particularly nice feature enabling you to copy and paste the data into another app or list of your favorite night sky shooting locations.

I’ve also found a couple additional sites that display light pollution like a weather radar thunderstorm map, namely Dark Sky Finder (also available for iOS devices) and Dark Site Finder (Figures 13 & 14).  I personally don’t prefer light pollution displayed in this way, and the search engines on these sites don’t appear to be quite as good as those in Blue Marble and Night Earth. But they are still functional and Dark Sky Finder also provides some built-in locations with additional information, such as on-site lodging or camping options, seasons for best access, and whether there’s a fee to access a site that you won’t find on Night Earth or Blue Marble.


Figure 13.  Screen shot of the Dark Sky Finder website, a version of which is also available for mobile devices.  One feature this site contains is a small database of existing dark sky locations with additional information not found in other apps.


Figure 14.  Screen shot showing the Dark Site Finder website, very similar to Dark Sky Finder, but has a little better graphic interface and customization to suit your viewing preferences.

STEP 2 – Find the Darkest Time of the Month – Know the Phases of the Moon

It doesn’t take much moonlight to influence the color balance of the night sky and obscure the faint light and colors of the Milky Way, so once I’ve found a dark enough location to shoot from, I need to target the best (or darkest) days of the month to shoot relative to the moon’s phases.

Fortunately there are a number of resources we photographer’s have at our disposal to track the moon’s phases.

To determine the best night lacking moonlight, I find out what day the new moon phase occurs during the month I want to shoot using information on the Internet such as the Star Date website, by using a mobile app such as VelaClock, or a desktop app such as Deluxe Moon Pro (Figures 15-17).


Figure 15.  Screen shot of the StarDate website where you can access a moon phase calendar to assess the best evenings to attempt viewing the Milky Way.

Figure 16.  Screen shots of VelaClock for iOS devices, and is also available as a Widget for Mac OS X.  Tabular and graphic data are presented of the moon phases and rise/set times.  Using the app doesn’t require access to the internet so long as you’ve entered your location into the app.


Figure 17.  Screen shot of Deluxe Moon Pro for Mac OSX and mobile devices.  This is a very information-packed app with various graphic and tabular data to track the Moon’s phases.

A calendar display of the moon’s phases clearly shows there’s 3-4 days on either side of a new moon when too little of the moon will be lit by the sun, and the moon will be below the horizon during most of the night (Figure 18). These nights are when the conditions relative to the moon’s phases will be optimal for photographing the Milky Way, and when I should target my efforts.


Figure 18.  Screen shot of the moon phase calendar within Deluxe Moon Pro, showing the darkest nights of the month when it should be sufficient to view and photograph the Milky Way.


Many planetarium and weather apps provide a handy means of tracking the moon’s phases as well as rise and set times, and fortunately many can function without access to the Internet. Other popular apps not already mentioned include VelaTerra, The Photographer’s Ephemeris, PhotoPills, Moon Calendar, and Phases of the Moon. Many are available for both iOS and Android mobile devices, and some are also available for Mac OS X and Windows desktop operating systems.

Deluxe Moon Pro is the app I use the most on my computer to assess monthly shooting windows relative to the Moon phases, and if you’re into shooting or observing the moon, it’s packed with ton of good information.

When I don’t have access to my desktop machine, I like to use the planning features in PhotoPills (available for iOS, with an Android release scheduled for late-2016), as well as VelaClock.

STEP 3 – Find the Best (Darkest) Time of Night – Understand Twilight

Once I’ve established which days of the month I can best view the Milky Way around the new moon, I need to determine when’s the best time during the night(s) that I plan on venturing out.

This requires an understanding of twilight to restrict shooting the Milky Way during the darkest times of the night.

Figure 19 illustrates twilight, the glowing light in the night sky that occurs when the Sun is below the horizon between sundown and complete darkness in the evening, and between complete darkness in the morning and sunrise.  Landscape photographers are always aware of twilight, at least near dawn and dusk, but it’s just as important to be aware of other periods of twilight to make the best Milky Way landscapes.


Figure 19.  A schematic depicting the daily periods of twilight and the difference between sunrise/sunset and dawn/dusk.  The angles and size of the sun are not shown to scale.  Illustration courtesy of T.W. Carson.

Twilight is caused by refraction and scattering of the sun’s rays in Earth’s upper atmosphere that illuminates the lower atmosphere, and is classified into three periods or types of twilight: civil, nautical, and astronomical as shown in Figure 20.


Figure 20.  The three types of twilight, civil, nautical, and astronomical, and their measurements.  The sun is drawn to scale.  Illustration courtesy of T.W. Carson.

As illustrated in Figure 21, the length of twilight at a location you might be interested in shooting depends on its latitude (with equatorial and tropical regions experiencing shorter twilight than regions at higher latitudes), and also depends on the time of year (with longer twilight experienced during the summer compared to winter months). So, it’s important to be able to predict when the darkest times of the night will occur for your shooting location and time of year.


Figure 21.  Approximate length of daylight, in hours, as a function of latitude and time of year.  Illustration courtesy of Thesevenseas.

Obviously, all other things being equal, the farther below the horizon the Sun is, the dimmer the twilight (Figure 20).

Civil twilight is the brightest form of twilight, and only the brightest celestial and space objects can be observed at this time. This period of twilight occurs in the morning and evening when the sun is between 0 and 6 degrees below the horizon.

Nautical twilight is the next brightest form of twilight and the term dates back to when sailors used the stars to navigate the seas. During this time we can easily see the brightest stars, and it occurs in the morning and evening when the sun is between 6 and 12 degrees below the horizon. However, during nautical twilight there is usually too much atmospheric light to see the faintest night sky objects, including many individual stars.

Astronomical twilight is the darkest type of twilight, occurring in the morning and evening when the Sun is between 12 and 18 degrees below the horizon.

While all the brightest stars are visible during astronomical twilight, and the Milky Way first becomes visible to the naked eye as well as to a good camera sensor during this period, there is still significant twilight obscuring the light and color of faint stars, nebulae and galaxies, including much of the Milky Way.

Astronomical twilight also produces a noticeable blue tint to the sky that is difficult to overcome in post-processing. This is important, as the brightest parts of the Milky Way contain interesting color that can’t be recovered if you shoot during astronomical twilight.

The image in Figure 22 was taken almost 20 minutes after astronomical twilight dawn in June. Normally I want much more contrast as well as color variation, particularly in the upper parts of the sky and the dust lanes in the Milky Way. But, since I was well within twilight, the photo has a noticeable blue cast, even after correcting the white balance quite a bit, and overall the contrast in the sky is too low with washed-out or more monotone darks than I’d prefer. Furthermore, much of the light and dark detail in the Milky Way has been lost and won’t be recoverable. I’d have to spend a lot of time further tweaking the contrast to get anything more out of this photo, and it still won’t provide as good a result if the shot had been taken just 15-20 minutes earlier.


Figure 22.  Photo of the Milky Way above Ellingson Island and Little Two Harbors along Minnesota’s north shore of Lake Superior.  The sky in this shot has a noticeable blue color cast (with lower contrast and washed-out detail in the Milky Way) caused by shooting the scene during astronomical twilight.  © Beau Liddell, all rights reserved.

Notwithstanding cloud cover, moonlight, solar storms, or urban light pollution, once astronomical twilight ends in the evening, the night sky is as dark as it will get (although not completely dark due to airglow in the lower atmosphere as shown in Figure 23), and any celestial bodies that are viewable by the naked eye can then be observed. The night sky will remain dark enough to view the Milky Way and other faint celestial bodies until astronomical twilight begins in the morning. So, the period of time overnight between astronomical twilight dusk and dawn is when I restrict shooting my Milky Way landscapes to achieve the best results.


Figure 23.  Nineteen vertical images were stitched together to create this large giga-panorama of the Milky Way over Minnesota’s north shore of Lake Superior.  The light cyan tint to the sky is caused by airglow in the lower atmosphere between 0 and 12 degrees above the horizon.  © Beau Liddell, all rights reserved.

My favorite aids for estimating shooting times relative to twilight are the VelaClock and PhotoPills mobile apps mentioned in the previous section. Certainly there are other weather, astronomy and photography apps that will provide this information, but I’ve found these two apps to be among the best.

VelaClock provides both a color-coded graphic of twilight periods reminiscent of The Photographer’s Ephemeris (TPE) that you might already be familiar with, as well as a tabular readout of twilight time periods for any saved locations (see Figure 16).

PhotoPills also provides twilight information in graphic and tabular form, but is only available for mobile devices (Figure 24). Like VelaClock and TPE, PhotoPills also displays a color-coded graphic on the timeline pane, but unlike VelaClock, I can manually scroll forward or backward through the twilight intervals on the timeline. The map will change hue & color as I advance from daylight through the various periods of twilight. When used with some of the other planning aids in PhotoPills that I’ll describe next, I find that it’s is one of the best all-around apps for planning my Milky Way photos.


Figure 24.  Screen shot of the PhotoPills app for iOS mobile devices (Android release planned for late-2016) showing the twilight information pane, map and timeline pane within the apps planner.

STEP 4 – Locate the Milky Way

So, at this point I’m almost ready to go out and shoot the Milky Way at a site. But, all the stars in the night sky won’t appear in the same location throughout the year due to the Earth’s orbit around the Sun (which is oriented at about 63 degrees relative to the galactic plane), and obviously won’t remain in the same location throughout any given night due to the Earth’s rotation.  For a nice, concise introduction to the Milky Way for photographer’s, see Andrew Rhode’s blog “A Photographer’s Guide to the Milky Way.”

Although it’s impossible to notice star movement at any give instant, if you create a time-lapse using successive photos taken at a site (see my Youtube video beginning at 16m:25s), or if you reference certain stars or constellations relative to landmarks on the horizon over an hour or so, it becomes quite apparent that the stars constantly change their position in the sky. Or, to be more correct, we change our position relative to the stars. In fact, objects in the night sky will appear to change position from east to west by about 2.5 radial degrees every 10 minutes.  That’s roughly equivalent to the width of one of your fingers on your outstretched arm.  Over an hour, the movement is about 15 degrees!

Therefore, to shoot a particular Milky Way landscape with the galactic center exactly where I want it requires working knowledge of how the night sky changes throughout the year, whether it’s even possible to observe the Milky Way in a particular location, and, if it is possible, be able to predict with some confidence when it will occur where I want it.

Fortunately there are graphic-based planetarium and photography apps that can do this for me.

I’ve found numerous apps that I like, but some of my favorites, in no order of importance, include StarWalk, SkyGuide, StarChart, SkySafari, Stellarium and PhotoPills.

All these apps precisely predict apparent star movement through time, and for any given location on Earth show me the position of objects of interest (including the Milky Way) relative to the horizon, as well as relative to the basic cardinal and intercardinal directions. Armed with this information, I can then scout any particular site during the day with a compass and clinometer, as well as using mapping aids in studio to predict if the Milky Way will occur roughly where I want it in a scene at any time during the year. After just a little time playing around with these apps you can develop a very good understanding of how the night sky changes over time.

StarWalk was the first app I stumbled across to determine the location of the Milky Way (Figure 25), I still use it from time to time, and is available for both iOS and Android mobile devices. Frankly it has a lot of fancy, unnecessary bells and whistles, but it’s still a really cool learning resource, as are all planetarium-type apps.


Figure 25.  Screenshot of StarWalk for mobile devices.  StarWalk 2 is now available, but is really the same app with slightly different graphics.

StarWalk’s graphics quality and overall responsiveness is great, and I love that it can be used without an Internet connection. It’s easy to pan, zoom, and otherwise orient the view in the direction you’ll be viewing the sky at the location of interest.

Once I’ve entered in a location, oriented the view in the proper direction, and placed the horizon similar to where I’d place it in the frame of my camera, my favorite feature is that I can easily set up a time-lapse by minute, day or month and quickly assess the approximate elevation, angle and position of the galactic disc and galactic center in relation to the horizon and the cardinal directions and determine how that changes through time.

It’s also a very handy app for identifying individual constellations, stars and other celestial objects (Figure 26), and I find it much easier to use for those purposes than referencing a hard-copy star chart or planisphere.


Figure 26.  Screenshot of detailed information provided by StarWalk when needing to identify particular celestial objects.

Stellarium is another app I use to predict the location of the Milky Way (Figure 27), and is available for mobile devices as well as Mac OS and Windows desktops. It’s very popular with astronomers, as well as many star photographers, and the best part it’s free. It excels over StarWalk in overall capability, and I use it often for planning shots when I’m sitting at my studio computer.


Figure 27.  Screenshot of Stellarium for desktop computers, a popular, free application for astronomers and photographers alike.

One of the things I like best about this app is the horizontal coordinate systems I can enable, including both the equatorial and azimuthal grid, which enable me to obtain coordinates of the galactic center and use that with a compass and clinometer on-site to very accurately plan and visualize a composition.  Stellarium for desktop is a tremendously powerful visual tool for planning Milky Way landscape photos, but has a steeper learning curve compared to most other apps. Therefore, should you decide to try it out, the online user’s guide will greatly assist with learning the app.

Since first using StarWalk I’ve stumbled across other planetarium apps to help plan Milky Way landscapes, including Sky Guide for iOS, Star Chart for iOS & Android devices, and SkySafari for iOS devices.

Sky Guide seems to have the best graphics of all the planetarium apps I’ve experimented with (Figure 28), and also makes it easy to enter custom location information, and perform a virtual time-lapse. Otherwise its functionality is on par with StarWalk, although it is a little cheaper. Star Chart and SkySafari also provides similar functionality, and has excellent graphics (Figure 29). So, you can’t go wrong with either of these relatively cheap apps in lieu of StarWalk.


Figure 28.  Screenshot of SkyGuide for mobile devices.


Figure 29.  Screenshot of StarChart for mobile devices.

The learning curve of the mobile versions of Stellarium is much better compared to the desktop version. While the graphics aren’t as good as SkyGuide, SkySafari or StarChart, they’re about as good as StarWalk, and like the desktop version can display a grid with elevation and azimuth data, which I find very handy for planning shots, especially if I want to view that type of information in a graphic as opposed to tabular format (Figure 30).


Figure 30.  Screenshot of Stellarium for mobile devices.

The final app that I’ve used to predict the location of the Milky Way throughout the year and during any given night, and the app I’ve used the most over the past four years to plan my Milky Way photos, is PhotoPills. Figures 31-33 show the November 2015 version of the app on my iPad. The graphic interface and ground-level views of the night sky in PhotoPills isn’t as fancy as the previous apps I’ve discussed, but nevertheless provides all the crucial information I need to plan just about any Milky Way photo. In many respects I feel PhotoPills is superior to the previous apps mentioned. That it works on mobile devices, is quite useful for many other photographic applications, and functions without an Internet connection are nice features.


Figure 31.  The intro screen of PhotoPills for iOS mobile devices.  The tools available in this one app are amazing.  Once you’ve entered your shooting location, tap the Planner to access all the tools you’ll need to predict the Milky Way’s location in the night sky.

Figure 32.  PhotoPills also includes all the data you’ll need to track the Moon’s phases, rise/set times and location in the night sky.


Figure 33.  Screenshot of the Planner within PhotoPills showing the timeline, two Milky Way locator tables, and an air photo with a graphic representation of the plane and direction of the Milky Way and galactic center relative to the viewing location.

Information on ascertaining the Milky Way’s position in the night sky is accessed using the same planning button mentioned earlier when discussing estimation of twilight  (Figure 33).

The second to last table in the information pane provides data on when the galactic center will be visible overnight, along with its azimuth (in radial degrees clockwise from your choice of true north or magnetic north), and elevation/altitude in radial degrees above the horizon. These are the same data I can from the grid overlay in Stellarium, but are more accurate and are quite helpful for planning a Milky Way composition. Elevation data in particular helps me determine if the galactic center might be obscured by topography, or obscured due to atmospheric extinction within 5-7 degrees of the horizon.

By tapping the Milky Way button on the left side of the table, the map displays a thick grey line indicating the azimuth that the galactic center will first become visible at the site and displays a dark grey line marking the azimuth it will disappear. That same display will overlay a radial grid centered on the site (representing what’s called the celestial sphere), showing the location of the galactic center (which is denoted by the largest white dots and a thick white line), as well as the plane and overhead elevation of the Milky Way relative to the celestial sphere. If I advance the timeline, the plane of the Milky Way rotates around the sphere and the elevation also changes. The graphics and associated data give me an excellent assessment of what the sky will look like when on-site.

The last table in the information pane provides readouts of the azimuth and elevation of both the galactic center and maximum overhead elevation of the Milky Way instantaneously for the site and changes as I shift the timeline.  The graphic on the Milky Way button to the left of the info pane’s table also rotates with the timeline giving another way to visualize the angle of the Milky Way relative to the horizon. The blue indicator bar to the left of the table indicates how visible the Milky Way will be during the darkest hours of the night for that date. More bars represent the darkest nights with little to no moonlight (e.g. on or immediately around a new moon). Tapping once on the Milky Way button to the left of the indicator will advance the date to the next new moon, and by tapping twice on the button will advance back to the previous new moon date.

If that weren’t enough, the Night AR button in PhotoPills (which stands for augmented reality) will show you an estimation of the position of the MW as if you were looking at it for the time, date & location specified (Figure 34). This particular feature will only work on mobile devices that have gyros and compasses built into them. As cool as this feature is, since the other locator features of the app provide all the needed data to assess the location of the Milky Way with confidence, I find that I rarely need to use it.  However, the potential use of the AR feather means that you can also assess on-site whether terrain or vegetation will obscure the galactic core.


Figure 34.  Screenshot of the Night AR (Augmented Reality) feature in PhotoPills, a more visual tool to assess the elevation and azimuth of the Milky Way from your viewing location.

The graphics and tabular data in PhotoPills and other apps are very helpful to assess the angle of the Milky Way as it elevates from the horizon to plan compositions like the one shown in Figure 35, and to plan out stitched panoramas of the full arc of our galaxy from horizon to horizon (Figures 2, 23, 36 & 40).

Figure 35. The planning features within PhotoPills made it much easier for me to determine the best time of year and night to capture this shot of the Milky Way rising above the St. Louis River Valley near Carlton, Minnesota. © Beau Liddell, all rights reserved.

Milky Way Savanna

Figure 36.  Milky Way above an oak savanna in central Minnesota. Giga-panorama comprised of 15 stitched vertical photos.  PhotoPills makes it very easy to confidently plan these types of shots.  © Beau Liddell, all rights reserved.

If you’re not experienced viewing the night sky, it’s to your benefit to learn at least a few key constellations, stars and deep-sky objects that will help you locate the center of the Milky Way in case you don’t invest in one of the above-mentioned apps, or in case you lose power to your mobile device.

Although under the best conditions, right time of the year and night, the core of the Milky Way should be readily apparent to our eyes, it can still be challenging to precisely position certain parts of it within the viewfinder under some circumstances.  This is particularly true of the darker portions of the galactic core, is particularly important if doing deep-sky photography of the Milky Way at higher magnifications.

But, if I learn some of the key constellations within and on either side of the Milky Way, it becomes very easy for me to position my viewfinder precisely without any aids (Figure 37).  For example, many of the stars in the constellations Sagittarius & Scorpius become visible during twilight before the Milky Way does (see figures 25, 28 & 30 showing the location of Sagittarius).  If I’m able to find those constellations, then I know the center of the Milky Way is between them and I can place it accurately when setting up a shot or predict it’s location in the coming hour and begin scouting multiple shots on location prior to the best viewing times.

Soupy mixture of stars, gas and dust at the Milky Way's galactic center as viewed from Lake Itasca in Minnesota's first state park. After waiting over a month for the right conditions, this was my first attempt at capturing a long-exposure star photo with the aid of an equatorial mount.

Figure 37.  Knowledge of the stars in the constellation Sagittarius and associated deep-sky objects, including the Lagoon Nebula, enabled me to precisely position and compose this telephoto shot of the galactic center without any aids.  © Beau Liddell, all rights reserved.

STEP 5 – Terrain & Skyline Assessment

So far the various planning aids have identified a sufficiently dark location, determined the best times to take the photos, and precisely determined where the Milky Way will be when I go out.  But other than using the AR feature on some mobile apps, few apps discussed so far will help me assess the visible skyline at my chosen location. It could well be that on-site terrain or vegetation might obscure the Milky Way from the location I plan on shooting from. Of course on-site scouting the day before or day of the shoot is always the most sure-fire approach, but fortunately it’s possible to do a reasonable job assessing the skyline at a site long before venturing out.

One of the most powerful, and definitely best know, free app that will let you know whether or not the terrain relative to the celestrial horizon might obscure the heart of the Milky Way at a location.

Google Earth (Figure 38) is the software I’m referring to. It’s available for windows & Mac OS operating systems as well as mobile devices, but I find it much easier to use on a desktop computer. Many folks use Google Earth for a variety of purposes, but I’ve found it to be superb for planning some of my landscapes, including those involving the Milky Way.


Figure 38.  Screen shot of Google Earth zoomed to the United States.  This is a tremendous planning aid for assessing the skyline you’ll see from ground view, particularly in mountainous terrain.

Google Earth won’t address issues associated with vegetation such as tall trees that might affect the skyline, so for that I still need to scrutinize aerial photos, although the ones available within the software usually give me a reasonably good idea of the type of vegetation I’ll encounter. But, Google Earth excels at accurately portraying topography, even when simulating ground-level views.

Now, if I’m shooting the Milky Way from a location in the Upper Midwest, I usually don’t need to worry about terrain. But, this app is indispensable if I’m making Milky Way landscapes in areas with a lot of topography, especially in mountain ranges where it’s very difficult to assess the skyline, and enables me to scout out potential shooting locations well in advance, thereby maximizing my productivity once I’m on-site.

As an example, the screen shot of Google Earth in Figure 39 shows a view of Mount Rainier from Spray Park. After assessing timelines and location of the Milky Way using other tools, I was able to use Google Earth several months in advance of my trip to hone in on a location that gave me the skyline composition I wanted with the volcano in a good location relative to the Milky Way. The only unknown going into the actual shot was the height of the subalpine trees that were clearly visible from the air photo.


Figure 39.  Ground level screenshot within Google Earth from Spray Park at Mt. Rainier National Park showing the skyline (excepting vegetation) I would encounter upon visiting the site.  The app can also visualize the night sky similar to various planetarium apps.

Figure 40 shows the resulting image I got from that exact site in July 2014. The app did a great job assessing the skyline I encountered on-site, excepting the vegetation.

Ancient Ice & Light

Figure 40.  Ancient light from the Milky Way shines through the clear mountain air over Mount Rainier and its ancient glaciers, as viewed from Spray Park in the northwestern quadrant of Mt. Rainier National Park.  © Beau Liddell, all rights reserved.

While planning this shot in Google Earth, I also panned the ground-level view past 180 degrees, and combined with information obtained from PhotoPills was able to confidently plan the capture of a large multi-image panorama (Figure 41) taken from the same location, the first time to my knowledge such a shot had been taken from that part of the National Park. Having established the location’s coordinates in Google Earth, I was able to efficiently execute my hiking trip and spent only a few minutes setting up the final compositions for these shots so that the vegetation wasn’t obscuring the volcano or the Milky Way.


Figure 41.  Information from Google Earth and PhotoPills helped me plan out this stitched panorama of 19 vertical images four months prior to visiting the site.  © Beau Liddell,

Google Earth can also simulate the night sky (Figure 39), but I generally don’t use it for locating the Milky Way since I’ve found it to be more cumbersome than using PhotoPills or any of the planetarium-type apps. The take home message is that Google Earth is a terrific aid to assess topography and skylines where all other apps are lacking, and thereby minimizes one more unknown prior to visiting a location.

Since originally writing this tutorial, The Photographer’s Ephemeris (TPE) 3D app has been released that provides accurate ground-level views of the skyline from a site, allows you to similate the field-of-view for a given focal length, and also includes an AR feather similar to PhotoPills.  However, as mobile apps go, the TPE apps are pretty expensive.

STEP 6 – Watch the Weather – Seek Clear Skies

Now, everything is coming together. But, one obvious last-minute factor that can still thwart my ability to get a shot of the Milky Way is the overnight weather conditions at my location of interest.

Simply put, clouds are bad if I want to view the Milky Way. The last thing I want is risk spending the time, effort, and money to reach a remote location only to be thwarted by the weather. Yes, the skies can always turn unexpectedly since forecasts aren’t completely accurate, and that was the case when I went to make the panorama shown in Figure 42 where the forecast predicted clear skies throughout the night, but instead clouds started moving in at the prime viewing time. Fortunately they were sparse enough for an acceptable result. In general though, by routinely consulting the weather forecast, unexpected cloud cover has rarely spoiled my Milky Way photo shoots.

Splitrock Starlight

Figure 42.  During a crisp April morning, Starlight, the Earth’s original navigation chart, glows brightly over Lake Superior as seen from the shore of Splitrock Lighthouse State Park near Beaver Bay, Minnesota.

Therefore, a weather app or weather radio is a crucial tool I use to plan shots of the Milky Way. If I’ll have access to the Internet I can get a more detailed 24-hour forecast than if I use a weather radio, and I’ve used a number of mobile weather apps and websites to assist my planning. By far the best ones are those that provide forecasts at least every few hours such as the Weather+ app for iOS and Android mobile devices, which you can get free (Figure 43).


Figure 43.  Screenshot of the Weather+ app for mobile devices that I use for tracking the short-term weather forecast prior to departing for my shooting locations.

But, the single best resource I’ve found, at least for the United States is the National Oceanic and Atmospheric Administration’s (NOAA) National Weather Service 7-day forecasting website. The site enables me to obtain a spot-forecast for the exact site I’ll be shooting at by using the map on the right side of the page (Figure 44).


Figure 44.  Detailed 7-day NOAA weather forecast for Spray Park, Washington in Mt. Rainier National Park.

There’s a lot of good information in the 7-day forecast synopsis, and it gives me a good idea of how clear the night skies might be when I go out.  In the case shown in Figure 44, if it were the right time of the month and year to capture the Milky Way in the southern sky, Thursday through Saturday evening look promising.  But it’s the hourly weather data on this site that I consult most frequently since it provides a wealth of information useful in keeping up-to-date with the near-term weather (Figure 45).


Figure 45.  NOAA’s hourly weather forecast graphs are the best source of information regarding the short-term conditions in your location of interest.  For Milky Way landscapes, the cloud cover data denoted by the blue line on the bottom graph is most useful in predicting if the conditions will be good enough to view the stars.

Obviously the amount of cloud-cover is the most important weather variable I’m interested in, but there are also other parameters I find useful. Generally I want to get a completely clear night, but I might be able to get acceptable results if the sky cover is predicted to be in the 15-20% range. Overnight temperatures and winds will tell me what type of clothing I’ll need, and relative humidity, dew point, and wind speed will help me assess whether frost or dew formation on my lens might be an issue to contend with, among other things.

In general I find the short-term NOAA weather forecasts are accurate enough to give me confidence about whether or not the night(s) I want to go out will provide acceptable viewing conditions and hence worth making a trip.  But, these forecasts aren’t available outside the United States, so you’ll have to look around for similar sites if you’ll be shooting in another country.

If you don’t want to consult detailed weather data, you can also check out the Clear Sky Chart website that provides a 48-hour sky forecast for relatively large regions (Figure 46).  In my experience the chart forecasts aren’t as accurate as hourly weather forecasts, but the site does provide a fair number of charts across the globe.  It doesn’t take much time to learn how to read the charts, and the page for each chart also links to a pollution map like those provided by Dark Sky Finder and Dark Site Finder.

Figure 46.  Screenshots of Clear Sky Chart website showing a sky chart for (left), and linked light pollution map (right) for Mt. St. Helens, Washington.

That’s pretty well covers the basic planning approach and tools I routinely use to capture Milky Way landscapes. By using these same or other convenient planetarium & photography apps, and by consulting appropriate Internet resources, you’ll improve your odds of being at the right place and time to capture compelling nightscapes of the Milky Way.

I hope you found some of this information useful in planning your own star photos. If you have any questions regarding planning Milky Way landscapes, please leave a comment or contact me at

Now, go shoot for the stars!

© Beau Liddell,, All rights reserved.