Focus Ramping now supported with the Fuji X-T2

I've been doing testing this week with the Fuji X-T2 and have focus control working very well with it! 

The way focus is controlled via USB with Fuji differs a bit from Canon and Nikon.  With both Canon on Nikon, there's no way to query the current position of the focus, and there's no feedback if it hits a limit.  There are simply relative move commands that tell the camera to move the focus farther or nearer, and nothing else.  Working with this, the VIEW internally tracks focus position and sends the camera relative commands, which works well as long as you don't end up hitting the limits during setup (since if you the VIEW has no way to know the focus didn't move, so then the internal position is incorrect, and you'll need to start again with the setup).

Fuji is different -- in fact, it doesn't even have relative move commands.  Instead, the current position can be queried and the focus move command sends it to an absolute position.  This sounds wonderful, but it also has its own set of quirks.  For one, when sent to an absolute position, it may not reach it exactly -- it will often overshoot, especially if moving a very small amount or very large amount.  To work around this, I query the position right after sending the move command and if it's not on target, I send the same absolute position command again until it hits it.

Since the VIEW internally expects the focus to be moved by relative amounts, for Fuji I query the current position and then add the relative move to that to determine the absolute target.  This ended up revealing another oddity: sometimes after taking a picture, the camera's focus position is very slightly different.  I don't know why, but about one out of three times, the queried focus position before and after a capture would vary by up to 5 steps.  This was an issue since the relative move would compound the incorrect position reading post-capture.  I work around this by caching the last absolute point from before the capture to base the next relative move on.  This works very well now and holds the position very accurately.  These are the kind of things I never expect to take so much time!

Below is a demo of how this is used and setup.  Note that this also works with most Canon and Nikon cameras.

And here's the resulting clip showing the motion and focus ramping:

Firmware with X-T2 support will be released in the next day or two as v1.8-beta11.

Posted on October 6, 2017 .

Fuji X mirrorless cameras now supported by the VIEW Intervalometer

I'm excited to finally announce Fuji X support for the Timelapse+ VIEW Intervalometer!  Fuji X has been the most requested camera family of those not previously supported.

Here's a sunset to night auto ramping time-lapse made with the VIEW Intervalometer and the X-T1:

And here's a day-night-day that ended up getting some nice aurora (surprising here in Southern MN!):

For a while this year the gphoto2 team and others have been working on it, and then in the last couple weeks someone graciously loaned me a Fuji X-T1 that I've been using for testing and further reverse-engineering the protocol.  There were more challenges than I expected, and Fuji is right up there with Sony as far as odd protocol design and breaking standards goes, but now that I've worked through the quirks it's actually quite reliable and robust, and I've got full control of Shutter, ISO, Aperture, Live View, as well as the ability to save to the camera or to the VIEW's SD.

I'm currently renting a Fuji X-T2 for further testing and it's working great and even can do focus ramping!  I'll write more about this soon.

To see some of the discussion on Fuji, even before I got involved, check out the thread here:

Here are a couple more test clips done with the VIEW + X-T1, as well as a video of the setup:

Support for Fuji X is first introduced in VIEW firmware v1.8-beta10 and will be part of the official v1.8 release soon.

Release notes:

Posted on September 27, 2017 .

Total Solar Eclipse -- it worked!

I'm happy to say the VIEW Intervalometer performed very well for the eclipse, allowing me to enjoy the experience with my family while it managed the pre-planned exposures and motion tracking on the telephoto.

I traveled with my wife and three boys (ages 5, 3 and 1) to Emigrant, MT (since everywhere else was too expensive).  We logged a total of 2945 miles over the 6-day trip, which was probably too much, but we made lots of stops and had a fun adventure.

We left at 2am Monday morning to drive down to the path of totality.  I was planning to meet up with Ron Risman's workshop at an awesome spot East of Jackson, but the main road through Yellowstone in our area was closed overnight for construction.  After a bit of offline research the night before (I didn't have data or wifi available!), we decided to go up and around through Bozeman and ended up in the pleasant little town of Driggs, Idaho, which turned out to be a wonderful spot that wasn't too crowded.

I setup two cameras, a wide-angle fixed on a tripod (Canon 5DIII with a 16-35 f/2.8L) and a telephoto (Canon 7D with a 300mm f/4L, 480mm equivalent) on the Sapphire pan/tilt head by Dynamic Perception.  For the telephoto, I used the sun tracking feature I just added to the VIEW for following the sun, since without it the sun would leave the frame within a few minutes.  This worked surprisingly well, keeping the sun nearly centered over the entire two and a half hours, so I didn't even need to use the correction feature I had added.

I had designed a filter holder for the telephoto but the week before leaving on the trip, my 3D printer hot end broke, so I ordered a new one but was too busy to replace it so I ended up making one out of cardboard and gaffer's tape.  I was worried the sun would heat the tape and cause it to detach, but it worked great, and made a nice shade for the camera to help keep it cool as well.  Still, I was surprised by the amount of sensor noise I had even at ISO100.

Since I had expected to have more internet access during the trip, I hadn't completely planned my exposure settings before hand, and was planning to do more research. I ended up taking more of a guess, and in the end my exposure choices were not quite ideal, specifically for totality.  I thought totality would be a bit darker than it was -- I think the part that surprised me was the dynamic range involved, since even while parts of the sky were very dark, the corona as well as horizon (where the shadow didn't reach) were relatively very bright.

For the telephoto during totality, I did a 5-frame bracket covering a total of 10 stops, but I didn't quite get the center right, so the brightest image was too bright, meaning I could have moved the center of the bracket 3-4 stops lower.  Still, since it was a wide range, I managed to capture a lot.  Photoshop wouldn't automatically align the images, and once I manually aligned them as layers, I found out it wouldn't perform the HDR conversion on I manually used this technique.

With the wide angle, I did a 2-frame bracket during the partial phases, 2 stops apart, which was actually not nearly enough.  I should have done the darker frame 6 stops under in order at least capture some of what was happening with the sun.  During totality, I had a fixed exposure with a short interval to capture the movement of the shadow, which worked pretty well although somewhat overexposed (I should have exposed lower and brought back the shadows in post, since the most interesting things were happening at the top end of the dynamic range).  Additionally, I could have done better on the timing of Baily's Beads (I was distracted and didn't remove the solar filter in time).

While there are things I could have done better, I'm very happy overall and have a wonderful set of footage for my first total eclipse.  Here's the resulting time-lapse, one from just the telephoto setup and the other both wide and telephoto mixed.  I edited it in Lightroom with the Timelapse Workflow plugin, then put it together and stabilized it with Apple Motion, rendered in with TLDF (for frame blending and noise reduction), and finally assembled the video, music and text in Final Cut Pro.

Telephoto-only version

Mixed wide angle + telephoto edit

And thanks to the calculations provided by Xavier Jubier, the VIEW was able to fully automate the exposures and timing for the eclipse, so the only intervention I had to do was to remove the solar filter during totality.  This allowed me to be able to enjoy the experience with my family rather than just focus on the camera during the event.  Now that it's been tested with the real thing, I'll improve the eclipse interface for the VIEW so that it's ready and even easier to use for future eclipses.

Observations from the eclipse

It was awesome!  The speed at which totality approaches is incredible and surreal.

It wasn't as dark during totality as I expected, but it was still very significant and there was an impressive temperature drop (I went from being uncomfortably warm to too cold)

Shadow bands were fascinating and much more apparent than I had expected, clearly visible across the dirt road and moving rapidly.  I had my iPhone video camera running on a tripod to capture this, but for some reason it stopped right before totality and I didn't realize it.

Now that I'm back I've got a lot to catch up on -- being without internet for a while really set me back on email.  Next up I've got a few more orders to ship out, wrap up firmware version 1.8, and create more documentation and videos on motion control.

While completely exhausting, it was a fun family trip and very memorable, especially for our 5-year-old, who loved the eclipse. After the eclipse, we drove over the pass to Jackson and on up through the Tetons and Yellowstone, wrapping up a very long day.  I'm very thankful for the support of my wife on this crazy and exhausting trip!  Here we are at a stop on the way back up after the eclipse.

Thanks again for your support!


If you're interested in purchasing prints of the eclipse images, check out the gallery here.



Posted on August 27, 2017 .

Notes on eclipse preparation

Well I'm now on the road in Rapid City, and the final firmware release for the eclipse is v1.8-beta7. Hopfully it works! This past week has been insane with packing orders and testing for the eclipse it all just feels like a blur.  Please test it first to make sure it works with your setup and you're familiar with the process. 

Important! If using HDR, make sure you're saving to the camera (recommended anyway). HDR saving to the VIEW's SD cord is NOT ready yet  


Pre-eclipse checklist  

  1. make sure you have v1.8-beta7 installed (enable settings->developer mode to enable access to beta versions or install via sd card) 
  2. plan your exposure and interval settings for each eclipse event. Use the links provided in the last blog post to help with this
  3. test it! Disable the GPS and enter the time and location for testing in Settings. The lat/lon are entered in degrees, time is 24-hour UTC time zone. More info on testing can be found in the previous post.


Eclipse Day

  1. make sure to enable the GPS or enter the exact coordinates for your final location
  2. confirm the time is current UTC time in Settings -> Set UTC time and Settings -> Set UTC Date (verify with 
  3. confirm your setup -- each time you switch to eclipse mode, the settings are set to the defaults, so be sure to review them all

UPDATE: it worked! Results blog post coming soon!

Posted on August 19, 2017 .

Photographing the 8/21 US eclipse using the Timelapse+ VIEW Intervalometer

Note: this document is a work-in-progress currently based on VIEW firmware version v1.8-beta5, and will be updated as testing continues and more resources are added.

The Timelapse+ VIEW Intervalometer is a powerful auto-ramping intervalometer, and the auto ramping algorithm is carefully designed to not respond to rapid changes, such as from car headlights and such.  However, during the eclipse, the light conditions during totality will change very rapidly making the ramping algorithm non-ideal if you're in the path of totality (though it would be fine for just a partial eclipse).

So over the last few months I've been doing a bit of research and finally put together a special program specifically for the eclipse (actually a lot of the underlying code will be re-used for scheduled programming and long-term features in the near future).

The eclipse program won't do auto ramping during the eclipse, but rather just run pre-planned sequences at the right times as the eclipse events (circumstances) take place.  This has the advantage of being determinate and you can preview the exact plan beforehand.

The VIEW uses Xavier M. Jubier's eclipse circumstances calculator for planning, used with permission graciously provided by Xavier.  He also makes the incredibly extensive and powerful Eclipse Maestro software as well as an exposure calculator you'll want to refer to when planning the presets:

Here's another exposure chart:

This is also a good resource:

And be sure to check out Syrp's guide here:

The following circumstances are supported in the eclipse mode:

This is the time leading up to the first contact.  I recommend a preset fixed exposure with a solar filter on the lens (if you're using a telephoto lens). You'll be able to setup and test this exposure in the camera before starting the program.  A 10 second interval or even more is probably good here (not much is happening unless there are clouds).

Partial (C1-C2)
This is the beginning of the eclipse.  I recommend keeping the exposure the same as pre-eclipse and maintain an interval of around 10 seconds.

Baily's Beads (C2)
This is a very short period of about 15 seconds where the final bits of the sun are disappearing behind the moon.  I recommend a short interval here of about 2 seconds, and no solar filter.  This is where you need to be watching and remove the filter at this point.

Totality (C2-C3)
Having removed the filter during the Baily's Beads period right before this, the best setting here is to do HDR to capture the corona.  Totality is very short -- only around 2 minutes and 20 seconds depending on where you are.  You could do a couple long HDR sets of 11 exposures 1 stop apart, or several sets of 3 exposures 2 stops apart.

Baily's Beads (C3)
Now we start doing everything in reverse -- same as before this is a short period of about 15 seconds were you will want to be taking the images in rapid succession, then put the solar filter back on as the sun emerges again.

Partial (C3-C4)
Set this too the same as the first exposure and make sure the solar filter is back on.

Once the eclipse is over, you can set it to auto ramp or just keep going at a fixed exposure.

Note that if you're using a telephoto lens the range of light is much more extreme that if you're doing a landscape.  With the landscape you should be able to do alright without a solar filter but you won't get the detail of what's happening with the sun.

I plan to run two cameras, one with a telephoto tracking the sun with a solar filter, and another wide-angle landscape that switches to auto-ramping after the eclipse and continues until night.

Here's how to test this feature before the eclipse:

  1. Disable GPS (Settings->GPS Module) so you can enter manual coordinates
  2. Find the coordinates on Google Maps of where you plan to be, and enter them in Settings->Set GPS Latitude/Longitude
  3. Change the date to 21 Aug 2017 in Settings->Set UTC Date
  4. Set the time to a little before first contact: 16:00:00 in Settings->Set UTC Time (then check Information->Eclipse Info to verify)
  5. Go to Time-lapse and set Timelapse Mode to 'eclipse'
  6. Configure your settings for each part of the eclipse in Eclipse Circumstances
  7. Check Review Program to see the planned events and settings
  8. START the time-lapse and see how it goes!

One way to verify the timing is to use Eclipse Maestro (Mac) or Eclipse Orchestrator (Windows) preset to the same location and time as the VIEW, and then point the camera at the computer screen to simulate the eclipse.

Note: if you're using non-current time or location settings other than the actual location, the sun tracking feature will not work. That will need to be tested separately with the current time and actual location set properly.

If you don't have the VIEW Intervalometer but still want to automate photographing the eclipse, try Eclipse Maestro (Mac) or Eclipse Orchestrator (Windows) from a laptop.

Tracking the sun

If you're using a telephoto lens, you'll need some way to keep the sun in the frame. There are a few options for this:

  1. Reposition the camera every couple minutes
  2. Use a polar-aligned astronomical tracker (this is ideal, but hard to align during the day)
  3. Use pan-head aligned with the north star (hard to do during daytime) and set it to turn at 15°/hour
  4. Use a 2-axis pan/tilt NMX or Genie Mini system and use the VIEW's sun tracking feature.

Obviously, of the above options, I'll be covering #4 here -- tracking the sun with the VIEW.  A couple things to note about a pan/tilt head vs a polar aligned system is that the pan/tilt head only needs to be level; you don't need to worry about polar alignment.  Secondly, the camera frame will remain level during tracking, in contrast to a polar aligned axis where the frame will tilt according to the stars (which if you're doing stars, this is what you want).  But the pan/tilt can be nice if it ends up including the horizon at some point, it's nice for it to show up at the bottom instead of a corner of the frame.  Also, at present, this method only supports shoot-move-shoot, so long exposures with a telephoto aren't recommended (up to 1 second is probably ok during totality).

The VIEW calculates the position of the sun based on the current coordinates and time, so as long as the sun is in the frame when it starts, it will continue moving the pan and tilt according to the relative change in the azimuth and altitude of the sun.

To enable solar tracking, make sure you either have a VIEW with a GPS or manually enter the exact coordinates, time and date in settings, connect the NMX or Genie Minis and then you'll find the Tracking option available in the Time-lapse menu.  You'll also need to select a motor for each axis (-R means reverse).

Or, you can use the app via wifi to set it up from your phone, using the joystick mode and live view to find the sun initially.  Then just select "sun" in the tracking section of the time-lapse setup screen.  Be sure to configure the motors so that the left/right up/down of the joystick is correct.

Tracking isn't perfect.  There are many factors to proper tracking, including an exact GPS position and accurate time, as well as a level tripod (the pan axis should be perfect the level and the tilt axis at exactly 90° off the pan axis).  Heavy cameras can cause the head to flex as the weight shifts around, adding to the inaccuracy. Additionally, backlash can cause a significant amount of error on the tilt axis since if you're starting spanning noon, the tilt will reverse direction and loose some steps due to the spacing between the gears.  If you're using a 200mm lens, none of these things should be too significant, but for 400mm or more, it can be hard to keep the sun in the frame for more than an hour.  Be sure to test this beforehand!

Disclaimer and word of caution

This is my first total eclipse and I'm quite excited about it, but I obviously don't have first-hand experience.  If you take my advice in any of this and plan to use the VIEW, please throughly test it before hand so you can be confident in it yourself and not just take my word for it.  I'm just doing what I can from research and putting this plan together for myself, and want to share it with others to be of help but you'll need to thoroughly test it and be confident in the process on your own.  I can't be responsible for ruining someone's once-in-a-lifetime event!

Also, I strongly recommend to manually run one camera in case something does go wrong.

Another thing to consider is compounding the probabilty of failure -- the more things you're dependent on, the more the likelyhood of something going wrong.  For example, if the camera, the VIEW, and motion controller, and the power supply each have a 3% chance of failure (just making up this number), when you put them all together you now have an 11% chance of something in the setup failing (1-(1-0.03)^4).


Posted on August 16, 2017 .

Turn Adobe Photoshop Lightroom into a powerful time-lapse editor with this Plugin

Turn Adobe Photoshop Lightroom into a powerful time-lapse editor. The Timelapse Workflow plugin is a suite of four essential tools for time-lapse post-processing in Lightroom.

Automated but not controlling; all tools have confirmation dialogs before applying changes, explaining what it's doing and why. Each step is optional and can be used alone as well.

Streamline and intuitive, all without leaving Lightroom.

Posted on August 9, 2017 .

Introducing Live View for Sony Alpha with the VIEW Intervalometer

Liveview Demo - see the realtime image from the camera, VIEW, and phone all at once! (the scene is a canvas print setup in the office)

Sony support in the world of intervalometers has always been a little hard to find.  To help solve this, I've been putting a lot of research lately into improved support for Sony with the VIEW Intervalometer.  This started with solving some key reliability issues earlier this year, which are now part of the libgphoto2 camera support library.

Since then, time-lapse with Sony has been quite reliable.  There were just three things missing to bring it to the level of Canon and Nikon:

  1. Live view for remote setup
  2. Save to the camera instead of having to download every image
  3. Focus control/ramping

Well, I'm excited to announce I've now got live view working with newer Sony Alpha cameras, completing item #1 above!  Item #2 is an annoying limitation of USB control for Sony, where they require all images be downloaded instead of saved to the cameras card.  This works fine with the VIEW, since it has a full-size SD card slot where the images are stored, it just requires longer intervals to keep up with transferring the large files.  This can be overcome via Wifi control, which is planned to be added soon.  For #3, it looks unless Sony improves the firmware, there's no hope at this time.

Anyway, back to the good news -- live view is working!  This is nice because you can stream live view over the app and more conveniently setup motion keyframes or just setup the shot from somewhere else altogether via over the internet.

I had previously believed live view wasn't possible over USB with Sony since there didn't seem to be any hidden commands in the protocol for this and Sony's software doesn't support it.  Sony's USB protocol is quite limited and in my opinion, terribly designed.  For example, to change ISO, you can only tell it to go up or down, then read what it is.  To find the range of options, you need to move it up until it doesn't change anymore, then run it all the way down, one step at a time, until it again stops changing.  Compare this with Nikon, where you can request the list of valid ISOs, choose one, then set it to be anything.  No stepping blindly through a list.  Canon is slightly more cumbersome -- the lists of everything are sent all at once when it first connects, then again if they change.  They can't just be requested at anytime, but it still works just fine.

Then, someone mentioned that live view does indeed work (, so I researched it further, downloading a program claiming to support it and monitored the USB line.  It worked!

And it's actually a very simple and nice design; there are no special commands for it -- instead, an image is requested from a specific address, one space higher than captured images.  The live view image is always current at that address and it can be requested at anytime using the PTP protocol standard commands (not vendor-specific commands like for live view with Nikon and Canon).

Captured images (RAW or JPEG) are retrieved from the 0xffffc001 address, and the live view JPEG image is found at 0xffffc002.  The only challenge here was that the object_info command for the live view image didn't return anything useful (like size), so I had to do a little work to get the JPEG out of the data downloaded (reused some code from the libgphoto2 live view section for Nikon), and now don't event use object info -- just the standard get_object then parse the data.  Here's the pull request adding this to libgphoto2:

Now, as of firmware version v1.5.0, live view is supported on most newer Sony Alpha cameras, including the A6300, A6500, A7II, A7RII and A7SII. 

Thanks for your support of the VIEW Intervalometer -- I hope to continue to push the limits and further what we can do in time-lapse, and your support and feedback make this possible.



The VIEW is a breakthrough intervalometer for cinematographers and photographers that allows for automatic bulb ramping, live time lapse preview -- watched either on the VIEW or a smartphone -- and touch-free gesture controls.

Order Now

Posted on January 13, 2017 .

Big firmware update!

Version 20150304 is here, along with some exciting new features...

Keyframe Editor

The new keyframe editor presents a new graphical interactive way to edit time/value keyframes for precise bulb ramping setup and more, specifically motion & focus, as seen below.


NMX Stepper Controller Integration

While the Timelapse+ already works well with many motion control systems, including the MX2, MX3, eMotimo TB3 and others, the new Dynamic Perception NMX integration takes it much further with total control of everything all through the Timelapse+ as the central controller.  Built on top of the new keyframe editor, up to 10 time/position keyframes can be added for each of the 3 axis.  The timing can thereby be coordinated with keyframe bramping (or auto bramping, even working with a variable interval) as well as focus ramping, as seen next...


Focus Ramping

This is a really exciting feature, especially when combined with motion.  And this time Nikon users aren't left out either -- it's fully working with my test D5100, and hopefully that means others, too -- we'll have to see.  Focus Stacking should also work with Nikon now as well.


Ready to get started?

Start out downloading the latest firmware here:   Among other improvements, a lot of time has been spent on perfecting the timing and reliability of bulb ramping with a PC-sync cable, and the auto config program in the Timelapse+ will calculate the best timing parameters for each camera (and remember them per camera, so you don't need to re-run the program every time).  Here's a video for help getting started with this:

Beyond what's mentioned here, there's also some exciting new features for bulb ramping that allow coordinating manual ISO changes for cameras without USB support, which will be covered in additional videos and documentation in the coming weeks.   Thanks for your patience as I now focus on getting the documentation back up to date for this new release.



P.S., Join me in Moab, Utah for a hands-on workshop with Ron Risman.  I will be there along with Ryan from Dynamic Perception. (it just sold out but you can get on the waiting list just in case)

Posted on March 9, 2015 .

Partial Solar Eclipse!

Eclipse at maximum coverage - check out those sunspots!

A cresent sun setting

A quick time-lapse of the crescent sun setting! I wanted to start it earlier but the sky was so clear it was hard not to just have everything black but the sun.

Posted on October 23, 2014 .

Firmware 20141010 released!

Finally!  This firmware update has been in the works for a long time, and includes big improvements again for auto bulb ramping with the Timelapse+.  It's been tested extensively and includes many bug fixes as well. 

New auto bulb ramping algorithm for more accurate tracking

The new algorithm is based on PID control and results and much better tracking of the exposure through the day-to-night transition while still ignoring momentary changes.  Last year the auto bramp algorithm was updated to have the Timelapse+ control the rate of change rather than the exposure value directly.  This made for much smoother results and made possible the the "night target" set point as well.  However, it was prone to get a bit dark during the transition from day to night.  This new algorithm keeps the rate of change method of control, but does a much better job of tracking with the change of light and is more tunable.  More details, tuning instructions, source code, and a simulation chart can be found here:

PC-sync cable feedback +  auto calibration

The Timelapse+ can use the camera's flash sync for reducing flicker and auto-calibrating the camera's timing parameters.  Set Settings->Auxiliary->AUX Port to 'PC Sync In' to enable.  The result is less flicker, faster shutter speeds in bulb mode (up to 1/60th for Canon), and auto-configuration of timing parameters.  Just connect the camera via USB and it will prompt to auto configure if it hasn't been done already for that camera.

Remembers camera-specific settings for multiple cameras

Easily switch between cameras without having to change camera-specific settings when using USB.  Additionally, the Timelapse+ will automatically update the Camera Make setting according to the connected camera, so it's easy to switch between Canon and Nikon while keeping the optimal settings for each.  Everything in Settings->Camera is linked to a profile for the currently connected camera.  Changing those settings without a camera connected will change the defaults for new camera -- preconfigured cameras will always use the saved profile.

More videos are in the works -- thanks for your patience!

Posted on October 15, 2014 .