top of page
Search
Writer's pictureLASM

Let’s Make a Planetarium Show: Part 3 – Digital Sky Animation


The first time I ever went to a planetarium was when I was in second grade.  It was on a field trip and I remember the show dealt with a private detective investigating something dealing with the stars.  I remember this because I had always wanted to have a job like Sherlock Holmes.  So, when I got a job at the planetarium I assumed that all the shows would revolve around a static star field and maybe a half-hour presentation on some of the planets you can find from your backyard.  How wrong I was.


Today, planetariums have evolved from star shows accompanied with slide projectors to multimedia theaters suitable for live concerts, large-format films, and anything else you can imagine.  There is a field of animation and photography geared strictly towards planetarium projection and production.  And that’s what we’re going to go over now: how animation for planetariums is accomplished.

Now, there’s a ton of ways to animate a scene depending on what look you’re going for.  To keep things simple, I mainly use three programs to create content for the dome: Digital Sky, Blender, and After Effects (along with Photoshop).

Digital Sky is the main workhorse for me in that it can accomplish the majority of what I need done for a planetarium show.  Digital Sky is the software I use for not only running the day-to-day shows we put on here at the planetarium, but also what you’d expect planetarium software to do.  It puts stars on the dome, it can show you the planets and moons, and it can put pictures on the dome.  However, it can do far more than all of that.  It can travel into the inside of a nebula, visit the edge of the universe, allow you to create newly discovered planet systems, and so much more.


Kenner Planetarium and MegaDome Cinema


When you think of a planetarium, you probably imagine a dome theater with a huge sci-fi-looking contraption in the middle.  That’s the star ball.  Many planetariums still use these optical star projectors.  And, honestly, they’re still the best device out there to put pinpoint sharp dots of light (or stars) onto the dome with the darkest environment you can get.  But, in my opinion, they’re only good for looking at a star field while on Earth.  Once we leave the Earth to go visit something like the Orion nebula, you have to resort to something like Digital Sky.

We used to have a Konica Minolta optical star ball in our theater but got rid of it to make room for more seats and to upgrade to a 4K projection system.  With the 4K projection system, we can put up a standard star field just like an optical system.  The only drawback is that it does put out some ambient light.  Today, I use Digital Sky for almost everything.

So, let’s say you want to show your friends something in the planetarium.  You want to show them a trip leaving the Earth and going out to Saturn.  From Saturn you want to rotate around to show how the Orion constellation looks the same from the Saturn orbit as it does from Earth. And then you want to not only enlarge the location of the Orion Nebula but then travel out to it.

This is a very simple Digital Sky operation that can be done either live or rendered out for an animation to be used as part of an automated show.  Here’s what it would look like:


Doing this trip from Earth to Saturn live would be fairly easy.  You would just pull up some plugins and manipulate some manual controls.  You’d load up your Planets plugin, your constellations plugin, plot a path to Saturn and bring up the Orion constellation once you get to Saturn.

However, if you want to make this flight path part of your automated Sky Tonight, you have to program a button to do all the functions you would do live, time it out to run on its own, and then render it out as a frame sequence to be used in something like After Effects.

Digital Sky is a script based program, so in order to achieve the look of leaving Earth and venturing out to Saturn I have to program something like in the image below.  This image shows just the beginning portion of a space flight plan.


The grey text tells you what each line of code is going to do; the white and red text is the actual code.  Almost all of the things featured in these lines of code have already been loaded though.  I have a separate button which loads images, data, etc.  That way when it comes to rendering the animation the computer doesn’t have to load anything while rendering—it’s already loaded and all it has to do is complete the commands.

Digital Sky definitely has a learning curve to it and has a fair amount to get used to.  It’s basically a 3D mapping of the observable universe and all the stars, planets, deep sky objects, etc. are placed in the correct distances.  So, if you want to travel to the Andromeda Galaxy some 2 million light-years away, you can’t exactly set the speed of your camera flight to seconds.  You’d have to set your space travel not to miles-per-hour but in increments of astronomical units, parsecs, megaparsecs, etc.

There are no online tutorials or classes you can take to learn Digital Sky.  They do offer a DVD set of tutorials to learn the basics, a series of manuals, and the occasional Digital Sky Academy.  There’s also a Digital Sky community with a message board and plenty of customer support if you need to ask somebody something.  However, it’s not like Blender which has a huge society of online support and video tutorials around every corner.

Blender is a free-to-use 3D modeling platform that’s really come into its own over the years.  We’re actually seeing many legit movies, animations, and pro-level content coming out using Blender.  I used to use Autodesk 3DS Max but have run into issues with rendering and overall ease of use.  Whenever I ran into a problem with Max, I had a hard time finding a solution online, but with Blender I find myself immediately being able to find the solution to whatever problem that pops up.  It also comes with a Fisheye camera meant for planetarium dome projection.  So, it’s super easy to make a scene the way I want and render it out with their own rendering engine and dome camera right in the box.

I use Blender to make stuff that isn’t available with Digital Sky.  With Blender I can make Martian landscapes, alien landscapes, exoplanets, characters, star clusters, nebulae, etc.

Using Blender is quite different than Digital Sky.  In Blender you’re molding meshes and shapes into what you want as well as animating their movement on a timeline.

The picture below is a very simple animation I did for a Mercury orbit.  I took a recent terrestrial mapping of Mercury’s surface, made a bump map out of the information provided, and wrapped that onto a sphere.  I then put a camera above the surface of that planet and had it orbit the object.  I later combined this animation with a Digital Sky star field moving in the same direction as the orbiting camera.


Of course, adding video, pictures, and text to Blender and Digital Sky is possible, but I find that adding all of this in while compiling all my frames in After Effects is the way to go.

After Effects offers its own brand of animation possibilities, but it’s really meant to be used after all your scenes are done.  I’ve created water effects, title card sequences, star clusters, grid overlays, etc. all in After Effects.  It’s a pretty powerful tool if you know what you’re doing, and, like Blender, it has a large pool of online users willing to help out with any questions.

For me, After Effects is the last step in the process once all my scenes are rendered into their own frame sequence folders.   I then compile all these scenes together and stitch them together in After Effects.  Then, once in AE I can add photos, text, and any additional video.  I can also make cool scene transitions, particle effects, or whatever other glossy end-product elements before rendering out my final frame sequence.

I’ll go more into detail later about these other software packages.  But as the voice-over is being recorded I mainly work in Digital Sky to get the bulk of the show mapped out.  All the “Stop” commands you see in the image below will be changed to actual wait cues once I get the voice-over in and edited.  Once I know exactly how long a scene takes from beat to beat I’ll be able to program the commands to accomplish certain things in the time required.

I’ll only be able to guess how long it takes to fully discuss the Saturn segment before moving onto Uranus until I get that final voice-over track.  And when it comes in I have to edit it, pace it out, and add music.

1 view0 comments

Comments


bottom of page