one of the most fun projects i have worked on is using data from the mars color imager (marci), a camera that is flying on the mars reconnaissance orbiter. this camera has a fisheye lens that covers a 180 degree field of view.
i'm using imaging data from the camera to create little tours of mars. each video is rendered at 'real-time' speed, taking about 45 minutes to travel from pole-to-pole. it is also rendered out to show a similar field of view and angular size of what a person would see looking out a spacecraft window.
this video uses data that was collected at the same time the perseverance rover was landing in 2021. if you'd like a tour guide for the landscapes out the window, i have included some optional closed-caption comments for that purpose. it's very fun to full-screen it at 4k (in both sober and altered states!!) and pretend you're in orbit around another planet.
if you're interested in how it got made, click through to read more!
marci takes photos through a process that is a little non-traditional, through what is described as a "pushframe" process. this is intermediate between a framing camera (the type of photography we're most familiar with, where the full image is formed on a sensor), and a pushbroom camera, which acts like a scanner sensor. marci has a 1000 px square sensor with visible color filter strips masking off five sets of 16 pixel tall strips. (if you're wondering where the rest of this sensor space went, marci also images in the ultraviolet, where mars is much darker, so the filter strips are much wider). to build up the image, marci takes thousands of single frames (like a framing camera), and each individual color strip is extracted and stitched together into a continuous strip (like a pushbroom camera).
during normal operation, marci collects the data for a pole-to-pole imaging strip during every orbit. mars rotates once under the orbiter's ground track about every 12.5 orbits, so 13 imaging strips will complete a global daily view of mars, capturing events like developing dust storms or polar weather systems. mars reconnaissance orbiter has been at mars since 2005, so tens of thousands of these imaging strips are available.
these images can be re-rendered onto a globe to simulate flying over the martian surface! there were a few steps involved. the first, and maybe largest hurdle to an image processing newbie, is creating the imaging strip in the first place. the marci dataset is publicly-accessible (nasa provides it for free to the public through the planetary data system), but the data itself is sort of in an "assembly required" format - the individual color strips are extracted, but they have to be map-projected and merged into a color image first. the processing is pretty straightfoward, but a little specialized and computationally-hungry. to do it, i need to use the united states geological survey's isis3 software (linux only) to handle the data processing and map-making. once i have the map in hand i can do some slight cosmetic corrections and touch-ups (fine-tuning color alignment, contrast adjustments, etc). then i have a very simple blender setup that wraps the map onto a sphere and has an orbiting camera view to render the view from a 35 mm focal length.
all told, when everything is working well, it takes about 2 hours from download to being ready to render out through blender (15 minutes for making the map, the rest fine-tuning the appearance in photoshop), and then about a day for the view to render out. i hope you enjoy the results!
