• She/Her

24 | Dagn | Engineering Student & All-Around Nerd | 🔞 Sometimes! Extra-Spicy over at @StarraSnack



amydentata
@amydentata

There are several reasons why this is the case, that you can sus out from basic principles. We've evolved to judge scenes based on the physics of light at small scales in an atmosphere. In space, however, there's...

  • no atmospheric scattering (fog/haze), so no distance cues at all
  • a single light source (the sun) that functions like an idealized point light, with no backlighting from a sky, causing harsh, purely black shadows
  • no obvious bounced light between celestial objects, due to the distances between them
  • near-spherical or roughly globular shapes, whose fine details (craters or even mountains) aren't large enough to alter the object's silhouette, so everything looks unnaturally geometric

amydentata
@amydentata

Obviously this is how the universe keeps frame rates so high once you're out in space


aloe
@aloe
This page's posts are visible only to users who are logged in.

lupi
@lupi

i think galileo might've still used vidicon tubes??? so it has a certain quality to it.
no, it's an early CCD.

even so, it almost adds that fuzziness and warmth back in through the quality of its sensor.

image source: galileo mission gallery

also, a DSCOVR moon transit for reference: the fully-illuminated moon passes in front of the earth, a deep muddy grey against the earth's vibrant blues and whites


You must log in to comment.

in reply to @amydentata's post:

you can get true color with that, it's how color photography was originally done, just with only as much sensitivity as the sensors themselves allow (which isn't that bad anymore, though it lags 5 years behind what we see in consumer products)