By Gopi Krishna Tummala
The Core Principle
Imagine you are looking at the world, but your eyes can see much more than just visible light—you can see heat, infrared, and radio waves. A satellite is like that, but better. Photogrammetry is simply the science of making measurements from photographs. The “photo” part is key: we aren’t just taking pretty pictures; we’re collecting data on how light reflects off the Earth.
The biggest misconception is that a satellite sees the world the way we do. It doesn’t. Our eyes see the tiny sliver of the Electromagnetic Spectrum (EMS) we call visible light. A satellite sensor can capture light outside that visible range (like Near Infrared), giving it superpowers to see things we can’t, like plant health or subtle temperature shifts.
💡 The Math Hook: The Light Fingerprint
How do we know how hot a star is, even millions of miles away? We look at its color. The math behind this is related to Wien’s Displacement Law, which links the peak wavelength of light an object emits to its temperature:
Where:
- is the wavelength at which the object emits the most radiation
- is Wien’s displacement constant (approximately )
- is the temperature in Kelvin
This same principle allows a satellite to measure energy signatures. The ultimate goal is converting the Digital Number (DN)—the raw value recorded by the satellite sensor—into something physically meaningful, like radiance or reflectance. We are turning a pixel value into a quantitative measurement of light energy.
Key Topics
Photogrammetry vs. Remote Sensing
- Photogrammetry: The science of making measurements from photographs
- Remote Sensing: The broader field of acquiring information about objects without physical contact
- Photogrammetry is a subset of remote sensing focused on geometric measurements
The Electromagnetic Spectrum (EMS)
Satellites capture electromagnetic radiation across different wavelengths:
- Visible: 0.4-0.7 μm (what human eyes see)
- Near-Infrared (NIR): 0.7-1.3 μm (vegetation health, water content)
- Shortwave Infrared (SWIR): 1.3-3.0 μm (mineral identification)
- Thermal Infrared: 8-14 μm (temperature measurement)
Spatial vs. Spectral Resolution
Spatial Resolution:
- Size of the smallest object that can be detected
- Measured in meters per pixel
- Trade-off: Higher resolution = smaller coverage area
Spectral Resolution:
- Number and width of spectral bands
- More bands = more information, but lower signal-to-noise ratio
Satellite Orbits
Low Earth Orbit (LEO) - 160-2000 km:
- Closer to Earth = higher resolution
- Orbits Earth multiple times per day
- Examples: Landsat, Sentinel, WorldView
Geostationary Orbit (GEO) - 35,786 km:
- Stays fixed over one location
- Lower resolution but continuous monitoring
- Examples: Weather satellites
Sensor Types
Panchromatic (Pan):
- Single broad band, black and white
- High spatial resolution (0.3-1m)
- Captures fine details
Multi-spectral:
- Multiple narrow bands (Red, Green, Blue, NIR)
- Lower spatial resolution (2-30m)
- Enables color and spectral analysis
Converting Digital Numbers to Physical Units
The calibration chain transforms raw sensor data into meaningful measurements:
-
DN → Radiance (L):
- Radiance: Energy per unit area per unit solid angle (W/m²/sr/μm)
- Gain and offset provided in image metadata
-
Radiance → Reflectance (ρ): Where:
- : Earth-Sun distance (astronomical units)
- : Exoatmospheric solar irradiance
- : Solar zenith angle
Why Reflectance?
- Reflectance is a property of the surface (0-1 or 0-100%)
- Independent of illumination conditions
- Enables comparison across different images and dates
This module provides the foundation for understanding how satellites capture Earth’s imagery. In the next module, we’ll explore the geometric principles that allow us to convert these 2D images into accurate 3D maps.