

Oct 7

An unmanned aerial vehicle (UAV) flies over a field or forest. Its camera takes hundreds of high-resolution photos, but it also captures colors in invisible wavelengths. So, what can such supervision reveal? For example, information such as chemical changes in plant leaves, soil moisture, or the material of a building's roof can only be revealed by processing photogrammetric (3D modeling) and hyperspectral (multispectral color analysis) data together. In this article, we'll first examine what photogrammetry (3D mapping) and hyperspectral imaging (spectral analysis) techniques are, how they work, and why they're valuable.
Photogrammetry is the science of creating three-dimensional models from two-dimensional photographs. Photographs taken by a UAV or aircraft are processed using specialized software to create a digital surface model (DSM) or point cloud of the Earth's surface. Drones or aircraft take numerous overlapping photographs to cover an area. The software identifies common points in these photographs (visual recognition), calculates subtle differences in camera position/orientation, and thus locates objects in space based on each photograph. This process is accomplished using Structure-from-Motion and multi-view geometry methods.

Hyperspectral imaging refers to cameras that capture hundreds of narrow-band spectral data for each pixel. While the human eye sees three channels of red-green-blue (RGB), hyperspectral sensors record numerous (e.g., 200–300) narrow-wavelength reflectance values from each point in the image. This data can be obtained from satellite imagery or UAV-borne cameras . The resulting dataset resembles a hyperspectral cube : the image has two spatial axes and one spectral axis. The reflected spectrum of sunlight reflects many characteristics, from plant chlorophyll to soil moisture. For example, disease symptoms on a plant that aren't immediately obvious in photographs can be seen in leaf reflectance at specific wavelengths.

Aircraft- or satellite-based hyperspectral systems capture images in pushbroom (continuous scanning) or snapshot (instant shooting). In hybrid systems, both 3D and spectral information can be collected by capturing sequential video-like images. An aircraft-mounted hyperspectral camera records a group of spectral bands at each moment, and these bands form a sequential row of pixels scanned sequentially, like a satellite. This produces a detailed reflectance spectrum for each location. This spectral image is processed to investigate factors such as plant health, mineral distribution , or pollution. For example, various studies have successfully classified field crops, fruit trees, and tropical forest samples by scanning them with hyperspectral. Today, mini hyperspectral cameras are being integrated into IoT/UAV solutions with new technologies such as receiver Fourier spectroscopy and weighted dual-input lasers.

Hyperspectral imagery carries much more detail than traditional multispectral cameras. While multispectral systems operate with 5–10 broad bands, hyperspectral systems capture hundreds of narrow bands and distinguish even very similar spectral features. For example, two plant species that appear the same color can be separated by small differences in reflectance at specific wavelengths. This wealth of information plays a critical role in material identification, chemical composition, or plant health analysis. Hyperspectral cameras integrated into UAVs offer new opportunities in fields such as agriculture, forestry, and urban monitoring by capturing early chemical changes in leaves.
There are three related but distinct categories of sensors in remote sensing: multispectral , hyperspectral , and thermal . The differences between them can be understood in terms of the electromagnetic spectrum range monitored, the number of bands, and their intended use.

Multispectral Sensors: They capture imagery in a limited number of broadband bands (usually 3–12 bands). For example, satellite platforms include near-IR bands in addition to RGB (red-green-blue) channels. Satellites such as Landsat and Sentinel-2 provide multispectral data as standard. Their output is relatively simple, their size is small, and their processing load is low. These sensors are used for tasks such as population mapping, vegetation classification, and general land use . Their resolution is generally high (3–30 m), making them suitable for wide-area scanning.
Hyperspectral Sensors: They operate with a large number (100–1000) of narrow bands. Each band is only a few nanometers wide, and these bands are usually sequentially (continuously) aligned. This allows each pixel, similar to a spectrometer, to provide a full reflectance spectrum. Hyperspectral data is preferred in applications requiring chemical information, such as material characterization, agricultural disease detection, and mineralogy. Their disadvantage is the large amount of data collected and the high processing cost. Furthermore, due to the high data volume, real-time transmission over the network can be difficult.
Thermal sensors: These sensors measure infrared radiation (long wavelength, approximately 8–14 µm) emitted from objects. They generate a temperature map of objects in this range, invisible to the human eye. The basic principle is that warmer objects emit more thermal radiation. Thermal images assign a temperature value to each pixel. These sensors are particularly used in temperature-based applications such as water stress, fire detection, and nighttime monitoring. For example, a thermal camera can detect hot spots on damaged solar panels or the temperature of plants in water scarcity.

Using These Three Sensor Types Together: These three sensor types complement each other. For example, a field can be mapped in 3D using a photogrammetric model, plant health can be analyzed using multispectral or hyperspectral imagery, and irrigation needs can be determined using thermal imagery. In short, multispectral provides overall color and texture , hyperspectral provides detailed chemical signatures , and thermal provides temperature information . By combining this rich data , tasks like yield optimization in agriculture , healthy tree selection in forestry, or green space planning in urban planning can be performed much more effectively.
Co-registration (matching) is essential for aligning images from multiple sensors to the same geographic coordinates. Data with different resolutions and perspectives are aligned; while GPS/IMU provides a rough fit, high accuracy requires additional software tuning. For example, in HSI and DSM alignment, common objects like building edges are used to extract transformation parameters.
After this stage, data fusion comes into play: Low-resolution but spectrally rich HSI is combined with high-resolution multispectral to produce high-resolution HSI (MS/HS fusion).
In recent years, CNN-based deep learning has outperformed classical methods by reconstructing disparate sensor data, thus combining textural, spectral, and structural information into a single output.

In short, spatially sharp, spectrally rich visuals and much more powerful analytics. This system delivers visual superiority not from individual links, but from the entire chain.
Agriculture: There's great demand for remote sensing in smart agriculture. A UAV can record the color and temperature of each plant, along with the shape of a field. When hyperspectral and thermal sensors are used together, plant health, irrigation needs, and disease can be determined simultaneously. For example, hyperspectral data and CNN have successfully classified corn seed varieties with 96.65% success, and pest monitoring has been carried out in olive and citrus fruits. Soil nutrient deficiencies, water stress, or disease symptoms can be detected early through small changes in reflectance. Multispectral is ideal for rapid scanning of large areas, while hyperspectral and thermal are ideal for identifying specific problems. This increases yield and reduces chemical use.
Forestry: Photogrammetry and spectral analysis are used together for species and health inventories in forests. Species and diseases can be identified using DSM and hyperspectral imagery acquired from UAVs. In one study, accuracy increased from 62% with hyperspectral alone to 90.5% with photogrammetry plus HSI and 96% with LiDAR plus HSI. In other words, combining spectral information with 3D structure increases success. Pests such as the pine processionary moth can be detected using spectral signals even before leaf drying is visible. Even yellowing phases are captured at different wavelengths, shortening intervention time. This protects the forest and allows for accurate stock planning.

Urban Planning: Photogrammetry is used to create 3D models (e.g., CityGML) in cities, distinguishing roofs, facades, and vegetation using hyperspectral and multispectral imagery. Thermal data reveals energy leakage, asphalt heat accumulation, and green roofs, while hyperspectral data identifies plant species and surface coatings. A study in Germany conducted a tree inventory with 96% accuracy with LiDAR+HSI and 90% accuracy with photogrammetry+HSI. This data is used in urban agriculture, infrastructure, pollution monitoring, and fire risk analysis. Urban planners make more sustainable decisions by seeing form (3D) and substance (spectral) together.
Photogrammetry and hyperspectral imaging are becoming increasingly intertwined thanks to advancing processing power and artificial intelligence. Combining 3D maps with the color spectrum of each pixel makes increased agricultural yields, early warning systems in forests, and smart urban infrastructure possible. These technologies are further enhanced by data integration, making it possible to see what's happening and understand why . For example, leaf color changes, 3D drought maps, and thermal temperature measurements can be interpreted together to reveal a plant's stressed chemistry.
In the future, smaller, cheaper sensors will become widespread, and systems providing multimodal imagery from space to UAVs will proliferate. Real-time data streams and cloud-based analysis will provide more relevant information than ever before to everyone from farmers to forestry officials. However, for all these major advances to occur, data integration and fusion will be crucial. Intelligent algorithms will reveal details invisible to the human eye . And let's not forget that being able to detect even the smallest changes in soil, leaves, and buildings will allow humanity to more effectively protect natural resources.



Comments