

Oct 7
Today, remote sensing is one of the most effective tools for observing and understanding the world. Often without fieldwork, imagery obtained from above or from orbit allows us to quickly collect data across vast areas. This data can be processed and used in a wide range of fields, from engineering to environmental management. In this article, we will explore the fundamental definition of remote sensing, image processing, and classification methods. We will also examine its application steps and its wide range of applications, from urban planning to disaster management, with examples from Türkiye.
Remote sensing is the science of obtaining the physical properties of objects and areas on the ground without direct contact . In other words, we obtain information about the Earth by detecting electromagnetic energy reflected or emitted from distant objects through sensors. This allows us to quickly collect data across large areas, often without having to go in the field. For example, TÜBİTAK UZAY's BİLSAT satellite monitors urbanization, disasters, the environment, and pollution with imagery between 12.6m and 27.6m from an altitude of 686 km. Remote sensing data is used in many areas, including environmental change monitoring, agricultural crop monitoring, forest fire prevention, and water resources management.
Different platforms are used to collect remote sensing data. Satellite , aircraft/helicopter , and UAV (drone) are the most common. The differences between these platforms can be summarized as follows:
Satellite-Based Sensors: Orbiting satellites cover large areas. For example, TÜBİTAK UZAY's Rasat Earth Observation Satellite operates in a sun-synchronous orbit at an altitude of 700 km and offers a resolution of 7.5 m for panchromatic (black-and-white) imagery and 15 m for multispectral mode. Satellites are suitable for monitoring geographical changes (urbanization, forest degradation, etc.) on a global scale, but their return periods (the interval between collecting new images) are limited.
Aircraft (Airplane/Helicopter): Aircraft sensors are at a level between satellites and drones. They can scan medium-sized areas with high resolution. For example, aerial photography and optical cameras are frequently used in urban planning or engineering mapping.
Unmanned Aerial Vehicles (UAVs/Drones): They provide the highest spatial resolution in small areas. They offer centimeter-level detail for plant health mapping in agriculture or construction site surveys. However, their range and flight time are limited.
Each of these sensors has its advantages and disadvantages. Satellites are expensive but cover a large area, while UAVs are low-cost but cover a small area. Which platform you choose depends on your objectives and scope.
There are two basic groups in remote sensing based on the energy source :
Passive Sensors: They detect light from a natural energy source like the sun. For example, Landsat, Sentinel, and BİLSAT satellite sensors create images by recording sunlight as it strikes the ground and reflects it. Passive systems do not generate their own energy ; they only collect reflected/emitted energy. They analyze in the visible and near-infrared bands, based on vegetation, water, buildings, and color.
Active sensors: They generate their own energy (radar signals, lasers, etc.) and send it to the target. When the transmitted energy hits the object and returns, they detect it and create an image. Examples include SAR (Synthetic Aperture Radar) satellites like Radarsat-2 or LiDAR (Light Detection and Ranging) sensors. Active systems operate by sending light or laser pulses , recording their return time and intensity to obtain information such as altitude, humidity, and so on.
Ultimately, both types of sensors are complementary in remote sensing. For example, radar can operate under cloud cover, while optical sensors provide more vivid detail. Active or passive systems are typically selected or used together depending on the project's needs.
The power of remote sensing lies in its use of wavelengths invisible to the human eye.
Visible light (approximately 380–740 nm) is the part of the electromagnetic spectrum that we can see, but the near-infrared (NIR) , infrared (IR) , and microwave (radar) bands are also used in remote sensing.
Every material reflects/absorbs light differently at different wavelengths. This difference is called a spectral signature . For example, healthy plants reflect green in the visible light but very strongly in the infrared. Satellites and cameras can distinguish objects by recording these wavelengths in separate bands.
In remote sensing, there are also multispectral and hyperspectral images.
Multispectral imagery typically contains several (3–10) broad bands. For example, Landsat-8 provides color (RGB), NIR, SWIR, and thermal information in 11 bands.
Hyperspectral imagery, on the other hand, contains hundreds of narrowbands. NASA's Hyperion sensor collects data with 220 narrowbands at a resolution of 30 meters. As the number of bands increases, the ability to distinguish objects increases. For example, hyperspectral data allows for much more precise classification of soil type, mineral composition, or different plant species.
Spectrally based indices can also be used. For example , NDVI (Normalized Difference Vegetation Index) numerically measures plant health in the Red and Near Infrared (NIR) bands. By calculating such indices from remote sensing images, you can infer many physical variables such as soil moisture, vegetation health, pollution, and more.
Image classification is used to generate engineering-relevant maps from the collected images . In remote sensing, each object has a different spectral signature (a reflectance pattern based on wavelengths). For example, sand, plants, water, and concrete each have their own unique reflectance characteristics. Software classifies pixels into different classes based on these signatures.
Image classification is generally done by three methods:
Supervised Classification: The user builds an algorithm model based on predetermined class examples (corresponding object) and classifies the entire image.
Unsupervised Classification: The algorithm groups pixels with similar spectral properties in the image and creates classes based on their similarities; the user then makes sense of these classes.
Object-Based Analysis (OBIA): In high-resolution images, neighboring pixels are grouped to create “objects”, and the shape, texture and spectral properties of these objects are evaluated together.
The resulting classifications are water, forest, agricultural land, and urban structure. Classification accuracy increases as the number of spectral bands increases (as is the case with hyperspectral bands), allowing for more precise separation of different tree species, various agricultural products, or minerals. In short, remotely sensed images are converted into digital maps through analyses based on spectral signatures.
The general workflow in remote sensing projects is as follows:
First, the purpose for which data will be collected is planned and the appropriate sensor/platform (satellite, aircraft, drone) is selected.
The image (photo, radar data, etc.) is collected with the selected sensor.
The data is subjected to pre-processing (geometric correction, atmospheric effect removal, blur removal, etc.).
Classification, object detection or quantitative analysis is performed from the processed image.
The information obtained is mapped and engineering decisions are made (e.g., mapping, infrastructure planning, risk analysis). Software tools (GIS/remote sensing programs) are used at each step.
Areas of Use: Remote sensing is used in a wide range of areas, from urban planning to climate monitoring.
Urban Planning and Infrastructure: Monitoring urban sprawl and creating land slope models for road/infrastructure projects.
Environment and Forestry: Monitoring forest fires, erosion, water pollution or air quality; protecting natural resources.
Disaster Management: Damage assessment and rapid response planning after disasters such as earthquakes and floods. For example, satellite imagery is used to identify collapsed buildings after an earthquake.
Agriculture and Water Management: Crop health (using vegetation indices), soil moisture, or irrigation planning. In Turkey, companies like Farmonaut use satellite and drone imagery to offer farmers solutions focused on productivity and sustainability.
Defense and Security: Military reconnaissance and border security, wide area scanning.
Geology and Mining: Mineral exploration, underground structure analysis.
Weather and Climate: Meteorological satellites make global atmospheric measurements, providing data for climate change models.
Remote sensing applications are rapidly expanding in Turkey. For example, BİLSAT and RASAT, designed by TÜBİTAK UZAY, are used for Earth observation. BİLSAT, the first Turkish Earth observation satellite launched in 2003, provides imagery with a resolution of 12.6–27.6 m and serves mapping, disaster and environmental monitoring, and planning purposes.
In the near future, microsatellites like LAGARI will further strengthen our security, forest monitoring, agriculture, and disaster monitoring efforts. These examples demonstrate how useful remote sensing is as a data source for surveyors and related experts. Resources
Comments