Horizon Precision Systems
Horizon Precision Systems has decades of experience developing unmanned aerial systems for all types of applications. More importantly, we have the facilities and people in place to properly service and support them so your investment stays on the job.
Thursday, March 17, 2016
Wednesday, December 30, 2015
We're Hiring!
Horizon Precision Systems is looking for an experienced Staff Engineer for our development team. Full details can be found at this link:
Horizon Precision Systems -- Staff Engineer
Come join our growing team!
Horizon Precision Systems -- Staff Engineer
Come join our growing team!
Monday, August 24, 2015
An Introduction to Agricultural UAV Imaging Systems
For decades, farmer-pilots (famers who have their own airplanes) have been able to inspect their crops from above, and as a result, have been able to react faster to changing conditions. Anecdotal data suggests that farmer-pilots have consistently produced better yields, primarily because of their ability to see what is going on in all areas of their fields.
The advantages of being able to inspect your crop from above are becoming more readily available due to the availability of small UAVs (unmanned aerial vehicles). These small, battery-powered multi-rotor and fixed wing aircraft are able to provide growers with a birds-eye view of their fields; providing much of the same value as owning a full-size aircraft at a fraction of the cost. The trained eye of a seasoned farmer can often visually identify issues quickly, and the airframes can provide a view of the entire field – rather than just the outer few rows. The results should mirror the impact of owning a full-size airplane, and in some cases may be better, due to the high-resolution cameras and low altitudes of the UAVs.
Near-infrared imaging creates additional value by identifying stressed plants early. Near-infrared (the light just beyond the visible light spectrum that we see) reflectivity of a plant, when compared to visible light reflectivity, can indicate the general health of the plant. When two plants are compared in visible light, they may look identical. When photographed with a NIR (near-infrared) sensor, we can tell if one of the plants is stressed.
Plants rely on the photons in visible light to perform photosynthesis, and therefore absorb visible light. Additionally, the cell structure of a healthy plant will reflect NIR light, because the energy level of the photons in NIR is lower, and absorbing these bandwidths of light would likely make the plant overheat without benefiting photosynthesis significantly. Using these two facts of nature, we can derive the overall health of a plant by comparing visible light reflectivity with NIR light reflectivity.
A healthy plant will reflect a small amount of visible light and high amount of NIR light. By comparison, a stressed plant will reflect a higher amount of visible light, and a lower amount of NIR light. Since our sensors (a standard camera and a NIR camera) measure reflectivity, they can be used to identify the areas of a crop that are stressed by comparing the two views pixel-by-pixel.
One of the more common expressions of the difference between the values is known as “NDVI” – Normalized Difference Vegetation Index. Developed in early-1970 as a way to evaluate imagery from the Landsat 1 satellite, NDVI compares the values and forces the difference into a range between -1 and 1. Limiting the range of possible values makes the job of creating a visual representation of the health of a field much easier, as each increment in the range can be assigned a color. In many NDVI images, green represents healthy plant life, and red represents either areas without vegetation or areas of stressed vegetation.
NIR sensors fall into two categories – purpose built, engineered and fully calibrated NIR sensors and converted point-and-shoot cameras. Point-and-shoot cameras are converted by opening up the body and removing the infrared mirror that sits in front of the light sensor, and replacing it with a blue filter of some type. The purpose of the blue filter is to block as much visible light as possible, allowing the sensor to respond to the NIR light frequencies.
While prototyping airframes and qualifying sensors, our technical team has tested purpose-built, fully calibrated NIR sensors and point-and-shoot converted NIR cameras. In our experience, the differences between the two categories of sensors are significant. While testing the converted point-and-shoot cameras, we discovered significant variation in what should have been comparable images. In one example, a field was mapped with a converted Canon SX260 at 10:00 AM. The images were stitched and converted with NDVI. The field was mapped again at 2:00 PM. The stitched and converted image from 2:00 PM differed significantly from the image produced at 10:00 AM.
We call the imagery from a converted point-and-shoot NIR camera “qualitative”. It may be helpful in some cases, and provide some insight into the health of a field, but it doesn’t appear to be consistent and comparable over time. For instance, in may be difficult to understand the effects of a field treatment based on qualitative imagery – as variations in the sun’s position or cloud cover can throw off the calculation and prevent real comparisons.
Purpose-built NIR sensors, while more expensive, provide quantitative data. They are calibrated in the factory, and often again for each flight. They sometimes carry ILS (incident light sensors) that allow them to adjust for changing light conditions. The software included with the purpose-built sensor systems allows it to adjust for varying light conditions to produce consistent, comparable data. So if you map a field, ground truth the results, apply a treatment and fly again in a week or so – the purpose-built, calibrated NIR systems should provide a solid idea of what results you have obtained with the treatment. (Note: the process of ground truth-ing is simply walking out to the stressed areas of crop to determine why the plants are stressed and figure out what to do about it).
In summary, the three common types of imaging sensors that are appearing in the agricultural UAV market are visible light, qualitative NIR and quantitative NIR. It isn’t enough to choose a system based on airframe capabilities; these should likely take a back seat to choosing a system for the types of sensors that are available on the platform.
The advantages of being able to inspect your crop from above are becoming more readily available due to the availability of small UAVs (unmanned aerial vehicles). These small, battery-powered multi-rotor and fixed wing aircraft are able to provide growers with a birds-eye view of their fields; providing much of the same value as owning a full-size aircraft at a fraction of the cost. The trained eye of a seasoned farmer can often visually identify issues quickly, and the airframes can provide a view of the entire field – rather than just the outer few rows. The results should mirror the impact of owning a full-size airplane, and in some cases may be better, due to the high-resolution cameras and low altitudes of the UAVs.
Near-infrared imaging creates additional value by identifying stressed plants early. Near-infrared (the light just beyond the visible light spectrum that we see) reflectivity of a plant, when compared to visible light reflectivity, can indicate the general health of the plant. When two plants are compared in visible light, they may look identical. When photographed with a NIR (near-infrared) sensor, we can tell if one of the plants is stressed.
Plants rely on the photons in visible light to perform photosynthesis, and therefore absorb visible light. Additionally, the cell structure of a healthy plant will reflect NIR light, because the energy level of the photons in NIR is lower, and absorbing these bandwidths of light would likely make the plant overheat without benefiting photosynthesis significantly. Using these two facts of nature, we can derive the overall health of a plant by comparing visible light reflectivity with NIR light reflectivity.
A healthy plant will reflect a small amount of visible light and high amount of NIR light. By comparison, a stressed plant will reflect a higher amount of visible light, and a lower amount of NIR light. Since our sensors (a standard camera and a NIR camera) measure reflectivity, they can be used to identify the areas of a crop that are stressed by comparing the two views pixel-by-pixel.
One of the more common expressions of the difference between the values is known as “NDVI” – Normalized Difference Vegetation Index. Developed in early-1970 as a way to evaluate imagery from the Landsat 1 satellite, NDVI compares the values and forces the difference into a range between -1 and 1. Limiting the range of possible values makes the job of creating a visual representation of the health of a field much easier, as each increment in the range can be assigned a color. In many NDVI images, green represents healthy plant life, and red represents either areas without vegetation or areas of stressed vegetation.
NIR sensors fall into two categories – purpose built, engineered and fully calibrated NIR sensors and converted point-and-shoot cameras. Point-and-shoot cameras are converted by opening up the body and removing the infrared mirror that sits in front of the light sensor, and replacing it with a blue filter of some type. The purpose of the blue filter is to block as much visible light as possible, allowing the sensor to respond to the NIR light frequencies.
While prototyping airframes and qualifying sensors, our technical team has tested purpose-built, fully calibrated NIR sensors and point-and-shoot converted NIR cameras. In our experience, the differences between the two categories of sensors are significant. While testing the converted point-and-shoot cameras, we discovered significant variation in what should have been comparable images. In one example, a field was mapped with a converted Canon SX260 at 10:00 AM. The images were stitched and converted with NDVI. The field was mapped again at 2:00 PM. The stitched and converted image from 2:00 PM differed significantly from the image produced at 10:00 AM.
We call the imagery from a converted point-and-shoot NIR camera “qualitative”. It may be helpful in some cases, and provide some insight into the health of a field, but it doesn’t appear to be consistent and comparable over time. For instance, in may be difficult to understand the effects of a field treatment based on qualitative imagery – as variations in the sun’s position or cloud cover can throw off the calculation and prevent real comparisons.
Purpose-built NIR sensors, while more expensive, provide quantitative data. They are calibrated in the factory, and often again for each flight. They sometimes carry ILS (incident light sensors) that allow them to adjust for changing light conditions. The software included with the purpose-built sensor systems allows it to adjust for varying light conditions to produce consistent, comparable data. So if you map a field, ground truth the results, apply a treatment and fly again in a week or so – the purpose-built, calibrated NIR systems should provide a solid idea of what results you have obtained with the treatment. (Note: the process of ground truth-ing is simply walking out to the stressed areas of crop to determine why the plants are stressed and figure out what to do about it).
In summary, the three common types of imaging sensors that are appearing in the agricultural UAV market are visible light, qualitative NIR and quantitative NIR. It isn’t enough to choose a system based on airframe capabilities; these should likely take a back seat to choosing a system for the types of sensors that are available on the platform.
Subscribe to:
Posts (Atom)