Waypoint

Ask The Experts — An Agriculture Drone Q&A

At the end of senseFly’s recent Ag Drone Insights webinar, we reserved time for an open Q&A session. The questions were submitted by viewers from around the globe. The answers meanwhile were provided by the event’s three expert presenters: Norm Lamothe of Deveron UAS, Erick Lebrun of AIRINOV and Nathan Stein, senseFly’s ag solutions manager.

This post details their responses in full, covering topics such as: flight heights, achievable accuracy, what makes for good data, in-field targeting and more.

Our presenters answered the following questions:

ag_drone_insights_speakers
Norm Lamothe (Deveron UAS), Erick Lebrun (AIRINOV), and Nathan Stein (senseFly)

Isn’t accuracy enhanced with a VTOL (vertical take-off and landing) aircraft?

Nathan Stein, senseFly: Each one is different and has its benefits. If you’re doing research and want to get up close, really close, then the VTOL makes sense, but I think in most cases the accuracy of our aircraft is very well proven and the overlap settings are critical in this part. It’s just making sure you have adequate overlap. Other than that, I have full faith and confidence in the results. We’ve proven to several genetic and seed companies and research institutions that our product can deliver very accurate information. I encourage you to look out and do some research yourself on that and to learn a bit more [read case studies]. If you have further questions, feel free to reach out to us and ask a bit more about that so we can explain that more in-depth.

Norm [Deveron UAS], what height were you flying your aircraft? Did you find the accuracy was good enough for precision ag?

Norm Lamothe, Deveron UAS: That’s a great question. The normal altitudes we would fly at with our system are between 300 to 400 feet above ground level. One of the things [decisions] we get captured in is the resolution and how precise it needs to be for most applications in agriculture. Remember, when we’re talking about equipment in the field we start at 20 feet and go all the way out to a 60, 90, or 120-foot sprayer. That equipment has to be able to adjust to any prescription you give it. To have a three- or four-centimetre resolution is not necessary to make a decision in agriculture at this point in time. We’re not looking at individual plants yet, we’re still looking at zones or areas that are much larger.

To have a three- or four-centimetre resolution is not necessary to make a decision

Are there any applications that are possible for orchards?

Norm Lamothe, Deveron UAS: Absolutely. Orchards have a chlorophyll signature and there is variability across an orchard as well. A lot of the orchard work we’ve done has been around water management. We’ve done some work with almonds and pistachios in the California area, dealing with things such as salination and using water more efficiently to establish where there might be problems with irrigation systems.

Erick Lebrun, AIRINOV: I would like to say that we are also able to furnish NDVI really simply for every kind of vegetation. You can use our platform for that. If you’re interested in developing some agronomical models, for fertilisation for example, we can help you. We are doing that with pineapples for example, in the Philippines. It’s a worldwide application.

How long does it take to process the data from a 15-minute flight?

Erick Lebrun, AIRINOV: A fifteen-minute flight collection is really very fast, maybe one or two hours. It depends on the internet connection because uploading the data takes the most time, but once it’s on the servers it’s a really fast application.

What do you consider good data? What are the parameters you use?

Norm Lamothe, Deveron UAS: For us, good data is the ability to collect consistent data across both lighting and outdoor temperature conditions, and also across geographic areas. If we fly a field one day, and all else being equal, except lighting conditions, we should be able to fly that field the next day and get very similar results. The ability to gather consistent data day after day is what I would consider good data, and data that stitches well, which has to do with the planning phase of setting up the UAV and ensuring you’re doing proper overlap so you do get good stitches and good end results.

Good data is the ability to collect consistent data across both lighting and outdoor temperature conditions, and also across geographic areas.

What algorithm should I use with my Parrot Sequoia sensor to calculate yield predictions?

Norm Lamothe, Deveron UAS: There are many algorithms in the marketplace that are providing outputs for yield estimation. A lot of them require still groundtruthing and putting boots on the ground, so taking that image and going down.

We have a system used here in Ontario that uses a 30-foot rope and you determine the yield in that space and then calibrate that to the image and multiply it out across the field. That gives you results accurate within five to seven percent of the yield potential.

In the future, you need to remember that weather conditions and circumstances can change that crop, along with diseases and pests. From a strict yield perspective, all else being equal, it’s pretty accurate.

Erick Lebrun, AIRINOV: Actually we have done some research on yield prediction. What’s important is to consider that we need relative data over time and to fly lots during the cycle of the plant to see the evolution of biomass, for example; the crop’s life. Then we can predict something. But from just one flight collecting data? It’s really complicated [difficult] to know something.

How much of a difference in chlorophyll level is caused by differences in sunlight reflection—UV— and would it be relevant to compare data captured in the morning to that collected in the afternoon, and how can you standardise this data?

Erick Lebrun, AIRINOV: Actually, there is no real difference. The sunshine sensor of the Sequoia is calibrating every image, so in the afternoon or in the morning there is no problem. You just have to be careful of the height of the sun in the sky—not very early in the morning or not very late in the evening. During the whole day, our operators fly over 600 hectares, so there is no specific problem.

sunshine_sensor_sequoia

What’s the accuracy of the eBee Ag or the eBee SQ?

Nathan Stein, senseFly: The geographical accuracy is fairly decent. I’ve been able to fly fields and they overlay my imagery or my data, in like SMS or any other farm management information system, quite well. If you needed a boost in accuracy, you could always go with the RTK version and that would allow you to get a very accurate geographical position.

On the spectral side, on the camera side, we should be looking at, with the Sequoia, around 5-10 percent accuracy and I think that should be very consistent.

Do you use any sort of targeting in the field?

Nathan Stein, senseFly: The Sequoia has been released with target calibration. When we sell it there is a procedure of taking pictures of a target and that’s going to improve as time goes on. Stay tuned for that! We’re basing it off of a standard and these cameras are based off of standards and calibrated when they’re produced, so that allows us to have that great calibrations.

Watch senseFly’s full Ag Drone Insights webinar on demand: 

Ag-Drone-Insights_On_Demand

 

1 Comment

Leave a Reply

Your email address will not be published. Required fields are marked *