In the middle of the UF/IFAS Beef Research Unit North of Gainesville, ECE PhD student Dylan Stewart unpacks an unmanned aerial vehicle from its flight case. The UAV (drone) is fitted with a special type of camera that is able to ‘see’ wavelengths far beyond the visible light spectrum that humans can see. It’s being programmed to fly autonomously over a cow pasture, taking snapshots as it flies. OK, cool, but a cow pasture in the middle of nowhere? What’s this to do with electrical engineering?
What’s happening here is collaboration. The story: Assistant Professor Chris Wilson of the UF Agronomy Department is studying the ecosystem impacts of the management practices recommended by IFAS/Agronomy, specifically the impact of introducing perennial peanut to grazing land for beef cattle. He and Agronomy PhD student Hunter Smith have been seeking new ways to measure plant traits such as carbon and nitrogen content as well as root growth and proliferation. One of the motivations behind the work is climate change—perennial peanut is quite effective in sequestering carbon.
Current methods to measure and characterize plant canopies and root systems are slow and often destructive—plants must be manually removed from the soil in order to be analyzed. Since plants are the engines that drive carbon uptake and storage in ecosystems, this represents a significant challenge for our ability to monitor and manage agroecosystems for services like carbon sequestration, which could help mitigate climate change. Traditional fieldwork is labor-intensive and often fairly coarse (for example, destructively harvesting shoots and roots from a given area to get biomass measurements, while missing more subtle features of their architecture or biochemistry).
And that’s where ECE Associate Professor Alina Zare gets involved. Her Machine Learning and Sensing Lab has extensive experience with the analysis of UAV-assisted hyper-spectral imagery using machine learning algorithms. She has developed an elaborate code-set designed specifically to aid in the analysis of the masses of data generated by hyper-spectral flyovers. The hope is that combining her expertise with UAVs, hyper-spectral imagery, and machine learning with Dr. Wilson’s expertise in ecosystem services and pasture management will yield pastures which are more resilient and are able to contribute to climate change mitigation.
And so it is that Dylan connects to the drone via a laptop, selects the correct program, and tells it to go. The drone launches, buzzing raucously into the air, then settles into a regular path, methodically moving above the field in overlapping straight lines. The team relaxes for a bit, hopeful that their hard work and planning will yield useful, non-“mushy” data. Today’s run is a bit of a practice run to ensure that the data looks right and that the drone behaves as expected. If everything goes according to plan, future flights will be easier and more frequent. However, as Dr. Wilson points out, this sort of thing almost never goes right the first time.