Agriculture, Ag Tech December 01, 2021
Propellered Protectors
New aerial recon and precision application tech could protect crops soon.
A collaboration between entomologists and engineers is creating a buzz around the University of California, Davis campus—and it’s more than just the hum of drone propellers overhead.
Led by professors Christian Nansen of the entomology department and Zhaodan Kong of mechanical and aerospace engineering, the multi-talented team is pushing the edges of technology to use drones to autonomously detect stress in plants, identify the cause, and deliver beneficial insects with exacting precision to the infested area. To detect changes in crop status too subtle for the naked eye and drop drift-prone persimilis mites on the target requires excruciating attention to tiny details as well as a willingness to tackle huge questions like the nature of light, plants’ daily rhythms, and how to predict the behavior of wind.
Take the challenge of detecting spider mites in a crop. Entomologists, including Nansen, have demonstrated that plants suffering from sucking insects like mites or aphids reflect light differently than healthy plants do.
“The plant is supposed to be efficient, it’s supposed to be absorbing the sunlight, but it can’t do that as efficiently when it’s stressed,” Nansen explains. “So the extra energy is going to bounce off the surface of the plant and we can pick that up as a change of reflectance.”
Subtle changes. Anil Mantri, a researcher in Nansen’s lab, points to the features of the team’s hyperspectral camera, which can detect 150 wavelengths of visible and invisible light. (My camera picks up three of them: red, green, and blue.)
That’s great—at 11:32 and 41 seconds on Tuesday morning, a snapshot can reveal telltale differences in certain wavelengths that signal trouble in some plants. But, notes Nansen, 30 minutes later, a cloud could pass across the sky for a moment, changing the color and intensity of the light. And by then, the sun and shadows are at a different angle. In short, light changes constantly, scrambling the reflectance data.
So Nansen, Mantri, and Kong (an expert in using hyperspectral imaging in human medicine) added a downwelling sensor, which registers the precise quality of light hitting the plants as the camera snaps each image. They’re also exploring a different approach—imaging at night. Mounting a halogen light bar beside the camera on the greenhouse rail, they can create consistent lighting for every image.
There may be other advantages.
“There’s even the possibility that as the plant gets into its respiratory mode when it’s not doing photosynthesis, it could be that it is reacting more strongly to a stressor,” Nansen notes. Nighttime monitoring could be extremely convenient, too. Greenhouse or farm managers could activate a rail-mounted or drone-mounted camera/light setup before they go home, then come back the next morning to a fresh crop health report.
Prop power. With quick turns on just two bolts, Mantri demonstrates how the camera setup can be affixed to a greenhouse rail or to one of the team’s DJI S1000 octocopter drones. Once they zero in on the most relevant wavelengths, the group can build an even lighter camera and fine-tune the light bar, too. In the meantime, aeronautical engineer Na Ma is readying the other S1000 for flight. Her drone is carrying the team’s “Bugbot,” a 3-D printed drum with a remote controlled gate designed to drop a mix of vermiculite and beneficial insects—like predatory mites or wasps—and onto an infested crop. If Ma can deliver those beneficials precisely to the infested plants, applying predators could become more cost-competitive with conventional insecticides, which is likely why the California Department of Pesticide Regulation funded it.
It’s no easy task. Vermiculite and mites are extremely lightweight, so even a 1 mph wind could push them off-course by a yard at the height she’s working: 11.5 to 13 feet, high enough to take advantage of the downwash of the drone’s eight propellers without bounceback off the ground.
Her goal is to drop her predaceous payload within half a meter—1.6 feet—of her target. To do that, she has rewritten the drone’s software to account for wind. Now she’s tapping into machine learning to teach it how to predict what the wind will do next so the drone can fight the wind or ride it to save battery, and line up precisely for release. “We’re constantly monitoring the wind and we’re making predictions several seconds into the future,” says Ma. “Based on that, we can make the controller send a command to the drone so it’s more precise to control.”
Huge strides.Basically, Ma is creating the pest control equivalent of the Norden bomb sight, which helped turn the tide of World War II. In the 1940s, Norden’s technology was a giant step toward the computer age. Today, the UC Davis team’s diagnostic and delivery work promises huge strides toward the next era in IPM and farm automation. ‡
Note: This research is sponsored through a combination of funding by the U.S. Department of Agriculture’s (USDA) Agricultural Marketing Service, the California Department of Pesticide Regulation as well as partial funding from the American Floral Endowment, the Gloeckner Foundation, and USDA-ARS Floriculture and Nursery Research Initiative.
Read More
AGRICULTURE, SPECIALTY/NICHE
Paddle it Forward
Mississippi River expedition highlights an opportunity to pay it forward for conservation.
AGRICULTURE, EDUCATION
Sunshine Strategies
Crop configurations that capture more sunlight.