Researchers led by Professor Salah Sukkarieh at the Australian Center for Field Robotics at the University of Sydney in Australia have developed an autonomous vision-based robotic system that could be used by vegetable farmers to optimize the yield of their crops. The novel design has the ability to conduct farm surveillance and mapping as well as detecting and classifying a variety of different vegetables.
The development of the so-called Ladybird robotic system was the result of joint funding the researchers received from Horticulture Innovation Australia, a research, development and marketing organization for the horticulture industry and AUSVEG, an organization that represents the interests of Australian growers to the country’s government.
Starting from scratch
In the early stages of the development process, Professor Sukkarieh and his team considered a number of options that they could take to automate the farming process. Initially, they considered retrofitting an existing piece of farm equipment such as a tractor with more intelligent systems. However, this approach was eventually discarded in favor of a developing a robotic system from scratch, an approach that could provide farmers with a lower-cost platform that could then be adapted to meet their individual needs.
“To create a robot that would be ecologically friendly, we rejected the idea of using a diesel engine to power the vehicle, deciding instead to build a battery powered vehicle that could be recharged in the field from a set of curved solar panels mounted on “wings” on each side of the robot,” said Professor Sukkarieh.
On a sunny day, the drive train is efficient enough so that the solar energy that is captured is enough to recharge the batteries, enabling the robot to operate indefinitely.
The curved solar panels enable the Ladybird robot to capture sunlight effectively as the location of the sun in the sky varies and changes with the seasons. Aside from charging the bank of lithium iron phosphate (LiFePO4) batteries on the vehicle, the wing mounted solar panels can also be raised and lowered to accommodate different height of crops. The wings also provide a degree of shading under the system, so that the vision systems mounted on the underside of the robot can capture images of the crops without being affected by changes in ambient lighting conditions.
A modular drive train system on the Ladybird robot features four identical, modular, electric drive units. Each has two mechanically decoupled axes to orient and drive the wheels. According to Professor Sukkarieh, this not only provides the robot with a great deal of maneuverability, it also enables the developers to quickly re-deploy the same drive train architecture in vehicles with more than four wheels should the need arise. The efficiency of the drive trains and the electric motors on the wheels enables the Ladybird to move autonomously through crops at about three miles per hour for between seven and nine hours on batteries alone depending on the tasks that it is performing.
“On a sunny day, the drive train is efficient enough so that the solar energy that is captured is enough to recharge the batteries, enabling the robot to operate indefinitely. In addition, the batteries will provide another seven hours of nighttime operation if needed, providing the ability for the system to work almost 24 hours a day,” said Professor Sukkarieh.
A set of three sensors capture data as the robot traverses a field. The data captured enables the system not only to identify the shape and color of the crops, but also to capture their spectral fingerprints, which can provide data on their health. A previously trained machine learning algorithm then identifies the crops after which an intelligent robotic system can directly deliver fertilizer directly onto them.
To enable the Ladybird to autonomously navigate across a field, the robot was fitted out with a NovAtel GPS Inertial Navigation System. As the Ladybird moves, a forward and rear facing Lidar together with Point Grey Ladybug 3 spherical camera also capture data of the surroundings, enabling a Neousys Nuvo-3005E on-board processor to identify and avoid obstacles as well as detect the rows of crops.
Capturing spectral fingerprints
Under the hood of the Ladybird, a set of three sensors capture data as the robot traverses a field. While a Point Grey Bumblebee 2 camera is used to capture RGB images of the crops, a Resonon Pika II hyperspectral imaging camera captures both infra red and ultraviolet data. The variety of spectral information from the two enables the system not only to identify the shape and color of the crops, but also to capture their spectral fingerprints, which can provide valuable data on their health. Lastly, a Sick LD-MRS laser sensor enables the system to determine the height of the crops above the ground, delivering yet another measure of their physical condition.
To enable the on-board computer to classify the types of plants in the crop, as well as providing data on their health, the system must first be trained to do so using sets of data previously collected in the field. According to Professor Sukkarieh this is achieved by using one of a number of machine learning algorithms, such as neural networks, support vector machines and Gaussian mixture models. The choice of which machine learning algorithm is most effective is highly dependent on the nature of the crop itself.
“The machine learning algorithm chosen to identify a particular crop type must first be presented off-line with the combined data from the imaging systems. Once the machine learning algorithm has been given a set of training examples, the training algorithms build a model that will then assign new examples that the system is then presented with into categories. The system can then be deployed in the field where it will automatically carry out the classification of the crop autonomously on the Ladybird robot,” said Professor Sukkarieh.
Individual plant control
To control the weeds in the field and to fertilize the crops, the Ladybird robot was fitted with a spraying end effecter attached to a Universal Robots UR5 six-axis robot. When the machine learning algorithm has identified a weed, the co-ordinates of the weed can be transferred to the intelligent robotic system, which can then position itself directly over the weed. Once in position, a small and controllable volume of herbicide spray is then fired at the weed exactly where it is required. Alternatively, if the machine learning software identifies a plant in poor health through an analysis of its spectral characteristics, the robotic arm can deliver a dose of fertilizer directly to the plant.
The targeted means of delivering both herbicide and fertilizer allows the volume of both chemicals to be reduced significantly. Indeed, estimates are that less than one hundredth the amount of chemicals would be required compared with conventional blanket spraying applications, leading to benefits to the environment and significant cost reduction for the farmer.
Moving toward commercialization
Since the development of the Ladybird system, the researchers have taken the key concepts behind the design to develop two new robots -- RIPPA (Robot for Intelligent Perception and Precision Application) and VIIPA (Variable Injection Intelligent Precision Applicator), effectively bringing the concepts first demonstrated in the Ladybird one step closer to commercialization.
From a research perspective, Professor Sukkarieh said that his research team is now working on developing new algorithms for the robots that will enable them to effectively identify the key features of a variety of crop types to determine what action to take to ensure that the plants stay healthy in the field.
Written by David Wilson, Senior Editor, Novus Light Technologies Today