RoboWeedMaPS

How deep learning can
help farmers get rid of weeds

Researchers can already take high-resolution images of weeds, even when driving at 50 km/h in the field. The aim is to make the entire spraying process as autonomous as possible.


By using a number of digital technologies, it is possible today to produce an overview of the type of weeds farmers are struggling with, and show exactly where they are located in the fields. This can mean a significant reduction in the amount of herbicides used.

To combat weeds, farmers have been spraying their fields with herbicides for decades. Until now, however, they have used a standard mixture of sprays which are spread over the entire field. This is not particularly smart from either a financial point of view or an environmental or biological perspective because herbicides are expensive, can have implications for the water quality in streams and groundwater, and lead to resistance in the weeds. Not to mention that a standard mixture of herbicides may have a good effect on some species of weeds, but a less effective impact on others. And with a new EU directive stating that farmers should not spray before first inspecting the fields and determining the type of weeds in accordance with the principles of integrated plant protection, there is an extremely great need for a smart solution.

But is it at all possible to make the spraying process smart? This is what Aarhus University researchers have been studying since 2012, and they have actually achieved success via a combination of cameras, Big Data and machine learning, in interaction with an existing decision support system (PlanteVærn Online).

“Currently, farmers don’t know what sort of weeds they have in their fields, and this knowledge is necessary. Not only regarding the use of the right herbicide, but also in relation to the regulations stipulated by the EU. But the farmers don’t have time to go round the fields and find out which weeds are growing there. This is where we can help with our technique which can outline the problem and thereby provide the basis for saving a considerable amount of spray,” says Senior Researcher Rasmus Nyholm Jørgensen.

Purple or green weeds

The technique involves attaching a row of high-resolution cameras to a wide bar on a tractor or directly on the farmer’s tools in the field. Alternatively a consumer drone may also be used. When the farmer drives through the vehicle tracks in the field, each camera takes lots of photos. Via the image positions, it is possible to piece the images together and create a comprehensive map of the crops and weeds in the field.

The images are automatically uploaded in the cloud, where they are analysed by several specialised algorithms for the composition of the weeds compared with the competitive ability of the crops. This is where Big Data come into the picture because the whole system relies on an enormous ‘weed data base’, where the researchers use deep learning to teach the computer to recognise different types of weeds.

Although it might sound simple, it is actually quite difficult to automate the process, particularly because weeds in the fields hardly ever resemble those in the weed botanical books.

“The problem with recognising weeds is that they change shape. It only takes a small beetle to eat a leaf and the plant doesn’t look like the one in the image at all. Or the stems can be so thin that – in the image – it looks as though the leaves aren’t connected. And if it’s cold in spring, some weeds turn completely purple even though they’re normally green,” says Senior Researcher Jørgensen, pointing out that the computer therefore has to chew through a considerable amount of data in order to distinguish each individual type of weed from the other.

He compares this with driving a car. “Using pedals and gears doesn’t come automatically and sub-conscious, so until you’ve been out driving a lot, you feel uncertain. But if you’ve tried out a car simulator beforehand, these basic things become automatic and sub-conscious. Just like pilots. And that’s what we’re doing in our project. We train the neural network with artificial images in the simulator prior to releasing it into the real world,” he says.

Making life easier

The project has now reached the stage where the cameras can take high-resolution images with an accuracy of 4 pixels per millimetre, even when driving at 50 km/h in the field, and where the computer has satisfactorily learned to recognise 27 types of weeds from a data base with thousands of images. The computer is trained with far more species of weeds, but the number of training images is still too small to achieve strong recognition.

The aim is to make everyday life easier for farmers, so that the computer itself finds where the weeds are located, what type they are, and which type of herbicide should be used at precisely that spot in the field. The computer will thus control the dosage when farmers are spraying their fields, and even regulate different types of herbicides and dosages depending on the type of weed – an important part of the future smart farming.

 

 

PROJECT FACTS

Project title
RoboWeedMaPS - Automated Weed detection, Mapping and Variable Precision Control of Weeds

Schedule
2017–2020

Financial Framework
DKK 34.6 million
Innovation Fund Denmark

Project partners
AgroIntelli ApS
IPM Consult ApS
Datalogisk A/S
Danfoil A/S
I•GIS ApS

About the research section

    Contact

    PURE fejl
    PURE serveren er i øjeblikket ikke tilgængelig.
    This image shows a survey of weeds in a field. Each individual dot represents an image. Put together, this provides a very precise indication of the type of weeds causing havoc at particular parts of the field.