You are here

Share page with AddThis

Spoilt for data but can drones really help grape and wine production?

Dr Sigfredo Fuentes Senior Lecturer in Digital Agriculture, Food and Wine School of Agriculture and Food Faculty of Veterinary and Agricultural Sciences

‘Drones, yeah right,’ you might groan. For so long they have promised much to viticulture and delivered dubious benefits. But at the dawn of a new IT revolution for agriculture, one of the field’s leading researchers is confident that’s all about to change, particularly as machine learning smartens up how vineyard managers and consultants can apply the data captured.

We invited University of Melbourne-based Senior Lecturer in Digital Agriculture, Food and Wine, Dr Sigfredo Fuentes, to share his insights.

Firstly, tell us about yourself. What are some of the differences between the industry in Australia and your native Chile?

I graduated from my Bachelor of Agricultural Sciences in Chile at The University of Talca, which is located in the 7th region (Mediterranean climate). The 7th region is predominantly agricultural, where 70% of Chilean wine exports originated.

I came to Australia in early 2000 and worked for two years ‘out the back of Bourke’ at Irrigation Technology, installing automatic meteorological stations and soil moisture networks. The 10-year-drought had just started there, which dramatically affected agriculture and the town itself.

In 2002, I was granted a PhD scholarship from The University of Western Sydney (Western Sydney University now), specifically in Plant Physiology and Irrigation Science. I graduated in 2005 and worked as postdoctoral fellow at The University of Technology, Sydney for two years. For my post-doctorate research, I studied the effects of climate change on forests, using whole-tree chambers (Richmond, NSW).

In 2008, I moved to Adelaide (The University of Adelaide, Waite Campus) for a postdoctoral research on the effects of climate change on grapevines and wine. I started The Vineyard of the Future initiative to unite common research on these topics and the implementation of digital technologies. Since 2012, I have been working at The University of Melbourne, Faculty of Veterinary and Agricultural Sciences, as Senior Lecturer in Digital Agriculture, Food and Wine.


Is there a particular machine learning research project you are involved with? Where’s the research up to?

We have a few exciting projects and models developed that can be readily used in the industry. One of these is to tackles the problem of bush fires and smoke taint in wines. At the moment, growers do not have any practical real-time tools to assess levels of contamination in grapevine canopies and berries. The only control measures that can be applied is by leaf plucking around bunches to avoid further contamination, and washing canopies, which has minimal effects. Furthermore, the only way to quantify contaminant levels in grapes is by collecting samples and sending them to a certified lab, which can be very costly, and cannot represent and capture the spatial variability present in the field.

We have determined that by using remote sensing techniques (infrared thermography and near infrared spectroscopy), the pattern of transpiration changes within canopies if affected by smoke. This pattern can be recognised using machine learning algorithms (which learn from the data to assess a specific target) and deep learning modelling which mimics the human eye to identify changes of patterns in imagery (thermography). With these techniques we have created models that can predict if canopies have been smoke contaminated or not, with accuracies between 85 to 91%, depending on the cultivar.

Dr. Sigfredo Fuentes Senior Lecturer in Digital Agriculture, Food and Wine School of Agriculture and Food Faculty of Veterinary and Agricultural Sciences.

A recent experiment that we conducted in China (Ningxia) has resulted in models with 90% accuracy to detect four different levels of contamination: i) no contamination, ii) low contamination, iii) medium and iv) high contamination. These models implemented on UAV and remote sensing technology can offer growers maps of levels of contamination of their fields one hour after bush fires. Then growers can take informed decisions of differential harvest and winemaking. Similar models were constructed using near infrared spectroscopy on bunches to assess the levels of glycoconjugates in berries and wines (depending on winemaking techniques) with similar high accuracies. These models can offer a rapid discrimination of berry contamination in a harvesting line and the implementation of artificial intelligence to obtain specific wine aromas and intensities.

There is another project in which we developed machine learning models for Cabernet Sauvignon and Chardonnay to predict stem water potential (Ψs in megaPascals) at a plant-by-plant level with 85% accuracy. Usually the pressure bomb is used to measure stem water potential, which has been shown since their invention in 1965 by Scholander, as the most accurate method to assess plant water status. However, it requires heavy instrumentation and gas cylinders. Furthermore, its operation requires a specialised personnel and measurements can be done only in sentinel plants from which they entire blocks and field are then extrapolated. This does not represent at all the spatial variability that can be found in any vineyard, especially in Australia, due to microclimates or changes in soil physical and chemical characteristics. The machine learning models we developed, require UAV flights with multispectral cameras with a current cost of around AUD $9,000 -10,000 plus GST. A flight of 15 minutes (normal duration of batteries per flight) can cover 50 hectares. If using a fixed wing UAV with a multispectral camera, it can fly for two hours covering around 1,500 hectares. These are very rough figures that can change according to weather conditions (wind velocity) and equipment used.

Dr. Sigfredo Fuentes has been particularly interested in the applications for UAVs when predicting smoke taint.

A third application developed is the cheapest and easiest one but with big implications to operation logistics. This is using a cheap UAV with a normal high definition camera for flights covering the same areas mentioned before to assess canopy architecture and vigour of plants individually. We generate this model using machine learning to answer a common question for growers: how to locate and quantify missing plants and plants affected after an attack of pest, disease or a frost event. The model developed identify single plants and perform canopy architecture quantification from each in terms of leaf area index, canopy porosity, canopy cover and clumping index. This generates maps with the results and a list with row and location for plants as targets, which can be missing plants, plants with reductions of vigour by a certain percent, etc.

We are now working on a project funded by Wine Australia with the collaboration of The Univesrity of Melbourne and The University of Adelaide (A.Prof. Cassandra Collins and Dr Roberta De Bei) working on the capabilities mentioned, to relate all the canopy architecture parameters to the quality traits of grapes. Hence, in the future we can generate maps of brix, pH, TA, maturity based on berry cell death and other important quality traits that can support harvests and target specific wine styles. This project started this year and will end in 2020. In parallel we will also test the capabilities of the VitiCanopy App developed in a previous project that can be used on smartphones and Tablet PCs.

Who is going to adopt this technology?

As touched on earlier, some growers have had bad experiences with these technologies in the past but I think that younger generations of growers are very open to these new and emerging technologies. I did an article recently entitled “Technology of bringing youth back to agriculture” in the Weekly Times. In this article I discussed how sensor technology, the Internet of Things (IoT), machine learning, robotics and artificial intelligence applied to agriculture, is attracting the interest of new generations, which practically grew up surrounded by them in the form of personal computers, digital games, smartphones and tablet PCs. In our plant physiology and agronomy classes at UoM, we are implementing the same models and Apps developed for agriculture to train new generations of agronomists. This has potentially contributed in increased enrolments in the Bachelor of Agricultural Science by 30% in 2017. This trend is the same in other countries offering agricultural science degrees that have implemented these technologies.

What’s likely the tipping point for widespread adoption of these technologies? Is it going to be for everyone?

I think that we are in the middle of this agricultural IT revolution and we’re amid the tipping point for their implementation in terms of UAV, remote sensing, sensor networks and IoT using machine learning and artificial intelligence. There are already various modalities that growers can have access to these services. Nowadays, even teenagers can easily fly drones in agricultural areas of, if the UAV weighs less than 2Kg to comply with the Civil Aviation Safety Authority (CASA). UAVs and instrumentation required (cameras) are getting cheaper and miniaturised more and more. We have implemented a cloud system in which growers can upload the data where it is processed and passed through models to give meaningful processed othomosaics for different applications. This is going through the five stages for UAV remote sensing analysis discussed earlier. One company is doing exactly that in association with the UoM, called HortEye (

So, as you can see, these are exciting times to be working in agriculture, and despite the many constraints climate change presents and market changes, we are working on a state-of-the-art technologies to maintain the competitivity of agricultural industries (small, medium and large) in the national and international market. 

All associated articles from different trials, experiments and models can be found and downloaded from:

To read more about Dr Fuentes and his work visit