Quantcast


CURATOR
A pinboard by
Alex Olsen

PhD student, College of Science and Engineering, James Cook University

PINBOARD SUMMARY

Giving weed control robots sharper eyes

Environmental weeds are plants that invade native ecosystems and adversely affect the survival of indigenous flora and fauna. This can include foreign plants accidentally or intentionally introduced or native plants that have become insidious due to inappropriate management or unsuited inhabitation.

In pastoral lands; weeds invade crops, smother pastures and occasionally poison livestock. In a 2012 survey conducted by Landcare Australia, weed and pest control was ranked as the most significant land management problem by nearly half of Australia’s primary producers.

Weed species recognition remains a major obstacle to the development and industry acceptance of robotic weed control technology. All weed control robots need to find weeds in order to kill them. The focus of my research is to enhance the effectiveness of weed spraying robots by developing new image recognition algorithms and technologies to improve their ability to detect weeds under realistic rangeland conditions.

Detecting weeds using machine vision is simple in the highly controlled environment of intensive cropping where the land is flat, the vegetation is homogeneous, and the light conditions may be controlled with external lighting/shading or time-of-use. However, for rangeland and rough pastures, the problem is far more difficult. Many different species of weeds and native plants may be present in the same scene, all at varying distances from the camera, all experiencing different levels of lighting/shading, with some weeds being occluded. This presents a number of issues for imaging and identification including depth of field and dynamic range limitations of camera systems.

Past experience in these difficult environments has indicated that using conventional image analysis techniques to identify leaf colour, shape or texture are not sufficient and new systems are required. The goal of my research is to develop fully tested recognition systems using a range of imaging and spectrometric properties which can be applied to any robotic platform.

The main contributions of this research will be: the publication of methods to reliably detect significant Australian weeds, adaptable to any agricultural vehicle or terrain; and the creation of the first public image dataset of some important Australian weed species for testing new detection methods in the future.

6 ITEMS PINNED

Convolutional Neural Network-Based Robot Navigation Using Uncalibrated Spherical Images.

Abstract: Vision-based mobile robot navigation is a vibrant area of research with numerous algorithms having been developed, the vast majority of which either belong to the scene-oriented simultaneous localization and mapping (SLAM) or fall into the category of robot-oriented lane-detection/trajectory tracking. These methods suffer from high computational cost and require stringent labelling and calibration efforts. To address these challenges, this paper proposes a lightweight robot navigation framework based purely on uncalibrated spherical images. To simplify the orientation estimation, path prediction and improve computational efficiency, the navigation problem is decomposed into a series of classification tasks. To mitigate the adverse effects of insufficient negative samples in the "navigation via classification" task, we introduce the spherical camera for scene capturing, which enables 360° fisheye panorama as training samples and generation of sufficient positive and negative heading directions. The classification is implemented as an end-to-end Convolutional Neural Network (CNN), trained on our proposed Spherical-Navi image dataset, whose category labels can be efficiently collected. This CNN is capable of predicting potential path directions with high confidence levels based on a single, uncalibrated spherical image. Experimental results demonstrate that the proposed framework outperforms competing ones in realistic applications.

Pub.: 13 Jun '17, Pinned: 31 Jul '17