Staff Publications

Staff Publications

  • external user (warningwarning)
  • Log in as
  • language uk
  • About

    'Staff publications' is the digital repository of Wageningen University & Research

    'Staff publications' contains references to publications authored by Wageningen University staff from 1976 onward.

    Publications authored by the staff of the Research Institutes are available from 1995 onwards.

    Full text documents are added when available. The database is updated daily and currently holds about 240,000 items, of which 72,000 in open access.

    We have a manual that explains all the features 

Records 1 - 20 / 143

  • help
  • print

    Print search results

  • export

    Export search results

  • alert
    We will mail you new results for this query: metisnummer==1016794
Check title to add to marked list
Cutting Hedge Technology
Hemming, Jochen - \ 2019
Angle estimation between plant parts for grasp optimisation in harvest robots
Barth, Ruud ; Hemming, Jochen ; Henten, Eldert J. Van - \ 2019
Biosystems Engineering 183 (2019). - ISSN 1537-5110 - p. 26 - 46.
Agriculture - Angle estimation - Computer vision - Robotics - Semantic segmentation

For many robotic harvesting applications, position and angle between plant parts is required to optimally position the end-effector before attempting to approach, grasp and cut the product. A method for estimating the angle between plant parts, e.g. stem and fruit, is presented to support the optimisation of grasp pose for harvest robots. The hypothesis is that from colour images, this angle in the horizontal plane can be accurately derived under unmodified greenhouse conditions. It was hypothesised that the location of a fruit and stem could be inferred in the image plane from sparse semantic segmentations. The paper focussed on 4 sub-tasks for a sweet-pepper harvesting robot. Each task was evaluated under 3 conditions: laboratory, simplified greenhouse and unmodified greenhouse. The requirements for each task were based on the end-effector design that required a 25° positioning accuracy. In Task I, colour image segmentation for classes back-ground, fruit and stem plus wire was performed, meeting the requirement of an intersection-over-union > 0.58. In Task II, the stem pose was estimated from the segmentations. In Task III, centres of the fruit and stem were estimated from the output of previous tasks. Both centre estimations In Tasks II and III met the requirement of 25 pixel accuracy on average. In Task IV, the centres were used to estimate the angle between the fruit and stem, meeting the accuracy requirement of 25° for 73% of the cases. The work impacted on the harvest performance by increasing its success rate from 14% theoretically to 52% in practice under unmodified conditions.

Paprika oogstrobot in de race voor Techtransfer award
Hemming, J. - \ 2019
Onder Glas 16 (2019)4. - p. 35 - 35.
Protocol for semi-automatic identification of whiteflies Bemisia tabaci and Trialeurodes vaporariorum on yellow sticky traps
Moerkens, Rob ; Brenard, Nathalie ; Bosmans, Lien ; Reybroeck, Eva ; Janssen, Dirk ; Hemming, Jochen ; Sluydts, Vincent - \ 2019
Journal of Applied Entomology 143 (2019)6. - ISSN 0931-2048 - p. 652 - 658.
Greenhouse crops - image analysis - integrated pest management - pest monitoring

Yellow sticky traps (YSTs) are commonly used in greenhouse crops to monitor flying pest species. Whiteflies like Trialeurodes vaporariorum (Westwood) (Hemiptera: Aleyrodidae) and Bemisia tabaci (Gennadius) (Hemiptera: Aleyrodidae) are typically monitored using YSTs in tomato and sweet pepper crops. By counting the whiteflies on a YST, growers get an idea of the pests density in space and time in the greenhouse and can take pest control measurements accordingly. The downside is that manual counting of whiteflies on a YST is very time-consuming and thus costly. A protocol to semi-automate counting and identification of whiteflies on YSTs using image analysis software was developed to speed up the monitoring process. Bemisia tabaci is on average smaller than T. vaporariorum and by discriminating by size based on the amount of pixels in digital images, ratios of both species in a mixed population on YSTs could be estimated accurately. At low densities, the countings of different YSTs should be pooled till a 200 density threshold is reached in order to get accurate ratio estimates of both species. This study provides a protocol to reliably count and identify whiteflies semi-automatically on standardized pictures. More research is required to develop alternative techniques to make standardized pictures in the field (e.g., with smartphone).

Raw data from Yellow Sticky Traps with insects for training of deep learning Convolutional Neural Network for object detection
Nieuwenhuizen, A.T. ; Hemming, J. ; Janssen, Dirk ; Suh, H.K. ; Bosmans, L. ; Sluydts, V. ; Brenard, N. ; Rodríguez, E. ; Tellez, M.D.M. - \ 2019
Conevolutional Neural Network (CNN) - deep learning - greenhouse whitefly - insects - Macrolophus - Macrolophus pygmaeus - Nesidiocoris - object detection sticky trap - Trialeurodes vaporariorum - whitefly - yellow sticky trap
On yellow sticky traps that were hanging in commercial greenhouses, insects were collected. The following insects were annotated on the yellow sticky traps: Whitefly, Macrolophus and Nesidiocoris. The dataset is suited to training deep learning convolutional neural networks for object detection.
Tomato plant spider mite damage images
Nieuwenhuizen, A.T. ; Kool, Janne ; Hemming, J. ; Bosmans, L. ; Brenard, N. ; Janssen, D. ; Sluydts, V. ; Suh, H.K. - \ 2019
deep learning - disease detection - spider mite - tomato plant
Image dataset of tomato plants in Belgian greenhouse. The tomato plants were infested with spider mites. Some of the leaves contain spider mite damage. Image locations with spider mite damage have been annotated by human expert annotations.
Synthetic bootstrapping of convolutional neural networks for semantic plant part segmentation
Barth, R. ; IJsselmuiden, J. ; Hemming, J. ; Henten, E.J. Van - \ 2019
Computers and Electronics in Agriculture 161 (2019). - ISSN 0168-1699 - p. 291 - 304.
Big data - Bootstrapping - Computer vision - Semantic segmentation - Synthetic dataset
A current bottleneck of state-of-the-art machine learning methods for image segmentation in agriculture, e.g. convolutional neural networks (CNNs), is the requirement of large manually annotated datasets on a per-pixel level. In this paper, we investigated how related synthetic images can be used to bootstrap CNNs for successful learning as compared to other learning strategies. We hypothesise that a small manually annotated empirical dataset is sufficient for fine-tuning a synthetically bootstrapped CNN. Furthermore we investigated (i) multiple deep learning architectures, (ii) the correlation between synthetic and empirical dataset size on part segmentation performance, (iii) the effect of post-processing using conditional random fields (CRF) and (iv) the generalisation performance on other related datasets. For this we have performed 7 experiments using the Capsicum annuum (bell or sweet pepper) dataset containing 50 empirical and 10,500 synthetic images with 7 pixel-level annotated part classes. Results confirmed our hypothesis that only 30 empirical images were required to obtain the highest performance on all 7 classes (mean IOU = 0.40) when a CNN was bootstrapped on related synthetic data. Furthermore we found optimal empirical performance when a VGG-16 network was modified to include à trous spatial pyramid pooling. Adding CRF only improved performance on the synthetic data. Training binary classifiers did not improve results. We have found a positive correlation between dataset size and performance. For the synthetic dataset, learning stabilises around 3000 images. Generalisation to other related datasets proved possible.
Trimbot cutting tools and manipulator
Hemming, Jochen ; Tuijl, Bart Van; Tielen, Toon ; Kaljaca, Dejan ; IJsselmuiden, Joris ; Henten, Eldert Van; Mencarelli, Angelo ; Visser, Pieter De - \ 2018
In: Applications of Intelligent Systems - Proceedings of the 1st International APPIS Conference 2018, APPIS 2018. - IOS Press (Frontiers in Artificial Intelligence and Applications ) - ISBN 9781614999287 - p. 89 - 93.
Agriculture - End-effector - Path planning - Pruning - Robot - Trimming

This article describes the tasks and first results of the work package "Manipulator and Control" of the EU project Trimbot2020. This project develops a mobile robot for outdoor hedge, rose and bush trimming. The Kinova Jaco 2 robotic arm was selected as manipulator. Two different types of robotic end-effectors have been developed. The tool for trimming topiaries uses two custom designed circular contra-rotating blades. The tool for single stem cutting is based on a commercial electrical pruner. The arm and the tools can all be controlled by using the Robot Operating System (ROS). The motion planning algorithm of the arm for the bush trimming action is divided into the planning setup module, the coverage planning module and the trajectory planning module. The path planning is modelled as a traveling salesman problem. In the first phase of the project the trimming control is performed open loop. A positioning genetic algorithm was developed that minimizes the needed number of vehicle poses for one target object. In the next phase of the project a vision feedback mechanism will be implemented.

Sweeper plukt paprika's: (maar nog niet snel genoeg)
Balendonck, Jos ; Hemming, Jochen - \ 2018
Automation and robotics in the protected environment, current developments and challenges for the future
Hemming, J. - \ 2018
Wageningen : Wageningen University & Research - 6 p.
Trimbot2020 : An outdoor robot for automatic gardening
Strisciuglio, Nicola ; Tylecek, Radim ; Blaich, Michael ; Petkov, Nicolai ; Biber, Peter ; Hemming, Jochen ; Henten, Eldert van; Sattler, Torsten ; Pollefeys, Marc ; Gevers, Theo ; Brox, Thomas ; Fisher, Robert B. - \ 2018
In: 50th International Symposium on Robotics, ISR 2018. - VDE Verlag GmbH - ISBN 9781510870314 - 1 p.

Robots are increasingly present in modern industry and also in everyday life. Their applications range from health-related situations, for assistance to elderly people or in surgical operations, to automatic and driver-less vehicles (on wheels or flying) or for driving assistance. Recently, an interest towards robotics applied in agriculture and gardening has arisen, with applications to automatic seeding and cropping or to plant disease control, etc. Autonomous lawn mowers are succesful market applications of gardening robotics. In this paper, we present a novel robot that is developed within the TrimBot2020 project, funded by the EU H2020 program. The project aims at prototyping the first outdoor robot for automatic bush trimming and rose pruning.

Volledig geautomatiseerde oogst van paprika’s is op dit moment een toekomstdroom: mens moet plukrobot nog handje helpen
Hemming, Jochen - \ 2018
Automation and robotics in the protected environment, current developments and challenges for the future
Hemming, Jochen - \ 2018
Aktuelle Entwicklungen der Robotik und Automatisierung im Gewächshaus
Hemming, Jochen - \ 2018
Robotereinsatz zur automatischen Ernte von Paprika
Hemming, Jochen - \ 2018
Detection and classification of insects on stick-traps in a tomato crop using Faster R-CNN
Nieuwenhuizen, A.T. ; Hemming, J. ; Suh, H.K. - \ 2018
- 4 p.
In this paper we present the method and performance to detect tomato whitefly and its predatory bugs on yellow sticky traps. These traps are imaged in controlled light conditions with a digital single lens reflex camera and in uncontrolled environment with smartphone camera. The method consists of the following steps. First, image sub setting and data labelling by manual annotation. Secondly, training a deep learning convolutional neural network. Third step is classification of the images. Final step is comparison with hand counted data of insects. The weighted averaged accuracy for deep learning detected insects was 87.4%. The correlation of hand counted insects with deep learning counted insects was over 0.95 for the smartphone images. The methods used show that the training data used on controlledconditions could be transferred to uncontrolled smartphone imaging conditions for the data provide
Detection and classification of insects on stick-traps in a tomato crop using Faster R-CNN
Nieuwenhuizen, A.T. ; Hemming, J. ; Suh, H.K. - \ 2018
Monitoringsysteem voor spint en uitlezen gele vangplaten
Hemming, J. - \ 2018
Onder Glas 15 (2018)3. - p. 43 - 43.
Monitoring system for spider mite damage and yellow sticky traps: PeMaTo-EuroPep Project
Hemming, J. ; Suh, H.K. - \ 2018
Precisietechnologie Tuinbouw: PPS Autonoom onkruid verwijderen : D2.4 Literatuurstudie spectrale reflectie-eigenschappen van planten en onkruiden; D2.5 Lab en veldexperimenten spectrale reflectie-eigenschappen van planten en onkruiden
Blok, Pieter ; Hemming, Jochen ; Holterman, Henk-Jan ; Michielsen, Jean-Marie ; Ruizendaal, Jos - \ 2018
Bleiswijk : Wageningen Plant Research, Business unit Glastuinbouw (Rapport WPR 751) - 118
This report contains the two deliverables of the research project “autonomous weed removal” that deal with the topic hyper- and multispectral weed detection. In the literature of the past 10 to 15 years, there are sufficient indications that a good distinction on the basis of spectral characteristics can be made between various plant species. For the hyperspectral lab measurements, various crops and weeds have been cultivated. The reflection spectrum of all plants was measured between 400 and 1000 nm and between 900 and 1700 nm on different growing stages in the laboratory. In particular the reflection in the chlorophyll range (650-670 nm) and in the green range (around 550 nm), red-edge (700 nm) and near-infrared (800 nm) show a distinctive power between the crops and weeds studied. With field measurements it was investigated whether it is possible to detect green weeds in a green lettuce crop using hyperspectral camera images. The accuracy of a correct classification on both measurement days was 6.9% and 9.9%, respectively, below the previously set target value of 90%. In order to exclude the effect of the use of different cameras on the test result, a comparative follow-up study is recommended.
Check title to add to marked list
<< previous | next >>

Show 20 50 100 records per page

 
Please log in to use this service. Login as Wageningen University & Research user or guest user in upper right hand corner of this page.