新疆农业科学2018,Vol.55Issue(3):548-555,8.DOI:10.6048/j.issn.1001-4330.2018.03.018
基于无人机可见光遥感的棉花面积信息提取
Research on Area Information Extraction of Cotton Field Based on UAV Visible Light Remote Sensing
摘要
Abstract
[Objective] This project aims to use object-oriented image classification method to extract the planting information of the visible light remote sensing image of the UAV in the hope of providing a new method for extracting large-scale farmland information and improving the speed and precision of classification results.[Method]The study selected fixed-wing UAV equipped with a camera and obtained the visible light images of 135th regiment farm of the eighth division of Xinjiang Production and Construction Corps. With the help of eCognition software platform, using the object-oriented method, the cotton planting information in the study area was extracted for experiments.[Result]The planting area of cotton extracted by visual interpretation was 0. 35 km2,and that by object -oriented approach was 0. 33 km2. The results showed that this method could effectively extract the cotton planting area in the study area, and the classification accuracy reached 94. 29%, error of 5. 71%.[Conclusion]Compared with traditional pixel -based classification methods, using the object-oriented classification method to extract the range information of visible light images captured by UAV has higher extraction accuracy and is greatly closer to visual interpretation.关键词
eCognition/面向对象/无人机可见光遥感/棉花/种植信息Key words
eCognition/object oriented/visible light remote sensing of UAV/cotton/planting informa-tion分类
农业科技引用本文复制引用
李路曼,郭鹏,张国顺,周倩,吴锁智..基于无人机可见光遥感的棉花面积信息提取[J].新疆农业科学,2018,55(3):548-555,8.基金项目
国家国际科技合作专项项目(2015DFA11660) (2015DFA11660)
国家大学生创新创业训练计划项目(201710759064) (201710759064)
大学生研究训练项目(SRP2017212) (SRP2017212)
International S & T Cooperation Program of China(2015DFA11660) (2015DFA11660)
National Undergraduate Training Program for Innovation and Entrepreneurship(201710759064) (201710759064)
Student Research Training Project(SRP2017212) (SRP2017212)