水科学进展2023,Vol.34Issue(5):766-775,10.DOI:10.14042/j.cnki.32.1309.2023.05.011
基于遥感数据的山区河流测深反演方法与应用
Mountain river bathymetry inversion method based on remote sensing data and its application
摘要
Abstract
River bathymetry(RB)is a fundamental dataset in the field of river research.However,mountainous regions often lack comprehensive data due to topographical and transportation challenges.Remote sensing technology provides an innovative method for estimating RB.In this study,the theoretical relationship between the water level and the river width is established by generalizing the channel cross-section shape.A novel RB estimation method was proposed,integrating the Hydroweb dataset and Sentinel-1 images.The impacts of exposure,reach-average length,and remote sensing observation errors on estimation accuracy were systematically analyzed.The method was applied to the Upper Yangtze River to evaluate its potential for estimating river discharge.Results reveal that:① The estimation error of the riverbed elevation ranges from4.00 m to4.06 m,with the estimated cross-section representing 73.69%to 80.29%of the actual area,indicating precise RB estimation.② Exposure rate emerges as a primary factor,significantly enhancing estimation accuracy.An appropriate reach-average length improves the estimation precision and optimal length of 10km is advised for the Upper Yangtze River.Furthermore,the accuracy of RB estimation is more susceptible to water level errors in remote sensing than to river width.③The method demonstrates the potential to estimate river discharge achieving a Nash efficiency coefficient of 0.92.The research outcome can provide a novel approach to RB monitoring in data-scarce regions.关键词
河流测深/多源遥感/水位—河宽关系/流量估算/长江上游Key words
river bathymetry/multisource remote sensing/the water level—width relationship/discharge estimation/the Upper Yangtze River分类
建筑与水利引用本文复制引用
吴剑平,杜洪波,李文杰,万宇,肖毅,杨胜发..基于遥感数据的山区河流测深反演方法与应用[J].水科学进展,2023,34(5):766-775,10.基金项目
国家自然科学基金资助项目(52079013) (52079013)
重庆市杰出青年科学基金资助项目(cstc2021jcyj-jqX0009)The study is financially supported by the National Natural Science Foundation of China(No.52079013)and the Natural Science Foundation of Chongqing,China(No.cstc2021jcyj-jqX0009). (cstc2021jcyj-jqX0009)