1- , omidashkriz7@mail.com
Abstract: (644 Views)
The growth of settlements and the increase of human activities in the floodplains, especially the banks of rivers and flood-prone places, have increased the amount of capital caused by this risk. Therefore, it is very important to determine the extent of the watershed in order to increase risk reduction planning, preparedness and response and reopening of this risk. The present study uses the common pattern of the machine and the classification of Sentinel 2 images to produce land cover maps, in order to construct sandy areas and determine land issues affected by the flood of March 2018 in Aqqla city. Also, in order to check and increase the accuracy of the algorithms, three software indices of vegetation cover (NDVI), water areas (MNDWI) and built-up land (NDBI) were used using images. The different sets of setting of each algorithm were evaluated by cross-validation method in order to determine their effect on the accuracy of the results and prevent the optimistic acquisition of spatial correlation from the training and test samples. The results show that the combination of different indices in order to increase the overall accuracy of the algorithms and to produce land cover maps, the forest algorithm is used with an accuracy of 83.08% due to the use of the collection method of higher accuracy and generalizability than compared to. Other algorithms of support vector machine and neural network with accuracy of 79.11% and 75.44% of attention respectively. After determining the most accurate algorithm, the map of flood zones was produced using the forest algorithm in two classes of irrigated and non-irrigated lands, and the overall accuracy of the algorithm in the most optimal models and by combining vegetation indices (MNDWI) was 93.40%. Then, with overlapping maps of land cover and flood plains, the surface of built-up land, agricultural land and green space covered by flood was 4.2008 and 41.0772 square kilometers, respectively.
Type of Study:
Research |
Subject:
Special Received: 2023/02/27 | Accepted: 2024/11/2