Skip to content

Introduction to Raster Analysis

Introduction

This lab utilized ArcGIS Pro and concepts from a GIS fundamentals textbook to explore raster-based analysis techniques, including resampling, filtering, and cost surface modeling. The datasets consisted of digital elevation models (DEMs) and related spatial data from southeastern Minnesota and northern California, all projected in NAD83 UTM coordinates with elevation values in meters. The primary objective was to understand how to integrate datasets of differing resolutions, improve data quality through filtering, and apply raster analysis tools to solve real-world problems such as terrain visualization and optimal site selection based on cost factors.

Resources Used

ArcGIS Pro
GIS Fundamentals Textbook

Process

For this lab we were provided a series of data. In this data there were valley’s and DEMS of a portion of southeastern Minnesota, raster elevation grids, a DEM of northern California, and a layer of the roads in Duluth Minnesota. All of these datasets were in NAD83 UTM coordinates, the Z units were in meters, the Minnesota data in zone 15, and the California data in zone 11.

Raster Resampling, Combination, and Filtering

Our first task was learning how to combine DEMS at different cell-sizes. One of these DEMS was at a 3-meter cell size and the other was at a 9-meter resolution. The first step was to create hill shads for both of these data sets. After creating these hillshades, you could see that the DEM at 3-meters had greater definition than the 9-meter DEM.
To join these DEMs, we needed to convert our data sets to a common resolution. We brought the 9-meter data to a 3-meter cell size, by using the resampling tool in ArcGIS Pro. After this, we used the raster calculator to combine the two data sets, by using an IsNull and Con raster function. Once the maps were joined, ‘noisy’ data was produced, and could be viewed as white points or pits.
To fix this, we used the raster filter function by applying a low-pass filter to the DEM. After running the filter, we creating new hillshades for the layer. We then compared the filtered and unfiltered data layers. Below you can see that after filtering, the layers lose fine detail.
Next, using the raster calculator, we calculated the difference between the two layers, and then produced another hillshade for this difference. We now wanted to replace cells in the original image when they are more than 15 meters different from the filtered surface. This removed much of the noisy data but still maintained the detail. Finally, we created one last hillshade with an altitude of 35 degrees, model shadows, and set the transparency to 50% over our new smoothed elevation data.

Cost Surface

For this project, we created a cost surface for locating a building. Our cost surface depended on slope and distance to existing roads. In this problem, we also assigned construction costs per meter of road required. We also calculated a distance cost and total cost to ensure we stay in budget.
Using the slope function, we used a DEM to calculate the percent slope and then used the raster calculator to apply an exponential function to each cell value. We also used the Con function to place a ceiling on values over 300. Next, we created a formula that calculates the slope costs. Finally, using the max-min symbology to stretch our values from 0-$25,000 range to our slope cost.
Using the Euclidean distance, we set our distance raster to a cell size of 3 and a distance method of planar. Then, we calculated the distance cost as $25 per meter using the raster calculator. Next, we added the distance and slope costs to calculate the total surface cost. We then used permanent reclassification to convert our set of numbers into a new raster. The resulted in a mask layer.
To get the final cost, we used the raster calculator to multiply the mask by the total costs.

Summary

Throughout the lab, DEMs of varying resolutions (3-meter and 9-meter) were resampled and combined to create a unified elevation surface. This process introduced noise, which was mitigated using low-pass filtering and conditional raster operations to balance detail and smoothness. Hillshades were used to visually assess differences between datasets at each stage. Additionally, a cost surface analysis was conducted to determine optimal building locations based on slope and distance to roads. By applying raster functions such as slope, Euclidean distance, and raster calculator operations, a total cost surface was generated and refined using reclassification and masking techniques. Overall, the lab demonstrated how raster processing methods can be used to manage data resolution tradeoffs, improve surface quality, and support spatial decision-making.

Want to print your doc?
This is not the way.
Try clicking the ··· in the right corner or using a keyboard shortcut (
CtrlP
) instead.