Wednesday, December 20, 2017

Processing Pix4D Imagery with GCPs

Introduction:
The purpose of this activity was to compare the accuracy between processed UAS data imagery from Pix4D that utilized Ground Control Points (GCPs) contrasted to the imagery that didn't from the week previous.  A GCP is, according to Pix4D, "a characteristic point whose coordinates are known.  GCPs are used to georeference a project and reduce the noise." Pix4D recommends users to incorporate GCPs, as the angle differences in a given image set which have GCPs enable the images to adjust properly to project the data accurately in a 3D model.   

Methods: 
To start out, the data imagery gathered from the DJI Phantom 4 drone were uploaded to Pix4D.  When brought in, the shutter method defined for imagery capture was designated to "rolling shutter".  After the images were all imported, users were then asked to navigate to "GCP MTP Manager" import 16 individual GCPs which were collected at the Litchfield mine.  Once these were brought in, they were displayed as blue crosses atop the red circles which represent image locations displayed below in figure 1.
Figure 1
After the GCPs were imported, the Basic GCP/MTP Editor was used to mark the exact center of each GCP location in order to ensure the highest degree of location accuracy and representation.  Figure 2 below shows an example of a GCP being previewed for marking. 
Figure 2
This process was repeated for the next 15 GCPs.  Following that, point cloud, mesh, index, DSM and Orthomosaic are all ran.  Upon process completion, a quality report is generated, shown in figure 3 below.
Figure 3
After the processing was completed, the pinpoint accuracy of GCP marking can be shown in figure 4 below through the point cloud imagery.
Figure 4
Results
Once processed, the geotiff file created in Pix4D can be brought into ArcMap.  Figure 5 below shows a hillshade model of the GCP UAS Imagery.
Figure 5
It was then compared to the previous imagery that didn't utilize GCPs to assess overall accuracy (shown in figure 6 below).  The biggest distinguishing factor between the two seemed to be with the difference of elevation.  Notice how the max value reads 247 meters for the GCP map, and the non-GCP reads 108 meters.  
  

Sunday, December 10, 2017

Processing Pix4D Imagery

Introduction:
Pix4D is a professional drone mapping and application software which uses images to create professional Orthomosaics, point clouds, and 3D mapping models.  In this exercise, students processed aerial images shot from a DJI Phantom 4 drone at 80 meters elevation over the Litchfield Mine located in Eau Claire, WI (taken on 9/30/16).  Pix4D was utilized in order to create a DSM and Orthomosaic model of the mine site. 

Methods:
Overlap required for this imagery required less overlap for areas of higher elevation rather than flat areas, which required more overlap. The recommended overlap is a minimum of 75% for frontal overlap, and at least 60% side overlap. If the user is flying over sand, snow, or uniform fields, there will be limited visual content mostly due to large uniform areas. Since the Litchfield mine is mostly uniform, a high overlap was selected (Figure 1).
Figure 1: Flight path, images captured, and photos processed in Pix4D


Rapid check is an alternative processing method which is speedier but less accurate.  In order to process multiple flights, the pilot needs to maintain height.  GCP's aren't essential for Pix4D, but they are strongly recommended if one wants to measure elevation points with high precision and accuracy.  

The first step involves processing the data with Pix4DMapper Pro application.  Students went to Create a New Project to carry this out, making a new folder location.  This was followed by taking the provided data from Professor Hupy and importing it to Pix4D into the created folder.  Figure 1 below shows the resulting pop-up screen once the destination folder is selected.  The Phantom 4 Camera is also switched from a global shutter to a rolling camera. To properly display the DSM, 3D Maps was also chosen.
Figure 2 - Image Properties of Litchfield Mine Survey
       
After the camera properties were set and the images to be processed were selected, the initial process was run before the point cloud and mesh and DSM, orthomosaic, and index were performed. This ensured the data would run correctly without having to wait for the processing time of the other functions. A quality report was then generated to preview outputs, and visualize the image overlay and images used along the flightpath (Figure 3).

Figure 3 - Previews of the orthomosaic and DSM outputs generated in the quality report
After this process was completed, the point cloud and mesh process and DSM, orthomosaic, and index process were performed. The results were then imported into ArcMap to be displayed as maps.

Results:
Figure 4 shows the orthomosaic output of the 197 photos used along the UAS flight path. This is a high resolution overlay because of the flight elevation and high-level photo overlay determined by the uniform texture of the Litchfield mine.
Figure 4 - Orthomosaic imagery from UAS of the Litchfield mine
Figure 5 then shows the digital surface model using a stretched color scheme to symbolize elevation. The elevation recorded by the UAS sensor uses an ellipsoidal earth method to record EXIF information rather than mean sea level used by most other software symbolizing elevation. 
Figure 5 - DSM elevation overlay of the imagery mosaic
Conclusion:
Point cloud imagery allows 3D modelling cost-effectively and is gaining in popularity across many sectors because UAS is highly customizable to achieve a wide variety of goals. For this project, the UAS point cloud data was used for volumetric analysis to calculate asset inventory. In order for the modelling to be accurate, attention to collection methods such as flight elevation and overlay are essential. This should be planned according to the objective before data collection. The methods and platform specifications should be recorded as metadata to assure accurate results. Flight elevation, shutter formats, platform used, date recorded, and other factors all play into the model produced and should be considered in project design and accounted for during processing.

Monday, December 4, 2017

Lab 10: Visualizing and Refining Terrain Survey

Introduction:
This lab is based off of a previous exercise performed earlier in the semester which involved measuring a 115 x 115cm sandbox to produce a hill, ridge, plain, depression and valley.  After the terrain was molded, students created a grid system utilizing pins and strings in efforts to normalize the data.  576 sample points of the elevation model were recorded in order for the data to be normalized.  This was essential to successfully project the data.  Data normalization, according to Esri, can be defined as "the process of organizing, and cleaning data to increase efficiency for data use and sharing."  The data points gathered from the sandbox were entered into a table in x,y,z-(elevation) format in order for it to be used in ArcMap.  These points were converted into a grid system in ArcMap in order to display the elevation of individual data points gathered within the sandbox. The data sampling method chosen for these particular points was a systematic sampling technique.  This method was found to be the most accurate and effective with gathering individual points on the grid. It has consistent intervals which are recorded at specific sampling points. The group which performed this sampling method created equal intersections with the string and utilized equal intervals of five centimeters along the x and y axis of the sandbox.  The interpolation procedure in this lab helps in visualizing this data by displaying 3D models of the sandbox.  These series of maps helped in representing the entire surface of the sandbox simply based off of the plots of each recorded point.  Figure 1 below displays a section of the x, y, z data used for the data points.
Figure 1: Segment of normalized data on Excel 

























Methods:
Once the data was normalized and a geodatabase was created, the x,y,z data points were imported into ArcMap by navigating to File-->Add XY Data.  This was exported as a feature class within the geodatabase.  Since the points were established in relation to a specific reference point at (0,0), a cadastral coordinate system was utilized without projecting the data.  A grid was then established and ran through a series of different interpolation methods in order to determine advantages/disadvantages of each, as well as how realistic of a representation of the sandbox terrain each method produced.  These methods (defined by ESRI) included Inverse Distance Weighted (IDW), Kringing, Natural Neighbor, Spline and Triangular Irregular Network (TIN).

The IDW tool uses a method of interpolation which calculates cell value by taking the average values of data points in the vicinity of each processing cell.  The closer a point is to the center of a cell being estimated, the more influence it has in the averaging process.   

The IDW tool seemed to provide a basic picture of what the original sandbox looked like, but is lacking specific definition.  This is likely due to the fact that there are only 100 total input units available.

Natural Neighbor interpolation finds the closest subset of input samples to a query point and applies weights to them based on proportionate areas to interpolate a value.  Since it only uses neighboring points, it is better suited for compact datasets and terrain that has higher elevation variability.  

The Natural Neighbor seemed to created an elevation that was a little distorted with elevation variance, though it served well in revealing the peaks and valleys.

Kriging is an advanced geostatistical procedure that generates an estimated surface from a scattered set of points with z-values.  More so than other interpolation methods, a thorough investigation of the spatial behavior of the phenomenon represented by the z-values should be done before you select the best estimation method for generating the output surface.  

Besides a couple spikes in the surface model, the Kriging created a smooth and accurate representation of the original sandbox. 

Spline interpolation method estimates values using a mathematical function that minimizes overall surface curvature, resulting in a smooth surface that passes exactly through the input points.

The spline tool created the smoothest surface out of all the models.  During processing, the Regularized option was selected in order to create the most accurate replica of the recorded terrain.       
TIN is a vector data structure that partitions geographic space into contiguous, non-overlapping triangles.  The vertices of each triangle are sample data points with x-,y-, and z-values.  These sample points are connected by lines to form Delaunay triagles.  TINs are used to store and display surface models.                   

The tin model strongly illustrated the slopes in the terrain, but poorly reflected what the sandbox looked like in that it was jagged.

Once all 5 interpolation methods were completed, the resulting output rasters were imported into ArcScene in order to produce a floating 3D view of elevation change in the sandbox.  When the raster is initially brought in, it's projected as a flat surface.  It can be modified to 3D by selecting "floating under a custom surface" under layer properties. The 3D surface was then exported as a JPEG and brought into ArcMap to be used as a visual aid for the maps produced in the results section.  A scale bar was established by navigating to data frame properties and selecting "centimeters".

Results/Discussion
Figure 2 displays the surface of the original sandbox.
Figure 2













Figures 3-7 below show the resulting maps of each Interpolation method.

IDW
Figure 3 below shows a map utilizing the IDW interpolation method.  This map had a fair representation of surface elevation, but did not have a very smooth surface.  This was uncharacteristic of the actual terrain of the sandbox, which wasn't nearly as bumpy.
Figure 3

























Natural Neighbor
Figure 4 below shows a map utilizing the Natural Neighbor interpolation method.  The peaks of each of the "hills" appear jagged, which is unrepresentative of the actual sandbox that was measured.
Figure 4

























Kriging
Figure 5 below shows a map utilizing the Kriging interpolation method.  Elevation changes are not as strongly pronounced, but the overall surface is considerably smoother than Natural Neighbor and IDW, having less pronounced variability from point to point.
Figure 5

























Spline
Figure 6 below shows a map utilizing the Spline interpolation method.  It is obvious that the surface of this model is considerably smoother than any of the other previous 3D representations.  This is likely due to the fact that Spline utilizes a mathematical function which minimizes overall surface curvature.
Figure 6

























TIN
Figure 7 below shows a map utilizing the TIN interpolation method.  It is very geometric and pointy by nature due to the triangles generated, unlike the actual surface of the sand.  Despite that distortion, it still represents the sandbox elevation well.
Figure 7

























For this particular survey, the Spline interpolation method appeared to be the best survey technique for producing the most accurate representation of the sandbox.  The mathematical function utilized to minimize surface curvature proved to be very effective.

Conclusion
This survey is related to other field surveys in that it collects elevation data over many points.  What makes it unique is the fact that such recordings were made only inches apart.  It is not always realistic to perform a highly detailed grid based survey, nor it it always necessary.  Difficult terrain or private property may be factors that interfere with this.  Interpolation be used for much more than just elevation models.  Factors like temperature, wind speed and windchill could also be collected for a interpolation dataset, being that they're multiple factors that are all inter-related.

Tuesday, November 28, 2017

Arc Collector Project

Introduction
The purpose of this assignment was to give students the freedom to create their own project using ArcCollector.  For this assignment in particular, the goal was to understand how the Eau Claire and Chippewa River have an effect on air temperature in relation to proximity.  This student gathered 65 individual data points concentrated in the Phoenix Park area.  Four domains were recorded for each point, these include temperature, windchill, wind speed, and land cover.

Study Area
The area these measurements were conducted on was Phoenix Park, everything between the shoreline out to Riverfront Terrace.


Figure 1: Satellite View of Phoenix Park




















Methods
Before recording the data in the field, a geodatabase was constructed in ArcCatolog.  Four domains were chosen in this database.  These included land cover, temperature, wind chill and wind speed.  For land cover, coded values were assigned to classify different types of land surfaces.  These included shoreline, grass, concrete (walking trails) and trees.  Shoreline was a significant factor because temperatures at the shore of water bodies tend to be cooler compared with temperatures more inland (at least when the water is cooler than the air).  Being in close proximity to trees can also have an impact on temperatures as well as increased wind friction.  Concrete tends to store heat from solar radiation and could potentially have an effect on surface temperature.  As for the other domains, wind chill and wind speed are correlated with one another when temperatures are below 50 degrees Fahrenheit and have speeds over 3 mph.  The student looked to see if wind speeds had any influence on wind chill.

Figure 2: Text fields where domains were established 
Once domains were created, they were almost ready to be published to ArcGIS online in order to be used on the ArcCollector app.  From Publish New Service the UWEC Geography page is connected and the service editor requires a brief summary of the project at hand before allowing one to publish.     

Figure 3: Publishing in progress after filling out required item description
Once the geodatabase is successfully uploaded to ArcCollector, the student is ready to go out to the field to start collecting.  Accompanying him are a Bad Elf GPS and Kestrel 3000 pictured below.  The Bad Elf was linked to an iPhone via Bluetooth to ensure the highest spatial accuracy for data collection.  The Kestrel was used to measure wind speed, wind chill and temperature.   
Figure 4: Kestrel 3000


Figure 5: Bad Elf GPS unit


Figure 6 shows what ArcCollector looks like in action with punching in the four attributes.


Figure 6




















Results/Discussion
Figure 7 below shows a map of the temperature distribution.  65 points were recorded in total, and it can be observed by the temperature map below that the southern tip of Phoenix Park is a few degrees cooler than along the trail leading up to the bridge.  The areas leading up to the bridge and beyond also seemed to be locations that were much higher above the river than the southern side, which slopes toward the Chippewa.  Despite these temperature variances, the deviation was not as high as expected.  To see a higher distribution of temperature ranges, the student probably could have plotted points more inland.  Windchill also was equal to the air temperature.  Perhaps this was due to the fact that it was over 50 degrees Fahrenheit and wind speeds never exceeded 3 mph, which are basic criterion for windchill measurements.
Figure 7. Temperature map





















The next map (figure 8) displays the distribution of windchill throughout Phoenix Park.  The results are essentially identical to temperature, something that was not originally anticipated in the planning of this project.  The fact that the temperatures were so high for this occasion likely was the reason there was a lack of deviation between windchill and temperature.

Figure 8. Wind Chill map.




















This interactive map below shows each and every individual domain value.  By clicking on each point, one can observe wind speed, wind chill, temperature and land cover type.
 


Conclusion
It is important to have proper research questions when undertaking a project such as this one.  This study was unique in that one of the factors (wind chill) simply was irrelevant because temperatures were too high (surprising, for November).  The idea of this project was to show that the river has a significant cooling effect on air temperature and wind chill.  On this occasion, the resulting data didn't quite meet the expectations the student had.  It was a very warm November evening, but on a more typical November night the data would likely paint a more vivid picture of how the river has an impact on windchill and temperature.  Overall, ArcCollector proves to always be a trusted app to use if one seeks to plot meaningful data.

Monday, November 13, 2017

Lab 8: Navigation with GPS, Map & Compass

Introduction:
The purpose of this lab was to have students navigate a study area using two different methods.  This included a Bad Elf GPS device with coordinates and also a UTM map and compass.  Utilizing these two different methods provided students with multiple skills in navigating through challenging environments.  The area being traversed was the Children's Nature Academy in the UWEC priory.  It is a wooded area found on the Southwest side of Eau Claire, WI.  Figure 1 below displays an aerial map of the study area.
Figure 1: Site Map of UWEC Priory& Children's Academy
Methods:
There were two different methods performed for navigation during this exercise.  The first one involved using a Bad Elf GPS unit to find specified latitude/longitude coordinates.  Below are the following coordinates provided for one of the groups.  

Group Three:

1)      617708.815999999640000, 4958257.839600000500000
2)      617930.692499999890000, 4957946.946799999100000
3)      617619.799700000320000, 4958049.249099999700000
4)      617852.304999999700000, 4958136.936799999300000
5)      617695.530000000260000, 4958123.650800000900000
Each group tracked their path to the given coordinates using a Bad Elf GPS unit, which gathered the path students walked in pursuit of each point.  At the given locations, trees were marked, displayed in figure 2 below.  

Figure 2

Aiding the students during their tracking was the Bad Elf iOS app.  The GPS device was linked to an iPhone via Bluetooth and live coordinate updates were accessible on an iPhone displayed on figure 3 below.   


Figure 3: Coordinate point displayed on Bad Elf iOS app

Once all 5 points were recorded, students moved on to the next sequence of the exercise, which involved navigating to 3 points only using a navigation map and compass to guide them.  A GPS was only used to track the location in order to see how close they were to the actual coordinate point.  Each group had a pace counter, an azimuth control and a leap frogger.  The pace counter's job was to stay in a straight line while keeping count of his/her pace in order to measure distance.  The azimuth control was there to guide and ensure that the pace counter was in a straight line.  The leap frogger simply stood at landmarks that were passed in order to enable the azimuth control to navigate straight shot to another land mark.  


Results: 
Figure 4 below displays group three's path in pursuit of each coordinate point.  The squiggly lines show the slight struggle in both navigating the terrain and finding the direct path to the coordinate points.  Despite the slight difficulty this particular group had, they were still able to come within 15 meters of each location point.  The blue trail represents navigation with aid of the Bad Elf GPS, and the orange represents navigation with use of just a navigation map and compass. 
Figure 4

Figure 5 represents all the groups' collective routes during this field outing.  Notice that each group had success in navigating to their given coordinate points.  The paths that are more "off beat" typically were the ones that used the GPS and not the compass.  Utilizing the compass provided groups with a more straight path, especially with the aid of a leap frogger and azimuth control. 
Figure 5

Conclusion:
The Bad Elf GPS is obviously much more precise than a conventional map and compass for location accuracy.  Knowing how to use a compass is still important as a backup when technology fails.  The above maps show that each navigation technique was quite successful overall.  The most challenging part of this exercise seemed to be navigating through the terrain, as you can tell group 3 took strange paths to get to various coordinate points as they struggled through the thick brush.  One thing that helped them determine their path as the continued on was the use of the contour lines on the UTM maps they had provided.  It enabled them to see where there would be areas of steep terrain that could be avoided. 

Sources:

 https://nhtramper.wordpress.com/2013/03/31/wilderness-compass-navigation-primer/


https://education.usgs.gov/lessons/compass.html































Monday, October 30, 2017

Lab 7: Using Arc Collector for Microclimate data

Introduction:
The purpose of this lab was to engage students with Arc Collector software on their mobile phones to gather data points.  Such devices typically have significantly more computing power than a standard GPS unit, and also are able to access online data, making data collection quick, easy and in real time.  High degrees of spatial accuracy can also be recorded when coupling Bluetooth with a GPS paired with Arc Collector.  Students downloaded Arc Collector on their phones and were thrown out into the field to test out the software.

Study Area: 
Students were assigned zones on University of Wisconsin Eau Claire's campus to record micro-climate data points.  On Tuesday, October 21st between 3:30-4:45 students ventured out to make their recordings.  Figure 1 below displays a map of the study area.  Zone 7 was the particular zone assigned to this student, who recorded 19 total points with another member of the class. 
Figure 1 - Study area divided into 7 zones



















Methods:
After downloading Arc Collector to their smart phones, students signed into their enterprise accounts to join the class group.  This gave them access to the geodatabase for this particular assignment to update data points in real time.  Students were given a compass to measure wind direction and a Kestrel 3000 weather meter (shown in figure 2) to measure wind speed, wind chill, dew point, and temperature.  Points were added by selecting the "add point" tab within the app.  On top of just the geographic coordinates of the points, other attributes that were collected were Group (1-7), Temperature (F), Dew Point, Wind Chill, Wind Speed (mph), Wind Direction (Azimuth), Time, and Notes.  
Figure 2: Kestrel 3000
















Once all the points were established, that data was ready to be transferred to ArcGIS online for mapping.  Proportional symbol maps and IDW interpolations were the two primary functions performed to produce the maps.  

Results:
The first map produced (figure 3) emphasized temperature recordings made throughout the study area.  The lowest temps on this map are near the river, rich is not suprising, as water bodies often times create a pocket of cool temperatures.  Warm areas seem to be near buildings and in parking lots, which might be influenced by solar absorption and heat emitted by the buildings.
Figure 3
   















The next map (figure 4) focuses on the distribution of the dew point throughout the study area.  This revealed the relative humidity depending on where one was on campus. 

Figure 4
















Figure 5 below shows a multi-variable map of wind chill, speed and direction.  Graduated symbols were used for wind speed and were rotated according to the angle of the wind direction.  High wind speeds seem to be associated with low wind chill, especially on the bridge over the Chippewa River.
Figure 5
  















Conclusion:
This lab showed the simplicity and power Arc Collector has when coupled with a smart phone.  It is a great method of data collection in the field in real time, technology which has never been so freely accessible to geographers. There were a couple issues with recording errors during the process of data collection, but such points were deleted before producing the maps.  Overall, this application is quite easy to use and highly effective.   

Monday, October 23, 2017

Lab 6: Using Survey 123 to Gather Survey Data Using Your Smart Phone

Introduction 
The purpose of this activity was to create a survey through Survey123 for ArcGIS, followed by analyzing and sharing the survey data.  Survey123 is a form-centric data gathering service for creating and analyzing surveys, publishing them online.  It can be completed on both mobile and desktop platforms, capable of downloading data attached to spatial information and mapping it.  This post covers how to use the software to perform such duties through an ArcGIS online tutorial.    

Methods

Create a Survey:

The first step to creating a survey is by selecting "create a new survey" on the survey 123 website. The next step requires you to insert provided details given through the tutorial (name, tags, summary).  Questions are then added to the survey through the "Add" button, having the option of single choice, drop-down, multiple choice, etc.  Figure 1 below shows the list of options available for question types.
Figure 1




















The survey the tutorial required users to create was composed of many different types of questions.  The first was about background details of the participant and their location, the second set of questions is composed of "safety checks", and third inquires of materials a homeowner has, such as fire extinguishers and if smoke alarms are up to date.  Once these survey questions are added, the list is previewed to ensure that all the questions are correct.  Figure 2 shows a mobile preview of the survey.

Figure 2




















Once satisfied with survey content, it is published by selecting "Publish" in the bottom right corner.  Once the survey is published, it can no longer be edited.  It is now shared among the other members of the class within the shared database, and can be found in the gallery under "my surveys", which is displayed in the iOS app in figure 3.
Figure 3
.



















After survey is completed, the second part of the tutorial requires users to complete the survey both with a desktop browser and with the Survey123 mobile app.  The user is required to fill out the survey a total of 6 different times, mixing up the answers in order to get a variety of results to further analyse and project.  As you can see from figures 4-6 below, many different variables can be mapped after survey results are produced.
Figure 4 - A color themed column graph showing ages of people living in a household

Figure 5 - Age of houses owned by survey residents


























Figure 6 - Pie chart of people living in a household

















Figure 7 below shows a heat map produced out of the surveys.  This was done by choosing location as the variable and heat map as the drawing style.  The yellow in Los Angeles represents a higher concentration of survey takers, as two different mock survey takers resided in LA.
Figure 7
   













After performing some data analysis, the next step in the tutorial involves creating a map with custom pop-ups within ArcGIS Online. This was completed by opening the map viewer and navigating to Configure Pop-Up.  Figure 8 below displays the window.  

Figure 8 - Pop up window of one of the mock-survey takers
  














Lastly, a web app is created by selecting the share button and create a web app, figure 9 below shows the results. Color themes are also available for the headers and texts.
Figure 9 - Survey viewer map
















Conclusion
Survey123 is a useful app that is capable of collecting meaningful data for spatially related questions.  It seems to be well suited for geographic techniques such as urban and general infrastructure planning.  Even though the data collected for this specific assignment was fictitious, it still shows the great potential and flexibility this application has in projecting various forms of data on a very accessible platform.

Sources:

https://learn.arcgis.com/en/projects/get-started-with-survey123/

Processing Pix4D Imagery with GCPs

Introduction: The purpose of this activity was to compare the accuracy between processed UAS data imagery from Pix4D that utilized Ground ...