Research and Teaching Applications of Remote Sensing Integrated with GIS: Examples from the Field

Abstract

Remote sensing is used in the Spatial Sciences Institute (SSI) across the full spectrum of the organization’s teaching and research initiatives. From undergraduate to graduate classes that utilize unmanned aerial systems (UAS) in data collection and graduate courses that incorporate remote sensing for a variety of applications, including Earth observation, to applied research via the Human Security and Geospatial Intelligence (HSGI) Lab projects and work being done with One World Terrain (OWT) at the Institute for Creative Technologies (ICT) to build a fully geo-referenced 3D planetary model, SSI recognizes the need to educate students about remote sensing techniques. This paper discusses faculty involvement in conducting a pilot study for the Catalina Island Conservancy (CIC) using UAS to survey local bison and deer populations. The team utilized an autonomous fixed-wing UAS with a thermal payload to collect data for a semi-automated detection workflow integrated within a GIS successfully identifying both deer and bison. Additionally, graduate students participate in a weeklong experiential learning opportunity on Catalina Island, CA during which they develop and conduct a research project integrating UAS and other remotely sensed data with primary data collection in a Geographic Information System (GIS). By extension, the Institute then reinforces that these educational opportunities, focused primarily on data acquisition, are instrumental in supporting the geographic information systems, science, and technology experiences in many diverse fields, including (but not limited to) human security, humanitarian relief, sustainable urban and rural planning, and public health.

Share and Cite:

Loyola, L. , Knowles, J. , Marx, A. , McAlinden, R. and Fleming, S. (2019) Research and Teaching Applications of Remote Sensing Integrated with GIS: Examples from the Field. Journal of Geographic Information System, 11, 670-682. doi: 10.4236/jgis.2019.116041.

1. Introduction

Within spatial sciences, the University of Southern California (USC) Dornsife College of Letters, Arts and Sciences (Dornsife) Spatial Sciences Institute (SSI) recognizes the importance of remotely sensed data as an integral component of a Geographic Information System (GIS) and spatial analysis to a variety of disciplines. As is evident from this special issue alone, remotely sensed data integrated into a GIS can be applied to everything from environmental analysis to human rights monitoring in far to reach locales. We focus this paper on two research projects contributed by faculty and affiliate faculty that incorporate remotely sensed data, acquired both from satellite imagery and unmanned aerial systems, into GIS and virtual reality/augmented reality (VR/AR) to promote the well being of military and the monitoring of wildlife. These vastly differing projects underscore the variety of applications for remotely sensed data and different research that the Spatial Sciences Institute undertakes, which also can be used in teaching and promoting the next generation of data acquisition and analysis specialists. Reinforcing concepts and scientific theories is best accomplished through active learning, when the creation of new knowledge occurs through the transformation of experience [1] [2]. We discuss these projects not only within the context of the integration of remote sensing with GIS, but also the broader context of integrating remote sensing into curricular advances within the SSI at both graduate and undergraduate levels. Situated in the heart of Los Angeles, CA, the online graduate programs in Geographic Information Sciences & Technology (GIST) at USC are currently the only US programs that conform to UNIGIS standards of design and delivery for distance learning in GIS and GISci. This places USC Dornsife SSI in a unique position at the forefront of curricular development.

Below, we briefly describe some of the work that Director of Modeling, Simulation, & Training at the Institute for Creative Technologies (ICT), Ryan McAlinden oversees, and work that Professors Jason Knowles and Andrew Marx have initiated with the Catalina Island Conservancy (CIC), a non-profit organization that privately holds and manages over 88% of the land on Catalina Island. The USC-ICT research utilizes novel geospatial techniques and advances in the areas of collection, processing, storage and distribution of geospatial data, while the pilot for wildlife monitoring of local bison and deer populations via UAS-based high resolution visible (RGB) and thermal imaging has not previously been conducted on Catalina Island. We also describe how projects such as these have been incorporated into specific spatial sciences courses over the years to provide support for research and platforms for enhanced student learning. Aspects of the pedagogical approaches were previously presented at the GIS-Pro & Cal-GIS 2018 conference [3], therefore, here we focus on newer research and faculty capacity building. We also provide the workflow that we have undertaken to increase the faculty capacity in operating UAS and working with remotely sensed data. We acknowledge this may not work for all academic units, but encourage academic departments to follow similar curricular advances.

The next section of this paper presents background, methodology, and results of the wildlife monitoring pilot study on Catalina Island from 2018. Section three describes the innovative research and application of remote sensing integrated with GIS for AR/VR products and 3D planetary modeling done at USC-ICT. Both research studies are presented within the context of a Dornsife Spatial Sciences Institute graduate spatial data acquisition course and the potential for building faculty capacity in integrating remote sensing with GIS in this course. The fourth section focuses on novel integration of remote sensing into the undergraduate and graduate curriculum within spatial sciences and presents a potential workflow for other organizations to build faculty capacity in this domain.

2. Wildlife Monitoring on Catalina Island, Pilot 2018

2.1. Background

The Dornsife Spatial Sciences Institute has long run an online spatial data acquisition course that affords students the opportunity to experience a weeklong field data excursion based at USC Wrigley Institute for Environmental Studies (WIES) on Catalina Island. This course has evolved from working solely with handheld GPS units to formally include unmanned platforms. This evolution was bolstered when trained Remote License Pilots (RPL) pilots, Professors Jason Knowles and Andrew Marx, joined the faculty of SSI in late 2017. Additionally, spatial students and faculty have intermittently previously worked with the Catalina Island Conservancy on a variety of projects, mostly small-scale projects focused on areas near the USC WIES campus or just beyond, during the week-long excursion, with results remaining internal.

Having practical experience from previous work, both Knowles and Marx were eager to continue their remote sensing work through SSI; the already forged contacts at the CIC and the spatial data acquisition course provided the opportunity to formally link their research with the student curriculum and vice versa, to link the students with this practical application of remote sensing. In April 2018, Knowles and Marx worked with the CIC to develop and execute a pilot study for conducting wildlife surveys of local bison and deer populations utilizing UAS-based high resolution visible (RGB) and thermal imaging. The study looked at both the feasibility of utilizing fixed wing UAS-based imagery to identify wildlife, but also at workflow maximization; could gains be made in the efficiency and efficacy of airborne counting and cataloging of wildlife in comparison to more traditional on the ground field survey methods [4]. In addition, it was also anticipated that this methodology would be less obtrusive and invasive to the observed species due to the collection altitude of the UAS. This is a vital component to wildlife monitoring and an especially high priority for ethical organizations such as the CIC, which are the sole guardians of wildlife in an area. The pilot study was done flying a fixed wing sense Fly Ebee Plus with SODA (RGB) and Thermo MAP payloads (Figure 1) resulting in both ultrahigh resolution aerial photography (at 1.12 in/2.84cm group sample distance) and thermal data capture (at 9.49 in/24.11cm ground sample distance). The CIC Director and two field biologists joined the collection team for the duration of the project.

2.2. Methodology

The study area selected by the CIC was Middle Ranch Meadows on Catalina Island (Figure 2), located near the center of the island in a valley surrounded by rich topography making on the ground field visual observations difficult. Two days of flying were completed on April 4 and 5, with both high-resolution RGB and thermal cameras for a total of five flights. Both pilots (Knowles and Marx) were holders of Civil Certificate of Waiver Authorization (COA), for commercial operations via Federal Aviation Administration (FAA) Code of Federal Regulations (14CFR) Part 107. The April 4 flights were for orientation and equipment shakedown/calibration, while April 5 flights were for wildlife data capture. On April 5, the first flight was completed with the thermal package, taking off 30-minutes before sunrise (FAA earliest allowed). This early flight was conducted to maximize the temperature differential between the cold evening ground and the wildlife. Immediately after the first, one-hour flight, a second flight was performed with a RGB sensor over the same study area. This, along with ground truth performed by the CIC personnel using binoculars from a vantage point to identify wildlife, was used to the confirm the presence of wildlife and validate potential signatures identified in the thermal imagery capture which is shown to be the most effective [5] [6]. Immediately following the flights, datasets were preprocessed in the field and saved to secondary backup systems. Once back from the field, datasets were processed overnight via commercial photogrammetry software (Pix4DMapper v4.2) and the following datasets were produces within 48-hours of capture and integrated within GIS (ArcMap v10.5):

· Ultra-high resolution RGB (visible) aerial photography (at 1.12 in/2.84cm ground sample distance);

· LAS Point cloud (Figure 3);

· Thermal surface model (at 9.49 in/24.11cm ground sample distance);

· Digital surface model (DSM); and

· 3D Textured mesh (Figure 4).

After processing the thermal imagery, it was added to a GIS, for manual analysis of the thermal imagery. Warm areas or literal “hotspots” from the thermal imagery capture were identified and correlated to the visible imagery and on the ground wildlife observations from the CIC field biologists.

Figure 1. SenseFly eBee Plus, field set up prior to data collection.

Figure 2. Study area: flight coverage at Middle Ranch, Catalina Island, CA and Catalina Island with Califronia caostline.

Figure 3. LAS point cloud processing in photogrammetry software.

Figure 4. UAS collected products: Orthoimage integrated with a Streets basemap (top left) and DSM integrated with Imagery basemap (bottom right) in GIS.

2.3. Results

The preliminary results indicate that fixed wing UA appears to be a good platform for wildlife surveys both for its ability to cover large areas and host both RGB and thermal payloads. Initial surveys were a success with both bison and mule deer identified in the thermal imagery captured by the UAS survey (Figure 5 and Figure 6). Four mule deer and one bison were ultimately counted over the 1.41 km2 study area.

The field crew and the RGB imagery flown immediately following the thermal capture verified these signatures. The aerial collection methodology was unquestionably more efficient in terms of being able to cover more area (the eBee fixed wing has a flight time in excess of ~60-minutes and depending on the collection altitude can cover vast areas in a single flight) and visual observations of the wildlife during the flights indicated that there was no disturbance, with the animals seemingly unaware of the UAS high above them. In addition to the survey data, the derivative geospatial products produced by the programming process (Figure 4 above) were found to be extremely useful datasets for the CIC field biologists and GIS staff that would normally not be available.

2.4. Summary

While we consider this initial pilot study a success, we believe that there can further improvements to the workflow and collection methodology. Collections and resultant thermal data capture would be significantly improved by flying predawn (or at night) in order to get a larger wildlife temperature differential signature between the environment and the wildlife. Even at sunrise, the head distribution was already much greater on east-facing slopes making detection of thermal signatures more difficult. Future studies will see the submission of an FAA Waiver to allow for predawn (or night) flying to maximize the temperature

Figure 5. Thermal imagery from ThermoMap camera of bison (left) and RGB imagery (right) with movement of bison from time 0615 to 0715.

Figure 6. Thermal imagery from ThermoMap camera of four mule deer (circled).

differential. Additionally, manual analysis of the thermal imagery, while doable, is time consuming and cumbersome. This process would benefit from the use of a scripted automation or semi-automated routine for entity detection [7]. A detection algorithm identify areas within the scenes where there are large temperature differences would enable the user to more rapidly identify and catalog the pertinent data from a large coverage area.

3. Autonomous Terrain Modeling

Prior to fully incorporating UAS data collection and integration into the graduate level spatial data acquisition course, the Spatial Sciences Institute collaborated with the Institute for Creative Technologies to provide opportunities for UAS pilot training in difficult terrain and pilot workflow presentations that mutually benefited ICT pilots and students on Catalina Island. As such, we also describe the integration of aerial imagery performed by USC-ICT.

3.1. Background

The USC-ICT’s Terrain efforts focus on researching and prototyping capabilities that support a fully geo-reference 3D planetary model for use in the Army’s next-generation training and simulation environments. USC-ICT research exploits new techniques and advances in the focus areas of collection, processing, storage and distribution of geospatial data to various runtime applications.

3.2. Generalized Methodology

USC-ICT collects aerial images with Commercial off the Shelf (COTS) UAS using the ICT autonomous UAV path planning and imagery collection system. The software provides a user-friendly interface that encodes photogrammetry best practices. Unlike other commercially available UAV remote control software, the ICT solution was design for collecting aerial images that cover a large area of interest with multiple flights. Parameters that are required for data collections include a bounding box of the area of interest, flight altitude, the desired overlap between images, and camera orientation. An optimized flight path is then computed with these parameters and the imaging task can be automatically accomplished. With the acquired images, the 3D point clouds are reconstructed using commercial photogrammetry software.

The photogrammetric-generated point clouds/meshes (Figure 7) from a collection stage do not allow both user-level and system-level interaction, as they do not contain the semantic information to distinguish between objects. The workflow in previous works require either manually labeling the points into different parts, or re-training a new model for each new scene. USC-ICT has designed a fully automated process that utilizes deep learning to automatically extract relevant features and segmentation from point clouds. To train the feature attribution model, the points are first manually labeled with the following labels: ground, man-made objects, vegetation, etc. These point clouds are then adapted to 3D voxel grids to produce a representation suitable for deep neural networks. ICT designed a simple yet effective 3D encoding and decoding network architecture based on 3D U-Net for point cloud segmentation. During training, a straight and forward 3D data augmentation strategy was designed to perform rotation, translation, and crop on the input data at the same time. This expands the amounts of data and allows for better generalization capabilities of the model. The resulting pipeline is able to extract building, ground, and vegetation in the raw point clouds automatically with high accuracy and produce accurate 3D models (Figure 8).

3.3. Connection with Curriculum

With this technology and workflow in mind, Ryan McAlinden and pilot trainees demonstrated the mission planning and other considerations to the data acquisition classes. Pilot trainees were afforded the opportunity of a non-traditional, low-stakes preparation, training, and flight time at WIES on Catalina Island, while graduate students in the spatial acquisition course experienced the mission planning and data collection, and were able to utilize the aerial imagery collected and modeled in their project work. This endeavor, while beneficial to USC-ICT and SSI, was taken over by the general instructors in SSI, and is discussed below

Figure 7. Photogrammetric point cloud segmentation, University of Southern California site.

Figure 8. 4-layer transition image of the USC campus.

4. Integrating Remote Sensing and UAS into Curriculum

As an academic unit, the SSI continues to grow and develop curriculum at the graduate and undergraduate levels. This is evident in our changing course offerings such as Spatial Data Collection Using Drones, an introductory undergraduate course that provides students with technical and practical experience with UAS, and program development such our Graduate Certificate in Remote Sensing for Earth Observation (RSEO). These courses and programs have undergone curricular review by our respective internal curriculum committees, and the appropriate Curriculum Offices at the College and University levels.

4.1. Student Programs and Curriculum Development

The Spatial Sciences Institute initiated curricular updates concurrently for both undergraduate education and graduate education that incorporate remote sensing at a higher level than was previously encompassed. Some changes were considered minor, modifications to content of existing courses that meet learning objectives with regards to remote sensing and UAS data collection, while other changes involved the creation of new courses and programs. As discussed for this paper, students enrolled in the graduate level Spatial Data Acquisition course who are also new to integrating remotely sensed data, collected via instructor conducted UAS flights, gain experience in pre-flight mission planning and post-flight processing of the data. They also trouble shoot problems that may arise with integration of these data with other field collected data; we encourage students to work through the processes to solve issues such as mis-matched coordinate systems upon projecting both UAS imagery (collected in UTM Zone 11N) with data from high accuracy receivers (collected in WGS 1984 and projected in Web Mercator) under guidance from faculty. The students can then related this practical experience with real world situations in which they will be utilizing these data and processes.

Additional curricular development culminated in the creation of a non-major undergraduate course in spatial data collection utilizing drone, in which students will, for the first time, work the UAS to plan, collect, and process imagery. This course revolves around applied and active learning experiences in which students can develop and demonstrate a deeper knowledge and understanding of the technological sciences behind the UAS-based collections, processing, and visualizations, and through germane examples. Additionally, this is a one-off course, meaning students across all disciplines are welcome to enroll in the course, no prior experience with GIS is required, and this is not limited to majors or minors. We anticipate that this course likely will draw students from the most diverse majors and academic disciplines. Lastly, we have developed a new graduate certificate program in Remotes Sensing for Earth Observations (RSEO), which leverages remotely sensed data from a multitude of sources such as Location Based Services (LBS), social media, and Internet-of-Things (IoT) devices for a variety of applications from weather and environmental observations to disaster management and recover efforts. The program focuses on the acquisition, management, and integration of these data for the purpose of advanced trend analysis, with the aim that students and professionals develop proficiency working with these data and are able apply their use in decision-making processes.

4.2. Capacity Building of Faculty for Improved Student Outcomes

In order to create a sustainable program that incorporated UAS and remote sensing technology at a greater level, SSI invested in building capacity of current faculty (Figure 9) beyond the abilities of Knowles and Marx. This entailed developing technical training and pilot certifications for the selected faculty that are, or would be, responsible for spatial data acquisition via UAS. Trainee faculty members worked in collaboration with USC-ICT and experienced pilot faculty to develop study plans and to review materials for the FAA RPL Part 107 exam. Faculty then created individual learning plans and study schedules. Due to fiscal considerations, faculty were required to sit for the Part 107 exam prior to the closing of fiscal year 2017.

In consultation with USC-ICT, Knowles, and Marx, SSI also invested in the necessary equipment and photogrammetry software for educational purposes. Equipment this included a quad copter UAS (DJI Phantom 4 Pro) with a RGB payload, no thermal or multispectral payloads are mounted currently. This was a

Figure 9. Workflow for capacity building and training of faculty for attaining FAA RPL, Part 107.

budgetary consideration and we recommend that initial investment also include a multispectral payload. This will greatly increase the data collection possibilities to include vegetation and landscape analysis, such as Normalized Difference Vegetation Index (NDVI) or others, which can be applied to a variety of domains ranging from crop management in agricultural to tracking people and objects for human security, search and rescue, and other military operations. Additional equipment and accessories included extra Intelligent Flight batteries, back-up propellers, carrying case, and one tablet. Educational licensing for photogrammetry software (Pix4D) was purchased and is renewed annually. Additional mission planning software (AirMap, DJI-Go, etc) are available free of charge and downloadable to any mobile device.

Upon successful completion of the exam, all faculty pilots engaged in physical training with Marx and Knowles. A pre-flight check list, in-flight protocol, and post-flight image processing workflow were standardized for use with the current graduate spatial data acquisition course and are available to use for additional courses, such as the aforementioned undergraduate course that specializes in spatial data collection using drones. Test flights, under the direction of Marx and Knowles, were conducted in open, unpopulated parks, in accordance with FAA regulations and recommendations that limit flights overhead of people in public spaces. Faculty also conducted test data collections, processing, and integration of outputs (DSM, 3D mesh) incorporated into a GIS platform to visualize georeferenced data to hone integration abilities.

5. Discussion

Ideally, students will develop projects that test not only the utility of remotely sensed imagery collected via UAS, but also efficiency and efficacy of their workflows. The weeklong Catalina experience can be used as a testing ground for implementation of this technology to new domains on a small scale, prior to large-scale implementation. Our goal is to provide students the opportunities to engage in active experimentation and improve learning experience and outcomes. This is accomplished through active collection of data, detailed data analysis, and product production via some geo-visualization outcome.

Having experienced faculty that can train additional faculty, and act as a resource throughout development of the programs and/or new student projects and novel flight paths, is key to the success of our programs. Successful pilot studies that apply remote sensing to wildlife tracking and monitoring, such as the first study highlighted above, and 3D models of the natural and built environment that are derived from UAS collected imagery and an automated computing process, such as accomplished by USC-ICT, are exemplars of the advantages of integrating remote sensing in a GIS. Additionally, faculty that can effectively communicate and demonstrate the possibilities of remotely sensed data acquired via UAS, are vital to improved student experiences and outcomes. SSI does not aim to train pilots, and in fact some of the students may already have experience working with UAS and product visualization through current jobs. Rather, courses and programs focus on the utility of remotely sensed data within a GIS, the science of photogrammetry and production of geo-referenced 3D models, and the variety of geospatial analyses that can be run for the diverse applications of remotely sensed data.

Lastly, while we have focused the case studies presented here on remotely sensed data mainly collected via unmanned aerial systems, our students interact with a variety of remotely sensed data (LiDAR, multi- and hyper-spectral satellite imagery, etc) within a GIS during these courses and in the progress of research projects that span the humanities and physical sciences. Through this work, SSI reinforces that these educational opportunities, focused primarily on data acquisition, are instrumental in supporting the application of geographic information systems, science, and technology in many diverse fields ranging from human security and humanitarian relief, to sustainable urban and rural planning and public health.

6. Conclusion

We have presented two innovative research projects that integrate remotely sensed data collected via UAS in GIS for distinct purposes, but that share the common element of further integration of remote sensing into the curriculum of spatial sciences courses. We successfully demonstrated the potential for wildlife monitoring on Catalina Island utilizing UAS and made tangible recommendations the future work in this domain. We also presented the work of USC-ICT and development of an automated process to build a fully geo-referenced 3D model of the earth for the training and simulation needs as the impetus and model for faculty development at SSI. In order to achieve the curricular renovations referenced above, faculty must be properly equipped to guide student data acquisition, processing, and analysis; we presented the workflow of how the Spatial Sciences Institute achieved this and the importance for our student development.

Acknowledgements

We would like to thank the Catalina Island Conservancy for engaging with the pilot study described here and permitting the public dissemination of the results. We also thank the staff of the Wrigley Institute for Environmental Studies on Catalina Island for their continue support of the field course.

Conflicts of Interest

The authors declare no conflicts of interest regarding the publication of this paper.

References

[1] Kolb, D.A. (2015) Experiential Learning: Experience as the Source of Learning and Development. 2nd Edition, Pearson Education, Inc.
[2] McLeod, S.A. (2017) Kolb-Learning Styles and Experiential Learning Cycle. Simple Psychology.
https://www.simplypsychology.org/learning-kolb.html
[3] Loyola, L.C., Marx, A.J. and Fleming, S.D. (2018) Combining Teaching, Partnerships, and Research in the Field: Lessons from the Spatial Data Acquisition Course (on Catalina Island). GIS-Pro & CalGIS 2018, Palm Springs, 8-12 October 2018.
[4] Kays, R., Sheppard, J., Mclean, K., Welch, C., Paunescu, C., Wang, V., Crofoot, M., et al. (2019) Hot Monkey, Cold Reality: Surveying Rainforest Canopy Mammals Using Drone-Mounted Thermal Infrared Sensors. International Journal of Remote Sensing, 40, 407-419.
https://doi.org/10.1080/01431161.2018.1523580
[5] Hodgson, J.C., Mott, R., Baylis, S.M., Pham, T.T., Wotherspoon, S., Kilpatrick, A.D., Koh, L.P., et al. (2018) Drones Count Wildlife More Accurately and Precisely than Humans. Methods in Ecology and Evolution, 9, 1160-1167.
https://doi.org/10.1111/2041-210X.12974
[6] Chrétien, L.P., Théau, J. and Ménard, P. (2016) Visible and Thermal Infrared Remote Sensing for the Detection of White-Tailed Deer Using an Unmanned Aerial System. Wildlife Society Bulletin, 40, 181-191.
https://doi.org/10.1002/wsb.629
[7] Lhoest, S., Linchant, J., Quevauvillers, S., Vermeulen, C. and Lejeune, P. (2015) How Many Hippos (HOMHIP): Algorithm for Automatic Counts of Animals with Infra-Red Thermal Imagery from UAV. International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, 40, 355-362.
https://doi.org/10.5194/isprsarchives-XL-3-W3-355-2015

Copyright © 2024 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.