Geology & Geophysics Meeting Report

Using Lidar to Advance Critical Zone Science

The Next Generation of LiDAR Analysis for Critical Zone Research;
Boulder, Colorado, 12–14 May 2014

By , Steve W. Lyon, and Jill A. Marshall

Critical Zone (CZ) scientists study the interactions among soil, water, air, and living organisms that shape the Earth’s surface. Lidar (light detection and ranging) has transformative potential to advance CZ science because the technology simultaneously measures geomorphic, hydrologic, and ecologic properties at high resolution (<10 centimeters) and over large extents (>100 square kilometers).

While lidar has led to major disciplinary breakthroughs, such as quantifying landscape patterns in erosion, snow water, and above-ground biomass, interdisciplinary challenges remain. To further the use of lidar in CZ science, an international group of more than 30 scientists gathered for a ­3-day workshop, held at the University of Colorado in Boulder, that was cosponsored by the U.S. National Science Foundation and the Swedish Foundation for International Cooperation in Research and Higher Education.

The workshop goals were to characterize use of lidar data and technology across CZ science and suggest ways of maximizing future use. The group identified only a small number of examples that successfully leveraged ­co-located ­lidar-​­derived data sets. These seminal examples, combining topographic, hydrologic, and vegetation data, demonstrate lidar’s utility in understanding CZ architecture and function across disciplines and scale. Transdisciplinary studies provide crucial scientific roadmaps for CZ scientists, as forthcoming advances in lidar systems (e.g., hyperspectral), processing (e.g., full waveform rectification), and collection platforms (e.g., unmanned aerial vehicles) are rapidly increasing information content and refining temporal resolution.

The workshop participants identified opportunities and challenges facing three key ­lidar-​based applications: change detection over time, parameterization and verification of physical models, and development of scaling relationships and interrelationships. Emerging technologies present new opportunities to link lidar data to complementary measurements, particularly ­finer-​scale in situ and ­coarser-​scale remote sensing observations. A common challenge across applications is that lidar processing to date has required expensive, proprietary software to develop and classify data sets. The reliance on proprietary software is slowly changing as new ­open-​­source processing techniques are emerging. To address these crosscutting opportunities and challenges, the CZ community will need to better articulate its current processing resources and future needs.

A summary of workshop recommendations to advance CZ science over the next 5 years through transdisciplinary lidar applications is presented below:

Develop open lines of communication within and among groups, including individual CZ disciplines, remote sensing scientists, computer scientists, and funding agencies. Workshops have the potential to increase communication between data users and data creators.

Advocate for new and existing lidar repositories, as well as ­open-​source and ­community-​­centric processing resources.

Advocate for new acquisition technologies that lower the cost of lidar collection and increase its availability, such as unmanned platforms and institutional acquisitions of lidar systems.

Support the development of new lidar technologies that are useful for linking disciplinary observations (e.g., hyperspectral laser technologies).

Consider other remote sensing observations that may be complementary to lidar (e.g., hyperspectral imagery), as well as competing ­lidar-​like technologies that may prove more suitable for specific CZ applications.

—Adrian A. Harpold, University of Colorado, Boulder; ­email: ­[email protected]; Steve W. Lyon, Stockholm University, Stockholm, Sweden; and Jill A. Marshall, University of Oregon, Eugene

© 2014. American Geophysical Union. All rights reserved.