Rockfalls, in which gravel- to boulder-sized rocks break away from slopes, skew toward the small end on the scale of landslides. Nevertheless, they remain hazardous. Rockfalls have blocked roads, crushed infrastructure, and taken lives. Reports from around the world suggest that they may be becoming more common in response to rising temperatures in some mountainous regions, so scientists are justifiably interested in better understanding the frequency with which these geologic hazards occur.
Advances in computing power and remote sensing tools, such as lidar, have allowed researchers to detect and monitor rockfalls more closely in recent years. A new study shows, however, that the monitoring interval—the amount of time between successive data collections at a site—can significantly influence estimates of rockfall rates. Researchers typically monitor for rockfalls on a monthly basis, but such long periods between surveys can lead to underestimations of small rockfalls and/or overestimations of large ones. This is because multiple small events can be mistaken for a single, larger event when the time between surveys is longer than the return period of small rockfalls.
Williams et al. monitored a coastal cliff in the seaside town of Whitby in the United Kingdom, where Jurassic shales, sandstones, and mudstones are eroding into the sea. Here, as in most locations prone to rockfalls, the erosion follows a power law, with small rockfalls occurring more frequently than larger ones. The researchers used terrestrial lidar surveys, collected hourly over 10 months in 2015, and an algorithm to detect changes in the rock face due to rockfalls.
To find out how the monitoring interval influences rockfall rate estimates, the team modeled the frequency distribution of rockfall volumes at various hour- and day-scale intervals, ranging from 1 hour to 30 days. The researchers found that changing the time interval significantly influenced the rockfall frequency and volume measured. Overall, increasing rockfall rates were seen for monitoring intervals shorter than about 12 hours, whereas for a range of intervals greater than 12 hours, observed rockfall rates were nearly identical. Specifically, the mean rockfall rate under hourly monitoring was 61 events per day—an order of magnitude larger than the six rockfalls per day seen with monthly surveys. And monitoring using the shortest time interval of 1 hour led to a threefold decrease in the average recorded rockfall volume compared with using a 30-day interval.
Given the risks posed by rockfalls of all sizes, this large increase in rockfall rate has major implications for rockfall hazard models, according to the authors. (Journal of Geophysical Research: Earth Surface, https://doi.org/10.1029/2019JF005225, 2019)
—Kate Wheeling, Freelance Writer