Presentation type:
GI – Geosciences Instrumentation & Data Systems

EGU23-13597 | Orals | GI1.1 | Highlight | Christiaan Huygens Medal Lecture

Solving the ambiguity in the potential field exploration of complex sources 

Maurizio Fedi

It is theoretically demonstrated  that, even with perfectly complete and perfectly accurate data, there is a fundamental ambiguity in the analysis of potential field data. The ambiguity may be easily illustrated by computing some of the various kinds of structures that can give rise to the same anomaly field. To solve the ambiguity and yield reasonable geophysical models we must therefore supply a priori information. In gravimetry, the ambiguity comes from the fact that only the excess mass is uniquely determined by the anomaly, neither the density nor the source volume. However, not only the excess mass can be uniquely estimated. Examples are the center of a uniformly dense (or magnetized) sphere or the top of a deeply extended homogeneously-dense cylinder. A priori information may consist of direct information (e.g., depth, shape) and/or of assuming that the source distribution has some specified properties (e.g., compactness, positivity). If one tries to classify the physical source-distributions in terms of their complexity, we may however use two different scaling laws, based on homogeneity and self-similarity, which allow modeling of the Earth in its complex heterogeneity. While monofractals or homogeneous functions are scaling functions, that is they do not have a specific scale of interest, multi-fractal and multi-homogeneous models need to be described within a multiscale dataset. Thus, specific techniques are needed to manage the information contained on the whole multiscale dataset. In particular,  any potential field  generated by a complex source may be modeled as  a multi-homogeneous field, which typically present a fractional and spatially varying homogeneity degree. For a source of irregular shape, it may be convenient to invert not the  field but a related quantity, the scaling function, which is a multiscale function having the advantage of not involving the density among the unknown parameters. For density or magnetic susceptibility tomographies, the degree of spatially variable homogeneity can be incorporated in the model weighting function, which, in this way, does not require prior assumptions because it is entirely deductible from the data. We discover that difficult quantities, such as the bottom of the sources, or multiple source systems are reasonably well estimated by abandoning the analysis at a single scale and unraveling the scale-related complexity of geophysical signals. The inherent self-consistency of these new multiscale tools is a significant step forward, especially in the analysis of areas where there is scarce other information about the sources.

How to cite: Fedi, M.: Solving the ambiguity in the potential field exploration of complex sources, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-13597, https://doi.org/10.5194/egusphere-egu23-13597, 2023.

EGU23-1641 | ECS | Orals | GI1.1 | GI Division Outstanding Early Career Scientist Award Lecture

Towards sustainable road transport infrastructure: Insights from GPR performance indicator development and enhancement of data quality 

Mezgeen Rasol

The wide use of the NDT technologies and the big database are produced, transmitted, collected, processed needs to be managed and well-presented towards monitoring of road transports. Using Ground Penetrating Radar (GPR) as one of the most efficient non-destructive tests in road transport monitoring.  Such database outcomes produced through on-site monitoring approaches are essential for providing the most reasonable decision-making tools to support best engineering judgment on-site. In addition to that the accuracy and precision of such decision-making tools are highly dependent on the data quality generated from different GPR images. Establishing performance indictor could avoid errors in dataset and unfavorable decisions in pavement management system. Consequently, GPR data management and transforming to local indicators is crucial to increase quality control of the dataset. This is still an ongoing challenging task for GPR support-knowledge.  

Establishing indicators are based on different criteria including intuitive outcomes, empirical outputs, and analytical results. Different GPR signal parameters can be correlated to the subsurface material changes and deterioration such as electromagnetic wave velocity, amplitude, centre frequency and signal attenuation to some local indicators. This can be under the category of the current challenges, the question is as follows, How GPR data can be converted to indicators based on the common defects in road transports. Therefore, establishing potential metrics to value GPR-related indicators in both a qualitative and quantitative approaches is crucial to provide better understanding of the defects and their propagation in road pavements.

How to cite: Rasol, M.: Towards sustainable road transport infrastructure: Insights from GPR performance indicator development and enhancement of data quality, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-1641, https://doi.org/10.5194/egusphere-egu23-1641, 2023.

GI1 – General sessions on geoscience instrumentation

EGU23-237 | ECS | Posters virtual | GI1.1

Scheduling System for Remote Control of Instruments used for Atmospheric Observation 

Martin Schumann, Johannes Munke, Stephan Hachinger, Patrick Hannawald, Inga Beck, Alexander Götz, Oleg Goussev, Jana Handschuh, Helmut Heller, Roland Mair, Till Rehm, Bianca Wittmann, Sabine Wüst, Michael Bittner, Jan Schmidt, and Dieter Kranzlmüller

In this poster contribution, we present a scheduling system for automated remote operation of instruments at high-altitude research facilities and similar remote sites. Via web-based interfaces, the system allows instrument owners as well as authorized third-party scientists to schedule and execute measurements and observations.

The system has been developed as a thesis project in the context of the AlpEnDAC-II ("Alpine Environmental Data Analysis Centre", www.alpendac.eu) collaboration (funded by the Bavarian State Ministry of the Environment and Consumer Protection). Consequently, the scheduler and interfaces have been integrated with the AlpEnDAC Operating-on-Demand functionalities. A first use case for the framework has been the operation of an airglow imager (FAIM) in Oberpfaffenhofen (DE).

We describe the design and implementation of our system for scheduling and execution of multi-user observations on instruments, including scheduling-data transfers and data retrieval. Our core implementation uses an optimization-based scheduler (Google's OR-Tools) to ensure maximum instrument use and to minimize idle times. Results show that the scheduler is reliable, fast, and is consistently able to provide optimal observation plans. The extensibility of the system is guaranteed by the usage of modern software in the core of the system, including well-defined and specified communication through REST APIs. Thus, it can easily be adapted to other settings and instruments, which is also facilitated by a modern deployment strategy using Docker and Kubernetes.

How to cite: Schumann, M., Munke, J., Hachinger, S., Hannawald, P., Beck, I., Götz, A., Goussev, O., Handschuh, J., Heller, H., Mair, R., Rehm, T., Wittmann, B., Wüst, S., Bittner, M., Schmidt, J., and Kranzlmüller, D.: Scheduling System for Remote Control of Instruments used for Atmospheric Observation, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-237, https://doi.org/10.5194/egusphere-egu23-237, 2023.

EGU23-1220 | ECS | Posters on site | GI1.1

Analytical developments at the Potsdam SIMS user facility and the metrological limits on in situ isotope ratio data 

Maria Rosa Scicchitano, Michael Wiedenbeck, Frederic Couffignal, Sarah Glynn, Alicja Wudarska, Robert Trumbull, and Alexander Rocholl

The German Research Centre for Geosciences (GFZ) in Potsdam hosts a CAMECA 1280-HR large geometry secondary ion mass spectrometer (SIMS) with a web-based user node at the University of the Witwatersrand, South Africa. A major theme of our facility is high-precision, high-accuracy, high-spatial resolution analyses of light isotope ratios in a variety of natural and experimental materials.

The latest analytical developments from the GFZ SIMS laboratory focus on the development, assessment and use of new reference materials for stable isotope analysis. Particularly for oxygen, our repeatability from 15-µm diameter domains is now typically better than ±0.15‰ (1s). However, the total uncertainty on such analyses is commonly larger because of significant differences (in some cases more than one ‰) among the isotope ratios of reference materials reported by multiple, highly regarded gas source mass spectrometry laboratories. This issue of interlaboratory bias during reference material characterization inevitably impacts all in situ data employing such materials and must be duly considered.

How to cite: Scicchitano, M. R., Wiedenbeck, M., Couffignal, F., Glynn, S., Wudarska, A., Trumbull, R., and Rocholl, A.: Analytical developments at the Potsdam SIMS user facility and the metrological limits on in situ isotope ratio data, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-1220, https://doi.org/10.5194/egusphere-egu23-1220, 2023.

EGU23-1641 | ECS | Orals | GI1.1 | GI Division Outstanding Early Career Scientist Award Lecture

Towards sustainable road transport infrastructure: Insights from GPR performance indicator development and enhancement of data quality 

Mezgeen Rasol

The wide use of the NDT technologies and the big database are produced, transmitted, collected, processed needs to be managed and well-presented towards monitoring of road transports. Using Ground Penetrating Radar (GPR) as one of the most efficient non-destructive tests in road transport monitoring.  Such database outcomes produced through on-site monitoring approaches are essential for providing the most reasonable decision-making tools to support best engineering judgment on-site. In addition to that the accuracy and precision of such decision-making tools are highly dependent on the data quality generated from different GPR images. Establishing performance indictor could avoid errors in dataset and unfavorable decisions in pavement management system. Consequently, GPR data management and transforming to local indicators is crucial to increase quality control of the dataset. This is still an ongoing challenging task for GPR support-knowledge.  

Establishing indicators are based on different criteria including intuitive outcomes, empirical outputs, and analytical results. Different GPR signal parameters can be correlated to the subsurface material changes and deterioration such as electromagnetic wave velocity, amplitude, centre frequency and signal attenuation to some local indicators. This can be under the category of the current challenges, the question is as follows, How GPR data can be converted to indicators based on the common defects in road transports. Therefore, establishing potential metrics to value GPR-related indicators in both a qualitative and quantitative approaches is crucial to provide better understanding of the defects and their propagation in road pavements.

How to cite: Rasol, M.: Towards sustainable road transport infrastructure: Insights from GPR performance indicator development and enhancement of data quality, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-1641, https://doi.org/10.5194/egusphere-egu23-1641, 2023.

EGU23-1725 | ECS | Posters virtual | GI1.1

Analysis on the Uncertainty of the Coil Sensitivity based on the Principle of Scalar Calibration Method 

Manming Chen, Yiren Li, Xinjun Hao, Kai Liu, Zonghao Pan, Xin Li, and Tielong Zhang

Scalar calibration method is quite convenient and widely used in the coil sensitivity calibration for certain coil system. The uncertainties of the results are critical in evaluating the accuracy of the coil sensitivity. Based on the calibration principle, a mathematical description of the coil sensitivity uncertainty is given, which shows that the current, the environmental magnetic field and their uncertainties are the main factors attributing to the coil sensitivity uncertainty. Series of tests were conducted under different currents and stable environmental magnetic fields. The results show a very good accordance with the analytical description. With the increase of the current, the coil sensitivity uncertainty becomes smaller and goes to a limit decided by the current uncertainty while the influence of the environmental magnetic field is comparatively insignificant.

How to cite: Chen, M., Li, Y., Hao, X., Liu, K., Pan, Z., Li, X., and Zhang, T.: Analysis on the Uncertainty of the Coil Sensitivity based on the Principle of Scalar Calibration Method, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-1725, https://doi.org/10.5194/egusphere-egu23-1725, 2023.

EGU23-2690 | Orals | GI1.1

A truly Very Broad Band (VBB) borehole seismometer with flat response over 5 decades of frequency 

Cansun Guralp, Horst Rademacher, Paul Minchinton, Robert Kirrage, Maksim Alimov, Rebekah Jones, and Nichola Boustead

Decades ago new opportunities in seismology were opened by the development of broadband seismic sensors with feedback. The three defining characteristics of these instruments were the bandwidth extension to longer periods, a much lower intrinsic noise and a higher dynamic range. However, the goal of further extending their bandwidth to frequencies above 100 Hz has proven elusive, because these sensors are plagued by parasitic resonances leading to modes not controllable by the feedback system.

Here we present a new low noise seismic borehole sensor with a truly VBB flat response over five frequency decades from 2.7 mHz (360 sec) to 270 Hz. The instrument has no mechanical resonances below 400 Hz. We achieved the bandwidth extension to high frequencies with improvements of the mechanical design, i.e. the arrangement of the pivots and the geometry of the spring.

The design is realized in a borehole arrangement, where three sensors are stacked in 90 degree angles to each other. Including a singe jaw holelock as a clamping mechanism the complete stack has a diameter of 89 mm, is 625 mm long and weighs about 24.5 kg. We show test results from three co-located complete borehole sensors with identical frequency responses.

How to cite: Guralp, C., Rademacher, H., Minchinton, P., Kirrage, R., Alimov, M., Jones, R., and Boustead, N.: A truly Very Broad Band (VBB) borehole seismometer with flat response over 5 decades of frequency, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-2690, https://doi.org/10.5194/egusphere-egu23-2690, 2023.

Muography is a passive and non-destructive imaging technique that utilizes cosmic-ray muons for visualizing and monitoring the interior of geological structures and human-made objects. A rapid development of muographic observation technologies was achieved in the recent years, which allowed to resolve Earth's shallow subsurface with a resolution of a few meters and conduct long-term muon monitoring in harsh and varying environment. Muography can be applied as a complementary technique for Earth sciences and related engineering fields, e.g., for studying active volcanism, characterizing the overburden above underground sites, structural health monitoring infrastructures, exploring hidden cultural heritages, etc. We overview the recent progress in development of muography and we discuss case studies from volcanology to mining engineering from the Americas, Asia and Europe.

How to cite: Oláh, L.: Advances in cosmic-ray muography for Earth sciences and geophysical applications, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-3288, https://doi.org/10.5194/egusphere-egu23-3288, 2023.

EGU23-3732 | ECS | Posters on site | GI1.1

Improving single particle ICP-TOFMS using a desolvation sample introduction system and collision cell technology 

Geunwoo Lee, Tobias Erhardt, Lyndsey Hendriks, Martin Tanner, Barbara Delmonte, and Hubertus Fischer

Inductively coupled plasma time-of-flight mass spectrometry (ICP-TOFMS) is increasingly used in various disciplines, especially for the characterization of single particles, because it allows truly simultaneous determination of isotopes over full mass range without sacrificing analytical sensitivity (Baalousha et al., 2021; Erhardt et al., 2019; Goodman et al., 2022). In particular, the extremely high time resolution of TOFMS allows us to detect individual mineral dust particles in the water stream obtained from Continuous Flow Analysis of Greenland Ice cores. Even though collision cell technology (CCT) and high-sensitivity sample introduction have been applied to the ICP-MS systems to overcome analytical limitations in spectral interferences and sensitivity (Burger et al., 2019; Lin et al., 2019), the impact of CCT and high-sensitivity desolvation sample introduction on analysis of single particles, unlikely to bulk analysis, are still relatively poorly understood. We investigated the effects of CCT and high-sensitivity desolvation sample introduction (individually and in combination) on the capability of single particle (sp) ICP-TOFMS including sensitivity and transport/transmission efficiency. To do so, we systemically investigated differences in the sensitivity of total Au, the transport efficiency of Au nano particles as well as the signal amplitude above the background of these nano particles. Application of the desolvation unit without CCT led to a significant improvement of the transport efficiency (number of particles introduced into the plasma) by a factor of about 5 but to a reduction of sensitivity (counts per particle) by about 30 percent. When CCT was used for sp-ICP-TOFMS without the high-sensitivity sample introduction system, the sensitivity for gold ion in particle signals increased by only about six percent. This is similar to the sensitivity improvement for gold ions in dissolved background signals from the ion focusing effect after the collision cell. However, when CCT was used in combination with the high-sensitivity sample introduction system, the sensitivity for gold ion in particle signals increased by another up to about 33 percent compared to the desolvation system without CCT. This could be because the collisional ion focusing is more effective under the high sample transport condition of the high-sensitivity sample introduction system. In conclusion, the high-sensitivity sample introduction system increased the number of detected particles by about 5 times while drying the sample and applying CCT enhanced the sensitivity of analyte ions in the ion optics of sp-ICP-TOFMS by about a third for gold particle signals and a factor of 2 for multi-elemental dissolved background signals. These enhancements will help us to analyze trace isotopes in ice core mineral dust particle analysis and to characterize the chemical composition of detected particles by sp-ICP-TOFMS.

How to cite: Lee, G., Erhardt, T., Hendriks, L., Tanner, M., Delmonte, B., and Fischer, H.: Improving single particle ICP-TOFMS using a desolvation sample introduction system and collision cell technology, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-3732, https://doi.org/10.5194/egusphere-egu23-3732, 2023.

EGU23-7349 | Orals | GI1.1

A Rugged, Portable and Intelligent Analogue Seismometer for Future and Pre-Existing Arrays – Güralp Certis 

Dan Whealing, James Lindsey, Neil Watkiss, and Will Reis

Seismic networks often face logistical and financial challenges that require portability, longevity and interoperability with existing equipment.

Güralp have combined proven ocean bottom, borehole and digitiser technology to produce an analogue seismometer with intelligence that benefits networks of all sizes. The Güralp Certis is a broadband analogue instrument that incorporates specific aspects of its sister digital instrument (Certimus) while still remaining compatible with third-party digitisers.

Each Certis stores its own serial number, calibration and response parameters internally and will automatically communicate these to a connected Minimus digitiser. This allows seismometer-digitiser pairings to be changed without manual entry of new parameters. If using GDI-link streaming protocol with the Minimus, these metadata parameters are transmitted within (and therefore inseparable from) the datastream itself. Therefore, this small piece of intelligence in the analogue sensor removes the need for any manual re-entry of response parameters anywhere along the sensor-digitiser-client chain.

Certis enables users to install in locations with poor horizontal stability (e.g., glaciers, dynamic landslide scarps, water-saturated soils), without the need for cement bases or precise levelling, as the sensor can be deployed at any angle regardless of which model digitiser is connected. Due to its small size, low weight and ultra-low power consumption, Certis significantly reduces logistical efforts and makes short term temporary deployments far easier.

Certis addresses many challenges of traditional seismometer deployments, including cost, but provides a flexible and simple solution for seismic monitoring applications across all disciplines.

How to cite: Whealing, D., Lindsey, J., Watkiss, N., and Reis, W.: A Rugged, Portable and Intelligent Analogue Seismometer for Future and Pre-Existing Arrays – Güralp Certis, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-7349, https://doi.org/10.5194/egusphere-egu23-7349, 2023.

EGU23-10047 | Orals | GI1.1

Stromboli volcano monitoring with airborne SAR systems 

Riccardo Lanari, Carmen Esposito, Paolo Berardino, Antonio Natale, Gianfranco Palmese, and Stefano Perna

Synthetic Aperture Radar (SAR) is an active sensor that can be mounted onboard satellite or airborne platforms for observing of Earth’s surface in any weather condition and even during night [1]. In the last years, it has been shown that the interferometric SAR (InSAR) technique allows [2] generating high quality Digital Elevation Models (DEMs) [1] from spaceborne [3] and airborne [4–7] SAR data.

Airborne SAR systems, unlike satellite SAR ones, are particularly suitable for environmental monitoring in case of emergencies due to their capability to maintain very tight revisit times and to acquire data practically without orbital constraints. The contribution of this work fits very nicely within this context. Indeed, in this work, we show the results obtained from the data collected during the acquisition campaigns carried out with the AXIS [5] and MIPS [8] airborne X-band interferometric SAR systems over the Stromboli island (Italy). In particular, starting from multiple single-pass interferometric SAR surveys we present the differences of the generated DEMs with the aim of measuring the topographic changes induced by the eruptive activity over the whole island during the July 2019 – October 2022 time interval. The work is supported by an agreement between IREA-CNR and the Civil Protection Department of Italy.

 

References

1. Franceschetti, G.; Lanari, R. Synthetic aperture radar processing; 1999;

2. Moreira, A.; Prats-Iraola, P.; Younis, M.; Krieger, G.; Hajnsek, I.; Papathanassiou, K.P. A tutorial on synthetic aperture radar. IEEE Geosci. Remote Sens. Mag. 2013.

3. Rabus, B.; Eineder, M.; Roth, A.; Bamler, R. The shuttle radar topography mission - A new class of digital elevation models acquired by spaceborne radar. ISPRS J. Photogramm. Remote Sens. 2003.

4. Perna, S.; Esposito, C.; Amaral, T.; Berardino, P.; Jackson, G.; Moreira, J.; Pauciullo, A.; Junior, E.V.; Wimmer, C.; Lanari, R. The InSAeS4 airborne X-band interferometric SAR system: A first assessment on its imaging and topographic mapping capabilities. Remote Sens. 2016, 8.

5. Esposito, C.; Natale, A.; Palmese, G.; Berardino, P.; Lanari, R.; Perna, S. On the Capabilities of the Italian Airborne FMCW AXIS InSAR System. Remote Sens. 2020, 12.

6. Pinheiro, M.; Reigber, A.; Scheiber, R.; Prats-Iraola, P.; Moreira, A. Generation of highly accurate DEMs over flat areas by means of dual-frequency and dual-baseline airborne SAR interferometry. IEEE Trans. Geosci. Remote Sens. 2018.

7. Wimmer, C.; Siegmund, R.; Schwäbisch, M.; Moreira, J. Generation of high precision DEMs of the Wadden Sea with airborne interferometric SAR. IEEE Trans. Geosci. Remote Sens. 2000.

8. Natale, A.; Berardino, P.; Esposito, C.; Palmese, G.; Lanari, R.; Perna, S. The New Italian Airborne Multiband Interferometric and Polarimetric SAR (MIPS) System: First Flight Test Results. Int. Geosci. Remote Sens. Symp. 2022, 2022-July, 4506–4509.

How to cite: Lanari, R., Esposito, C., Berardino, P., Natale, A., Palmese, G., and Perna, S.: Stromboli volcano monitoring with airborne SAR systems, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-10047, https://doi.org/10.5194/egusphere-egu23-10047, 2023.

EGU23-11295 | ECS | Orals | GI1.1

Analytical performance assessment of LMS-GT, a newly-developed laboratory-scale Laser Ablation Ionization Mass Spectrometry instrument, in the context of geo- and planetary sciences 

Coenraad de Koning, Salome Gruchola, Rustam Lukmanov, Peter Keresztes Schmidt, Nikita Boeren, Andreas Riedo, Marek Tulej, and Peter Wurz

Direct analysis of micrometer-scale features embedded in solid samples of a wide chemical variety is an integral part of many fields in geo-, geobio-, and planetary sciences. Examples range from microscopic mineral inclusions in meteoritic material, (e.g., zircons, CAIs, chondrules, etc.) to biological inclusions in a variety of mineralogical host materials, e.g., (putative) fossils of microbial species. In many of these use-cases, the determination of the element and/or isotope composition, specifically those of minor or trace abundance, is of prime interest, meaning mass spectrometry is typically the preferred analysis technique.

As a result, a growing group of instruments has been (and are being) developed specifically with the purpose of element and/or isotope analysis of microscale features in solid hosts, each with its specific advantages and limitations. Perhaps the most well-known technique is (nano)SIMS, which boasts analysis spot sizes down to the nanometer level as well as ppm to ppb detection limits, but struggles with quantitativeness and capital and operating expenses. In the field of laser-based solid sampling mass spectrometric techniques, LA-ICP-MS has become a well-established technique, mainly due to its reproducibility and ease of operation. However, due to necessity to transport particles from the ablation plume to the ICP, this technique is inherently limited through fractionation effects and isobaric interferences with the plasma and carrier gas. Furthermore, sample dilution in the plasma and the subsequent loss of sample at the ICP-MS interface result in diminished limits of detection.

Another member of the laser-based solid sampling techniques is Laser Ablation Ionization Mass Spectrometry (LIMS), in which the ions present in the ablation plume are directly introduced into the mass spectrometer. This direct sampling of the ablation plume results in both a significant advantage over LA-ICP-MS (high sensitivity) and a challenge (mass resolution). The limited mass resolution of typical LIMS instruments often makes (quantitative) analysis challenging due to isobaric interferences, especially when applied to more complex materials. To solve this issue, the Laser Mass Spectrometer – Gran Turismo (LMS-GT) was developed at the University of Bern with the aim of achieving mass resolutions sufficient to resolve the most common isobaric interferences (M/ΔM = 10.000).

Over the last years, commissioning and continuous improvement of the instrument has been ongoing, which has led to a set of analytical performance characteristics which highlight the potential complementary value of LMS-GT. In this talk, we will discuss the latest technological developments1, latest analytical performance metrics (mass resolution, mass accuracy, limits of detection, etc.), and element and isotope ratio accuracies2,3. We will also discuss a case-study in which LMS-GT was used to study fossilized microbial inclusions in Gunflint chert4, highlighting both the potential strength and challenges for LMS-GT in a geo- and geobiosciences context.

1. Gruchola, S. et al., Int. J. Mass Spectrom. 474, 116803 (2022).

2. Wiesendanger, R. et al.,  J. Anal. At. Spectrom. 34, 2061–2073 (2019).

3. de Koning, C. P. et al.,  Int. J. Mass Spectrom. 470, 116662 (2021).

4. Lukmanov, R. A. et al., Front. Space Technol. 3, (2022).

How to cite: de Koning, C., Gruchola, S., Lukmanov, R., Keresztes Schmidt, P., Boeren, N., Riedo, A., Tulej, M., and Wurz, P.: Analytical performance assessment of LMS-GT, a newly-developed laboratory-scale Laser Ablation Ionization Mass Spectrometry instrument, in the context of geo- and planetary sciences, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-11295, https://doi.org/10.5194/egusphere-egu23-11295, 2023.

EGU23-13597 | Orals | GI1.1 | Highlight | Christiaan Huygens Medal Lecture

Solving the ambiguity in the potential field exploration of complex sources 

Maurizio Fedi

It is theoretically demonstrated  that, even with perfectly complete and perfectly accurate data, there is a fundamental ambiguity in the analysis of potential field data. The ambiguity may be easily illustrated by computing some of the various kinds of structures that can give rise to the same anomaly field. To solve the ambiguity and yield reasonable geophysical models we must therefore supply a priori information. In gravimetry, the ambiguity comes from the fact that only the excess mass is uniquely determined by the anomaly, neither the density nor the source volume. However, not only the excess mass can be uniquely estimated. Examples are the center of a uniformly dense (or magnetized) sphere or the top of a deeply extended homogeneously-dense cylinder. A priori information may consist of direct information (e.g., depth, shape) and/or of assuming that the source distribution has some specified properties (e.g., compactness, positivity). If one tries to classify the physical source-distributions in terms of their complexity, we may however use two different scaling laws, based on homogeneity and self-similarity, which allow modeling of the Earth in its complex heterogeneity. While monofractals or homogeneous functions are scaling functions, that is they do not have a specific scale of interest, multi-fractal and multi-homogeneous models need to be described within a multiscale dataset. Thus, specific techniques are needed to manage the information contained on the whole multiscale dataset. In particular,  any potential field  generated by a complex source may be modeled as  a multi-homogeneous field, which typically present a fractional and spatially varying homogeneity degree. For a source of irregular shape, it may be convenient to invert not the  field but a related quantity, the scaling function, which is a multiscale function having the advantage of not involving the density among the unknown parameters. For density or magnetic susceptibility tomographies, the degree of spatially variable homogeneity can be incorporated in the model weighting function, which, in this way, does not require prior assumptions because it is entirely deductible from the data. We discover that difficult quantities, such as the bottom of the sources, or multiple source systems are reasonably well estimated by abandoning the analysis at a single scale and unraveling the scale-related complexity of geophysical signals. The inherent self-consistency of these new multiscale tools is a significant step forward, especially in the analysis of areas where there is scarce other information about the sources.

How to cite: Fedi, M.: Solving the ambiguity in the potential field exploration of complex sources, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-13597, https://doi.org/10.5194/egusphere-egu23-13597, 2023.

The detection of vessels is considered an attractive byproduct of satellite radar altimetry, because it may complement the conventional tracking systems with the possibility to build long-term global statistics of ship traffic based on relatively small and manageable datasets of freely available data. Satellite radar altimetry was initially conceived and applied to the observation of ocean topography, being later extended to the coastal zone and to the observation of inland water.

The potentiality of SAR altimetry for the detection of ships has already been demonstrated with Cryosat2, and today Sentinel-3 is the first operational mission offering global SAR coverage with a constellation of two satellites.

Thanks to the enhanced azimuth (along-track) resolution available in the synthetic aperture radar (SAR) mode, the radar altimeter on board the Sentinel-3 satellite could be beneficial to other applications than ocean topography. In particular, this work studies the performance of algorithms for the automatic detection of ship targets from SAR mode data. In addition, the pre-processing of altimeter data by reliable detection algorithms, filtering out signal outliers from the sea surface response, largely contributes to enhance geophysical products that are typical in ocean topography studies (e.g. mean sea level).  Thus, altimeter data of today could be regarded as an additional non-cooperative source for vessel traffic monitoring or to map global traffic patterns over long periods of time.

This work proposes a processing chain based on mathematical morphology filtering and robust statistics to estimate the structured background and detect target signatures from radargrams. The detection stage is followed by an additional binary morphological filtering phase that is useful to estimate target characteristics, such as the height. The study shows that robust statistics outperform non-robust ones, in terms of target signal to background ratio and of rejection of false alarms. The study finally provides a first attempt to validate the analysis comparing detected target contacts with automatic identification system (AIS) data. 

How to cite: Scozzari, A. and Grasso, R.: Radar altimetry for the detection of ship traffic: an improved byproduct of satellite radar altimetry, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-13618, https://doi.org/10.5194/egusphere-egu23-13618, 2023.

EGU23-16127 | Orals | GI1.1

SBUDNIC: lessons learned in implementing a CubeSat mission 

Lorenzo Bigagli and Rick Fleeter

The SBUDNIC project was a collaboration between The National Research Council of Italy and Brown University’s School of Engineering. The project also got support from D-Orbit, AMSAT-Italy, La Sapienza-University of Rome and NASA Rhode Island Space Grant.

This scientific cooperation started in January 2021 and aimed to improve techniques and skills in Earth observation and its applications, with particular focus on teaching, research and knowledge transfer, as well as to promote open access to documentation, software and other information and resources for developing Earth observation capabilities.

The project was led by a team of students, professors and researchers and involved the construction of a 3U CubeSat according to an open and agile approach, using mostly commercial components commonly used on Earth, including an Arduino processor and AA Energizer batteries. The satellite was designed to allow the download of low resolution images in the amateur radio band.

SBUDNIC was launched on May 25, 2022 by a SpaceX Falcon 9 rocket and was released into orbit at an altitude of about 525 kilometers by the ION Satellite Carrier platform of the Space logistics company D-Orbit.

As a New Space experiment, SBUDNIC has provided useful insights on several organizational, technological and regulatory aspects of the implementation of a CubeSat mission and confirmed the increasingly rapid and affordable accessibility of Space to the scientific and academic community.

We hope that the SBUDNIC project may be able to inspire some of the future engineers in universities, research and industry, in their effort to advance Space exploration, Earth observation techniques and innovative satellite technologies, tools that may prove of fundamental importance in addressing global challenges.

How to cite: Bigagli, L. and Fleeter, R.: SBUDNIC: lessons learned in implementing a CubeSat mission, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-16127, https://doi.org/10.5194/egusphere-egu23-16127, 2023.

EGU23-16665 | Posters virtual | GI1.1

Case study for the application of micro-computed tomography on a Miocene sample of Holstein Erratics to identify and asses included molluscs and foraminifera 

Ulrich Kotthoff, Karolin Engelkes, Muofhe Tshibalanganda, Andre Beerlink, Michael Hesemann, Yvonne Milker, and Gerhard Schmiedl

The assessment of fossils in sediments and sedimentary rocks often involves the destruction of the sedimentary matrix and even of parts of the fossil assemblage (e.g. via removing and/or dissolution). Therefore, the destruction-free assessment of fossils in sediments (e.g. sediment cores) and sedimentary rocks is of great interest to the geoscience community. In addition, the three-dimensional examination of fossils becomes more and more important to evaluate morphological features and improve morphometrical analyses.

The "Holsteiner Gestein" is a sandstone and glacial erratic which is frequently found at certain outcrops in northern Germany. While the material was transported during the Pleistocene, the original deposition of this sediment took place during the Miocene, perhaps also the upper Oliocene (Schallreuter et al. 1984).

Its fossil content and paleoecology has not been investigated in detail, and since the 1980s, scientific publications on this sediment are rare. This type of material, if analysed at all, is generally subjected to destructive methods to isolate fossils such as marine snails or foraminifers (marine protists), which both comprise taxa with calcareous shells. These fossils support the reconstruction of the paleo-ecosystem and depositional environment.

In the framework of a case study, a piece of glacial erratics – “Holsteiner Gestein” was scanned with a Comet Yxlon FF35 CT system employing the directional beam tube: First, an overview scan of the whole sample (210 kV, 160 µA, 1.0 mm Cu filter, 50.00 µm iso-voxel size) was used to identify a region with high fossil count and potentially interesting fossils. The region of interest was then scanned (210 kV, 160 µA, 1.0 mm Cu filter 7.23 µm iso-voxel size) with higher resolution using a scan trajectory with a flexible rotation center that allowed for maximal resolution by adjusting the position of the sample such that it was located as close as possible to the x-ray tube but a collision was prevented. In addition, a laminography scan (180 kV, 70 µA, 0. 5 mm Cu filter, 5.02x5.02x9.58 µm voxel size) was performed to achieve the maximally possible sharpness and resolution in cross-sectional images. Data were visualized and analysed in the software Amira (version 6.0.1).

Our approach did not only enable us to three-dimensionally assess relatively big snails shells, but also foraminifera of less than 1 mm in size. The scans additionally allow quantifying the number of microfossils inside a certain part of the sample.

The foraminiferal taxa comprise agglutinating foraminifers which closely resemble the genus Entzia. These imply a former salt marsh environment.

This work is distributed under the Creative Commons Attribution 4.0 License. This licence does not affect the Crown copyright work, which is re-usable under the Open Government Licence (OGL). The Creative Commons Attribution 4.0 License and the OGL are interoperable and do not conflict with, reduce or limit each other.

References:

Schallreuter, R., Vinx, R., Lierl, H.J. (1984): Geschiebe in Südholstein. In Degens et al. (eds.): Exkursionsführer Erdgeschichte des Nordsee- und Ostseeraumes, Geologisch-Paläontologisches Institut der Universität Hamburg.

How to cite: Kotthoff, U., Engelkes, K., Tshibalanganda, M., Beerlink, A., Hesemann, M., Milker, Y., and Schmiedl, G.: Case study for the application of micro-computed tomography on a Miocene sample of Holstein Erratics to identify and asses included molluscs and foraminifera, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-16665, https://doi.org/10.5194/egusphere-egu23-16665, 2023.

EGU23-16722 | Orals | GI1.1

Cost Efficient Station Monitoring and Remote Data Retrieval for Portable Seismic Stations 

Valarie Hamilton, Sylvain Pigeon, Tim Parker, Michael Perlin, and Michael Laporte

Remote monitoring as well as remote waveform data retrieval is an enabling capability for portable stations used in seismic hazards studies. Remote monitoring provides intra-deployment visibility of system performance to allow prompt detection, and subsequent resolution, of faults which may otherwise go undetected for the duration of the deployment and jeopardize a seismic campaign's successful outcome.  Waveform data retrieval allows intra-deployment quality control (QC) and can enable faster science. These benefits must be considered against the associated power consumption and telemetry bandwidth costs. These tradeoffs may drive a system implementation that limits remote retrieval to low resolution data or, in other use cases, restricts the retrieval of full resolution waveform data to specific periods of interest only.  In the case of deployments in harsh environments, such as polar or ocean bottom environments where field visits are cost prohibitive or may not be possible, even high cost remote retrieval of a complete data set may be preferable if it offers a savings and / or a reduction in operational risk in comparison to a station visit.

As part of this session, we discuss how advancement in low power portable geoscience instrumentation, combined with low power communication technology, deliver these new capabilities with flexible implementation balancing function and operational cost.

How to cite: Hamilton, V., Pigeon, S., Parker, T., Perlin, M., and Laporte, M.: Cost Efficient Station Monitoring and Remote Data Retrieval for Portable Seismic Stations, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-16722, https://doi.org/10.5194/egusphere-egu23-16722, 2023.

The deposition of heavy metals on water bodies and soil has adverse consequences on
human health. The elevated Coal-based energy production and increased industrial emissions
have also prompted us to study about heavy metals reactive nitrogen species in the
atmosphere. In the present work, the samples of rain water were collected from a residential
site in south-west Delhi. The samples were analyzed for selected heavy metals by using ICP-
OES. The heavy metals analysis involved voltammetry method using 797 VA Computrace
(Metrohm, Switzerland) instrument. The analysis of Total Nitrogen (TN) and dissolved
organic carbon (DOC) was carried out by using chemiluminescence based TN/TOC analyzer
(Shimadzu model TOC-LCPH E200 ROHS). The mean values of Cu, Mn, Zn, Al, As and Hg
were calculated as 34.5 mg/l, 19.5 mg/l, 52.7 mg/l, 392.3 mg/l, 9.8 mg/l and 1.6 mg/l
respectively. The mean values for TN and DOC were 12.7mg/l and 2.8 mg/l respectively. The
detailed results will be discussed in the EGU General Assembly Meeting.

Keywords: Total Nitrogen, wet deposition, ICP-OES, voltammetry, agricultural area.

How to cite: sunaina, S.: Wet deposition of heavy metals, reactive nitrogen species and dissolved organic carbonat a residential site in Delhi region, India, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-345, https://doi.org/10.5194/egusphere-egu23-345, 2023.

EGU23-1075 | ECS | Posters on site | GI1.3

What can we learn from nested IoT low-cost sensor networks for air quality?  A case study of PM2.5 in Birmingham UK. 

Nicole Cowell, Clarissa Baldo, William Bloss, and Lee Chapman

Birmingham is a city within the West Midlands region of the United Kingdom. In June 2021, coinciding with the introduction of the Clean Air Zone by Birmingham City Council (BCC), multiple low-cost IoT sensor networks for air pollution were deployed across the city by both the University of Birmingham and BCC. Low-cost sensor networks are growing in popularity due to their lower costs compared to regulatory instruments (£10’s-£1000’s per unit compared to £10,000+ per unit) and the reduced need for specialised staff allow for deployments at greater spatial scales (1-3).  Although such low-cost sensing is often associated with uncertainty, the measurement of PM2.5 optical particle counters have been generally shown to perform well, giving indicative insight into concentrations following calibrations and corrections for external influence such as humidity (4-7). 

One common problem with sensor networks is they tend to be isolated and unopen deployments, deployed and maintained by an interested party with the focus of their own monitoring goal. To tackle this, Birmingham Urban Observatory was an online platform created and used by researchers at the University of Birmingham to host and share open access meteorological and air pollution data from low-cost sensor deployments. Whilst hosting and displaying data from two of their own deployments of air quality sensors (Zephyrs by Earthsense and AltasensePM: an in-house designed PM sensor), the platform also pulled data from the DEFRA AURN sites and collaborated with local government to pull data from their own low-cost sensor network. The result was a real-time view of environmental data produced from a series of nested arrays of sensors.

This poster presents findings from this combined low-cost network, considering the successes and pitfalls of the low-cost monitoring network alongside insight into regional and local PM2.5 concentrations. Colocations against reference instruments within the network demonstrate good performance of the low-cost sensors after calibration and data validation but the project experienced challenges in deploying the network and sensor reliability. Low-cost sensor data generally gives novel insight into spatial analysis of PM2.5 across the city and this is presented alongside other experiences of deploying and using sensor networks for air quality.

1 Lewis et al., (2016) https://doi.org/10.1039/C5FD00201J

2 Chong and Kumar. (2003) doi: 10.1109/JPROC.2003.814918

3 Snyder et al., (2013) https://doi.org/10.1021/es4022602

4 Magi et al., (2020) https://doi.org/10.1080/02786826.2019.1619915

5 Crilley et al., (2018) https://doi.org/10.5194/amt-11-709-2018

6 Cowell et al., (2022) https://doi.org/10.3389/fenvs.2021.798485

7 Cowell et al., (2022) https://doi.org/10.1039/D2EA00124A

How to cite: Cowell, N., Baldo, C., Bloss, W., and Chapman, L.: What can we learn from nested IoT low-cost sensor networks for air quality?  A case study of PM2.5 in Birmingham UK., EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-1075, https://doi.org/10.5194/egusphere-egu23-1075, 2023.

EGU23-2847 | Posters on site | GI1.3

Atmospheric ammonia in-situ long-term monitoring: review worldwide strategies and recommendations for implementation 

Aude Bourin, Pablo Espina-Martin, Anna Font, Sabine Crunaire, and Stéphane Sauvage

Ammonia (NH3) is the major alkaline gas in the atmosphere and the third most abundant N-containing species, after N2 and N2O. It plays an important role in N deposition processes, responsible of several damages on ecosystems, and it is also a precursor of fine particulate matter, known to cause numerous impacts on human health. Despite this, not many countries have implemented long-term monitoring of NH3 in their air quality programs due to the lack of consensus on limit values for ambient levels and a reference method of measuring this gas. In the climate change context, governments and health organizations are increasingly concerned about NH3 and its effects. As a proof, the revision of the EU air quality directives proposes the inclusion of NH3 as a mandatory pollutant for several urban and rural supersites for all member states.

Currently, there are only 12 long term programs worldwide dedicated specifically to measure NH3 or including gas-phase measurements of NH3. The longest NH3 time series come from UK and Africa, where measurements start in mid-1990. The rest of locations have started after 2000 and they have lower temporal coverage, between 5 and 22 years. The objectives pursued by these networks are to follow long term spatio-temporal trends, assess the N deposition on sensitive ecosystems, validate emission and/or chemistry transport models and help to understand the effectiveness of air pollution control and mitigation policies. Most of these networks operate using a combination of low-cost samplers with a high spatial density with few collocated sites with high time resolution instrumentation to help calibrate passive samplers and to better monitor the fine temporal variability of NH3. This combined approach has proven to be successful for most of the proposed objectives.

However, there are several differences that may difficult harmonizing the information at both the technical and scientific level. At the technical level these include type and number of passive samplers per site, calibration protocol, data control and quality analysis, exposure duration and type of high time resolution sampling method. On the scientific level, increased difficulty understanding the operative parameters and scientific results may come from language barriers (non-English reports), availability of the data (whether it is public or not), and gaps on the knowledge of NH3 levels on a spatial scale due to differences in the implementation of monitoring strategies within the same country.

This work aims to review synthetically the world current long-term NH3 networks and provide some insight and recommendations for other countries and supranational programs aiming to establish long term monitoring networks of NH3, based on cost-effective, technical, and operational criteria.

How to cite: Bourin, A., Espina-Martin, P., Font, A., Crunaire, S., and Sauvage, S.: Atmospheric ammonia in-situ long-term monitoring: review worldwide strategies and recommendations for implementation, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-2847, https://doi.org/10.5194/egusphere-egu23-2847, 2023.

EGU23-2984 | ECS | Posters virtual | GI1.3

The COllaborative Carbon Column Observing Network COCCON: Showcasing GHG observations at the COCCON Tsukuba site 

Matthias Max Frey, Isamu Morino, Hirofumi Ohyama, Akihiro Hori, Darko Dubravica, and Frank Hase

Greenhouse gases (GHGs) play a crucial role regarding global warming. Therefore, precise and accurate observations of anthropogenic GHGs, especially carbon dioxide and methane, are of utmost importance for the estimation of their emission strengths, flux changes and long-term monitoring. Satellite observations are well suited for this task as they provide global coverage. However, like all measurements these need to be validated.

The COllaborative Carbon Column Observing Network (COCCON) performs ground-based observations to retrieve column-averaged dry air mole fractions of GHGs (XGAS) with reference precision. The instrument used by the network is the EM27/SUN, a solar-viewing Fourier Transform infrared (FTIR) spectrometer. COCCON data are of high accuracy as COCCON uses species dependent airmass-independent and airmass-dependent adjustments for tying the XGAS products to TCCCON (Total Carbon Column Observing Network) and thereby to the World Meteorological Organization (WMO) reference scale. Moreover, instrument specific characteristics are measured for each COCCON spectrometer, and taken into account in the data analysis.

Here we first introduce the COCCON network in general and summarize its capabilities for various challenges including satellite and model validation, long-term observation of GHGs, and local and regional GHG source emission strength estimations. By example of the COCCON Tsukuba station we highlight in detail its usefulness for the above-mentioned applications.

How to cite: Frey, M. M., Morino, I., Ohyama, H., Hori, A., Dubravica, D., and Hase, F.: The COllaborative Carbon Column Observing Network COCCON: Showcasing GHG observations at the COCCON Tsukuba site, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-2984, https://doi.org/10.5194/egusphere-egu23-2984, 2023.

EGU23-7462 | ECS | Orals | GI1.3

Data infrastructure for nitrogen compound emissions monitoring 

Daniel Bertocci, Burcu Celikkol, Shaojie Zhuang, and Jasper Fabius

Emissions of nitrogen compounds, including nitrogen dioxide (NO2) and ammonia (NH3), have significant impacts on air quality and the environment. To effectively monitor the spatial and temporal variability of these emissions and the efficacy of emission mitigation measures, OnePlanet Research Center is developing a low-cost sensor system to monitor outdoor NO2 and NH3concentrations. This sensor system is designed to be deployable in fine-grained networks to accurately capture the dispersion from an emitting source. The deployment of multitudes of such sensor systems will result in large volumes of data. For this purpose, we developed a data infrastructure using the OGC SensorThings API and TimescaleDB, a time-series database extending PostgreSQL. This infrastructure allows for the efficient storage, management, and analysis of large volumes of spatiotemporal data from various sources, such as air quality monitoring networks, meteorological data, and agricultural practices. We demonstrate the potential of this infrastructure by using it in citizen science project COMPAIR, combining data from various sensors to gain insights on the air quality impact of urban circulation policies. The resulting data platform will facilitate the development of decision support tools and the implementation of targeted emission reduction strategies.

How to cite: Bertocci, D., Celikkol, B., Zhuang, S., and Fabius, J.: Data infrastructure for nitrogen compound emissions monitoring, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-7462, https://doi.org/10.5194/egusphere-egu23-7462, 2023.

Even in the presence of more reliable air quality tools, low-cost sensors have the benefit of recording data on highly localized spatial and temporal scales, allowing for multiple measurements within a single satellite pixel and on pixel boundaries. However, they are less accurate than their regulatory-grade counterparts, requiring regular co-locations with accepted instruments to ensure their validity. Thus, the addition of low-cost sensors to a field campaign – where reference-grade air quality instruments are abundant – not only provides ample opportunities for low-cost sensor co-location and calibration, but also allows the low-cost instruments to be used for sub-pixel validation, covering more surface area than the regulatory instruments alone with a network of sensors. During the summer of 2014, our low-cost sensor network was deployed during the Front Range Air Pollution and Photochemistry Éxperiment (FRAPPÉ) campaign conducted to sample the composition of air at and above ground level in northeastern Colorado, USA. The low-cost sensor platform included a suite of gas-phase sensors, notably NO2 and two generalized volatile organic compound (VOC) sensors, which were leveraged together to quantify speciated hydrocarbons such as formaldehyde. These key pollutants were chosen for their impacts on human health and climate change, as well as their inclusion on the TEMPO satellite launching this year. Airborne campaign measurements included slant column optical observations of formaldehyde (HCHO), nitrogen dioxide (NO2), and ozone (O3). Myriad additional in-situ instruments described chemical composition up to approximately 5 km above surface level. Ground-based instrumentation included both stationary and mobile regulatory-grade instruments, which were used for sensor calibration. Machine learning techniques such as artificial neural networks (ANNs) were used to match the low-cost signals to that of the reference-grade instruments. Here, we compare calibrated low-cost sensor data collected at ground level in a variety of locations along Colorado’s Front Range to various data sources from the FRAPPÉ campaign to better understand how well airborne and regulatory ground-based measurements can be extrapolated to other locations. Further, as the slant column measurements act as satellite simulators, we explore how low-cost instruments can be used for satellite validation purposes. Comparisons among these different data types also have important implications in data fusion.

How to cite: Okorn, K., Iraci, L., and Hannigan, M.: Comparing Low-Cost Sensors with Ground-Based and Airborne In-Situ and Column Observations of NO2 and HCHO during the FRAPPE Field Campaign in Colorado, USA, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-7839, https://doi.org/10.5194/egusphere-egu23-7839, 2023.

EGU23-8631 | Posters on site | GI1.3 | Highlight

Ambient conditions and infrared sky brightness in the Chilean Atacama Desert 

Wolfgang Kausch, Stefan Kimeswenger, Stefan Noll, and Roland Holzlöhner

The Atacama Desert in the Chilean Andes region is one of the dryest areas in the world. Due to its unique location with stable subtropical meteorological conditions and high mountains, it is an ideal site for the astronomical telescope facilities of the European Southern Observatory (ESO). The special meteorological conditions are continuously monitored at Cerro Paranal (the location of the Very Large Telescope) by measuring various parameters like temperature, pressure, humidity, precipitable water vapour (PWV), wind speed and direction, and sky radiance and bolometric sky temperature, respectively, the latter being crucial for astronomical observations in the thermal infrared regime. ESO operates several site monitoring systems for that purpose, e.g. the ESO MeteoMonitor, the Differential Image Motion Monitor (DIMM) and a Low Humidity And Temperature PROfiler (L-HATPRO) microwave radiometer providing detailed water vapour and temperate profiles up to a height of 12km in various directions. 


We have assembled all available data for a period of 4.5 years (2015-07-01 through 2019-12-31) and created a unique data set from it. This period also covers the strong El Niño event at the end of 2015. In this poster we present statistical results on the overall conditions and trends, and compare our measurements of the nocturnal sky brightness with an empirical model as function of the ambient temperature, PWV and zenith distance.

How to cite: Kausch, W., Kimeswenger, S., Noll, S., and Holzlöhner, R.: Ambient conditions and infrared sky brightness in the Chilean Atacama Desert, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-8631, https://doi.org/10.5194/egusphere-egu23-8631, 2023.

Air quality monitoring networks provide invaluable data for studying human health, environmental impacts, and the effects of policy changes,  but obtaining high quality data can be costly, with each site in a monitoring network requiring instrumentation and skilled operator time. It is therefore important to ensure that each monitor in the network is providing unique data to maximize the value of the entire network.  Differences in measurement approaches for the same chemical between monitoring stations may also result in discontinuities in the network data.  Both of these factors suggest the need for objective, machine-learning methodologies for monitoring network data analysis.   

Air quality models are another valuable tool to augment monitoring networks.  The models simulate air quality over a large region where monitoring may be sparse. The gridded output from air-quality models thus contain inherent information on the similarity of sources, chemical oxidation pathways and removal processes for chemicals of interest, provided appropriate tools are available to identify these similarities on a gridded basis.  The output from these models can be immense, again requiring the use of special, highly optimized tools for post-processing analysis.

Spatiotemporal clustering is a family of techniques that have seen widespread use in air quality, whereby time-series taken at different locations are grouped based on the level of similarity between time-series data within the dataset.   Hierarchical clustering is one such algorithm, which has the advantage of not requiring an a priori assumption about how many clusters there might be (unlike K-means).  However, traditional approaches for hierarchical clustering become computationally expensive as the number of time-series increases in size, resulting in prohibitive computational costs  when the total number of time-series to be compared rises above 30,000, even on a supercomputer.  Similarly, the comparison and clustering of large numbers of discrete data (such as multiple mass spectrometer data sampled at high time resolution from a moving laboratory platform) becomes computationally prohibitive using conventional methods. 

In this study we present a high-performance hierarchical clustering algorithm which is able to run in parallel over many nodes on massively parallel computer systems, thus allowing for efficient clustering for very large monitoring network and model output datasets.   The new high-performance program is able to cluster 290,000 annual time series (from either monitoring network data or gridded model output) in 13 hours on 800 nodes. We present here some example results showing how the algorithm can be used to analyse very large datasets, providing new insights into “airsheds” depicting regions of similar chemical origin and history, different spatial regimes for nitrogen, sulphur, and base cation deposition, .  These analyses show how different processes control each species at different potential monitoring site locations, via cluster-generated airshed maps for each species. The efficiency and flexibility of the algorithm allows for extremely large datasets to be analysed in hours of wall-clock time instead of weeks or months. The new algorithm is being used as the numerical engine for a new tool for the analysis of EU monitoring network data. 

How to cite: Lee, C., Makar, P., and Soares, J.: Spatio-temporal clustering on a high-performance computing platform for high-resolution monitoring network analysis, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-8841, https://doi.org/10.5194/egusphere-egu23-8841, 2023.

22 cost-efficient (aka ‘low-cost’) commercially available particulate matter (PM) measurement devices were installed in a diverse urban area in Leipzig, Germany. The instruments measure mostly PM2.5, some additionally PM10, and are equipped with methods for quality assurance such as conditioning to a defined temperature and regular internal calibration. In order to investigate the spread between the instruments and to enable a pre-campaign calibration, all instruments were setup in the laboratory and the outside air and compared against the same reference measurements.

Since July 2022, the measurement network was installed. It covers roughly 2x2 km2 and holds different urban features like residential and commercial buildings, important main roads, city parks, and small open building gaps. Within the network there is an official air quality monitoring station located directly at a main road. In addition, at two further official monitoring stations as well as at observation stations of the Leibniz Institute for Tropospheric Research instruments were installed to study the long-term performance, dependence on meteorological conditions and comparison to reference measurements. The measurements will take place until end of 2023.

The cost-efficient instruments perform generally quite well after the calibration. In particularly for higher PM loads > 10 µg m-3 the agreement against references is mostly satisfying. However, under very high relative humidity and cold temperatures, some instruments lacked to condition the air sufficiently. Despite these difficulties, the chosen instruments have the potential for application in monitoring of air quality limit values, i.e. the answer the question how often are certain limits exceeded.

Furthermore, differences between different local features in the observation area could be observed in e.g., the diurnal cycle but also peak and mean concentrations.

This work is co-financed with tax funds on the basis of the budget passed by the Saxon State Parliament (funding number 100582357).

How to cite: Schrödner, R., Alas, H., and Voigtländer, J.: Application of cost-efficient particulate matter measurement devices in an urban network and comparison to state-of-the-art air quality monitoring, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-9356, https://doi.org/10.5194/egusphere-egu23-9356, 2023.

EGU23-9537 | Posters on site | GI1.3

The Global Environmental Measurement and Monitoring Initiative – An International Network for Local Impact 

Daniel Klingenberg, D. Michelle Bailey, David Lang, and Mark Shimamoto

The Global Environmental Measurement and Monitoring (GEMM) Initiative is an international project of Optica and the American Geophysical Union seeking to provide precise and usable environmental data for local impact. The Initiative brings together science, technology, and policy stakeholders to address critical environmental challenges and provide solutions to inform policy decisions on greenhouse gases (GHGs) and air and water quality. GEMM Centers are currently established in Scotland, Canada, New Zealand, and the United States. These Centers represent partnerships with leading institutions that are actively working toward developing or deploying new measurement technology and improved climate models. Additional Centers are under development in India and Australia with plans to expand to Asia and Africa.

In addition to establishing monitoring centers worldwide, GEMM actively engages with other sectors (including industry, standards organizations, and regional or national governments) to support the incorporation or adoption of these evidence-based approaches into decision making processes. For example, Glasgow, Scotland is piloting the GEMM Urban Air Project, deploying a low-cost, real-time, ground-based network of devices that continuously monitors GHGs and air pollutants at a neighborhood scale. The sensor network in Glasgow is increasing the precision of local models that can provide the city with information to assess current policies and support future action. Here we will share the progress and outputs of the GEMM Initiative to date and highlight paths forward to grow the network.

How to cite: Klingenberg, D., Bailey, D. M., Lang, D., and Shimamoto, M.: The Global Environmental Measurement and Monitoring Initiative – An International Network for Local Impact, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-9537, https://doi.org/10.5194/egusphere-egu23-9537, 2023.

Since the discovery of the chlorofluorocarbons (CFCs) implication in stratospheric ozone destruction, the Montreal Protocol (1987) has aimed at controlling the production of CFCs and other ozone depleting substances (ODS) in order to protect and then recover the ozone layer. Consequently, temporary substitutes for CFCs have been developed and produced by the industry. First substitute molecules were hydrochlorofluorocarbons (HCFCs), which have smaller ozone depletion potentials (ODP) than CFCs since their atmospheric lifetimes are shorter. Nevertheless, HCFCs still contain chlorine atoms and hence, also deplete the stratospheric ozone, requiring them to be banned in turn. Thus, chlorine-free molecules, i.e. hydrofluorocarbons (HFCs) such as CH2FCF3 (HFC-134a) were introduced to replace both CFCs and HCFCs. Even if HFCs do not contribute to ozone depletion, they are very powerful greenhouse gases since they have great global warming potentials (GWPs). Consequently, the Kigali amendment (2016) to the Montreal Protocol aimed for their phase-out.

The atmospheric concentrations of CFCs have decreased in response to the phase-out and ban of their production by the Montreal Protocol and its subsequent amendments, while the HCFCs burden is now leveling off. In contrast, the atmospheric concentrations of HFCs have increased notably in the last two decades.

We present the first retrievals of HFC-134a from Fourier Transform Infra-Red (FTIR) solar spectra obtained from a remote site of the Network for the Detection of Atmospheric Composition Change (NDACC.org): the Jungfraujoch station (Swiss Alps). We discuss of the applicability of our retrieval strategy to other NDACC sites, for future quasi global monitoring from ground-based observations. We further perform first comparisons with other datasets as ACE-FTS satellite observations.

 

How to cite: Pardo Cantos, I. and Mahieu, E.: First HFC-134a retrievals and analysis of long-term trends from FTIR solar spectra above NDACC network stations: the Jungfraujoch case, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-11033, https://doi.org/10.5194/egusphere-egu23-11033, 2023.

Monitoring networks, able to effectively provide high-frequency geochemical data for characterizing the geochemical behavior of the main greenhouse gases (i.e., CO2 and CH4) and pollutants (e.g., heavy metals) are crucial tools for the assessment of air quality and its role in climate changes. However, the provision of measurement stations dedicated to monitor gas species and particulate in polluted areas is complicated by the high cost of their set-up and maintenance. In the last decade, traditional instruments have tentatively been coupled with low-cost sensors for improving spatial coverage and temporal resolution of air quality surveys. The main concerns of this new approach regard the in-field accuracy of the low-cost sensors, being significantly dependent on: (i) cross-sensitivities to other atmospheric pollutants, (ii) environmental parameters (e.g., relative humidity and temperature), and (iii) detector signal degradation over time.

This study presents the results of a geochemical survey carried out in the Greve River Basin (Chianti territory, Central Italy) from May to September 2022 by adopting two measuring strategies: (i) deployment of a mobile station, along predefined transepts within the Greve valley, equipped with a Picarro G2201-i analyzer to measure CO2 and CH4 concentrations and δ13C-CO2 and δ13C-CH4 values (‰ vs. V-PDB) by Wavelength-Scanned Cavity Ring-Down Spectroscopy (WS-CRDS); (ii) continuous monitoring, at five fixed sites positioned at different altitudes, of CO2 and CH4 concentrations through prototyped low-cost stations, coupled with atmospheric deposition and rain samplers to collect particulate samples for chemical lab analysis. The low-cost monitoring stations housed (i) a non-dispersive infrared (NDIR) sensor for CO2 concentrations, (ii) a solid-state metal oxide sensor (MOS) for CH4 concentrations, (iii) a laser light scattering sensor (LSPs) for PM2.5 and PM10 concentrations, and (iv) a sensor for temperature and relative humidity in the air. The CO2 and CH4 sensors have been calibrated in-field based on parallel measurements with the Picarro G2201-i and elaborating the calibration data with the Random Forest machine learning-based algorithm.

The measurements carried out along the transepts showed that the downstream areas next to the metropolitan city of Florence were affected by the highest concentrations of CO2 and CH4, marked by isotopic signatures revealing a clear anthropogenic origin, mainly ascribed to vehicular traffic. The distribution of these carbon species reflected the evolution of the atmospheric boundary layer, displaying higher concentrations during the early morning, when gas accumulation occurred due to stable atmospheric conditions, and lower concentrations during daytime when the heating of the surface favored the dilution of air pollutants due to the establishment of convective turbulence. These observations were confirmed by the network of low-cost stations, which allowed to simultaneously monitor the distribution of the atmospheric pollutants at different altitudes in the valley. The distribution of particulate was consistent with that of the gaseous species, and the main sources were clearly distinguished based on the chemical composition of the atmospheric deposition in the collection sites. The promising results from the present study could result in an affordable approach to effectively improve air quality monitoring strategies and support data-driven policy actions to reduce carbon emissions.

How to cite: Biagi, R., Ferrari, M., Tassi, F., and Venturi, S.: Multi-instrumental approach for air quality monitoring: characterization and distribution of greenhouse gases and atmospheric metal deposition in the Greve River Basin (Chianti territory, Central Italy)., EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-11385, https://doi.org/10.5194/egusphere-egu23-11385, 2023.

EGU23-13997 | ECS | Posters on site | GI1.3

Correction, gap filling and homogenization on daily level of the historical DMI station network temperature data 

Dina Rapp, Bo Møllesøe Vinther, Jacob L. Høyer, and Eigil Kaas

As climate change is amplified in the Arctic, it is crucial to have temperature records of high temporal resolution and quality in this area. This will help improve understanding of the involved physical mechanisms, assessment of the past changes and improve predictions for the future temperature development in the Arctic. In this study temperature measurements from the DMI Greenland station network spanning 1784-present day are corrected, gap-filled and homogenized on a daily level. Currently homogenized data is only available on a monthly level, and the more recent data has not been homogenized. The data is currently used for purposes like assessment and predictions of the surface mass balance of the Greenland Ice Sheet, temperature/climate reanalyses, validation of proxy data, etc.  

This study presents a method for improving the calculation of daily average temperatures, from the current practice of averaging the available measurements without considering what time of day they are from and how the measurements are distributed. The method is based on a moving average taking into consideration time of day, time of year and latitude/longitude of the station in question. An estimate of the related uncertainty is also calculated. Following the generation of daily average temperatures, different gap filling methods are tested. The different algorithms tested and compared are: simple gap filling by linear interpolation with other stations, single station temporal linear interpolation and MEM (Maximum Entropy Method). Finally, homogenization on daily level is performed. These steps will in turn also improve the monthly and annual average temperatures for the DMI Greenland station network. 

How to cite: Rapp, D., Møllesøe Vinther, B., L. Høyer, J., and Kaas, E.: Correction, gap filling and homogenization on daily level of the historical DMI station network temperature data, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-13997, https://doi.org/10.5194/egusphere-egu23-13997, 2023.

The Global Atmosphere Watch (GAW) Programme was established in 1989 in recognition of the need for improved scientific understanding of the increasing influence of human activities on atmospheric composition and subsequent societal impacts. It is implemented as an activity of the World Meteorological Organization, a specialized agency of the United Nations system, and is funded by the organization member countries.

As an international programme, GAW supports a broad spectrum of applications from atmospheric composition-related services to contribution to environmental policy. The examples of the later include provision of a comprehensive set of high quality and long-term globally harmonized observations and analysis of atmospheric composition for the United Nations Framework Convention on Climate Change (UNFCCC), the Montreal Protocol on Substances that Deplete the Ozone Layer and follow-up amendments, and the Convention on Long-Range Transboundary Air Pollution (CLRTAP).

The programme includes six focal areas: Greenhouse Gases, Ozone, Aerosols, Reactive Gases, Total Atmospheric Deposition and SolarUltraviolet Radiation.

The surface-based observational network of the programme includes Global (31 stations) and Regional (about 400 stations) stations where observations of various GAW parameters occur. These stations are complemented by regular ship cruises and various contributing networks. All observations are linked to common reference standards and the observational data are made available at seven designated World Data Centres (WDC).

Surface-based observations are complemented by airborne and space-based observations that help to characterize the upper troposphere and lower stratosphere, with regards to ozone, solar radiation, aerosols, and certain trace gases.

Requirements to become a GAW station are detailed in the GAW Implementation Plan 2016-2023 (WMO, 2017). A new IP is in preparation, the four strategic objectives will be presented.

  • The GAW Quality Management comprises: Data Quality Objectives, Measurement Guidelines, Standard Operating Procedures and Data Quality Indicators. Throughout the programme the common quality assurance principles apply, that include requirements for the long-term sustainability of the observations, use of one network standard for each variable and implementation of the measurement practices that satisfy the set data quality objectives. GAW implements open data policy and requires observational data be made available in the dedicated data centers operated by WMO Member countries.

The programme relies on different types of central facilities: Central Calibration Laboratories, Quality Assurance/Science Activity Centres, World and Regional Calibration Centres, which are also directly supported and implemented by the individual Member countries for the global services.

Majority of the recommendations regarding measurement and quality assurance procedures are developed by the expert and advisory groups within the programme, often those rely on the expertise withing the contributing networks and collaborating organizations, like the Aerosol, Clouds and Trace Gases Research Infrastructure (ACTRIS) or the Integrated Carbon Observation System (ICOS).

One of the GAW priorities is to expand and strengthen partnerships with contributing networks, through development of statements and strategies to articulate the mutual benefits for the collaborations and stream-line processes of data reporting and exchange of QA standards and metadata. This involves collaboration with national and regional environmental protection agencies and the development of harmonized metadata and data exchange and quality information.

How to cite: Moreno, S.: The WMO Global Atmosphere Watch Programme new implementation plan and strategic objectives, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-14442, https://doi.org/10.5194/egusphere-egu23-14442, 2023.

EGU23-14459 | ECS | Orals | GI1.3

Developing and testing a validation procedure to successfully use on-the-move sensors in urban environments 

Francesco Barbano, Erika Brattich, Carlo Cintolesi, Juri Iurato, Vincenzo Mazzarella, Massimo Milelli, Abdul Ghafoor Nizamani, Maryam Sarfraz, Antonio Parodi, and Silvana Di Sabatino

With the increasing attempt to empower citizens and civil society in promoting virtuous behaviours and relevant climate actions, novel user-friendly and low-cost tools and sensors are nowadays being developed and distributed on the market. Most of these sensors are typically easy to install with a ready-to-use system, while measured data are automatically uploaded on a mobile application or a web dashboard which also guarantees secure and open access to measurements gathered by other users. However, the quality of the datum and the calibration of these sensors are often ensured against research-grade instrumentations only in the laboratory and rarely in real-world measurement. The discrepancies arising between these low-cost sensors and research-grade instrumentations are such that the first might be impossible to use if a validation (and re-calibration if needed) under environmental conditions is not performed. Here we propose a validation procedure applied to the MeteoTracker, a recently developed portable sensor to monitor atmospheric quantities on the move. The ultimate scope is to develop and implement a general procedure to test and validate the quality of the MeteoTracker data to compile user guidelines tailored for on-the-move sensors. The result will evaluate the feasibility of MeteoTracker (and potentially other on-the-move sensors) to integrate the existing monitoring networks on the territory, improve the atmospheric data local coverage and support the informed decision by the authorities. The procedure will include multi-sensor testing of all the sensor functionalities, validation of all data simultaneously acquired by several sensors under similar conditions, methods and applications of comparisons with research-grade instruments. The first usage of the MeteoTracker will be also presented for different geographical contexts where the sensors will be used for citizen science activities and develop a monitoring network of selected Essential Variables within the HORIZON-EU project I-CHANGE (Individual Change of HAbits Needed for Green European transition).

How to cite: Barbano, F., Brattich, E., Cintolesi, C., Iurato, J., Mazzarella, V., Milelli, M., Nizamani, A. G., Sarfraz, M., Parodi, A., and Di Sabatino, S.: Developing and testing a validation procedure to successfully use on-the-move sensors in urban environments, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-14459, https://doi.org/10.5194/egusphere-egu23-14459, 2023.

EGU23-15087 | Posters on site | GI1.3

Applications of an advanced clustering tool for EU AQ monitoring network data analysis 

Joana Soares, Christoffer Stoll, Islen Vallejo, Colin Lee, Paul Makar, and Leonor Tarrasón

Air quality monitoring networks provide invaluable data for studying human health, environmental impacts, and the effects of policy changes. In a European legislative context, the data collected constitutes the basis for reporting air quality status and exceedances under the Ambient Air Quality Directives (AAQD) following specific requirements. Consequently, the network's representativity and ability to accurately assess the air pollution situation in European countries become a key issue. The combined use of models and measurements is currently understood as the most robust way to map the status of air pollution in an area, allowing it to quantify both the spatial and temporal distribution of pollution. This spatial-temporal information can be used to evaluate the representativeness of the monitoring network and support air quality monitoring design using hierarchical clustering techniques.

The hierarchical clustering methodology applied in this context can be used as a screening tool to analyse the level of similarity or dissimilarity of the air concentration data (time-series) within a monitoring network. Hierarchical clustering assumes that the data contains a level of (dis)similarity and groups the station records based on the characteristics of the actual data. The advantage of this type of clustering is that it does not require an a priori assumption about how many clusters there might be, but it can become computationally expensive as the number of time-series increases in size. Three dissimilarity metrics are used to establish the level of similarity (or dissimilarity) of the different air quality measurements across the monitoring network: (1) 1-R, where R is the Pearson linear correlation coefficient, (2) the Euclidean distance (EuD), and (3) multiplication of metric (1) and (2). The metric based on correlation assesses dissimilarities associated with the changes in the temporal variations in concentration. The metric based on the EuD assesses dissimilarities based on the magnitude of the concentration over the period analysed. The multiplication of these two metrics (1-R) x EuD assesses time variation and pollution levels correlations, and it has been demonstrated to be the most useful metric for monitoring network optimization.

This study presents the MoNET webtool developed based on the hierarchical clustering methodology. This webtool aims to provide an easy solution for member states to quality control the data reported as a tier-2 level check and evaluate the representativeness of the air quality network reporting under the AAQD. Some examples from the ongoing evaluation of the monitoring site classification carried out as a joint exercise under the Forum for Air Quality Modeling (FAIRMODE) and the National Air Quality Reference Laboratories Network (AQUILA) are available to show the usability of the tool. MoNet should be able to identify outliers, i.e., issues with the data or data series with very specific temporal-magnitude profiles, and to distinguish, e.g., pollution regimes within a country and if it resembles the air quality zones required by the AAQD and set by the member states; stations monitoring high-emitting sources; background regimes vs. a local source driving pollution regime in cities.

How to cite: Soares, J., Stoll, C., Vallejo, I., Lee, C., Makar, P., and Tarrasón, L.: Applications of an advanced clustering tool for EU AQ monitoring network data analysis, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-15087, https://doi.org/10.5194/egusphere-egu23-15087, 2023.

EGU23-15609 | ECS | Posters on site | GI1.3

A compact and customisable street-level sensor system for real-time weather monitoring and outreach in Freiburg, Germany 

Gregor Feigel, Marvin Plein, Matthias Zeeman, Ferdinand Briegel, and Andreas Christen

Climate adaptation and emergency management are major challenges in cities, that benefit from the incorporation of real-time weather, air quality, differential exposure and vulnerability data. We therefore need systems that allow us to map, for example, localised thermal heat stress, heavy precipitation events or air quality spatially resolved across cities at high temporal resolution. Key to the assessment of average conditions and weather extremes in cities are systems that are capable of resolving intra-urban variabilities and microclimates at the level of people, hence in the urban canopy layer at street-level. Placing sensors at street-level, however, is challenging: Sensors need to be small, rugged, safe, and they must measure a number of quantities within limited space. Such systems may ideally require little or no external power, provide remote accessibility, sensor interoperability and real-time data transfer and must be cost-effective for mass deployment. However, these characteristics as well as a wide spectrum of observed variables are not available in current commercial sensor network solutions, hence we designed and implemented a custom partly in-house developed two-tiered sensor system for mounting and installation at 3 m height on city-owned street lights in Freiburg, Germany.

Our partly in-house developed two-tiered sensor network, consisting of fifteen fully self-developed, cost-effective “Tier-I stations” and 35 commercial “Tier-II stations” (LoRAIN, Pessl Instruments GmbH), aims to fill these gaps and to provide a modular, user-friendly WSN with a high spatial density and temporal resolution for research, practical applications and the general public. The Tier-I stations were designed and optimised from the ground up, including the printed circuit board (PCB), for temporally high-resolution WSNs that support wide ranges of sensors and that is expandable. The core of the system is a low-power embedded computer (Raspberry Pi Zero) running a custom multithreaded generic logging and remote control software that locally stores the data and transmits it to a custom vapor-based TCP server via GSM. The software also features system monitoring and error detection functions, as well as remote logging. The setup can easily be expanded on the fly by adding predefined sensors to a configuration file. For better modularity, each station registers itself on the server and will be automatically integrated in all further processes and vice versa. Custom frontends as well as bidirectional communication and task distribution protocols enable remote access and across node interaction, resulting in a more easy-to-maintain system. 

In addition to air temperature, humidity and precipitation measured by the Tier II stations, the Tier-I station feature a ClimaVUE 50 all-in-one weather sensor and a BlackGlobe (Campbell Scientific, Inc.) that provides data on wind, radiation, pressure, lightning, solar radiation and black globe temperatures. That allows for calculation of thermal comfort indices in real-time. A webpage and the self-developed “uniWeather” (iOS-App, API) offers near-realtime data access and data interpretation for stakeholders and public outreach.

How to cite: Feigel, G., Plein, M., Zeeman, M., Briegel, F., and Christen, A.: A compact and customisable street-level sensor system for real-time weather monitoring and outreach in Freiburg, Germany, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-15609, https://doi.org/10.5194/egusphere-egu23-15609, 2023.

EGU23-16779 | Orals | GI1.3 | Highlight

An integrated meteorological forecasting system for emergency response 

Alexander Haefele, Maxime Hervo, Philipp Bättig, Daniel Leuenberger, Claire Merker, Daniel Regenass, Pirmin Kaufmann, and Marco Arpagaus

EMER-Met is the new meteorological forecasting system for the protection of the population in Switzerland. It provides the meteorological basis for coping with all types of emergencies, especially in case of nuclear and chemical accidents. EMER-Met consists of a dedicated upper air measurement network and a high-resolution numerical weather prediction model. The measurement network is composed of state-of-the-art remote sensing instruments to measure accurate wind and temperature profiles in the boundary layer. At three sites, a radar wind profiler PCL1300, a Doppler lidar Windcube-200s and a microwave radiometer Hatpro-G5 are installed. The data from the measurement network are assimilated into the operational 1-km ensemble numerical weather prediction (NWP) system. In the case of the microwave radiometers, we assimilate the brightness temperatures using an adapted version of the RTTOV observation operator. To ensure best impact on the NWP results, the data quality of the measurements is of high importance and is monitored closely on a daily and monthly basis against radiosondes and the NWP model itself. EMER-Met is operational since 2022 and to our best knowledge, it is the first time that the brightness temperatures measured by surface-based microwave radiometers are assimilated operationally. This presentation will focus on the upper air network performance and its impact on NWP. 

How to cite: Haefele, A., Hervo, M., Bättig, P., Leuenberger, D., Merker, C., Regenass, D., Kaufmann, P., and Arpagaus, M.: An integrated meteorological forecasting system for emergency response, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-16779, https://doi.org/10.5194/egusphere-egu23-16779, 2023.

EGU23-17535 | ECS | Orals | GI1.3

ACTRIS - CiGas side-by-side interlaboratory comparison of new and classical techniques for formaldehyde measurement in the nmol/mol range 

Therese Salameh, Emmanuel Tison, Evdokia Stratigou, Sébastien Dusanter, Vincent Gaudion, Marina Jamar, Ralf Tillmann, Franz Rohrer, Benjamin Winter, Teresa Verea, Amalia Muñoz, Fanny Bachelier, Véronique Daele, and Audrey Grandjean

Formaldehyde is an important hazardous air pollutant, classified as carcinogenic to humans by the International Agency for Research on Cancer (IARC). It is emitted directly by many anthropogenic and natural sources, and formed as a secondary product from volatile organic compounds (VOCs) photo-oxidation. Formaldehyde is, as well, a significant source of radicals in the atmosphere resulting in ozone and secondary organic aerosols formation. Routine measurements of formaldehyde in regulatory networks within Europe (EMEP) and USA (EPA Compendium Method TO 11A) rely on sampling with DNPH (2,4-Dinitrophenylhydrazine)- impregnated silica cartridges, followed by analysis with HPLC (High-performance liquid chromatography).

In the framework of the EURAMET-EMPIR project « MetClimVOC » (Metrology for climate relevant volatile organic compounds: http://www.metclimvoc.eu/), the European ACTRIS (Aerosol, Cloud and Trace Gases Research InfraStructure: https://www.actris.eu/) Topical Centre for Reactive Trace Gases in-situ Measurements (CiGas), IMT Nord Europe unit – France, organized a side-by-side intercomparison campaign in Douai-France, dedicated to formaldehyde measurement in a low amount fraction range of 2 - 20 nmol/mol, from 30 May to 8 June 2022. The objectives of the intercomparison are to evaluate the instruments metrological performance under the same challenging conditions, and to build best practices and instrumental knowledge.

Here, we present the results from the intercomparison, where ten instruments belonging to seven different techniques were challenged with the same formaldehyde gas mixture generated either from a cylinder or from a permeation system, in different conditions (amount fractions, relative humidity, interference, blanks, etc.), flowing through a high-flow (up to 50 L/min) Silcosteel-coated manifold. The advantages/drawbacks of the techniques will be discussed.

How to cite: Salameh, T., Tison, E., Stratigou, E., Dusanter, S., Gaudion, V., Jamar, M., Tillmann, R., Rohrer, F., Winter, B., Verea, T., Muñoz, A., Bachelier, F., Daele, V., and Grandjean, A.: ACTRIS - CiGas side-by-side interlaboratory comparison of new and classical techniques for formaldehyde measurement in the nmol/mol range, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-17535, https://doi.org/10.5194/egusphere-egu23-17535, 2023.

EGU23-229 | ECS | Posters virtual | ESSI3.5

A web-based strategy to reuse grids in geographic modeling 

Yuanqing He, Min Chen, Yongning Wen, and Songshan Yue

Integrated application of geo-analysis models is critical for geo-process research. Due to the continuity of the real world, the geo-analysis model cannot be applied immediately over the entire space. To date, the method of regrading space as a sequence of computing units (i.e. grid) has been widely used in geographic study. However, the model's variances in division algorithms result in distinct grid data structures. At first, researchers must install and setup the various software to generate the structure-specific grid data required by the models. This method of localized processing is inconvenient and inefficient. Second, in order to integrate the models that use different structural grid data, researchers need to design a specific conversion method based on the integration scenario. Due to difference of researcher’s development habits, it is difficult to reuse the conversion method in another runtime environment. The open and cross-platform character of web services enables users to generate data without the assistance of software programs. It has the potential to revolutionize the present time-consuming process of grid generation and conversion, hence increasing efficiency.

Based on the standardized model encapsulation technology proposed by OpenGMS group, this paper presents a grid-service method tailored to the specific requirements of open geographic model integration applications, and the research work is carried out in the following three areas:

  • The basic strategy of grid servitization. The heterogeneity of the grid generation method is a major factor that prevents it from being invoked via a unified way by web services. To reduce the heterogeneous of the grid generation method, this study proposes a standardized description method based on the Model Description Language (MDL).
  • Method for constructing a grid data generating service. A unified representation approach for grid data is proposed in order to standardize the description of heterogeneous grid data; an encapsulation method for grid generating algorithms is proposed; and grid-service is realized by merging the main idea of grid servitization.
  • Method for constructing a grid data conversion service . A box-type grid indexing approach is provided to facilitate the retrieval of grid cells with a large data volume; two conversion types, topologically similar and topologically inaccessible grid data conversion, are summarized, along with the related conversion procedures. On this foundation, a grid conversion engine is built using the grid service-based strategy as a theoretical guide and integrated with the grid conversion strategy.

Based on the grid service approach proposed in this paper, researchers can generate and converse grid data without tedious steps for downloading and installing programs. Thus, there are more time spend on geography problem solving, hence increasing efficiency.

How to cite: He, Y., Chen, M., Wen, Y., and Yue, S.: A web-based strategy to reuse grids in geographic modeling, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-229, https://doi.org/10.5194/egusphere-egu23-229, 2023.

EGU23-2744 | Orals | ESSI3.5

Provenance powered microservices: a flexible and generic approach fostering reproducible research in Earth Science 

Alessandro Spinuso, Ian van der Neut, Mats Veldhuizen, Christian Pagé, and Daniele Bailo

Scientific progress requires research outputs to be reproducible, or at least persistently traceable and analysable for defects through time. This can be facilitated by coupling analysis tools that are already familiar to scientists, with reproducibility controls designed around common containerisation technologies and formats to represent metadata and provenance. Moreover, modern interactive tools for data analysis and visualisation, such as computational notebooks and visual analytics systems, are built to expose their functionalities through the Web. This facilitates the development of integrated solutions that are designed to support computational research with reproducibility in mind, and that, once deployed onto a Cloud infrastructure, benefit from operations that are securely managed and perform reliably. Such systems should be able to easily accommodate specific requirements concerning, for instance, the deployment of particular scientific software and the collection of tailored, yet comprehensive, provenance recordings about data and processes. By decoupling and generalising the description of the environment where a particular research took place from the underlying implementation, which may become obsolete through time, we obtain better chances to recollect relevant information for the retrospective analysis of a scientific product in the long term, enhancing preservation and reproducibility of results.

In this contribution we illustrate how this is achievable via the adoption of microservice architectures combined with a provenance model that supports metadata standards and templating. We aim at empowering scientific data portals with Virtual Research Environments (VREs) and provenance services, that are programmatically controlled via high-level functions over the internet. Our system SWIRRL deals, on behalf of the clients, with the complexity of allocating the interactive services for the VREs on a Cloud platform. It runs staging and preprocessing workflows to gather and organise remote datasets, making them accessible collaboratively. We show how Provenance Services manage provenance records about the underlying environment, datasets and analysis workflows, and how these are exploited by researchers to control different reproducibility use cases. Our solutions are currently being implemented in more contexts in Earth Science. We will provide an overview on the progress of these efforts for the EPOS and IS-ENES research infrastructures, addressing solid earth and climate studies, respectively.

Finally, although the reproducibility challenges can be tackled to a large extent by modern technology, this will be further consolidated and made interoperable via the implementation and uptake of the FDOs. To achieve this goal, it is fundamental to establish the conversation between engineers, data-stewards and researchers early in the process of delivering a scientific product. This fosters the definition and implementation of suitable best practices to be adopted by a particular research group. Scientific tools and repositories built around modern FAIR enabling resources can be incrementally refined thanks to this mediated exchange. We will briefly introduce success stories towards this goal in the context of the IPCC Assessment Reports.

How to cite: Spinuso, A., van der Neut, I., Veldhuizen, M., Pagé, C., and Bailo, D.: Provenance powered microservices: a flexible and generic approach fostering reproducible research in Earth Science, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-2744, https://doi.org/10.5194/egusphere-egu23-2744, 2023.

The AuScope 3D Geomodels Portal is a website designed to display a variety of geological models and associated datasets and information from all over the Australian continent. The models are imported from publicly available sources, namely Australian government geological surveys and research organisations. Often the models come in the form of downloadable file packages designed to be viewed in specialised geological software applications. They usually contain enough information to view the model’s structural geometry, datasets and a minimal amount of geological textual information. Seldom do they contain substantial metadata, often they were created before the term ‘FAIR’ was coined or the importance of metadata had dawned upon many of us. This creates challenges for data providers and aggregators trying to maintain a certain standard of FAIR compliance across all their offerings. How to improve the standard of FAIR compliance of metadata extracted from these models? How to integrate these models into existing metadata infrastructure? For the Geomodels portal, these concerns are alleviated within the automated model transformation software. This software transforms the source file packages into a format suitable for display in a modern WebGL compliant browser. Owing to the nature of the model source files only a very modest amount of metadata can be extracted. Hence other sources of metadata must be introduced. For example, often the dataset provider will publish a downloadable PDF report file or a description on a web page associated with the model. Automated textual analysis is used to extract more information from these sources. At the end of the transformation process, an ISO-compliant metadata record is created for importing into a geonetwork catalogue. The geonetwork catalogue record can be used for integration with other applications. For example, AuScope’s flagship portal, the AuScope Portal displays information, download links and a geospatial footprint of models on a map. The metadata can also be displayed in the Geomodels Portal.

How to cite: Fazio, V.: How AuScope 3D Geomodels Portal integrates relatively metadata poor geological models into its metadata infrastructure, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-3006, https://doi.org/10.5194/egusphere-egu23-3006, 2023.

EGU23-3711 | Orals | ESSI3.5

Lessons in FAIR software from the Community Surface Dynamics Modeling System 

Gregory Tucker, Albert Kettner, Eric Hutton, Mark Piper, Tian Gan, Benjamin Campforts, Irina Overeem, and Matthew Rossi

The Community Surface Dynamics Modeling System (CSDMS) is a US-based science facility that supports computational modeling of diverse Earth and planetary surface processes, ranging from natural hazards and contemporary environmental change to geologic applications. The facility promotes open, interoperable, and shared software. Here we review approaches and lessons learned in advancing FAIR principles for geoscience modeling. To promote sharing and accessibility, CSDMS maintains an online Model Repository that catalogs over 400 shared codes, ranging from individual subroutines to large and sophisticated integrated models. Thanks to semi-automated search tools, the Repository now includes ~20,000 references to literature describing these models and their applications, giving prospective model users efficient access to information about how various codes have been developed and used. To promote interoperability, CSDMS develops and promotes the Basic Model Interface (BMI): a lightweight, language-agnostic API standard that provides control, query, and data-modification functions. BMI has been adopted by a number of academic, government, and quasi-private institutions for coupled-modeling applications. BMI specifications are provided for common scientific languages, including as Python, C, C++, Fortran, and Java. One challenge lies in broader awareness and adoption; for example, self-taught code developers may be unaware of the concept of an API standard, or may not perceive value in designing around such a standard. One way to address this challenge is to provide open-source programming libraries. One such library that CSDMS curates is Landlab Toolkit: a Python package that includes building blocks for model development (such as grid data structures and I/O functions) while also providing a framework for assembling integrated models from component parts. We find that Landlab can greatly speed model development, while giving user-developers an incentive to follow common patters and contribute new components to the library. However, libraries by themselves do not solve the reproducibility challenge. Rather than reinventing the wheel, the CSDMS facility has approached reproducibility by partnering with the Whole Tale initiative, which provides tools and protocols to create reproducible archives of computational research. Finally, we have found that a central challenge to FAIR modeling lies in the level of community knowledge. FAIR is a two-way street that depends in part on the technical skills of the user. Are they fluent in a particular programming language? How familiar are they with the numerical methods used by a given model? How familiar are they with underlying scientific concepts and simplifying assumptions? Are they conversant with modern version control and collaborative-development technology and practices? Although scientists should not need to become software engineers, in our experience there is a basic level of knowledge that can substantially raise the quality and sustainability of research software. To address this, CSDMS offers training programs, self-paced learning materials, and online help resources for community members. The vision is to foster a thriving community of practice in computational geoscience research, equipped with ever-improving modeling tools written by and for the community as a whole.

How to cite: Tucker, G., Kettner, A., Hutton, E., Piper, M., Gan, T., Campforts, B., Overeem, I., and Rossi, M.: Lessons in FAIR software from the Community Surface Dynamics Modeling System, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-3711, https://doi.org/10.5194/egusphere-egu23-3711, 2023.

This is a report from the chapter editor's perspective of a high visibility publication effort to foster the adoption of the FAIR principles (Findable, Accessible, Interoperable, Reusable) by encouraging the adoption of Persistent Identifiers (PID) and repository-based workflows in geospatial open source software communities as good practices. Lessons learned are detailed about how to communicate the benefits of PID adoption to software project communities focussing on professional software-development and meritocracy. Also encountered communication bottleneck patterns, the significance of cross-project  multiplicators, remaining challenges and emerging opportunities for publishers and repository infrastructures are reported. For the second Edition of the Springer Handbook of Geographic Information, a team of scientific domain experts from several software communities was tasked to rewrite a chapter about Open Source Geographic Information Systems (DOI: 10.1007/978-3-030-53125-6_30). For this, a sample of representative geospatial open source projects was selected, based on the range of projects integrated in the OSGeo live umbrella project (DOI: 10.5281/zenodo.5884859). The chapters authors worked in close contact with the respective Open Source software project communities. Since the editing and production process for the Handbook of Geographic Information was delayed due to the pandemic, this provided the opportunity to explore, improve and implement good practices for state of the art PID-based citation of software projects and versions, but also project communities, data and related scientific video ressources. This was a learning process for all stakeholders involved in the publication project. At the completion of the project, the majority of the involved software projects had minted Digital Object Identifiers (DOI) for their codebases. While the adoption level of software versioning with automated PID-generation and metadata quality remains heterogeneous, the insights gained from this process can simplify and accelerate the adoption of PID-based best software community practices for other open geospatial projects according to the FAIR principles.

How to cite: Löwe, P.: Going FAIR by the book: Accelerating the adoption of PID-enabled good practices in software communities through reference publication., EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-4354, https://doi.org/10.5194/egusphere-egu23-4354, 2023.

EGU23-4525 | ECS | Posters on site | ESSI3.5

GCIMS – Integration: Reproducible, robust, and scalable workflows for interoperable human-Earth system modeling 

Zarrar Khan, Chris Vernon, Isaac Thompson, and Pralit Patel

The number of models, as well as data inputs and outputs, are continuously growing as scientists continue to push the boundaries of spatial, temporal, and sectoral details being captured. This study presents the framework being developed to manage the Global Change Intersectoral Modeling System (GCIMS) eco-system of human-Earth system models. We discuss the challenges of ensuring continuous deployment and integration, reproducibility, interoperability, containerization, and data management for the growing suite of GCIMS models. We investigate the challenges of model version control and interoperability between models using different software, operating on different temporal and spatial scales, and focusing on different sectors. We also discuss managing transparency and accessibility to models and their corresponding data products throughout our integrated modeling lifecycle.

How to cite: Khan, Z., Vernon, C., Thompson, I., and Patel, P.: GCIMS – Integration: Reproducible, robust, and scalable workflows for interoperable human-Earth system modeling, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-4525, https://doi.org/10.5194/egusphere-egu23-4525, 2023.

EGU23-4939 * | Orals | ESSI3.5 | Highlight

Open Science: How Open is Open? 

Shelley Stall and Kristina Vrouwenvelder

Open science is transformative, removing barriers to sharing science and increasing reproducibility and transparency. The benefits of open science are maximized when its principles are incorporated throughout the research process, through working collaboratively with community members and sharing data, software, workflows, samples, and other aspects of scientific research openly where it can be reused, distributed, and reproduced. However, the paths toward Open Science are not always apparent, and there are many concepts, approaches, tools to learn along the way.  

Open Science practices are along a continuum where researchers can make incremental adjustments to their research practices that may seem small but can have valuable benefits. Here we will share the first steps in a researcher’s open science journey and how to lead your own research team in adopting Open Science practices.

How to cite: Stall, S. and Vrouwenvelder, K.: Open Science: How Open is Open?, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-4939, https://doi.org/10.5194/egusphere-egu23-4939, 2023.

EGU23-6375 | Posters on site | ESSI3.5

A machine-actionable workflow for the publication of climate impact data of the ISIMIP project 

Jochen Klar and Matthias Mengel

The Inter-Sectoral Impact Model Intercomparison Project (ISIMIP) is a community-driven climate impact modeling initiative that aims to contribute to a quantitative and cross-sectoral synthesis of the various impacts of climate change, including associated uncertainties. ISIMIP is organized into simulation rounds for which a simulation protocol defines a set of common scenarios. Participating modeling groups run their simulations according to these scenarios and with a common set of climatic and socioeconomic input data. The model output data are collected by the ISIMIP team at the Potsdam Institute for Climate Impact Research (PIK) and made publicly available in the ISIMIP repository. Currently the ISIMIP Repository at data.isimip.org includes data from over 150 impact models spanning across 13 different sectors. It comprises of over 100 Tb of data.

As the world's largest data archive of model-based climate impact data, ISIMIP output data is used by a very diverse audience inside and outside of academia, for all kind of research and analyses. Special care is taken to enable persistent identification, provenience, and citablity. A set of workflows and tools ensure the conformity of the model output data with the protocol and the transparent management of caveats and updates to already published data. Datasets are referenced using unique internal IDs and hash values are stored for each file in the database.

In recent years, this process has been significantly improved by introducing a machine-readable protocol, which is version controlled on GitHub and can be accessed over the internet. A set of software tools for quality control and data publication accesses this protocol to enforce a consistent data quality and to extract metadata. Some of the tools can be used independently by the modelling groups even before submitting the data. After the data is published on the ISIMIP Repository, it can be accessed via web or using an API (e.g. for access from Jupyter notebooks) using the same controlled vocabularies from the protocol. In order to make the data citable, DOI for each output sector are registered with DataCite. For each DOI, a precise list of each contained dataset is maintained. If data for a sector is added or replaced, a new, updated DOI is created.

While the specific implementation is highly optimized to the peculiarities of ISIMIP, the general ideas should be transferable to other projects. In our presentation, we will discuss the various tools and how they interact to create an integrated curation and publishing workflow.

How to cite: Klar, J. and Mengel, M.: A machine-actionable workflow for the publication of climate impact data of the ISIMIP project, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-6375, https://doi.org/10.5194/egusphere-egu23-6375, 2023.

EGU23-6726 | ECS | Posters on site | ESSI3.5

Data compilations for enriched reuse of sea ice data sets 

Anna Simson, Anil Yildiz, and Julia Kowalski

A vast amount of in situ cryospheric data has been collected during publicly funded field campaigns to the polar regions over the past decades. Each individual data set yields important insights into local thermo-physical processes, but they need to be assembled into informative data compilations to unlock their full potential to produce regional or global outcomes for climate change related research. The efficient and sustainable interdisciplinary reuse of such data compilations is of large interest to the scientific community. Yet, the creation of such compilations is often challenging as they have to be composed of often heterogeneous data sets from various data repositories. We will focus on the reuse of data sets in this contribution, while generating extendible data compilations with enhanced reusability.

Data reuse is typically conducted by researchers other than the original data producers, and it is therefore often limited by the metadata and provenance information available. Reuse scenarios include the validation of physics-based process models, the training of data-driven models, or data-integrated predictive simulations. All these use cases heavily rely on a diverse data foundation in form of a data compilation, which depends on high quality information. In addition to metadata, provenance, and licensing conditions, the data set itself must be checked for reusability. Individual data sets containing the same metrics often differ in structure, content, and metadata, which challenges data compilation.

In order to generate data compilations for a specific reuse scenario, we propose to break down the workflow into four steps:
1) Search and selection: Searching, assessing, optimizing search, and selecting data sets.
2) Validation: Understanding and representing data sets in terms of the data collectors including structure, terms used, metadata, and relations between different metrics or data sets.
3) Specification: Defining the format, structure, and content of the data compilation based on the scope of the data sets.
4) Implementation: Integrating the selected data sets into the compilation.

We present a workflow herein to create a data compilation from heterogeneous sea ice core data sets following the previously introduced structure. We report on obstacles encountered in the validation of data sets mainly due to missing or ambiguous metadata. This leaves the (re)user space for subjective interpretation and thus increases uncertainty of the compilation. Examples are challenges in relating different data repositories associated with the same location or the same campaign, the accuracy of measurement methods, and the processing stage of the data. All of which often require a bilateral iteration with the data acquisition team. Our study shows that enriching data reusability with data compilations requires quality-ensured metadata on the individual data set level.

How to cite: Simson, A., Yildiz, A., and Kowalski, J.: Data compilations for enriched reuse of sea ice data sets, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-6726, https://doi.org/10.5194/egusphere-egu23-6726, 2023.

EGU23-7417 | ECS | Posters on site | ESSI3.5 | Highlight

Data-integrated executable publications for reproducible geohazards research 

Anil Yildiz and Julia Kowalski

Investigating the mechanics of physical processes involved in various geohazards, e.g. gravitational, flow-like mass movements, shallow landslides or flash floods, predicting their temporal or spatial occurrence, and analysing the associated risks clearly benefit from advanced computational process-based or data-driven models. Reproducibility is needed not only for the integrity of the scientific results, but also as a trustbuilding element in practical geohazards engineering. Various complex numerical models or pre-trained machine learning algorithms exist in the literature, for example, to determine landslide susceptibility in a region or to predict the run-out of torrential flows in a catchment. These use FAIR datasets with increasing frequency, for example DEM data to set up the simulation, or open access landslide databases for training and validation purposes. However, we maintain that workflow reproducibility is not ensured simply due to the FAIRness of input or output datasets. Underlying computational or machine learning model needs to be (re)structured to enable the reproducibility and replicability of every step in the workflow so that a model can be (re)built to either reproduce the same results, or can be (re)used to elaborate on new cases or new applications. We propose a data-integrated, platform-independent scientific model publication approach combining self-developed Python packages, Jupyter notebooks, version controlling, FAIR data repositories and high-quality metadata. Model development in the form of a Python package guarantees that model can be run by any end-user, and defining submodules of analysis or visualisation within the package helps the users to build their own models upon the model presented. Publishing the manuscript as a data- and model-integrated Jupyter notebook creates a transparent application of the model, and the user can reproduce any result either presented in the manuscript or in the datasets. We demonstrate our workflow with two applications from geohazards research herein while highlighting the shortcomings of the existing frameworks and suggesting improvements for future applications.

How to cite: Yildiz, A. and Kowalski, J.: Data-integrated executable publications for reproducible geohazards research, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-7417, https://doi.org/10.5194/egusphere-egu23-7417, 2023.

EGU23-7427 | Posters on site | ESSI3.5

Integrating sample management and semantic research-data management in glaciology 

Florian Spreckelsen, Henrik tom Wörden, Daniel Hornung, Timm Fitschen, Alexander Schlemmer, and Johannes Freitag

The flexible open-source research data management toolkit CaosDB is used in a diversity of fields such as turbulence physics, legal research, maritime research and glaciology. It is used to link research data and make it findable and retrievable and to keep it consistent, even if the data model changes.

CaosDB is used in the glaciology department at the Alfred Wegener Institute in Bremerhaven for the management of ice core samples and related measurements and analyses. Researchers can use the system to query for ice samples linked to, e.g., specific measurements for which they then can request to borrow for further analyses. This facilitates inter-laboratory collaborative research on the same samples. The system helped to solve a number of needs for the researchers, such as: A revision system which intrinsically keeps track of changes to the data and in which state samples were, when certain analyses were performed. Automated gathering of information for the publication in a meta-data repository (Pangaea). Tools for storing, displaying and  querying geospatial information and graphical summaries of all the measurements and analyses performed on an ice core. Automatic data extraction and refinement into data records in CaosDB so that users do not need to enter the data manually. A state machine which guarantees certain workflows, simplifies development and can be extended to trigger additional actions upon transitions.

We demonstrate how CaosDB enables researchers to create and work with semantic data objects. We further show how CaosDB's semantic data structure enables researchers to publish their data as FAIR Digital Objects.

How to cite: Spreckelsen, F., tom Wörden, H., Hornung, D., Fitschen, T., Schlemmer, A., and Freitag, J.: Integrating sample management and semantic research-data management in glaciology, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-7427, https://doi.org/10.5194/egusphere-egu23-7427, 2023.

EGU23-7532 | Posters on site | ESSI3.5

Virtual Earth Cloud: a multi-cloud framework for improving replicability of scientific models 

Mattia Santoro, Paolo Mazzetti, and Stefano Nativi

Humankind is facing unprecedented global environmental and social challenges in terms of food, water and energy security, resilience to natural hazards, etc. To address these challenges, international organizations have defined a list of policy actions to be achieved in a relatively short and medium-term timespan (e.g., the UN SDGs). The development and use of knowledge platforms is key in helping the decision-making process to take significant decisions and avoid potentially negative impacts on society and the environment.

Scientific models are key tools to transform into information and knowledge the huge amount of data currently available online. Executing a scientific model (implemented as an analytical software) commonly requires the discovery and use of different types of digital resources (i.e. data, services, and infrastructural resources). In the present geoscience technological landscape, these resources are generally provided by different systems (working independently from one another) by utilizing Web technologies (e.g. Internet APIs, Web Services, etc.). In addition, a given scientific model is often designed and developed for execution in a specific computing environment. These are important barriers to enable reproducibility, replicability, and reusability of scientific models –becoming key interoperability requirements for a transparent decision-making process.

This presentation introduces the Virtual Earth Cloud concept, a multi-cloud framework for the generation of information/knowledge from Big Earth Data analytics. The Virtual Earth Cloud allows the execution of computational models to process and extract knowledge from Big Earth Data, in a multi-cloud environment, and thus improving their reproducibility, replicability and reusability.

The development and prototyping of the Virtual Earth Cloud is carried out in the context of the GEOSS Platform Plus (GPP) project, funded by the European Union’s Horizon 2020 Framework Programme, aims to contribute to the implementation of the Global Earth Observation System of Systems (GEOSS) by evolving the European GEOSS Platform components to allow access to tailor-made information and actionable knowledge.

How to cite: Santoro, M., Mazzetti, P., and Nativi, S.: Virtual Earth Cloud: a multi-cloud framework for improving replicability of scientific models, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-7532, https://doi.org/10.5194/egusphere-egu23-7532, 2023.

EGU23-8321 | Orals | ESSI3.5

Facilitating provenance documentation with a model-driven-engineering approach. 

Lucy Bastin, Owen Reynolds, Antonio Garcia-Dominguez, and James Sprinks

Evaluating the quality of data is a major concern within the scientific community: before using any dataset for study, a careful judgement of its suitability must be conducted. This requires that the steps followed to acquire, select, and process the data have been thoroughly documented in a methodical manner, in a way that can be clearly communicated to the rest of the community. This is particularly important in the field of citizen science, where a project that can clearly demonstrate its protocols, transformation steps, and quality assurance procedures have much more chance of achieving social and scientific impact through the use and re-use of its data.

A number of specifications have been created to provide a common set of concepts and terminology, such as ISO 19115-3 or W3C PROV. These define a set of interchange formats, but in themselves, they do not provide tooling to create high-quality dataset descriptions. The existing tools built on these standards (e.g. GeoNetwork, USGS metadata wizard, CKAN) are overly complex for some users (for example, many citizen science project managers) who, despite being experts in their own fields, may be unfamiliar with the structure and context of metadata standards or with semantic modelling. 

In this presentation, we will describe a prototype authoring tool that was created using a Model-driven engineering (MDE) software development methodology. The tool was authored using JetBrains Meta Programming System (MPS) to implement a modelling language based on the ISO19115-3 model. A user is provided with a “text-like” editing environment, which assists with the formal structures needed to produce a machine-parable document.

This allows a user to easily describe data lineage and generic processing steps while reusing recognised external vocabularies with automated validation, autocompletion, and transformation to external formats (e.g. the XML format 19115-3 or JSON-LD). We will report on the results of user testing aimed at making the tool accessible to citizen scientists (through dedicated projections with simplified structures and dialogue-driven model creation) and evaluating with those users any new possibilities that comprehensive and machine-parsable provenance information may create for data integration and sharing. The prototype will also serve as a test pilot of the integration between ISO 19115-3 and existing/upcoming third-party vocabularies (such as the upcoming ISO data quality measures registry).

How to cite: Bastin, L., Reynolds, O., Garcia-Dominguez, A., and Sprinks, J.: Facilitating provenance documentation with a model-driven-engineering approach., EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-8321, https://doi.org/10.5194/egusphere-egu23-8321, 2023.

EGU23-8526 | ECS | Orals | ESSI3.5

openEO Platform – showcasing a federated, accessible platform for reproducible large-scale Earth Observation analysis 

Benjamin Schumacher, Patrick Griffiths, Edzer Pebesma, Jeroen Dries, Alexander Jacob, Daniel Thiex, Matthias Mohr, and Christian Briese

openEO Platform holds a large amount of free and open as well as commercial Earth Observation (EO) data which can be accessed and analysed with openEO, an open API that enables cloud computing and EO data access in a unified and reproducible way. Additionally, client libraries are available in R, Python and Javascript. A JupterLab environment and the Web Editor, a graphical interface, allow a direct and interactive development of processing workflows. The platform is developed with a strong user focus and various use cases have been implemented to illustrate the platform capabilities. Currently, three federated backends support the analysis of EO data from pixel to continental scale.  

The use cases implemented during the platform’s main development phase include a dynamic landcover mapping, an on-demand analysis-ready-data creation for Sentinel-1 GRD, Sentinel-2 MSI and Landsat data, time series-based forest dynamics analysis with prediction functionalities, feature engineering for crop type mapping and large-scale fractional canopy mapping. Additionally, three new use cases are being developed by platform users. These include large scale vessel detection based on Sentinel-1 and Sentinel-2 data, surface water indicators using the ESA World Water toolbox for a user-defined area of interest and monitoring of air quality parameters using Sentinel-5P data. 

The future evolution of openEO Platform in terms of data availability and processing capabilities closely linked to community requirements, facilitated by feature requests from users who design their workflows for environmental monitoring and reproducible research purposes. This presentation provides an overview of the completed use cases, the newly added functionalities such as user code sharing, and user interface updates based on the new use cases and user requests. openEO Platform exemplifies how the processing and analysing large amounts of EO data to meaningful information products is becoming easier and largely compliant with FAIR data principles supporting the EO community at large. 

How to cite: Schumacher, B., Griffiths, P., Pebesma, E., Dries, J., Jacob, A., Thiex, D., Mohr, M., and Briese, C.: openEO Platform – showcasing a federated, accessible platform for reproducible large-scale Earth Observation analysis, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-8526, https://doi.org/10.5194/egusphere-egu23-8526, 2023.

EGU23-9852 | Posters on site | ESSI3.5

Proposal of a simple procedure to derive a more FAIR open data archive than a spreadsheet or a set of CSV files 

Filippo Giadrossich, Ilenia Murgia, and Roberto Scotti

NuoroForestrySchool (a study center of the Department of Agriculture, University of Sassari, Italy) has developed and published a ‘data documentation procedure’ (link to NFS-DDP) enabling the improvement of the dataset FAIRness that any data collector wishes to share as open data. Datasets are frequently shared as spreadsheet files. While this tool is very handy in data preparation and preliminary analysis, its structure and composition are not very effective for storing and sharing consolidated data, unless data structures are extremely simple. NFS-DDP takes in input a spreadsheet in which data are organized as relational tables, one per sheet, while four additional sheets contain metadata standardized according to the Dublin Core specifications. The procedure outputs an SQLite relational database (including data and metadata) and a pdf-file documenting the database structure and contents. A first example application of the proposed procedure was shared by Giadrossich et al. (2022) on the PANGEA repository, concerning experimental data of erosion in forest soil measured during artificial rainfall. The zip-archive that can be downloaded contains the experiment data and metadata processed by NFS-DDP. At the following link is available a test document where basic statistics are computed to show how NFS-DDProcedure facilitates the understanding and correct processing of the shared dataset. 

The NFS-DataDocumentationProcedure provides a simple solution for organizing and archiving data aiming to i) achieve a more FAIR archive, ii) exploit data consistency and comprehensibility of semantic connections in the relational database, ii) produce a report documenting the collection and organization of data, providing an effective and concise overview of the whole with all details at hand.

Giadrossich, F., Murgia, I., Scotti, R. (2022). Experiment of water runoff and soil erosion with and without forest canopy coverage under intense artificial rainfall. PANGAEA. DOI:10.1594/PANGAEA.943451



How to cite: Giadrossich, F., Murgia, I., and Scotti, R.: Proposal of a simple procedure to derive a more FAIR open data archive than a spreadsheet or a set of CSV files, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-9852, https://doi.org/10.5194/egusphere-egu23-9852, 2023.

EGU23-12443 | Posters on site | ESSI3.5 | Highlight

Landlab: a modeling platform that promotes the building of FAIR research software 

Eric Hutton and Gregory Tucker

Landlab is an open-source Python package designed to facilitate creating, combining, and reusing 2D numerical models. As a core component of the Community Surface Dynamics Modeling System (CSDMS) Workbench, Landlab can be used to build and couple models from a wide range of domains. We present how Landlab provides a platform that fosters a community of model developers and aids them in creating sustainable and FAIR (Findable, Accessible, Interoperable, Reusable) research software.

Landlab’s core functionality can be split into two main categories: infrastructural tools and community-contributed components. Infrastructural tools address the common needs of building new models (e.g. a gridding engine, and numerical utilities for common tasks). Landlab’s library of community-contributed components consists of several dozen components that each model a separate physical process (e.g. routing of shallow water flow across a landscape, calculating groundwater flow, or biologic evolution over a landscape). As these user-contributed components are incorporated into Landlab, they are able to attach to the Landlab infrastructure so that they also become both findable and accessible (through, for example, standardized metadata and versioning) and are maintained by the core Landlab developers.

One key aspect of Landlab’s design is its use of a standard programming interface for all components. This ensures that all Landlab components are interoperable with one another and with other software tools, allowing researchers to incorporate Landlab's components into their own workflows and analyses. By separating processes into individual components, they become reusable and allow researchers to combine components in new ways without having to write new components from scratch.

Overall, Landlab's design and development practices support the principles of FAIR research software, promoting the ability for scientific research to be easily shared and built upon. This design also provides a platform onto which model developers are able to attach their model components and take advantage of Landlab’s development practices and infrastructure and ensure their components also follow FAIR principles.

How to cite: Hutton, E. and Tucker, G.: Landlab: a modeling platform that promotes the building of FAIR research software, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-12443, https://doi.org/10.5194/egusphere-egu23-12443, 2023.

EGU23-12864 | Orals | ESSI3.5 | Highlight

Who Done It? Reproducibility of Data Products Also Requires Lineage to Determine Impact and Give Credit Where Credit is Due. 

Lesley Wyborn, Nigel Rees, Jens Klump, Ben Evans, Rebecca Farrington, and Tim Rawling

Reproducible research necessitates full transparency and integrity in data collection (e.g. from observations) or generation of data, and further data processing and analysis to generate research products. However, Earth and environmental science data are growing in complexity, volume and variety and today, particularly for large-volume Earth observation and geophysics datasets, achieving this transparency is not easy. It is rare for a published data product to be created in a single processing event by a single author or individual research group. Modern research data processing pipelines/workflows can have quite complex lineages, and it is more likely that an individual research product is generated through multiple levels of processing, starting from raw instrument data at full resolution (L0) followed by successive levels of processing (L1-L4), which progressively convert raw instrument data into more useful parameters and formats. Each individual level of processing can be undertaken by different research groups using a variety of funding sources: rarely are those involved in the early stages of processing/funding properly cited.

The lower levels of processing are where observational data essentially remains at full resolution and is calibrated, georeferenced and processed to sensor units (L1) and then geophysical variables are derived (L2). Historically, particularly where the volumes of the L0-L2 datasets are measured in Terabytes to Petabytes, processing could only be undertaken by a minority of specialised scientific research groups and data providers, as few had the expertise/resources/infrastructures to process them on-premise. Wider availability of colocated data assets and HPC/cloud processing means that the full resolution, less processed forms of observational data can now be processed remotely in realistic timeframes by multiple researchers to their specific processing requirements, and also enables greater exploration of parameter space allowing multiple values for the same inputs to be trialled. The advantage is that better-targeted research products can now be rapidly produced. However, the downside is that far greater care needs to be taken to ensure that there is sufficient machine-readable metadata and provenance information to enable any user to determine what processing steps and input parameters were used in each part of the lineage of any released dataset/data product, as well as be able to reference exactly who undertook any part of the acquisition/processing and identify sources of funding (including instruments/field campaigns that collected the data).

The use of Persistent Identifiers (PIDs) for any component objects (observational data, synthetic data, software, model inputs, people, instruments, grants, organisations, etc.) will be critical. Global and interdisciplinary research teams of the future will be reliant on software engineers to develop community-driven software environments that aid and enhance the transparency and reproducibility of their scientific workflows and ensure recogniton. The advantage of the PID approach is that not only will reproducibility and transparency be enhanced, but through the use of Knowledge Graphs it will also be possible to trace the input of any researcher at any level of processing, whilst funders will be able to determine the impact of each stage from the raw data capture through to any derivative high-level data product. 

 

How to cite: Wyborn, L., Rees, N., Klump, J., Evans, B., Farrington, R., and Rawling, T.: Who Done It? Reproducibility of Data Products Also Requires Lineage to Determine Impact and Give Credit Where Credit is Due., EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-12864, https://doi.org/10.5194/egusphere-egu23-12864, 2023.

EGU23-12971 | Posters on site | ESSI3.5

Reproducible quality control of time series data with SaQC 

David Schäfer, Bert Palm, Peter Lünenschloß, Lennart Schmidt, and Jan Bumberger

Environmental sensor networks produce ever-growing volumes of time series data with great potential to broaden the understanding of complex spatiotemporal environmental processes. However, this growth also imposes its own set of new challenges. Especially the error-prone nature of sensor data acquisition is likely to introduce disturbances and anomalies into the actual environmental signal. Most applications of such data, whether it is used in data analysis, as input to numerical models or modern data science approaches, usually rely on data that complies with some definition of quality.

To move towards high-standard data products, a thorough assessment of a dataset's quality, i.e., its quality control, is of crucial importance. A common approach when working with time series data is the annotation of single observations with a quality label to transport information like its reliability. Downstream users and applications are hence able to make informed decisions, whether a dataset in its whole or at least parts of it are appropriate
for the intended use.

Unfortunately, quality control of time series data is a non-trivial, time-consuming, scientifically undervalued endeavor and is often neglected or executed with insufficient rigor. The presented software, the System for automated Quality Control (SaQC), provides all basic and many advanced building blocks to bridge the gap between data that is usually faulty but expected to be correct in an accessible, consistent, objective and reproducible way. Its user interfaces address different audiences ranging from the scientific practitioner with little access to the possibilities of modern software development to the trained programmer. SaQC delivers a growing set of generic algorithms to detect a multitude of anomalies and to process data using resampling, aggregation, and data modeling techniques. However, one defining component of SaQC is its innovative approach to storing runtime process information. In combination with a flexible quality annotation mechanism, SaQC allows to extend quality labels with fine-grained provenance information appropriate to fully reproduce the system's output.

SaQC is proving its usefulness on a daily basis in a range of fully automated data flows for large environmental observatories. We highlight use cases from the TERENO Network, showcasing how reproducible automated quality control can be implemented into real-world, large-scale data processing workflows to provide environmental sensor data in near real-time to data users, stakeholders and decision-makers.

 

How to cite: Schäfer, D., Palm, B., Lünenschloß, P., Schmidt, L., and Bumberger, J.: Reproducible quality control of time series data with SaQC, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-12971, https://doi.org/10.5194/egusphere-egu23-12971, 2023.

EGU23-13108 | Orals | ESSI3.5 | Highlight

The reality of implementing FAIR principles in the IPCC context to support open science and provide a citable platform to acknowledge the work of authors. 

Charlotte Pascoe, Lina Sitz, Diego Cammarano, Anna Pirani, Martina Stockhause, Molly MacRae, and Emily Anderson

A new paradigm for Intergovernmental Panel on Climate Change (IPCC) Working Group I (WGI) data publication has been implemented.  IPCC Data Distribution Centre (DDC) partners at the Centre for Environmental Data Analysis (CEDA), the German Climate Computing Centre (DKRZ) and the Spanish Research Council (CSIC) have worked with the IPCC Technical Support Unit (TSU) for WGI to publish figure data from the Sixth Assessment Report (AR6). The work was guided by the IPCC Task Group on Data Support for Climate Change Assessments (TG-Data) recommendations for Open Science and FAIR data (making data Findable, Accessible, Interoperable, and Reusable) with a general aim to enhance the transparency and accessibility of AR6 outcomes.  We highlight the achievement of implementing FAIR for AR6 figure data and discuss the lessons learned on the road to FAIRness in the unique context of the IPCC.

  • Findable - The CEDA catalogue record for each figure dataset enhances findability. Keywords can be easily searched. Records are organised into collections for each AR6 chapter. There is a two-way link between the catalogue record and the figure on the AR6 website. CEDA catalogue records are duplicated on the IPCC-DDC. 
  • Accessible - Scientific language is understandable, acronyms and specific terminology are fully explained. CEDA services provide tools to access and download the data. 
  • Interoperable - Where possible data variables follow standard file format conventions such as CF-netCDF and have standard names, where this is not feasible readme files describe the file structure and content. 
  • Reusable - The data can be reused, shared and adapted elsewhere, with credit, under a Creative Commons Attribution 4.0 licence (CC BY 4.0). Catalogue records link to relevant documentation such as the Digital Object Identifier (DOI) for the code and other supplementary information. The code used to create the figures allows users to reproduce the figures from the report independently. 

CEDA catalogue records provide a platform to acknowledge the specific work of IPCC authors and dataset creators whose work supports the scientific basis of AR6. 

Catalogue records for figure datasets were created at CEDA with data archived in the CEDA repository and the corresponding code stored on GitHub and referenced via Zenodo.  For instances where the data and code were blended in a processing chain that could not be easily separated, we developed criteria to categorise the different blends of data and code and created a decision tree to decide how best to archive them. Key intermediate datasets were also archived at CEDA.

Careful definition of metadata requirements at the beginning of the archival process is important for handling the diversity of IPCC figure data which includes data derived from climate model simulations, historical observations and other sources of climate information. The reality of the implementation meant that processes for gathering data and information from authors were specified later in the preparation of AR6. This presented challenges with data management workflows and the separation of figure datasets from the intermediate data and code that generated them. 

We present recommendations for AR7 and scaling up this work in a feasible way.

How to cite: Pascoe, C., Sitz, L., Cammarano, D., Pirani, A., Stockhause, M., MacRae, M., and Anderson, E.: The reality of implementing FAIR principles in the IPCC context to support open science and provide a citable platform to acknowledge the work of authors., EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-13108, https://doi.org/10.5194/egusphere-egu23-13108, 2023.

In our project we are employing semantic data management with the Open Source research data management system (RDMS) CaosDB [1] to link empirical data and simulation output from Earth System Models [2]. The combined management of these data structures allows us to perform complex queries and facilitates the integration of data and meta data into data analysis workflows.

One particular challenge for analyses of model output is to keep track of all necessary meta data of each simulation during the whole digital workflow. Especially for open science approaches it is of great importance to properly document - in human- and computer-readable form - all the information necessary to completely reproduce obtained results. Furthermore, we want to be able to feed all relevant data from data analysis back into our data management system, so that we are able to perform complex queries also on data sets and parameters stemming from data analysis workflows.

A specific aim of this project is to re-analyse existing sets of simulations under different research questions. This endeavour can become very time consuming without proper documentation in an RDMS.

We implemented a workflow, combining semantic research data management with CaosDB and Jupyter notebooks, that keeps track of data loaded into an analysis workspace. Procedures are provided that create snapshots of specific states of the analysis. These snapshots can automatically be interpreted by the CaosDB crawler that is able to insert and update records in the system accordingly. The snapshots include links to the input data, parameter information, the source code and results and therefore provide a high-level interface to the full chain of data processing, from empirical and simulated raw data to the results. For example, input parameters of complex Earth System Models can be extracted automatically and related to model performance. In our use case, not only automated analyses are feasible, but also interactive approaches are supported.

  • [1] Fitschen, T.; Schlemmer, A.; Hornung, D.; tom Wörden, H.; Parlitz, U.; Luther, S. CaosDB—Research Data Management for Complex, Changing, and Automated Research Workflows. Data 2019, 4, 83. https://doi.org/10.3390/data4020083
  • [2] Schlemmer, A., Merder, J., Dittmar, T., Feudel, U., Blasius, B., Luther, S., Parlitz, U., Freund, J., and Lennartz, S. T.: Implementing semantic data management for bridging empirical and simulative approaches in marine biogeochemistry, EGU General Assembly 2022, Vienna, Austria, 23–27 May 2022, EGU22-11766, https://doi.org/10.5194/egusphere-egu22-11766, 2022.

How to cite: Schlemmer, A. and Lennartz, S.: Transparent and reproducible data analysis workflows in Earth System Modelling combining interactive notebooks and semantic data management, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-13347, https://doi.org/10.5194/egusphere-egu23-13347, 2023.

EGU23-14845 | Orals | ESSI3.5

Open geospatial standards and reproducible research 

Massimiliano Cannata, Gregory Giuliani, Jens Ingensand, Olivier Ertz, and Maxime Collombin

In the era of cloud computing, big data and Internet of things, research is very often data-driven: based on the analysis of data, increasingly available in large quantities and collected by experiments, observations or simulations. These data are very often characterized as being dynamic in space and time and as continuously expanding (monitoring) or change (data quality management or survey). Modern Spatial Data Infrastructures (e.g.  swisstopo or INSPIRE), are based on interoperable Web services which expose and serve large quantities of data on the Internet using widely accepted and used open standards defined by the Open Geospatial Consortium (OGC) and the International Organization for Standardization (ISO). These standards mostly comply with FAIR principles but do not offer any capability to retrieve a dataset how it was in a defined instant, to refer to its status in that specific instant and to guarantee its immutability. These three aspects hinder the replicability of research based on such a kind of services. We discuss the issue here and the state of the art  and propose a possible solution to fill this gap, using or extending when needed the existing standards and or adopting best practices in the fields of sensor data, satellite data and vector data.

How to cite: Cannata, M., Giuliani, G., Ingensand, J., Ertz, O., and Collombin, M.: Open geospatial standards and reproducible research, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-14845, https://doi.org/10.5194/egusphere-egu23-14845, 2023.

EGU23-15384 | Orals | ESSI3.5 | Highlight

A peer review process for higher reproducibility of publications in GIScience can also work for Earth System Sciences 

Daniel Nüst, Frank O. Ostermann, and Carlos Granell

The Reproducible AGILE initiative (https://reproducible-agile.github.io/) successfully established a code execution procedure following the CODECHECK principles (https://doi.org/10.12688/f1000research.51738.2) at the AGILE conference series (https://agile-online.org/conference). The AGILE conference is a medium-sized community-led conference in the domains of Geographic Information Science (GIScience), geoinformatics, and related fields. The conference is organised under the umbrella of the Association of Geographic Information Laboratories in Europe (AGILE).

Starting with a series of workshops on reproducibility from 2017 to 2019, a group of Open Science enthusiasts with the support of the AGILE Council (https://agile-online.org/agile-actions/current-initiatives/reproducible-publications-at-agile-conferences) was able to introduce guidelines for sharing reproducible workflows (https://doi.org/10.17605/OSF.IO/CB7Z8) and establish a reproducibility committee that conducts code executions for all accepted full papers.
In this presentation, we provide details of the taken steps and the encountered obstacles towards the current state. We revisit the process and abstract a series of actions that similar events or even journals may take to introduce a shift towards higher reproducibility of research publications in a specific community of practice.

We discuss the taken approach in the light of the challenges for reproducibility in Earth System Sciences (ESS) around four main ideas.
First, Reproducible AGILE’s human-centered process is able to handle the increasingly complex, large and varying data-based workflows in ESS because of the clear guidance on responsibilities (What should the author provide? How far does the reproducibility reviewer need to go?).
Second, the communicative focus of the process is very well suited to, over time, help to establish a shared practice based on current technical developments, such as FAIR Digital Objects, and to reform attitudes towards openness, transparency and sharing. A code execution following the CODECHECK principles is a learning experience that may sustainably change researcher behaviours and practice. At the same time, Reproducible AGILE’s approach avoids playing catch-up with technology and does not limit researcher freedom or includes a need to unitise researcher workflows beyond providing instructions suitable for a human evaluator, similar to academic peer review.
Third, while being agnostic of technology and infrastructures, a supportive framework of tools and infrastructure can of course increase the efficiency of conducting a code execution. We outline how existing infrastructures may serve this need and what is still missing.
Fourth, we list potential candidates of event series or journals that could introduce a code checking procedure because of their organisational setup or steps towards more open scholarhip that were already taken.

How to cite: Nüst, D., Ostermann, F. O., and Granell, C.: A peer review process for higher reproducibility of publications in GIScience can also work for Earth System Sciences, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-15384, https://doi.org/10.5194/egusphere-egu23-15384, 2023.

EGU23-15391 | Posters on site | ESSI3.5

Data Management for PalMod-II – data workflow and re-use strategy 

Swati Gehlot, Karsten Peters-von Gehlen, Andrea Lammert, and Hannes Thiemann

German climate research initiative PalMod phase II (www.palmod.de) is presented here as an exclusive example where the project end-product is unique, scientific paleo-climate data. PalMod-II data products include output from three state-of-the-art coupled climate models of varying complexity and spatial resolutions simulating the climate of the past 130,000 years. In addition to the long time series of modeling data, a comprehensive compilation of paleo-observation data is prepared to facilitate model-model and model-proxy intercomparison and evaluation. Being a large multidisciplinary project, a dedicated RDM (Research Data Management) approach is applied within the cross-cutting working group for PalMod-II. The DMP (Data Management Plan), as a living document, is used for documenting the data-workflow framework that defines the details of paleo-climate data life-cycle. The workflow containing the organisation, storage, preservation, sharing and long-term curation of the data is defined and tested.  In order to make the modeling data inter-comparable across the PalMod-II models and easily analyzable by the global paleo-climate community, model data standardization (CMORization) workflows are defined for individual PalMod models and their sub-models. The CMORization workflows contain setup, definition, and quality assurance testing of CMIP61 based standardization processes adapted to PalMod-II model simulation output requirements with a final aim of data publication via ESGF2. PalMod-II data publication via ESGF makes the paleo-climate data an asset which is (re-)usable beyond the project life-time.

The PalMod-II RDM infrastructure enables common research data management according to the FAIR3 data principles across all the working groups of PalMod-II using common workflows for the exchange of data and information along the process chain. Applying data management planning within PalMod-II made sure that all the data related workflows were defined, continuously updated if needed and made available to the project stakeholders. End products of PalMod-II which consist of unique long term scientific paleo-climate data (model as well as paleo-proxy data) are made available for re-use via the paleo-climate research community as well as other research disciplines (e.g., land-use, socio-economic studies etc.).

1. Coupled Model Intercomparison Project phase 6 (https://www.wcrp-climate.org/wgcm-cmip/wgcm-cmip6)

2. Earth System Grid Federation (https://esgf.llnl.gov)

3. Findable, Accessible, Interoperable, Reusable

How to cite: Gehlot, S., Peters-von Gehlen, K., Lammert, A., and Thiemann, H.: Data Management for PalMod-II – data workflow and re-use strategy, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-15391, https://doi.org/10.5194/egusphere-egu23-15391, 2023.

EGU23-16288 | Orals | ESSI3.5 | Highlight

The UK’s NCAS Data Project: establishing transparent observational data workflows from field to user 

Graham Parton, Barbara Brooks, Ag Stephens, and Wendy Garland

Within the UK the National Centre for Atmospheric Science (NCAS) operates a suite of observational instruments for atmospheric dynamics, chemistry and composition studies. These are principally made available through two facilities: the Atmospheric Measurement and Observations Facility (AMOF) and the Facility for Airborne Atmospheric Measurements (FAAM). Between these two facilities instrumentation can be on either campaign or long-term deployed in diverse environments (from polar to maritime; surface to high altitude), on a range of platforms (aircraft, ships) or dedicated atmospheric observatories.

The wide range of instruments, spanning an operational time period from the mid 1990s to present, has traditionally been orientated to specific communities, resulting in a plethora of different operational practices, data standards and workflows. The resulting data management and usage challenges have been further exacerbated over time by changes of staff, instruments and end-user communities and their requirements. This has been accompanied by the wider end-user community seeking greater access to and improved use of the data, with necessary associated improvements in data production to ensure transparency, quality, veracity and, thus, overall reproducibility. Additionally, these enhancemed workflows further ensure FAIR data outputs, widening long-term re-use of the data. 

Seeking to address these challenges in a more harmonious approach across the range of AMOF and FAAM facilities, NCAS established the NCAS Data Project in 2018 bringing together key players in the data workflows to break down barriers and common standards and procedures through improved dialogue. The resulting NCAS ‘Data Pyramid’ approach, brings together representatives from the data provider, data archive and end-user communities alongside supporting software engineers within a common framework that enables cross-working between all partners. This has lead to new data standards and workflows being established to ensure 3 key objectives: 1) capturing and flow of the necessary metadata to automate data flows and quality control as much as possible in a timely fashion ‘from field to end-user’; 2) enhanced transparency and traceability in data production via linked externally visible documentation, calibration and code repositories; and, 3) data products meeting end-user requirements in terms of their content and established quality control. Finally, data workflows are further enhanced thanks to scriptable conformance checking throughout the data production lifecycle, built on the controlled data product and metadata standards.

Thus, through the established workflows of the NCAS Data Project, the necessary details are captured and conveyed by both internal file-level and catalogue-level metadata to ensure that all three corners of the triangle of reproducibility, quality information, and provenance are able to be achieved in combination.

How to cite: Parton, G., Brooks, B., Stephens, A., and Garland, W.: The UK’s NCAS Data Project: establishing transparent observational data workflows from field to user, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-16288, https://doi.org/10.5194/egusphere-egu23-16288, 2023.

EGU23-17263 | Posters on site | ESSI3.5

Towards reproducible workflows in simulation based Earth System Science 

Ivonne Anders, Hannes Thiemann, Martin Bergemann, Christopher Kadow, and Etor Lucio-Eceiza

Some disciplines, e.g. Astrophysics or Earth system sciences, work with large to very large amounts of data. Storing this data, but also processing it, is a challenge for researchers because novel concepts for processing data and workflows have not developed as quickly. This problem will only become more pronounced with the ever increasing performance of High Performance Computing (HPC) – systems.

At the German Climate Computing Center, we analysed the users, their goals and working methods. DKRZ provides the climate science community with resources such as high-performance computing (HPC), data storage and specialised services and hosts the World Data Center for Climate (WDCC). In analysing users, we distinguish between two main groups: those who need the HPC system to run resource-intensive simulations and then analyse them, and those who reuse, build on and analyse existing data. Each group subdivides into subgroups. We have analysed the workflows for each identified user and found identical parts in an abstracted form and derived Canonical Workflow Modules. In the process, we critically examined the possible use of so-called FAIR Digital Objects (FDOs) and checked to what extent the derived workflows and workflow modules are actually future-proof.

We will show the analysis of the different users, the Canonical workflow and the vision of the FDOs. Furthermore, we will present the framework Freva and further developments and implementations at DKRZ with respect to the reproducibility of simulation-based research in the ESS.

How to cite: Anders, I., Thiemann, H., Bergemann, M., Kadow, C., and Lucio-Eceiza, E.: Towards reproducible workflows in simulation based Earth System Science, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-17263, https://doi.org/10.5194/egusphere-egu23-17263, 2023.

As deep learning (DL) is gathering remarkable attention for its capacity to achieve accurate predictions in various fields, enormous applications of DL in geosciences also emerged. Most studies focus on the high accuracy of DL models by model selections and hyperparameter tuning. However, the interpretability of DL models, which can be loosely defined as comprehending what a model did, is also important but comparatively less discussed. To this end, we select thin section photomicrographs of five types of sedimentary rocks, including quartz arenite, feldspathic arenite, lithic arenite, dolomite, and oolitic packstone. The distinguishing features of these rocks are their characteristic framework grains. For example, the oolitic packstone contains rounded or oval ooids. A regular classification model using ResNet-50 is trained by these photomicrographs, which is assumed as accurate because its accuracy reaches 0.97. However, this regular DL model makes their classifications based on the cracks, cements, or even scale bars in the photomicrographs, and these features are incapable of distinguishing sedimentary rocks in real works. To rectify the models’ focus, we propose an attention-based dual network incorporating the microphotographs' global (the whole photomicrographs) and local features (the distinguishing framework grains). The proposed model has not only high accuracy (0.99) but also presents interpretable feature extractions. Our study indicates that high accuracy should not be the only metric of DL models, interpretability and models incorporating geological information require more attention.

How to cite: Zheng, D., Cao, Z., Hou, L., Ma, C., and Hou, M.: High accuracy doesn’t prove that a deep learning model is accurate: a case study from automatic rock classification of thin section photomicrographs, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-244, https://doi.org/10.5194/egusphere-egu23-244, 2023.

EGU23-1183 | ECS | Orals | ITS1.5/GI1.5 | Highlight

Detection of anomalous NO2 emitting ships using AutoML on TROPOMI satellite data 

Solomiia Kurchaba, Jasper van Vliet, Fons J. Verbeek, and Cor J. Veenman

Starting from 2021 International Maritime Organization (IMO) introduced more demanding NOx emission restrictions for ships operating in waters of the North and Baltic Seas. All methods currently used for ship compliance monitoring are financially and time-demanding. Thus, it is important to prioritize the inspection of ships that have a high chance of being non-compliant. 

 

TROPOMI/S5P instrument for the first time allows a distinction of NO2 plumes from individual ships. Here, we present a method for the selection of potentially non-compliant ships using automated machine learning (AutoML) on TROPOMI/S5P satellite data. The study is based on the analysis of 20 months of data in the Mediterranean Sea region. To each ship, we assign a Region of Interest (RoI), where we expect the ship plume to be located. We then train a regression model to predict the amount of NO2 that is expected to be produced by a ship with specific properties operating in the given atmospheric conditions. We use a genetic algorithm-based AutoML for the automatic selection and configuration of a machine-learning pipeline that maximizes prediction accuracy. The difference between the predicted and actual amount of produced NO2 is a measure of inspection worthiness. We rank the analyzed ships accordingly. 

 

We cross-check the obtained ranks using a previously developed method for supervised ship plume segmentation.  We quantify the amount of NO2 produced by a given ship by summing up concentrations within the pixels identified as a “plume”. We rank the ships based on the difference between the obtained concentrations and the ship emission proxy.

 

Ships that are also ranked as highly deviating by the segmentation method need further attention. For example, by checking their data for other explanations. If no other explanations are found, these ships are advised to be the candidates for fuel inspection.

How to cite: Kurchaba, S., van Vliet, J., Verbeek, F. J., and Veenman, C. J.: Detection of anomalous NO2 emitting ships using AutoML on TROPOMI satellite data, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-1183, https://doi.org/10.5194/egusphere-egu23-1183, 2023.

Compaction of agricultural soil negatively affects its hydraulic proprieties, leading to water erosion and other negative effects on the quality of the environment. This study focused on the effect of compaction on soil hydrodynamic properties under unsaturated and saturated conditions using the Hydraulic Property Analyzer (HYPROP) system. We studied the impact of five levels of compaction among loam sand soils collected in a potato crop field in northern Québec, Canada. Soil samples were collected, and the soil bulk densities of the artificially compacted samples were developed by increasing the bulk density by 0% (C0), 30% (C30), 40% (C40), 50% (C50), and 70% (C70). First, the saturated hydraulic conductivity of each column was measured using the constant-head method. Soil water retention curve (SWRC) dry-end data and unsaturated hydraulic conductivities were obtained via the implementation and evaluation of the HYPROP evaporation measurement system and WP4-T Dew Point PotentioMeter equipment (METER group, Munich, Germany). Second, the soil microporosity was imaged and quantified using the micro-CT-measured pore-size distribution to visualize and quantify soil pore structures. The imaged soil microporosity was related to the saturated hydraulic conductivity, air permeability, porosity and tortuosity measured of the same samples.  Our results supported the application of the Peters–Durner–Iden (PDI) variant of the bimodal unconstrained van Genuchten model (VGm-b-PDI) for complete SWRC estimation based on the root mean square error (RMSE). The unsaturated hydraulic conductivity matched the PDI variant of the unconstrained van-Genuchten model (VGm-PDI) well. Finally, the preliminary results indicated that soil compaction could strongly influence the hydraulic properties of soil in different ways. The saturated conductivity decreased with increasing soil compaction, and the unsaturated hydraulic conductivity changed very rapidly with the ratio of water to soil. Overall, the HYPROP methodology performed extremely well in terms of the hydraulic behavior of compacted soils.

How to cite: Mbarki, Y. and Gumiere, S. J.: Study of the effect of compaction on the hydrodynamic properties of a loamy sand soil for precision agriculture, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-1583, https://doi.org/10.5194/egusphere-egu23-1583, 2023.

EGU23-1902 | Posters on site | ITS1.5/GI1.5

TACTICIAN: AI-based applications knowledge extraction from ESA’s mission scientific publications 

Omiros Giannakis, Iason Demiros, Konstantinos Koutroumbas, Athanasios Rontogiannis, Vassilis Antonopoulos, Guido De Marchi, Christophe Arviset, George Balasis, Athanasios Daglis, George Vasalos, Zoe Boutsi, Jan Tauber, Marcos Lopez-Caniego, Mark Kidger, Arnaud Masson, and Philippe Escoubet

Scientific publications in space science contain valuable and extensive information regarding the links and relationships between the data interpreted by the authors and the associated observational elements (e.g., instruments or experiments names, observing times, etc.). In this reality of scientific information overload, researchers are often overwhelmed by an enormous and continuously growing number of articles to access in their daily activities. The exploration of recent advances concerning specific topics, methods and techniques, the review and evaluation of research proposals and in general any action that requires a cautious and comprehensive assessment of scientific literature has turned into an extremely complex and time-consuming task.

The availability of Natural Language Processing (NLP) tools able to extract information from scientific unstructured textual contents and to turn it into extremely organized and interconnected knowledge, is fundamental in the framework of the use of scientific information. Exploitation of the knowledge that exists in the scientific publications, necessitates state-of-the-art NLP. The semantic interpretation of the scientific texts can support the development of a varied set of applications such as information retrieval from the texts, linking to existing knowledge repositories, topic classification, semi-automatic assessment of publications and research proposals, tracking of scientific and technological advances, scientific intelligence-assisted reporting, review writing, and question answering.

The main objectives of TACTICIAN are to introduce Artificial Intelligence (AI) techniques to the textual analysis of the publications of all ESA Space Science missions, to monitor and evaluate the scientific productivity of the science missions, and to integrate the scientific publications’ metadata into the ESA Space Science Archive. Through TACTICIAN, we extract lexical, syntactic, and semantic information from the scientific publications by applying NLP and Machine Learning (ML) algorithms and techniques. Utilizing the wealth of publications, we have created valuable scientific language resources, such as labeled datasets and word embeddings, which were used to train Deep Learning models that assist us in most of the language understanding tasks. In the context of TACTICIAN, we have devised methodologies and developed algorithms that can assign scientific publications to the Mars Express, Herschel, and Cluster ESA science missions and identify selected named entities and observations in these scientific publications. We also introduced a new unsupervised ML technique, based on Nonnegative Matrix Factorization (NMF), for classifying the Planck mission scientific publications to categories according to the use of the Planck data products.

These methodologies can be applied to any other mission. The combination of NLP and ML constitutes a general basis, which has proved that it can assist in establishing links between the missions’ observations and the scientific publications and to classify them in categories, with high accuracy.

This work has received funding from the European Space Agency under the "ArTificiAl intelligenCe To lInk publiCations wIth observAtioNs (TACTICIAN)" activity under ESA Contract No 4000128429/19/ES/JD.

How to cite: Giannakis, O., Demiros, I., Koutroumbas, K., Rontogiannis, A., Antonopoulos, V., De Marchi, G., Arviset, C., Balasis, G., Daglis, A., Vasalos, G., Boutsi, Z., Tauber, J., Lopez-Caniego, M., Kidger, M., Masson, A., and Escoubet, P.: TACTICIAN: AI-based applications knowledge extraction from ESA’s mission scientific publications, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-1902, https://doi.org/10.5194/egusphere-egu23-1902, 2023.

EGU23-2388 | ECS | Orals | ITS1.5/GI1.5

Deep learning based identification of carbonate rock components in core images 

Harriet Dawson and Cédric John

Identification of constituent grains in carbonate rocks is primarily a qualitative skill requiring specialist experience. A carbonate sedimentologist must be able to distinguish between various grains of different ages, preserved in differing alteration stages, and cut in random orientations across core sections. Recent studies have demonstrated the effectiveness of machine learning in classifying lithofacies from thin section, core and seismic images, with faster analysis times and reduction of natural biases.  In this study, we explore the application and limitations of convolutional neural network (CNN) based object detection frameworks to identify and quantify multiple types of carbonate grains within close-up core images. Nearly 400 images of carbonate cores we compiled of high-resolution core images from three ODP and IODP expeditions. Over 9,000 individual carbonate components of 11 different classes were manually labelled from this dataset. Using transfer learning, we evaluate one-stage (YOLO v3) and two-stage (Faster R-CNN) detectors under different feature extractors (Darknet and Inception-ResNet-v2). Despite the current popularity of one-stage detectors, our results show Faster R-CNN with Inception-ResNet-v2 backbone provides the most robust performance, achieving nearly 0.8 mean average precision (mAP). Furthermore, we extend the approach by deploying the trained model to ODP Leg 194 Sites 1196 and 1190, developing a performance comparison with human interpretation. 

How to cite: Dawson, H. and John, C.: Deep learning based identification of carbonate rock components in core images, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-2388, https://doi.org/10.5194/egusphere-egu23-2388, 2023.

EGU23-3997 | ECS | Orals | ITS1.5/GI1.5

Artificial Intelligence Models for Detecting Spatiotemporal Crop Water Stress in schedule Irrigation: A review 

Elham Koohi, Silvio Jose Gumiere, and Hossein Bonakdari

Water used in agricultural crops can be managed by irrigation scheduling based on plant water stress thresholds. Automated irrigation scheduling limits crop physiological damage and yield reduction. Knowledge of crop water stress monitoring approaches can be effective in optimizing the use of agricultural water. Understanding the physiological mechanisms of crop responding and adapting to water deficit ensures sustainable agricultural management and food supply. This aim could be achieved by analyzing stomatal conductance, growth rate, leaf water potential, and stem water potential. Calculating thresholds of soil matric potential, and available water content improves the precision of irrigation management by preventing water limitations between irrigations. Crop monitoring and irrigation management make informed decisions using geospatial technologies, the internet of things, big data analysis, and artificial intelligence. Remote sensing (RS) could be applied whenever in situ data are not available. High-resolution crop mapping extracts information through index-based methods fed by the multitemporal and multi-sensor data used in detection and classification. Precision Agriculture (PA) means applying farm inputs at the right amount, at the right time, and in the right place. RS in PA captures images in different spatial, and spectral resolutions through in-field, satellites, aerial, and handheld or tractor-mounted such as unmanned aerial vehicles (UAVs) sensors. RS sensors receive the electromagnetic signals of plant responses in different spectral domains. Optical satellite data, including narrow-band multispectral remote sensing techniques and thermal imagery, is used for water stress detection. To process and analysis RS data, cloud storage and computing platforms simplify the complex mathematical of incorporating various datasets for irrigation scheduling. Machine learning (ML) algorithms construct models for the regression and classification of multivariate and non-linear crop mapping. The web-based software gathered from all different datasets makes a reliable product to reinforce farmers’ ability to make appropriate decisions in irrigating agricultural crops.

Keywords: Agricultural crops; Crop water stress detection; Irrigation scheduling; Precision agriculture; Remote Sensing.

How to cite: Koohi, E., Gumiere, S. J., and Bonakdari, H.: Artificial Intelligence Models for Detecting Spatiotemporal Crop Water Stress in schedule Irrigation: A review, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-3997, https://doi.org/10.5194/egusphere-egu23-3997, 2023.

EGU23-6696 | ECS | Orals | ITS1.5/GI1.5 | Highlight

Satellite-based continental-scale inventory of European wetland types at 10m spatial resolution 

Gyula Mate Kovács, Stefan Oehmcke, Stéphanie Horion, Dimitri Gominski, Xiaoye Tong, and Rasmus Fensholt

Wetlands provide invaluable services for ecosystems and society and are a crucial instrument in our fight against climate change. Although Earth Observation satellites offer cost-effective and accurate information about wetland status at the continental scale; to date, there is no universally accepted, standardized, and regularly updated inventory of European wetlands <100m resolution. Moreover, previous satellite-based global land cover products seldom account for wetland diversity, which often impairs their mapping performances. Here, we mapped major wetland types (i.e., peatland, marshland, and coastal wetlands) across Europe for 2018, based on high resolution (10m) optical and radar time series satellite data as well as field-collected land cover information (LUCAS) using an ensemble model combining traditional machine learning and deep learning approaches. Our results show with high accuracy (>85%) that a substantial extent of European peatlands was previously classified as grassland and other land cover types. In addition, our map highlights cultivated areas (e.g., river floodplains) that can be potentially rewetted. Such accurate and consistent mapping of different wetland types at a continental scale offers a baseline for future wetland monitoring and trend assessment, supports the detailed reporting of European carbon budgets, and lays down the foundation towards a global wetland inventory.

How to cite: Kovács, G. M., Oehmcke, S., Horion, S., Gominski, D., Tong, X., and Fensholt, R.: Satellite-based continental-scale inventory of European wetland types at 10m spatial resolution, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-6696, https://doi.org/10.5194/egusphere-egu23-6696, 2023.

EGU23-8409 | ECS | Orals | ITS1.5/GI1.5

Evaluation of lagoon eutrophication potential under climate change conditions: A novel water quality machine learning and biogeochemical-based framework. 

Federica Zennaro, Elisa Furlan, Donata Melaku Canu, Leslie Aveytua Alcazar, Ginevra Rosati, Sinem Aslan, Cosimo Solidoro, and Andrea Critto

Lagoons are highly valued coastal environments providing unique ecosystem services. However, they are fragile and vulnerable to natural processes and anthropogenic activities. Concurrently, climate change pressures, are likely to lead to severe ecological impacts on lagoon ecosystems. Among these, direct effects are mainly through changes in temperature and associated physico-chemical alterations, whereas indirect ones, mediated through processes such as extreme weather events in the catchment, include the alteration of nutrient loading patterns among others that can, in turn, modify the trophic states leading to depletion or to eutrophication. This phenomenon can lead, under certain circumstances, to harmful algal blooms events, anoxia, and mortality of aquatic flora and fauna, or to the reduction of primary production, with cascading effects on the whole trophic web with dramatic consequences for aquaculture, fishery, and recreational activities. The complexity of eutrophication processes, characterized by compounding and interconnected pressures, highlights the importance of adequate sophisticated methods to estimate future ecological impacts on fragile lagoon environments. In this context, a novel framework combining Machine Learning (ML) and biogeochemical models is proposed, leveraging the potential offered by both approaches to unravel and modelling environmental systems featured by compounding pressures. Multi-Layer Perceptron (MLP) and Random Forest (RF) models are used (trained, validated, and tested) within the Venice Lagoon case study to assimilate historical heterogenous WQ data (i.e., water temperature, salinity, and dissolved oxygen) and spatio-temporal information (i.e., monitoring station location and month), and to predict changes in chlorophyll-a (Chl-a) conditions. Then, projections from the biogeochemical model SHYFEM-BFM for 2049, and 2099 timeframes under RCP 8.5 are integrated to evaluate Chl-a variations under future bio-geochemical conditions forced by climate change projections. Annual and seasonal Chl-a predictions were performed out by classes based on two classification modes established on the descriptive statistics computed on baseline data: i) binary classification of Chl-a values under and over the median value, ii) multi-class classification defined by Chl-a quartiles. Results from the case study showed as the RF successfully classifies Chl-a under the baseline scenario with an overall model accuracy of about 80% for the median classification mode, and 61% for the quartile classification mode. Overall, a decreasing trend for the lowest Chl-a values (below the first quartile, i.e. 0.85 µg/l) can be observed, with an opposite rising fashion for the highest Chl-a values (above the fourth quartile, i.e. 2.78 µg/l). On the seasonal level, summer remains the season with the highest Chl-a values in all scenarios, although in 2099 a strong increase in Chl-a is also expected during the spring one. The proposed novel framework represents a valuable approach to strengthen both eutrophication modelling and scenarios analysis, by placing artificial intelligence-based models alongside biogeochemical models.

How to cite: Zennaro, F., Furlan, E., Melaku Canu, D., Aveytua Alcazar, L., Rosati, G., Aslan, S., Solidoro, C., and Critto, A.: Evaluation of lagoon eutrophication potential under climate change conditions: A novel water quality machine learning and biogeochemical-based framework., EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-8409, https://doi.org/10.5194/egusphere-egu23-8409, 2023.

EGU23-8702 | ECS | Orals | ITS1.5/GI1.5 | Highlight

Evaluating the risk of cumulative impacts in the Mediterranean Sea using a Random Forest model 

Angelica Bianconi, Elisa Furlan, Christian Simeoni, Vuong Pham, Sebastiano Vascon, Andrea Critto, and Antonio Marcomini

Marine coastal ecosystems (MCEs) are of vital importance for human health and well-being. However, their ecological condition is increasingly threatened by multiple risks induced by the complex interplay between endogenic (e.g. coastal development, shipping traffic) and exogenic (e.g. changes in sea surface temperature, waves, sea level, etc.) pressures. Assessing cumulative impacts resulting from this dynamic interplay is a major challenge to achieve Sustainable Development Goals and biodiversity targets, as well as to drive ecosystem-based management in marine coastal areas. To this aim, a Machine Learning model (i.e. Random Forest - RF), integrating heterogenous data on multiple pressures and ecosystems’ health and biodiversity, was developed to support the evaluation of risk scenarios affecting seagrasses condition and their services capacity within the Mediterranean Sea. The RF model was trained, validated and tested by exploiting data collected from different open-source data platforms (e.g. Copernicus Services) for the baseline 2017. Moreover, based on the designed RF model, future scenario analysis was performed by integrating projections from climate numerical models for sea surface temperature and salinity under the 2050 and 2100 timeframes. Particularly, under the baseline scenario, the model performance achieved an overall accuracy of about 82%. Overall, the results of the analysis showed that the ecological condition and services capacity of seagrass meadows (i.e. spatial distribution, Shannon index, carbon sequestration) are mainly threatened by human-related pressures linked to coastal development (e.g. distance from main urban centres), as well as to changes in nutrient concentration and sea surface temperature. This result also emerged from the scenario analysis, highlighting a decrease in seagrass coverage and related services capacity, in both 2050 and 2100 timeframes. The developed model provides useful predictive insight on possible future ecosystem conditions in response to multiple pressures, supporting marine managers and planners towards more effective ecosystem-based adaptation and management measures in MCEs.

How to cite: Bianconi, A., Furlan, E., Simeoni, C., Pham, V., Vascon, S., Critto, A., and Marcomini, A.: Evaluating the risk of cumulative impacts in the Mediterranean Sea using a Random Forest model, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-8702, https://doi.org/10.5194/egusphere-egu23-8702, 2023.

EGU23-10681 | Orals | ITS1.5/GI1.5

EarthQA: A Question Answering Engine for Earth Observation Data Archives 

Dharmen Punjani, Eleni Tsalapati, and Manolis Koubarakis

The standard way for earth observation experts or users to retrieve images from image archives (e.g., ESA's Copernicus Open Access Hub) is to use a graphical user interface, where they can select the geographical area of the image they are interested in and additionally they can specify some other metadata, such as sensing period, satellite platform and cloud cover.

In this work, we are developing the question-answering engine EarthQA that takes as input a question expressed in natural language (English) that asks for satellite images satisfying certain criteria and returns links to such datasets, which can be then downloaded from the CREODIAS cloud platform. To answer user questions, EarthQA queries two interlinked knowledge graphs: a knowledge graph encoding metadata of satellite images from the CREODIAS cloud platform (the SPARQL endpoint of CREODIAS) and the well-known knowledge graph DBpedia. Hence, the questions can refer to image metadata (e.g., satellite platform, sensing period, cloud cover), but also to more generic entities appearing in DBpedia knowledge graph (e.g., lake, Greece). In this way, the users can ask questions like “Find all Sentinel-1 GRD images taken during October 2021 that show large lakes in Greece having an area greater than 100 square kilometers”.

EarthQA follows a template-based approach to translate natural language questions into formal queries (SPARQL). Initially, it decomposes the user question by generating its dependency parse tree and then automatically disambiguates the components appearing in the question to elements of the two knowledge graphs.  In particular, it automatically identifies the spatial or temporal entities (e.g., “Greece”, “October 2021”), concepts (e.g., “lake”), spatial or temporal relations (e.g., “in”, “during”), properties (e.g., “area”) and product types (e.g., “Sentinel-1 GRD”) and other metadata (e.g., “cloud cover below 10%”) mentioned in the question and maps them to the respective elements appearing in the two knowledge graphs (dbr:Greece, dbo:Lake, dbp:area, etc). After this, the SPARQL query is automatically generated.

How to cite: Punjani, D., Tsalapati, E., and Koubarakis, M.: EarthQA: A Question Answering Engine for Earth Observation Data Archives, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-10681, https://doi.org/10.5194/egusphere-egu23-10681, 2023.

EGU23-11527 | ECS | Posters on site | ITS1.5/GI1.5

Global Layer——An integrated, fully online, cloud based platform 

Xingchen Yang, Yang Song, Zhenhan Wu, and Chaowei Wu

In the current stage of scientific research, it is necessary to break the barriers between traditional disciplines and promote the cross integration of various related disciplines. As one of the important carriers of research achievements of various disciplines, maps can be superimposed and integrated to more intuitively display the results of multidisciplinary integration, promote the integration of disciplines and discover new scientific problems. Traditional geological mapping is often based on different scales for single scale mapping, aiming at the mapping mode of paper printing results. It is difficult to read maps between different scales at the same time. To solve this problem,an integrated platform named Global Layer is being built under the support of Deep-time Digital Earth (DDE). Global Layer is embedded with several core databases such as Geological Map of the World at a scale 1/5M, Global Geothermal Database etc. These databases presented in form of electronic map which enables the results of different scales to be displayed and browsed through one-stop hierarchical promotion. In addition, Users can also upload data in four ways: local file, database connection, cloud file and arcgis data service, and data or maping results can be shared to Facebook, Twitter and other platforms in the form of links, widgets, etc. Construction of Global Layer could provide experience and foundation for integrating global databases related to geological map and constructing data platforms.

How to cite: Yang, X., Song, Y., Wu, Z., and Wu, C.: Global Layer——An integrated, fully online, cloud based platform, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-11527, https://doi.org/10.5194/egusphere-egu23-11527, 2023.

EGU23-12373 | ECS | Posters on site | ITS1.5/GI1.5

Mapping streams and ditches using Aerial Laser Scanning 

Mariana Dos Santos Toledo Busarello, Anneli Ågren, and William Lidberg

Streams and ditches are seldom identified on current maps due to their small dimensions and sometimes intermittent nature. Estimates point out that only 9% of all ditches are currently mapped, and the underestimation of natural streams is a global issue. Ditches have been dug in European boreal forests and some parts of North America to drain wetlands and increase forest production, consequently boosting the availability of cultivable land and a national-scale landscape modification. Target 6.6 of the Agenda 2030 highlights the importance of protecting and restoring water-related ecosystems. Wetlands are a substantial part of this, having a high carbon storage capability, the property of mitigating floods, and purifying water. All things accounted for, the withdrawal of anthropogenic environment alterations can be on the horizon, even more because ditches are also strong emitters of methane and other greenhouse gases due to their anoxic water and sediment accumulation. However, streams and ditches that are missing from maps and databases are difficult to manage.

The main focus of this study was to develop a method to map channels combining deep learning and national Aerial Laser Scans (ALS). The performance of different topographical indices derived from the ALS data was evaluated, and two different Digital Elevation Model (DEM) resolutions were compared. Ditch channels and natural streams were manually digitized from ten regions across Sweden, summing up to 1923km of ditch channels and 248km of natural streams. The topographical indices used were: high-passing median filter, slope, sky-view factor and hillshade (with azimuths of 0°, 45°, 90° and 135°); while 0.5m and 1m were the DEM resolutions analysed. A U-net model was trained to segment images between ditches and stream channels: all pixels from each image were labelled in a way that those with the same class display similar attributes.

Results showed that ditches can be successfully mapped with this method and it can generally be applied anywhere since only local terrain indices are required. Additionally, when the natural streams are present in the dataset the model underperformed in predicting the location of ditches, while a higher resolution had the opposite effect. Streams were more challenging to map, and the model only indicated the channels, not whether or not they contained water. Further research will be required to combine hydrological modelling and deep learning.

How to cite: Dos Santos Toledo Busarello, M., Ågren, A., and Lidberg, W.: Mapping streams and ditches using Aerial Laser Scanning, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-12373, https://doi.org/10.5194/egusphere-egu23-12373, 2023.

EGU23-13099 | ECS | Posters virtual | ITS1.5/GI1.5

Mapping Swedish Soils with High-resolution DEM-derived Indices and Machine Learning 

Yiqi Lin, William Lidberg, Cecilia Karlsson, and Anneli Ågren

There is a soaring demand for up-to-date and spatially-explicit soil information to address various environmental challenges. One of the most basic pieces of information, essential for research and decision-making in multiple disciplines is soil classification. Conventional soil maps are often low in spatial resolution and lack the complexity to be practical for hands-on use. Digital Soil Mapping (DSM) has emerged as an efficient alternative for its reproducibility, updatablity, accuracy, and cost-effectiveness, as well as the ability to quantify uncertainties.

Despite DSM’s growing popularity and increasingly wider areas of application, soil information is still rare in forested areas and remote regions, and the integration with high-resolution data on a country scale remains limited. In Sweden, quaternary deposit maps created by the Geological Survey of Sweden (SGU) have been the main reference input for soil-related research and operation, though most parts of the country still warrant higher quality representation. This study utilizes machine learning to produce a high-resolution surficial deposits map with nationwide coverage, capable of supporting research and decision-making. More specifically, it: i) compares the performance of two tree-based ensemble machine learning models, Extreme Gradient Boosting and Random Forest, in predictive mapping of soils across the entire country of Sweden; ii) determines the best model for spatial prediction of soil classes and estimates the associated uncertainty of the inferred map; iii) discusses the advantages and limitations of this approach, and iv) outputs a map product of soil classes at 2-m resolution. Similar attempts around the globe have shown promising results, though at coarser resolutions and/or of smaller geographical extent. The main assumptions behind this study are: i) terrain indices derived from digital elevation model (DEM) are useful predictors of soil type, though different classification algorithms differ in their effectiveness; ii) machine learning can capture major soil classes that cover most of Sweden, but expert geological and pedological knowledge is required when identifying rare soil types.

To achieve this, approximately 850,000 labeled soil points extracted from the most accurate SGU maps will be combined with a stack of 12 LiDAR DEM-derived topographic and hydrological indices and 4 environmental datasets. Uncertainty estimates of the overall model and for each soil class will be presented. An independent dataset obtained from the Swedish National Forest Soil Inventory will be used to assess the accuracy of the machine learning model. The presentation will cover the method, data handling, and some promising preliminary results.

How to cite: Lin, Y., Lidberg, W., Karlsson, C., and Ågren, A.: Mapping Swedish Soils with High-resolution DEM-derived Indices and Machine Learning, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-13099, https://doi.org/10.5194/egusphere-egu23-13099, 2023.

As human activities continue to expand and evolve, their impact on the planet is becoming more evident. These past years Murmuration has been studying one of the most recent and destructive trends that has taken off: mass tourism. In Malta, tourism has been on the rise since before the Covid-19 pandemic. Now that travel restrictions are beginning to lift, it's likely that this trend will go back to increasing in the coming years. While Malta’s economy is mostly based on tourism, it's essential that this activity does not alter the areas in which it takes place. To address these issues and ensure sustainable development, governments and organizations have developed a set of guidelines called Sustainable Development Goals (SDG). SDGs are a set of 17 goals adopted by the United Nations in 2015 to provide a framework to help countries pursue sustainable economic, social and environmental development. They include objectives for mitigating climate change, preventing water pollution and degradation of biodiversity, as well as providing economic benefits to local communities.

In order to help territories like the islands of Malta to cope with these environmental issues, Murmuration carries out studies on various ecological, human and economic indicators. Using the Sentinel satellites of the European Copernicus program for earth imagery data makes possible the collection of geolocated, hourly values on air quality indicators such as NO2, CO and other pollutants but also water quality and vegetation through the analysis of the vegetation health. Other data sources give access to land cover values at meter resolution, tourism infrastructures locations and many more human activity variables. This information is processed into understandable indicators, aggregated indexes which take international standards and SDGs in their design and usage. An example of these standards are the WHO air quality guidelines providing thresholds quantifying the impact on health of the air pollution in the area of interest. The last step is to gather all the data, maps and correlations computed and design understandable visualizations to make it usable by territory management instances, enabling efficient decision making and risk management. The goal here is to achieve a link between satellite imagery, internationally agreed political commitment  and ground level decision-making.

This meaningful aggregation comes in the shape of operational dashboards. A dashboard is an up-to-date, interactive, evolving online tool hosting temporal and geographical linked visualizations on various indicators. This kind of tool allows for a better understanding of the dynamic of a territory in terms of environmental state, human impact and ecological potential.

How to cite: Plantec, M. and Castel, F.: From satellite data and Sustainable Development Goals to interactive tools and better territorial decision making, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-14519, https://doi.org/10.5194/egusphere-egu23-14519, 2023.

EGU23-15656 | Posters virtual | ITS1.5/GI1.5

Karst integration into groundwater recharge simulation in WaterGAP 

Wenhua Wan and Petra Döll

Karst aquifers cover a significant portion of the global water supply. However, a proper representation of groundwater recharge in karst areas is completely absent in the state-of-art global hydrological models. This study, based on the new version of the global hydrological model WaterGAP, (1) presented the first modeling of diffuse groundwater recharge (GWR) in all karst regions using the global map of karstifiable rocks; and (2) adjusted the current GWR algorithm with the up-to-date databases of slope and soil. A large number of ground-based recharge estimates on 818 half degree cells including 75 in karst areas were compared to model results. GWR in karst landscapes assuming equal to the runoff from soil leads to unbiased estimation. The majority of simulated mean annual recharge ranges from 0.6 mm/yr (10th percentile) to 326.9 mm/yr (90th) in nonkarst regions, and 7.5 mm/yr (10th) to 740.2 mm/yr (90th) in karst regions. The recharge rate ranges from 2% to 66% of precipitation according to ground-based estimates in karst regions, while the simulated GWR produces global recharge fractions between 4% (10th) to 68% (90th) in karst areas while that in nonkarst areas rarely exceeds 25%. Unlike the previous studies that claimed global hydrological models consistently underestimate recharge, we observed underestimation only in the very humid regions where recharge exceeds 300 mm/yr. These very high recharge estimates are likely to include preferential flow and adopt a finer spatial and temporal scale than the global model. In karst landscapes and arid regions, we demonstrate that WaterGAP incorporating karst algorithm gives a worthy performance.

 

How to cite: Wan, W. and Döll, P.: Karst integration into groundwater recharge simulation in WaterGAP, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-15656, https://doi.org/10.5194/egusphere-egu23-15656, 2023.

EGU23-16252 | ECS | Posters on site | ITS1.5/GI1.5

GEOTEK: Extracting Marine Geological Data from Publications 

Muhammad Asif Suryani, Christian Beth, Klaus Wallmann, and Matthias Renz

In Marine Geology, scientists persistently perform extensive experiments to measure diverse features across the globe, hence to estimate environmental changes. For example, Mass Accumulation Rate (MAR) and Sedimentation Rate (SR) are measured by marine geologists at various oceanographic locations and are largely reported in research publications but have not been compiled in any central database. Furthermore, every MAR and SR observation normally carries i) exact locational information (Longitude and Latitude), ii) the method of measurement (stratigraphy, 210Pb), iii) a numerical value and units (2.4 g/m2/yr), iv) temporal feature (e.g. hundred years ago). The contextual information attached to MAR and SR observations is heterogeneous and manual approaches for information extraction from text are infeasible. It is also worth mentioning that MAR and SR are not denoted in standard international (SI) units.

We propose the comprehensive end-to-end framework GEOTEK (Geological Text to Knowledge) to extract targeted information from marine geology publications. The proposed framework comprises three modules. The first module carries a document relevance model alongside a PDF extractor, capable of filtering relevant sources using metadata, and the extraction module extracts text, tables, and metadata respectively. The second module mainly comprises of two information extractors, namely Geo-Quantities and Geo-Spacy, particularly trained on text from the Marine Geology domain. Geo-Quantities is capable of extracting relevant numerical information from the text and covers more than 100 unit variants for MAR and SR, while Geo-Spacy extracts a set of relevant named entities as well as locational entities, which are further processed to obtain respective geocode boundaries. The third module, the Heterogeneous Information Linking module (HIL), processes exact spatial information from tables and captions and forms links to the previously extracted measurements. Finally, the all-linked information is populated in an interactive map view.

How to cite: Suryani, M. A., Beth, C., Wallmann, K., and Renz, M.: GEOTEK: Extracting Marine Geological Data from Publications, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-16252, https://doi.org/10.5194/egusphere-egu23-16252, 2023.

EGU23-16813 | ECS | Posters on site | ITS1.5/GI1.5 | Highlight

The Use of Artificial Intelligence in ESA’s Climate Change Initiative 

Anna Jungbluth, Ed Pechorro, Clement Albergel, and Susanne Mecklenburg

Climate change is arguably the greatest environmental challenge facing humankind in the twenty-first century. The United Nations Framework Convention on Climate Change (UNFCCC) facilitates multilateral action to combat climate change and its impacts on humanity and ecosystems. To make decisions on climate change mitigation and adaptation, the UNFCCC requires systematic observations of the global climate system.

The objective of the ESA’s climate programme, currently delivered via the Climate Change Initiative (CCI), is to realise the full potential of the long-term, global-scale, satellite earth observation archive that ESA and its Member States have established over the last 35 years, as a significant and timely contribution to the climate data record required by the UNFCCC.

Since 2010, the programme has contributed to a rapidly expanding body of scientific knowledge on >22 Essential Climate Variables (ECVs), through the production of Climate Data Records (CDRs). Although varying across geophysical parameters, ESA CDRs follow community-driven data standards, facilitating inter- and cross-ECV research of the climate system.

In this work, we highlight the use of artificial intelligence (AI) in the context of the ESA CCI. AI has played a pivotal role in the production and analysis of these Climate Data Records. Eleven CCI projects - Greenhouse Gases (GHG), Aerosols, Clouds, Fire, Ocean Colour, Sea Level, Soil Moisture, High Resolution Landcover, Biomass, Permafrost, and Sea Surface Salinity - have applied AI in their data record production and research or have identified specific AI usage for their research roadmaps.

The use of AI in these CCI projects is varied, for example - GHG CCI algorithms using random forest machine learning techniques; Aerosol CCI algorithms to retrieve dust aerosol optical depth from thermal infrared spectra; Fire CCI algorithms to detect burned areas. Moreover, the ESA climate community has identified climate science gaps in context to ECVs with the potential for meaningful advancement through AI.

We specifically focus on showcasing the use of AI for data homogenization and super-resolution of ESA CCI datasets. For instance, both the land cover and fire CCI dataset were generated globally in low resolution, while high resolution data only exists for specific geographical regions. By adapting super-resolution algorithms to the specific science use cases, we can accelerate the generation of global, high-resolution datasets with the required temporal coverage to support long-term climate studies. 

How to cite: Jungbluth, A., Pechorro, E., Albergel, C., and Mecklenburg, S.: The Use of Artificial Intelligence in ESA’s Climate Change Initiative, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-16813, https://doi.org/10.5194/egusphere-egu23-16813, 2023.

In most places on the planet vegetation thrives: it is known as “greening Earth”. However in certain regions, especially in the Arctic, there are areas exhibiting a browning trend. This phenomenon is well known but not fully understood yet, and grasping its impact on local ecosystems requires involvement of scientists from different disciplines, including social sciences and humanities, as well as local populations. Here we focus on the Troms and Finnmark counties in northern Norway to assess the extent of the problem and any link with local environmental conditions as well as potential impacts. 

We have chosen to adopt an open and collaborative process and take advantage of the services offered by RELIANCE on the European Open Science Cloud (EOSC). RELIANCE delivers a suite of innovative and interconnected services that extend the capabilities of the European Open Science Cloud (EOSC) to support the management of the research lifecycle within Earth Science Communities and Copernicus Users. The RELIANCE project has delivered 3 complementary  technologies: Research Objects (ROs), Data Cubes and AI-based Text Mining. RoHub is a Research Object management platform that implements these 3 technologies and enables researchers to collaboratively manage, share and preserve their research work. 

We will show how we are using these technologies along with EGI notebooks to work open and share an executable Jupyter Notebook that is fully reproducible and reusable. We use a number of Python libraries from the Pangeo software stack such as Xarray, Dask and Zarr. Our Jupyter Notebook is bundled with its computational environment, datacubes and related bibliographic resources in an executable Research Object. We believe that this approach can significantly speed up the research process and can drive it to more exploitable results. 

Up to now, we have used indices derived from satellite data (in particular Sentinel-2) to assess how the vegetation cover in Troms and Finnmark counties has changed. To go a bit further we are investigating how to relate such information to relevant local parameters obtained from meteorological reanalysis data (ERA5 and ERA5-land from ECMWF). That should give a good basis for training an Artificial Intelligence algorithm and testing it, with the objective of getting an idea about the possibility of “predicting” what is likely to happen in the near future with certain types of vegetation like mosses and lichens which are essential for local populations and animals.

How to cite: Iaquinta, J. and Fouilloux, A.: Using FAIR and Open Science practices to better understand vegetation browning in Troms and Finnmark (Norway), EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-2579, https://doi.org/10.5194/egusphere-egu23-2579, 2023.

EGU23-3639 | Orals | ESSI2.8

Data Proximate Computation; Multi-cloud approach on European Weather Cloud and Amazon Web Services  

Armagan Karatosun, Michael Grant, Vasileios Baousis, Duncan McGregor, Richard Care, John Nolan, and Roope Tervo

Although utilizing the cloud infrastructure for big data processing algorithms is increasingly common, the challenges of utilizing cloud infrastructures efficiently and effectively are often underestimated. This is especially true in multi-cloud scenarios where data are available only on a subset of the participating clouds. In this study, we have iteratively developed a solution enabling efficient access to ECMWF’s Numerical Weather Prediction (NWP) and EUMETSAT’s satellite data on the European Weather Cloud [1], in combination with UK Met Office assets in Amazon Web Services (AWS), in order to provide a common template for multi-cloud processing solutions in meteorological application development and operations in Europe.  

Dask [2] was chosen as the computing framework due to its widespread use in the meteorological community, its ability to automatically spread processing, and its flexibility in changing how workloads are distributed across physical or virtualized infrastructures while maintaining scalability. However, the techniques used here are generally applicable to other frameworks. The primary limitation in using Dask is that all nodes should be able to intercommunicate freely, which is a serious limitation when nodes are distributed over multiple clouds. Although it is possible to route between multiple cloud environments over the Internet, this introduces considerable administrative work (firewalls, security) as well as networking complexities (e.g., due to extensive use of potentially-clashing private IP ranges and NAT in clouds, or cost for public IPs). Virtual Private Networks (VPNs) can hide these issues, but many use a hub-and-spokes model, meaning that communications between workers pass through a central hub. By use of a mesh network VPN (WireGuard) between clusters using IPv6 private addressing, all these difficulties can be avoided, in addition to providing a simplified network addressing scheme with extremely high scalability. Another challenge was to ensure the Dask worker nodes were aware of data locality, both in terms of placing work near data and in terms of minimizing transfers. Here, the UK Met Office’s work on labeling resource pools (in this case, data) and linking scheduling decisions to labels was the key. 

In summary, by adapting Dask's concept of resourcing [3] into resource pools [4], building an automated start-up process, and effectively utilizing self-configuring IPv6 VPN mesh networks, we managed to provide a “cloud-native” transient model where all resources can be easily created and disposed of as needed. The resulting “throwaway” multi-cloud Dask framework is able to efficiently place processing on workers proximate to the data while minimizing necessary data traffic between clouds, thus achieving results more quickly and cheaper than naïve implementations, and with a simple, automated setup suitable for meteorological developers. The technical basis of this work was published on the Dask blog [5] but is covered more holistically here, particularly regarding the application side and challenges of developing cloud-native applications which can effectively utilize modern multi-cloud environments, with future applicability to distributed (e.g., Kubernetes) and serverless computing models. 

References: 

[1] https://www.europeanweather.cloud 
[2] https://www.dask.org 
[3] https://distributed.dask.org/en/stable/resources.html
[4] https://github.com/gjoseph92/dask-worker-pools  
[5] https://blog.dask.org/2022/07/19/dask-multi-cloud  

How to cite: Karatosun, A., Grant, M., Baousis, V., McGregor, D., Care, R., Nolan, J., and Tervo, R.: Data Proximate Computation; Multi-cloud approach on European Weather Cloud and Amazon Web Services , EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-3639, https://doi.org/10.5194/egusphere-egu23-3639, 2023.

The National Oceanic and Atmospheric Administration (NOAA) established the Earth Prediction Innovation Center (EPIC) to be the catalyst for community research and modeling focused on informing and accelerating advances in our nation’s operational NWP forecast modeling systems. The Unified Forecast System (UFS) is a community-based, coupled, comprehensive Earth modeling system. The UFS numerical applications span local to global domains and predictive time scales from sub-hourly analyses to seasonal predictions. It is designed to support the Weather Enterprise and to be the source system for NOAA‘s operational numerical weather prediction applications. EPIC applies an open-innovation and open-development framework that embraces open-source code repositories integrated with automated Continuous Integration/Continuous Deployment (CI/CD) pipelines on cloud and on-prem HPCs. EPIC also supports UFS public releases, tutorials and training opportunities (e.g., student workshops, hackathons, and codesprints), and advanced user support via a virtual community portal (epic.noaa.gov). This framework allows community developers to track the status of their contributions, and facilitate rapid incorporation of innovation by implementing consistent and transparent, standardized and community-driven validation and verification tests. In this presentation, I will demonstrate capabilities in the EPIC framework using the UFS Short-range Weather (SRW) Application as an example in the follow aspects:

  • Public Releases of a Cloud-ready UFS SRW application with a scalable container following a modernize continuous release paradigm 
  • Test cases for challenging forecast environments released with datasets
  • Training and Tutorials for users and developers
  • Baseline for benchmarking in skill and computation on cloud HPCs , and
  • An Automated CI/CD pipeline to enable seamless transition to operations

How to cite: Huang, M.: An Open-innovation and Open-development Framework for the Unified Forecast System Powered by the Earth Prediction Innovation Center, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-3738, https://doi.org/10.5194/egusphere-egu23-3738, 2023.

EGU23-4298 | Orals | ESSI2.8

BUILDSPACE: Enabling Innovative Space-driven Services for Energy Efficient Buildings and Climate Resilient Cities 

Stamatia Rizou, Vaggelis Marinakis, Gema Hernández Moral, Carmen Sánchez-Guevara, Luis Javier Sánchez-Aparicio, Ioannis Brilakis, Vasileios Baousis, Tijs Maes, Vassileios Tsetsos, Marco Boaria, Piotr Dymarski, Michail Bourmpos, Petra Pergar, and Inga Brieze

BUILDSPACE aims to couple terrestrial data from buildings (collected by IoT platforms, BIM solutions and other) with aerial imaging from drones equipped with thermal cameras and location annotated data from satellite services (i.e., EGNSS and Copernicus) to deliver innovative services for the building and urban stakeholders and support informed decision making towards energy-efficient buildings and climate resilient cities. The platform will allow integration of these heterogeneous data and will offer services at building scale, enabling the generation of high fidelity multi-modal digital twins and at city scale providing decision support services for energy demand prediction, urban heat and urban flood analysis. The services will enable the identification of environmental hotspots that increase pressure to local city ecosystems and raise probability for natural disasters (such as flooding) and will issue alerts and recommendations for action to local governments and regions (such as the support of policies for building renovation in specific vulnerable areas). BUILDSPACE services will be validated and assessed in four European cities with different climate profiles. The digital twin services at building level will be tested during the construction of a new building in Poland, and the city services validating the link to digital twin of buildings will be tested in 3 cities (Piraeus, Riga, Ljubljana) across EU. BUILDSPACE will create a set of replication guidelines and blueprints for the adoption of the proposed applications in building resilient cities at large. 

How to cite: Rizou, S., Marinakis, V., Hernández Moral, G., Sánchez-Guevara, C., Sánchez-Aparicio, L. J., Brilakis, I., Baousis, V., Maes, T., Tsetsos, V., Boaria, M., Dymarski, P., Bourmpos, M., Pergar, P., and Brieze, I.: BUILDSPACE: Enabling Innovative Space-driven Services for Energy Efficient Buildings and Climate Resilient Cities, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-4298, https://doi.org/10.5194/egusphere-egu23-4298, 2023.

EGU23-5807 | Orals | ESSI2.8

The EuroHPC Center of Excellence for Exascale in Solid Earth 

Arnau Folch, Josep DelaPuente, Antonio Costa, Benedikt Halldórson, Jose Gracia, Piero Lanucara, Michael Bader, Alice-Agnes Gabriel, Jorge Macías, Finn Lovholt, Vadim Montellier, Alexandre Fournier, Erwan Raffin, Thomas Zwinger, Clea Denamiel, Boris Kaus, and Laetitia le Pourhiet

The second phase (2023-2026) of the Center of Excellence for Exascale in Solid Earth (ChEESE-2P), funded by HORIZON-EUROHPC-JU-2021-COE-01 under the Grant Agreement No 101093038, will prepare 11 European flagship codes from different geoscience domains (computational seismology, magnetohydrodynamics, physical volcanology, tsunamis, geodynamics, and glacier hazards). Codes will be optimised in terms of performance on different types of accelerators, scalability, containerisation, and continuous deployment and portability across tier-0/tier-1 European systems as well as on novel hardware architectures emerging from the EuroHPC Pilots (EuPEX/OpenSequana and EuPilot/RISC-V) by co-designing with mini-apps. Flagship codes and workflows will be combined to farm a new generation of 9 Pilot Demonstrators (PDs) and 15 related Simulation Cases (SCs) representing capability and capacity computational challenges selected based on their scientific importance, social relevance, or urgency. The SCs will produce relevant EOSC-enabled datasets and enable services on aspects of geohazards like urgent computing, early warning forecast, hazard assessment, or fostering an emergency access mode in EuroHPC systems for geohazardous events including access policy recommendations. Finally, ChEESE-2P will liaise, align, and synergise with other domain-specific European projects on digital twins and longer-term mission-like initiatives like Destination Earth.

How to cite: Folch, A., DelaPuente, J., Costa, A., Halldórson, B., Gracia, J., Lanucara, P., Bader, M., Gabriel, A.-A., Macías, J., Lovholt, F., Montellier, V., Fournier, A., Raffin, E., Zwinger, T., Denamiel, C., Kaus, B., and le Pourhiet, L.: The EuroHPC Center of Excellence for Exascale in Solid Earth, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-5807, https://doi.org/10.5194/egusphere-egu23-5807, 2023.

EGU23-6768 | ECS | Orals | ESSI2.8

SarXarray: an Xarray extension for SLC SAR data processing 

Ou Ku, Francesco Nattino, Meiert Grootes, Pranav Chandramouli, and Freek van Leijen

Satellite-based Interferometric Synthetic Aperture Radar (InSAR) plays a significant role for numerous surface motion monitoring applications, e.g. civil-infrastructure stability, hydrocarbons extraction, etc. InSAR monitoring is based on a coregistered stack of Single Look Complex (SLC) SAR images. Due to the long temporal coverage, broad spatial coverage and high spatio-temporal resolution of an SLC SAR stack, handling it in an efficient way is a common challenge within the community. Aiming to meet this need, we present SarXarray: an open-source Xarray extension for SLC SAR stack processing. SarXarray provides a Python interface to read and write a coregistered stack of SLC SAR data, with basic SAR processing functions. It utilizes Xarray’s support on labeled multi-dimensional datasets to stress the space-time character of an SLC SAR stack. It also leverages Dask to perform lazy evaluation of the operations. SarXarray can be integrated to existing Python workflows in a flexible way. We provide a case study of creating a SAR Mean Reflectivity Map to demonstrate the functionality of SarXarray.

How to cite: Ku, O., Nattino, F., Grootes, M., Chandramouli, P., and van Leijen, F.: SarXarray: an Xarray extension for SLC SAR data processing, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-6768, https://doi.org/10.5194/egusphere-egu23-6768, 2023.

EGU23-6857 | ECS | Posters on site | ESSI2.8

Convergence of HPC, Big Data and Machine Learning for Earth System workflows 

Donatello Elia, Sonia Scardigno, Alessandro D'Anca, Gabriele Accarino, Jorge Ejarque, Francesco Immorlano, Daniele Peano, Enrico Scoccimarro, Rosa M. Badia, and Giovanni Aloisio

Typical end-to-end Earth System Modelling (ESM) workflows rely on different steps including data pre-processing, numerical simulation, output post-processing, as well as data analytics and visualization. The approaches currently available for implementing scientific workflows in the climate context do not properly integrate the entire set of components into a single workflow and in a transparent manner. The increasing usage of High Performance Data Analytics (HPDA) and Machine Learning (ML) in climate applications further exacerbate the issues. A more integrated approach would allow to support next-generation ESM and improve the workflow in terms of execution and energy consumption.

Moreover, a seamless integration of components for HPDA and ML into the ESM workflow will open the floor to novel applications and support larger scale pre- and post-processing. However, these components typically have different deployment requirements spanning from HPC (for ESM simulation) to Cloud computing (for HPDA and ML). It is paramount to provide scientists with solutions capable of hiding the technical details of the underlying infrastructure and improving workflow portability.

In the context of the eFlows4HPC project, we are exploring the use of innovative workflow solutions integrating approaches from HPC, HPDA and ML for supporting end-to-end ESM simulations and post-processing, with a focus on extreme events analysis (e.g., heat waves and tropical cyclones). In particular, the envisioned solution exploits PyCOMPSs for the management of parallel pipelines, task orchestration and synchronization, as well as PyOphidia for climate data analytics and ML frameworks (i.e., TensorFlow) for data-driven event detection models. This contribution presents the approaches being explored in the frame of the project to address the convergence of HPC, Big Data and ML into a single end-to-end ESM workflows.

How to cite: Elia, D., Scardigno, S., D'Anca, A., Accarino, G., Ejarque, J., Immorlano, F., Peano, D., Scoccimarro, E., Badia, R. M., and Aloisio, G.: Convergence of HPC, Big Data and Machine Learning for Earth System workflows, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-6857, https://doi.org/10.5194/egusphere-egu23-6857, 2023.

EGU23-6960 | Orals | ESSI2.8

Remote Sensing Deployable Analysis environmenT 

Pranav Chandramouli, Francesco Nattino, Meiert Grootes, Ou Ku, Fakhereh Alidoost, and Yifat Dzigan

Remote-sensing (RS) and Earth observation (EO) data have become crucial in areas ranging from science to policy, with their use expanding beyond the ‘usual’ fields of geosciences to encompass ‘green’ life sciences, agriculture, and even social sciences. Within this context, the RS-DAT project has developed and made available a readily deployable framework enabling researchers to scale their analysis of EO and RS data on HPC systems and associated storage resources. Building on and expanding the established tool stack of the Pangeo Community, the framework integrates tools to access, retrieve, explore, and process geospatial data, addressing common needs identified in the EO domain. On the computing side RS-DAT leverages Jupyter (Python), which provides users a web-based interface to access (remote) computational resources, and Dask, which enables to scale analysis and workflows to large computing systems. Both Jupyter and Dask are well-established tools in the Pangeo community and can be deployed in several ways and on different infrastructures. RS-DAT provides an easy-to-use deployment framework for two targets: the generic case of SLURM-based HPC systems (for example, Dutch Supercomputer Snellius/Spider) which offer flexibility in computational resources; and the special case of an ansible-based cloud-computing infrastructure (Surf Research Cloud (SRC)) which is more straight-forward for the user but less flexible. Both these frameworks enable the easy scale-up of workflows, using HPCs, to access, manipulate and process large-scale datasets as commonly found in EO. On the data access and storage side RS-DAT integrates two python packages, STAC2dCache and dCacheFS, which were developed to facilitate data retrieval from online STAC catalogs (STAC2dCache) and its storage on the HPC system or local mass storage, specifically dCache.  This ensures efficient computation for large-scale analyses where data retrieval and handling can cause significant bottlenecks. User-defined input/output to Zarr file format is also supported within the framework. We present an application of the tools developed to the calculation of leaf-spring indices for North America using the Daymet dataset at a 1km resolution for 42 years (~940 GiB, completed in under 5 hours using 60 cores on the Dutch supercomputing system) and look forward to on-going work integrating both deployment targets in the case of the Dutch HPC ecosystem.

How to cite: Chandramouli, P., Nattino, F., Grootes, M., Ku, O., Alidoost, F., and Dzigan, Y.: Remote Sensing Deployable Analysis environmenT, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-6960, https://doi.org/10.5194/egusphere-egu23-6960, 2023.

With the amount of high resolution earth observation data available it is not feasible anymore to do all analysis on local computers or even local cluster systems. To achieve high performance for out-of-memory datasets we develop the YAXArrays.jl package in the Julia programming language. YAXArrays.jl provides both an abstraction over chunked n-dimensional arrays with labelled axes and efficient multi-threaded and multi-process computation on these arrays.
In this contribution we would like to present the lessons we learned from scaling an analysis of high resolution Sentinel-1 time series
data. By bringing a Sentinel-1 change detection use case which has been performed on a small local area of interest to a whole region we test the ease and performance of distributed computing on the European Open Science Cloud (EOSC) in Julia.

How to cite: Gans, F. and Cremer, F.: Scaling up a Sentinel 1 change detection pipeline using the Julia programming language, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-7825, https://doi.org/10.5194/egusphere-egu23-7825, 2023.

EGU23-8096 | ECS | Posters on site | ESSI2.8

Spatio-Temporal Asset Catalog (STAC) for in-situ data 

Justus Magin and Tina Odaka

In order to make use of a collection of datasets – for example, scenes from a SAR satellite – more efficient, it is important to be able to search for datasets relevant for a specific application. In particular, one might want to search for a specific period in time, for the spatial extent, or perform searches over multiple collections together.

For SAR data or data obtained from optical satellites, Spatio-Temporal Asset Catalogs (STAC) have become increasingly popular in the past few years. Defined as JSON and backed by databases with geospatial extensions, STAC servers (endpoints) have the advantage of being efficient, language-agnostic and following a standardized API.

Just like satellite scenes, in-situ data is growing in size very quickly and thus would benefit from being catalogued. However, the sequential nature of in-situ data and its sparse distribution in space makes it difficult to fit into STAC's standard model.

In the session, we present a experimental STAC extension that defines the most common properties of in-situ data as identified from ArgoFloat and  biologging data.

How to cite: Magin, J. and Odaka, T.: Spatio-Temporal Asset Catalog (STAC) for in-situ data, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-8096, https://doi.org/10.5194/egusphere-egu23-8096, 2023.

EGU23-8756 | Posters on site | ESSI2.8

Pangeo framework for training: experience with FOSS4G, the CLIVAR bootcamp and the eScience course 

Anne Fouilloux, Pier Lorenzo Marasco, Tina Odaka, Ruth Mottram, Paul Zieger, Michael Schulz, Alejandro Coca-Castro, Jean Iaquinta, and Guillaume Eynard Bontemps

The ever increasing number of scientific datasets made available by authoritative data providers (NASA, Copernicus, etc.) and provided by the scientific community opens new possibilities for advancing the state of the art in many areas of the natural sciences. As a result, researchers, innovators, companies and citizens need to acquire computational and data analysis skills to optimally exploit these datasets. Several educational programs dispense basic courses to students, and initiatives such as “The Carpentries” (https://carpentries.org/) complement this offering but also reach out to established researchers to fill the skill gap thereby empowering them to perform their own data analysis. However, most researchers find it challenging to go beyond these training sessions and face difficulties when trying to apply their newly acquired knowledge to their own research projects. To this regard, hackathons have proven to be an efficient way to support researchers in becoming competent practitioners but organising good hackathons is difficult and time consuming. In addition, the need for large amounts of computational and storage resources during the training and hackathons requires a flexible solution. Here, we propose an approach where researchers  work on realistic, large and complex data analysis problems similar to or directly part of  their research work. Researchers access an infrastructure deployed on the European Ocean Science Cloud (EOSC)  that supports intensive data analysis (large compute and storage resources). EOSC is a European Commission initiative for providing a federated and open multi-disciplinary environment where data, tools and services can be shared, published, found and re-used. We used jupyter book for delivering a collection of FAIR training materials for data analysis relying on Pangeo EOSC deployments as its primary computing platform. The training material (https://pangeo-data.github.io/foss4g-2022/intro.html, https://pangeo-data.github.io/clivar-2022/intro.html, https://pangeo-data.github.io/escience-2022/intro.html) is customised (different datasets with similar analysis) for different target communities and participants are taught the usage of Xarray, Dask and more generally how to efficiently access and analyse large online datasets. The training can be completed by group work where attendees can work on larger scale scientific datasets: the classroom is split into several groups. Each group works on different scientific questions and may use different datasets. Using the Pangeo (http://pangeo.io) ecosystem is not always new for all attendees but applying Xarray (http://xarray.pydata.org)  and Dask (https://www.dask.org/) on actual scientific “mini-projects” is often a showstopper for many researchers. With this approach, attendees have the opportunity to ask questions, collaborate with other researchers as well as Research Software Engineers, and apply Open Science practices without the burden of trying and failing alone. We find the involvement of scientific computing research engineers directly in the training is crucial for success of the hackathon approach. Feedback from attendees shows that it provides a solid foundation for big data geoscience and helps attendees to quickly become competent practitioners. It also gives infrastructure providers and EOSC useful feedback on the current and future needs of researchers for making their research FAIR and open. In this presentation, we will provide examples of achievements from attendees and present the feedback EOSC providers have received.

How to cite: Fouilloux, A., Marasco, P. L., Odaka, T., Mottram, R., Zieger, P., Schulz, M., Coca-Castro, A., Iaquinta, J., and Eynard Bontemps, G.: Pangeo framework for training: experience with FOSS4G, the CLIVAR bootcamp and the eScience course, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-8756, https://doi.org/10.5194/egusphere-egu23-8756, 2023.

EGU23-9095 | Posters on site | ESSI2.8

Pangeo@EOSC: deployment of PANGEO ecosystem on the European Open Science Cloud 

Guillaume Eynard-Bontemps, Jean Iaquinta, Sebastian Luna-Valero, Miguel Caballer, Frederic Paul, Anne Fouilloux, Benjamin Ragan-Kelley, Pier Lorenzo Marasco, and Tina Odaka

Research projects heavily rely on the exchange and processing of data and in this context Pangeo (https://pangeo.io/), a world-wide community of scientists and developers, thrives to facilitate the deployment of ready to use and community-driven platforms for big data geoscience. The European Open Science Cloud (EOSC) is the main initiative in Europe for providing a federated and open multi-disciplinary environment where European researchers, innovators, companies and citizens can share, publish, find and re-use data, tools and services for research, innovation and educational purposes. While a number of services based on Jupyter Notebooks were already available, no public Pangeo deployments providing fast access to large amounts of data and compute resources were accessible on EOSC. Most existing cloud-based Pangeo deployments are USA-based, and members of the Pangeo community in Europe did not have a shared platform where scientists or technologists could exchange know-how. Pangeo teamed up with two EOSC projects, namely EGI-ACE (https://www.egi.eu/project/egi-ace/) and C-SCALE (https://c-scale.eu/) to demonstrate how to deploy and use Pangeo on EOSC and emphasise the benefits for the European community. 

The Pangeo Europe Community together with EGI deployed a DaskHub, composed of Dask Gateway (https://gateway.dask.org/) and JupyterHub (https://jupyter.org/hub), with Kubernetes cluster backend on EOSC using the infrastructure of the EGI Federation (https://www.egi.eu/egi-federation/). The Pangeo EOSC JupyterHub deployment makes use of 1) the EGI Check-In to enable user registration and thereby authenticated and authorised access to the Pangeo JupyterHub portal and to the underlying distributed compute infrastructure; and 2) the EGI Cloud Compute and the cloud-based EGI Online Storage to distribute the computational tasks to a scalable compute platform and to store intermediate results produced by the user jobs. 

To facilitate future Pangeo deployments on top of a wide range of cloud providers (AWS, Google Cloud, Microsoft Azure, EGI Cloud Computing, OpenNebula, OpenStack, and more), the Pangeo EOSC JupyterHub deployment is now possible through the Infrastructure Manager (IM) Dashboard (https://im.egi.eu/im-dashboard/login). All the computing and storage resources are currently supplied by CESNET (https://www.cesnet.cz/?lang=en) in the frame of EGI-ACE project (https://im.egi.eu/). Several deployments have been made to serve the geoscience community, both for teaching and for research work. To date, more than 100 researchers have been trained on Pangeo@EOSC deployments and more are expected to join, in particular with easy access to large amounts of Copernicus data through a recent collaboration established with the C-SCALE project. In this presentation, we will provide details on the different deployments, how to get access to JupyterHub deployments and more generally how to contribute to Pangeo@EOSC.



How to cite: Eynard-Bontemps, G., Iaquinta, J., Luna-Valero, S., Caballer, M., Paul, F., Fouilloux, A., Ragan-Kelley, B., Marasco, P. L., and Odaka, T.: Pangeo@EOSC: deployment of PANGEO ecosystem on the European Open Science Cloud, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-9095, https://doi.org/10.5194/egusphere-egu23-9095, 2023.

EGU23-10697 | Orals | ESSI2.8 | Highlight

The Joint Effort for Data Assimilation Integration (JEDI): A unified data assimilation framework for Earth system prediction supported by NOAA, NASA, U.S. Navy, U.S. Air Force, and UK Met Office 

Dom Heinzeller, Maryam Abdi-Oskouei, Stephen Herbener, Eric Lingerfelt, Yannick Trémolet, and Tom Auligné

The Joint Effort for Data assimilation Integration (JEDI), is an innovative data assimilation system for Earth system prediction, spearheaded by the Joint Center for Satellite Data Assimilation (JCSDA) and slated for implementation in major operational modeling systems across the globe in the coming years. Funded as an inter-agency development by NOAA, NASA, the U.S. Navy and Air Force, and with contributions from the UK Met Office, JEDI must operate on a wide range of computing platforms. The recent move towards cloud computing systems puts portability, adaptability and performance across systems, from dedicated High Performance Computing systems to commercial clouds and workstations, in the critical path for the success of JEDI.

JEDI is a highly complex application that relies on a large number of third-party software packages to build and run. These packages can include I/O libraries, workflow engines, Python modules for data manipulation and plotting, several ECMWF libraries for complex arithmetics and grid manipulations, and forecast models such as the Unified Forecast System (UFS), the Goddard Earth Observing System (GEOS), the Modular Ocean Model (MOM6), the Model for Prediction across Scales (MPAS), the Navy Environmental Prediction sysTem Utilizing the NUMA corE (NEPTUNE), and the Met Office Unified Model (UM).

With more than 100 contributors and rapid code development it is critical to perform thorough automated testing, from basic unit tests to comprehensive end-to-end-tests. This presentation summarizes recent efforts to leverage cloud computing environments for research, development, and near real-time applications of JEDI, as well as for developing a Continuous Integration/Continuous Delivery (CI/CD) pipeline. These efforts rest on a newly developed software stack called spack-stack, a joint effort of JCSDA, the NOAA Environmental Modeling Center (EMC) and the U.S. Earth Prediction Innovation Center (EPIC). Automatic testing in JEDI is implemented with modern software development tools such as GitHub, Docker containers, various Amazon Web Services (AWS), and CodeCov for testing and evaluation of code performance. End-to-end testing is realized in JCSDA’s newly developed Skylab Earth system data assimilation application, which combines JEDI with the Research Repository for Data and Diagnostics (R2D2) and the Experiments and Workflow Orchestration Kit (EWOK), and which leverages the AWS Elastic Compute Cloud (EC2) for testing, research, development and production.

How to cite: Heinzeller, D., Abdi-Oskouei, M., Herbener, S., Lingerfelt, E., Trémolet, Y., and Auligné, T.: The Joint Effort for Data Assimilation Integration (JEDI): A unified data assimilation framework for Earth system prediction supported by NOAA, NASA, U.S. Navy, U.S. Air Force, and UK Met Office, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-10697, https://doi.org/10.5194/egusphere-egu23-10697, 2023.

EGU23-11117 | Orals | ESSI2.8

Modeling the Earth System on Modular Supercomputing Architectures: coupled atmosphere-ocean simulations with ICON 

Olaf Stein, Abhiraj Bishnoi, Luis Kornblueh, Lars Hoffmann, Norbert Eicker, Estela Suarez, and Catrin I. Meyer

Significant progress has been made in recent years to develop km-scale versions of global Earth System Models (ESM), combining the chance of replacing uncertain model parameterizations by direct treatment and the improved representation of orographic and land surface features (Schär et al., 2020, Hohenegger et al., 2022). However, adapting climate codes to new hardware and at the same time keeping the performance portability, still remains a major issue. Given the long development cycles, the various maturity of ESM modules and their large code bases, it is not expected that all code parts can be brought to the same level of exascale readiness in the near future. Instead, short term model adaptation strategies need to focus on software abilities as well as hardware availability. Moreover, energy use efficiency is of growing importance on both sides, supercomputer providers and scientific projects employing climate simulations.

Here, we present results from first simulations of the coupled atmosphere-ocean modelling system ICON-v2.6.6-rc on the supercomputing system JUWELS at the Jülich Supercomputing Centre (JSC) with a global resolution of 5 km, using significant parts of the HPC system. While the atmosphere part of ICON (ICON-A) is capable of running on GPUs, model I/O currently performs better on a CPU cluster and the ocean module (ICON-O) has not been ported to modern accelerators yet. Thus, we make use of the modular supercomputing architecture (MSA) of JUWELS and its novel batch job options for the coupled ICON model with ICON-A running on the NVIDIA A100 GPUs of JUWELS Booster, while ICON-O and the model I/O are running simultaneously on the CPUs of the JUWELS Cluster partition. As expected, ICON performance is limited by ICON-A. Thus we chose the performance-optimal Booster-node configuration for ICON-A considering also memory requirements (84 nodes) and adapted ICON-O configuration to achieve minimum waiting times for simultaneous time step execution and data exchange (63 cluster nodes).  We compared runtime and energy efficiency to cluster-only simulations (on up to 760 cluster nodes) and found only small improvements in runtime for the MSA case, but energy consumption is already reduced by 26% without further improvements in vector length applied with ICON. When switching to even higher ICON resolutions, cluster-only simulations are not fitting to most of current HPC systems and upcoming exascale systems will rely to a large extent on GPU acceleration. Thus exploiting MSA capabilities is an important step towards performance portable and energy efficient use of km-scale climate models.

References:

Hohenegger et al., ICON-Sapphire: simulating the components of the Earth System and their interactions at kilometer and subkilometer scales, https://doi.org/10.5194/gmd-2022-171, in review, 2022.

Schär et al., Kilometer-Scale Climate Models: Prospects and Challenges, https://doi.org/10.1175/BAMS-D-18-0167.1, 2020.

 

How to cite: Stein, O., Bishnoi, A., Kornblueh, L., Hoffmann, L., Eicker, N., Suarez, E., and Meyer, C. I.: Modeling the Earth System on Modular Supercomputing Architectures: coupled atmosphere-ocean simulations with ICON, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-11117, https://doi.org/10.5194/egusphere-egu23-11117, 2023.

EGU23-12539 | Orals | ESSI2.8

European Weather Cloud: A community cloud tailored for big Earth modelling and EO data processing 

Roberto Cuccu, Vasileios Baousis, Umberto Modigliani, Charalampos Kominos, Xavier Abellan, and Roope Tervo

The European Centre for Medium-Range Weather Forecasts (ECMWF) together with the European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT) have worked together to offer to their Member States a new paradigm to access and consume weather data and services. The “European Weather Cloud-(EWC)” (https://www.europeanweather.cloud/), concluded its pilot phase and is expected to become operational during the first months of 2023.

This initiative aims to offer a community cloud infrastructure on which Member and Co‐operating States of both organizations can create on demand virtual compute (including GPUs) and storage resources to gain easy and high throughput access to the ECMWF’s Numerical Weather Predication (NWP) and EUMETSAT’s satellite data in a timely and configurable fashion. Moreover, one of the main goals is to involve more National Meteorological Services to jointly form a federation of clouds/data offered from their Member States, for the maximum benefit of the European Meteorological Infrastructure (EMI). During the pilot phase of the project, both organizations have jointly hosted user and technical workshops to actively engage with the meteorological community and align the evolution of the EWC to reflect and satisfy their operational goals and needs.

The EWC, in its pilot phase hosted several use cases, mostly aimed at users in the developers’ own organisations. These broad categories of these cases are:

  • Web services to explore hosted datasets
  • Data processing applications
  • Platforms to support the training of machine learning models on archive datasets
  • Workshops and training courses (e.g., ICON model training, ECMWF training etc)
  • Research in collaboration with external partners
  • World Meteorological Organization (WMO) support with pilots and PoC.

Some examples of the use cases currently developed at the EWC are:

  • The German weather service DWD, which is already feeding maps generated by a server it deployed on the cloud into its public GeoPortal service.
  • EUMETSAT and ECMWF joint use case assesses bias correction schemes for the assimilation of radiance data based on several satellite data time series
  • the Royal Netherlands Meteorological Institute (KNMI) hosts a climate explorer web application based on KNMI climate explorer data and ECMWF weather and climate reanalyses
  • The Royal Meteorological Institute of Belgium prepares ECMWF forecast data for use in a local atmospheric dispersion model.
  • NordSat, a collaboration of northern European countries which is developing and testing imagery generation tools in preparation for the Meteosat Third Generation (MTG) satellite products.
  • UK Met Office with the DataProximateCompute use case, which distributes compute workload close to data, with the automatic creation and disposal of Dask clusters, as well as the data plane VPN network, on demand and in heterogeneous cloud environments.

In this presentation, the status of the project, the offered services and how these are accessed by the end users along with examples of the existing use cases will be analysed. The plans, next steps for the evolution of the EWC and its relationship with other projects and initiatives (like DestinE) will conclude the presentation.

How to cite: Cuccu, R., Baousis, V., Modigliani, U., Kominos, C., Abellan, X., and Tervo, R.: European Weather Cloud: A community cloud tailored for big Earth modelling and EO data processing, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-12539, https://doi.org/10.5194/egusphere-egu23-12539, 2023.

EGU23-12785 | Orals | ESSI2.8

A Scalable Near Line Storage Solution for Very Big Data 

Neil Massey, Jack Leland, and Bryan Lawrence

Managing huge volumes of data is a problem now, and will only become worse with the advent of exascale computing and next generation observational systems. An important recognition is that data needs to be more easily migrated between storage tiers. Here we present a new solution, the Near-Line Data store (NLDS), for managing data migration between user facing storage systems and tape by using an object storage cache.  NLDS builds on lessons learned from previous experience developing the ESIWACE funded Joint Data Migration App (JDMA) and deploying it at the Centre for Environmental Data Analysis (CEDA). 
 
CEDA currently has over 50PB of data stored on a range of disk based storage systems.  These systems are chosen on cost, power usage and accessibility via a network, and include three different types of POSIX disk and object storage. Tens of PB of additional data are also stored on tape. Each of these systems has different workflows, interfaces and latencies, causing difficulties for users.  

NLDS, developed with ESIWACE2 and other funding, is a multi-tiered storage solution using object storage as a front end to a tape library.  Users interact with NLDS via a HTTP API, with a Python library and command-line client provided to support both programmatic and interactive use.  Files transferred to NLDS are first written to the object storage, and a backup is made to tape.  When the object storage is approaching capacity, a set of policies is interrogated to determine which files will be removed from it.  Upon retrieving a file, NLDS may have to first transfer the file from tape to the object storage, if it has been deleted by the policies.  This implements a multi-tier of hot (disk), warm (object storage) and cold (tape) storage via a single interface. While systems like this are not novel, NLDS is open source, designed for ease of redeployment elsewhere, and for use from both local storage and remote sites. 

NLDS is based around a microservice architecture, with a message exchange brokering communication between the microservices, the HTTP API and the storage solutions.  The system is deployed via Kubernetes, with each microservice in its own Docker container, allowing the number of services to be scaled up or down, depending on the current load of NLDS.  This provides a scalable, power efficient system while ensuring that no messages between microservices are lost.  OAuth is used to authenticate and authorise users via a pluggable authentication layer. The use of object storage as the front end to the tape allows both local and remote cloud-based services to access the data, via a URL, so long as the user has the required credentials. 

NLDS is a a scalable solution to storing very large data for many users, with a user-friendly front end that is easily accessed via cloud computing. This talk will detail the architecture and discuss how the design meets the identified use cases.

How to cite: Massey, N., Leland, J., and Lawrence, B.: A Scalable Near Line Storage Solution for Very Big Data, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-12785, https://doi.org/10.5194/egusphere-egu23-12785, 2023.

EGU23-12851 | Orals | ESSI2.8

From the Copernicus satellite data to an environmentally aware field decision 

Fabien Castel and Emma Rizzi

Tackling complex environmental issues requires accessing and processing a wide range of voluminous data. The Copernicus spatial data is a very complete and valuable source for many earth science domains, in particular thanks to its Core Services (Land, Atmosphere, Marine…). For almost five years now, Copernicus DIAS platforms have provided broad access to the core services products through the cloud. Among these platforms, the Wekeo platform operated by EUMETSAT, Mercator Ocean, ECMWF and EEA provides wider access to Copernicus Core Service data.

However, Copernicus data needs an additional layer of processing and preparation to be presented and understood by the general public and decision makers. Murmuration has developed data processing pipelines to produce environmental indicators from Copernicus data constituting powerful tools to put environmental issues at the centre of decision-making processes.

Throughout its use, limitations on the DIAS platforms were observed. Firstly, the cloud service offerings are basic in comparison to the market leaders (such as AWS and GCP). In particular, there is no built-in solution for automating and managing data processing pipelines, which must be set up at the user's expense. Secondly, the cost of resources is higher than market price. Limiting the activities on DIAS to edge data processing and relying on a cheaper offering for applications not requiring the direct access to raw Copernicus data is a cost effective choice.  FInally, the performance and reliability requirements to access the data can sometimes not be met when relying on a single DIAS platform. Implementing a multi-DIAS approach ensures backup data sources. This raises the question of the automation and orchestration of such a multi-cloud system.

We propose an approach combining the wide data offer of the DIAS platforms, the automation features provided by the Prefect platform and the usage of efficient cloud technologies to build a repository of environmental indicators. Prefect is a hybrid orchestration platform dedicated to automation of data processing flows. It does not host any data processing flow itself and rather connects in a cloud-agnostic way to any cloud environment, where periodic and triggered flow executions can be scheduled. Prefect centrally controls flows that run on different cloud environments through a single platform.

Technologies leveraged to build the system allow to efficiently produce and disseminate the environmental indicators: firstly, containerisation and clustering (using Docker and Kubernetes) to manage processing resources; secondly object storage combined with cloud native access (Zarr data format); and finally, the Python scientific software stack (including pandas, scikit-learn, etc.) complemented by the powerful Xarray library. Data processing pipelines ensure a path from the NetCDF Copernicus Core Services products to cloud-native Zarr products. The Zarr format allows windowed read/write operations, avoiding unnecessary data transfers. This efficient data access allows plugging into the data repository fast data dissemination services following well-established OGC standards and feeding interactive dashboards for decision makers. The cycle is complete, from the Copernicus satellite data to an environmentally aware field decision.

How to cite: Castel, F. and Rizzi, E.: From the Copernicus satellite data to an environmentally aware field decision, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-12851, https://doi.org/10.5194/egusphere-egu23-12851, 2023.

EGU23-13768 | ECS | Posters on site | ESSI2.8

FAIR Notebooks: opportunities and challenges for the geoscience community 

Alejandro Coca-Castro, Anne Fouilloux, J. Scott Hosking, and Environmental Data Science Book community

Making assets in scientific research Findable, Accessible, Interoperable and Reusable (FAIR) is still overwhelming for many scientists. When considered as an afterthought, FAIR research is indeed challenging, and we argue that its implementation is by far much easier when considered at an early stage and focusing on improving the researchers' day to day work practices. One key aspect is to bundle all the research artefacts in a FAIR Research Object (RO) using RoHub (https://reliance.rohub.org/), a Research Object management platform that enables researchers to collaboratively manage, share and preserve their research work (data, software, workflows, models, presentations, videos, articles, etc.). RoHub implements the full RO model and paradigm: resources associated to a particular research work are aggregated into a single FAIR digital object, and metadata relevant for understanding and interpreting the content is represented as semantic metadata that are user and machine readable. This approach provides the technical basis for implementing FAIR executable notebooks: the data and the computational environment can be “linked” to one or several FAIR notebooks that can then be executed via EGI Binder Service with scalable compute and storage capabilities. However, the need for defining clear practises for writing and publishing FAIR notebooks that can be reused to build upon new research has quickly arised. This is where a community of practice is required. The Environmental Data Science Book (or EDS Book) is a pan-european community-driven resource hosted on GitHub and powered by Jupyter Book. EDS Book provides practical guidelines and templates that help to translate research outputs into curated, interactive, shareable and reproducible executable notebooks. The quality of the FAIR notebooks is ensured by a collaborative and transparent reviewing process supported by GitHub related technologies. This approach provides immediate benefits for those who adopt it and can feed fruitful discussions to better define a reward system that would benefit Science and scientific communities. All the resources needed for understanding and executing the notebook are gathered into an executable Research Object in RoHub. To date, the community has successfully published ten FAIR notebooks covering a wide range of topics in environmental data science. The notebooks consume open-source python libraries e.g. intake, iris, xarray, hvplot for fetching, processing and interactively visualising environmental research.  While these notebooks are currently python-based, EDS Book supports other programming languages such as R and Julia, and we are aiming at engaging with computational notebooks communities alike towards improving the research practices in environmental science.

How to cite: Coca-Castro, A., Fouilloux, A., Hosking, J. S., and community, E. D. S. B.: FAIR Notebooks: opportunities and challenges for the geoscience community, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-13768, https://doi.org/10.5194/egusphere-egu23-13768, 2023.

EGU23-14507 | Orals | ESSI2.8

geokube: A Python Package for Data Analysis and Visualization in Geoscience 

Marco Mancini, Mirko Stojiljkovic, and Jakub Walczak

geokube is a Python package for data analysis and visualisation in geoscience that  provides high level abstractions in terms of both Data Model, inspired by Climate Forecast and Unidata Common Data Models, and Application Programming Interface (API), inspired by xarray. Key features of geokube are the capabilities to: (i) perform georeferenced axis-based indexing on data structures and specialised geospatial operations according to different types of geo scientific datasets like structured grids, point observations, profiles etc. (e.g. extracting a bounding box or a multipolygon of variable values defined on a rotated pole grid), (ii) perform operations on the variables that are either instantaneous or defined over intervals, (iii) convert to/from xarray data structures and to read/write CF-compliant netCDF datasets.

How to cite: Mancini, M., Stojiljkovic, M., and Walczak, J.: geokube: A Python Package for Data Analysis and Visualization in Geoscience, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-14507, https://doi.org/10.5194/egusphere-egu23-14507, 2023.

EGU23-14515 | ECS | Orals | ESSI2.8

Intaking DKRZ ESM data collections 

Fabian Wachsmann

In this showcase, we present to you how Intake and its plugin Intake-ESM are utilized at DKRZ to provide highly FAIR data collections from different projects, stored on different types of storages in different formats.

The Intake Plugin Intake-ESM allows users to not only find the data of interest, but also load them as analysis-ready-like Xarray datasets. We utilize this tool to provide users with access to many available data collections at our institution from only one single access point, the main DKRZ intake catalog at www.dkrz.de/s/intake. The functionality of this package works independently of data standards and formats and therefore enables full metadata-driven data access including data processing. Intake-esm catalogs increase the FAIRness of the data collections in all aspects but especially in terms of Accessibility and Interoperability.

Started with a collection of DKRZ’s CMIP6 Data Pool, DKRZ now hosts catalogs for more than 10PB of data on different local storages. The Intake-ESM package has been well integrated into ESM data provisioning workflows.

  • Early sharing and making accessible: The co-developed inhouse ICON model generates an intake-esm catalog on each run.
  • Uptake from other technologies: E.g., intake-esm catalogs serve as templates for the more advanced DKRZ STAC Catalogs. 
  • Making accessible all storage types: tools used for writing data to the local institutional cloud allow users to create Intake-ESM catalogs for the written data.
  • Data archiving: Catalogs for projects in the archive can be created from its metadata database.

For future activities, we plan to make use of new functionalities like the support for kerchunked data and the derived variable registry.

The DKRZ data management team develops and maintains local services around intake-esm for a positive user experience. In this showcase, we will present excerpts of seminars, workflows and integrations.

How to cite: Wachsmann, F.: Intaking DKRZ ESM data collections, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-14515, https://doi.org/10.5194/egusphere-egu23-14515, 2023.

EGU23-14547 | Orals | ESSI2.8

PANGEO multidisciplinary test case for Earth and Environment Big data analysis in FAIR-EASE Infra-EOSC project 

Marine Vernet, Erwan Bodere, Jérôme Detoc, Christelle Pierkot, Alessandro Rizzo, and Thierry Carval

Earth observation and modelling is a major challenge for research and a necessity for environmental and socio-economic applications. It requires voluminous and heterogeneous data from distributed and domain-dependent data sources, managed separately by various national and European infrastructures.

In a context of unprecedented data wealth and growth, new challenges emerge to enable inter-comparison, inter-calibration and comprehensive studies and uses of earth system and environmental data.

To this end, the FAIR-EASE project aims to provide integrated and interoperable services through the European Open Science Cloud to facilitate the discovery, access and analysis of large volumes of heterogeneous data from distributed sources and from different domains and disciplines of Earth system science.

This presentation will explain how the PANGEO stack will be used within FAIR EASE to improve data access, interpolation and analysis, but will also explore its integration with existing services (e.g. Galaxy) and underlying IT infrastructure to serve multidisciplinary research uses.

How to cite: Vernet, M., Bodere, E., Detoc, J., Pierkot, C., Rizzo, A., and Carval, T.: PANGEO multidisciplinary test case for Earth and Environment Big data analysis in FAIR-EASE Infra-EOSC project, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-14547, https://doi.org/10.5194/egusphere-egu23-14547, 2023.

Observational meteorological data is central to understanding atmospheric processes, and is thus a key requirement for the calibration and validation of atmospheric and numerical weather prediction models. While recent decades have seen the development of notorious platforms to make satellite data easily accessible, observational meteorological data mostly remains scattered through the sites of regional and national meteorological service, each potentially offering different magnitudes, temporal coverage and data formats. 

In order to overcome these shortcomings, we propose meteostations-geopy, a Pythonic library to access data from meteorological stations. The central objective is to provide a common interface to retrieve observational meteorological data, therefore reducing the amount of time required to process and wrangle the data. The library interacts with APIs from different weather services, handling authentication if needed and transforming the requested information into geopandas data frames of geolocated and timestamped observations that are homogeneously structured independently of the provider. 

The project is currently in an early development stage with support for two providers only. Current and future work is organized in three interrelated main axes, namely integration of further providers, implementation of native support of distributed data structures and organization of the library into the intake technical structure with drivers, catalogs, metadata sharing and plugin packages that are provider specific.

How to cite: Bosch, M.: meteostations-geopy: a Pythonic interface to access data from meteorological stations, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-14774, https://doi.org/10.5194/egusphere-egu23-14774, 2023.

EGU23-15964 | ECS | Orals | ESSI2.8

A novel data ecosystem for coastal analyses 

Floris Calkoen, Fedor Baart, Etiënne Kras, and Arjen Luijendijk

The coastal community widely anticipates that in the next years data-driven studies are going to make essential contributions to bringing about long-term coastal adaptation and mitigation strategies at continental scale. This view is also supported by CoCliCo, a Horizon 2020 project, where coastal data form the fundamental building block for an open-web portal that aims to improve decision making on coastal risk management and adaptation. The promise of data is likely triggered by several coastal analyses that showed how the coastal zone can be be monitored at unprecedented spatial scales using geospatial cloud platforms . However, we note that when analyses become more complex, i.e., require specific algorithms, pre- and post-processing or include data that are not hosted by the cloud provider, the cloud-native processing workflows are often broken, which makes analyses at continental scale impractical.

We believe that the next generation of data-driven coastal models that target continental scales can only be built when: 1) processing workflows are scalable; 2) computations are run in proximity to the data; 3) data are available in cloud-optimized formats; 4) and, data are described following standardized metadata specifications. In this study, we introduce these practices to the coastal research community by showcasing the advantages of cloud-native workflows by two case studies.

In the first example we map building footprints in areas prone to coastal flooding and estimate the assets at risk. For this analysis we chunk a coastal flood-risk map into several tiles and incorporate those into a coastal SpatioTemporal Asset Catalog (STAC). The second example benchmarks instantaneous shoreline mapping using cloud-native workflows against conventional methods. With data-proximate computing, processing time is reduced from the order of hours to seconds per shoreline km, which means that a highly-specialized coastal mapping expedition can be upscaled from regional to global level.

The analyses mostly rely on "core-packages" from the Pangeo project, with some additional support for scalable geospatial data analysis and cloud I/O, although they can essentially be run on a standard Python Planetary Computer instance. We publish our code, including self-explanatory Juypter notebooks, at https://github.com/floriscalkoen/egu2023.

To conclude, we foresee that in next years several coastal data products are going to be published, of which some may be considered "big data". To incorporate these data products into the next generation of coastal models, it is urgently required to agree upon protocols for coastal data stewardship. With this study we do not only want to show the advantages of scalable coastal data analysis; we mostly want to encourage the coastal research community to adopt FAIR data management principles and workflows in an era of exponential data growth.

How to cite: Calkoen, F., Baart, F., Kras, E., and Luijendijk, A.: A novel data ecosystem for coastal analyses, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-15964, https://doi.org/10.5194/egusphere-egu23-15964, 2023.

EGU23-16117 | ECS | Orals | ESSI2.8

Virtual aggregations to improve scientific ETL and data analysis for datasets from the Earth System Grid Federation 

Ezequiel Cimadevilla, Maialen Iturbide, and Antonio S. Cofiño

The ESGF Virtual Aggregation (EVA) is a new data workflow approach that aims to advance the sharing and reuse of scientific climate data stored in the Earth System Grid Federation (ESGF). The ESGF is a global infrastructure and network of internationally distributed research centers that together work as a federated data archive, supporting the distribution of global climate model simulations of the past, current and future climate. The ESGF provides modeling groups with nodes for publishing and archiving their model outputs to make them accessible to the climate community at any time. The standardization of the model output in a specified format, and the collection, archival and access of the model output through the ESGF data replication centers have facilitated multi-model analyses. Thus, ESGF has been established as the most relevant distributed data archive for climate data, hosting the data for international projects such as CMIP and CORDEX. As of 2022 it includes more than 30 PB of data distributed across research institutes all around the globe and it is the reference archive for Assessment Reports (AR) on Climate Change produced by the Intergovernmental Panel on Climate Change (IPCC). However, explosive data growth has confronted the climate community with a scientific scalability issue. Conceived as a distributed data store, the ESGF infrastructure is designed to keep file sizes manageable for both sysadmins and end users. However, use cases in scientific research often involve calculations on datasets spanning multiple variables, over the whole time period and multiple model ensembles. In this sense, the ESGF Virtual Aggregation extends the federation capabilities, beyond file search and download, by providing out of the box remote climate data analysis capabilities over data analysis ready, virtually aggregated, climate datasets, on top of the existing software stack of the federation. In this work we show an analysis that serves as a test case for the viability of the data workflow and provides the basis for discussions on the future of the ESGF infrastructure, contributing to the debate on the set of reliable core services upon which the federation should be built.

Acknowledgements

This work it’s been developed under support from IS-ENES3 which is funded by the European Union’s Horizon 2020 research and innovation programme under grant agreement No 824084.

This work it’s been developed under support from CORDyS (PID2020-116595RB-I00) funded by MCIN/AEI/10.13039/501100011033.

How to cite: Cimadevilla, E., Iturbide, M., and Cofiño, A. S.: Virtual aggregations to improve scientific ETL and data analysis for datasets from the Earth System Grid Federation, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-16117, https://doi.org/10.5194/egusphere-egu23-16117, 2023.

EGU23-17029 | Orals | ESSI2.8

Establishing a Geospatial Discovery Network with efficient discovery and modeling services in multi-cloud environments 

Campbell Watson, Hendrik Hamann, Kommy Weldemariam, Thomas Brunschwiler, Blair Edwards, Anne Jones, and Johannes Schmude

The ballooning volume and complexity of geospatial data is one of the main inhibitors for advancements in climate & sustainability research. Oftentimes, researchers need to create bespoke and time-consuming workflows to harmonize datasets, build/deploy AI and simulation models, and perform statistical analysis. It is increasingly evident that these workflows and the underlying infrastructure are failing to scale and exploit the massive amounts of data (Peta and Exa-scale) which reside across multiple data centers and continents. While there have been attempts to consolidate relevant geospatial data and tooling into single cloud infrastructures, we argue that the future of climate & sustainability research relies on networked/federated systems. Here we present recent progress towards multi-cloud technologies that can scale federated geospatial discovery and modeling services across a network of nodes. We demonstrate how the system architecture and associated tooling can simplify the discovery and modeling process in multi-cloud environments via examples of federated analytics for AI-based flood detection and efficient data dissemination inspired by AI foundation models.

How to cite: Watson, C., Hamann, H., Weldemariam, K., Brunschwiler, T., Edwards, B., Jones, A., and Schmude, J.: Establishing a Geospatial Discovery Network with efficient discovery and modeling services in multi-cloud environments, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-17029, https://doi.org/10.5194/egusphere-egu23-17029, 2023.

EGU23-17494 | Orals | ESSI2.8

Enabling simple access to a data lake both from HPC and Cloud using Kerchunk and Intake 

Thierry Carval, Erwan Bodere, Julien Meillon, Mathiew Woillez, Jean Francois Le Roux, Justus Magin, and Tina Odaka

We are experimenting with hybrid access from Cloud and HPC environments using the Pangeo platform to make use of a data lake in an HPC infrastructure “DATARMOR”.  DATARMOR is an HPC infrastructure hosting ODATIS services (https://www.odatis-ocean.fr) situated at “Pôle de Calcul et de Données pour la Mer” in IFREMER. Its parallel file system has a disk space dedicated for shared data, called “dataref”.  Users of DATARMOR can access these data, and some of those data are cataloged by sextant service (https://sextant.ifremer.fr/Ressources/Liste-des-catalogues-thematiques/Datarmor-Donnees-de-reference ) and is open and accessible from the internet, without duplicating the data. 

In the cloud environment, the ability to access files in a parallel manner is essential for improving the speed of calculations. The Zarr format (https://zarr.readthedocs.io) enables parallel access to data sets, as it consists of numerous chunked “object data” files and some “metadata” files. Although it enables multiple data access, it is simple to use since all the collections of data stored in a Zarr format are accessible through one access point.  

For HPC centers, the numerous “object data” files create a lot of metadata on parallel file systems, slowing the data access time. Recent progress on development of Kerchunk (https://fsspec.github.io/kerchunk/), which recognize the chunks in a file (e.g. NetCDF / HDF5) as a Zarr chunk and its capability to recognize a series of files as one Zarr file, is solving these technical difficulties in our PANGEO use cases at DATARMOR. Thanks to Kerchunk and Intake (https://intake.readthedocs.io/) it is now possible to use different sets of data stored in DATARMOR in an efficient and simple manner.    

We are further experimenting with this workflow using the same use cases on the PANGEO-EOSC cloud.   We make use of the same data stored at the data lake in DATARMOR, but based on Kerchunk and Intake catalog through ODATIS access, without duplicating the source data. In the presentation we will share our recent experiences from these experiments. 

How to cite: Carval, T., Bodere, E., Meillon, J., Woillez, M., Le Roux, J. F., Magin, J., and Odaka, T.: Enabling simple access to a data lake both from HPC and Cloud using Kerchunk and Intake, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-17494, https://doi.org/10.5194/egusphere-egu23-17494, 2023.

EGU23-515 | ECS | Posters on site | HS1.2.1

Estimating sheet flow velocities using quinine as a fluorescent tracer in low luminosity conditions: laboratory and field experiments 

Soheil Zehsaz, João L. M. P. de Lima, M. Isabel P. de Lima, Jorge M. G. P. Isidoro, and Ricardo Martins

This study presents a technique based on the use of quinine as a fluorescent tracer, to estimate sheet flow velocities over various surface coverings (e.g., bare; mulched; vegetated; paved) in low luminosity conditions (e.g., night; twilight; shielded environments). Quinine glows when exposed to UVA light and in the concentrations used is not harmful to the environment. Experimental work was conducted for studying sheet flows in the i) laboratory (using a soil flume), over bare and mulched surfaces, and ii) field, over vegetated and paved surfaces. Flow velocities were estimated based on the injection of a quinine solution into the water flow.  In these experiments, dye and thermal tracer techniques were used as a benchmark for assessing the performance of the quinine tracer. Optical and infrared cameras were used to record the movement of the tracers’ plumes in the flow. The surface velocity of the flow was estimated by tracking the tracers’ plumes leading-edge and calculating their travel distance over a certain time lapse. Overall, the visibility of the quinine tracer was better in comparison to the dye tracer. However, under some circumstances, lower than the visibility of the thermal tracer. Nonetheless, the results show that all three tracers yielded similar estimations of the flow velocities. Therefore, when exposed to UVA light the quinine tracer can be useful to estimate sheet flow velocities over a wide variety of soil and urban surfaces in low luminosity conditions. Despite some inherent limitations of this technique (e.g., invisible under bright light conditions or heavy mulched/vegetated cover; need of a UVA lamp), its main advantage is the high visibility of the quinine fluorescent tracer under UVA light for fade light conditions (e.g., night; twilight; shielded environments such as close conduits), which creates new opportunities for tracer-based surface flow velocity measurements in surface hydrology studies.

How to cite: Zehsaz, S., de Lima, J. L. M. P., de Lima, M. I. P., Isidoro, J. M. G. P., and Martins, R.: Estimating sheet flow velocities using quinine as a fluorescent tracer in low luminosity conditions: laboratory and field experiments, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-515, https://doi.org/10.5194/egusphere-egu23-515, 2023.

EGU23-649 | ECS | Posters on site | HS1.2.1

Near Real-Time Depth Change Monitoring on Inland Water Bodies Using Sentinel-1 and Dynamic World Data 

Utku Berkalp Ünalan, Onur Yüzügüllü, and Ayşegül Aksoy

Monitoring the depth changes in lakes is crucial to understanding hydrological dynamics and water quality changes. In developed countries, the authorities monitor the lake depths regularly; however, it might be different in developing and underdeveloped countries. In this study, we aim to develop a near-real-time SAR-based depth change monitoring system for lakes by focusing on shoreline pixels. For this purpose, we developed a framework using the Sentinel-1 GRD and Sentinel-2 Dynamic World land cover datasets available on the Google Earth Engine. Sentinel-1 data provides us with the necessary temporal resolution for frequent monitoring. For the initial development phase, we consider five ground monitoring stations in Sweden and one in Turkey. The approach starts by detecting water bodies within a selected area of interest using Sentinel-1. Then it extracts shoreline pixels to calculate the change in the VV and VH sigma naught and VV-VH and VV+VH Pauli vectors. Extracted differences are further classified according to the temporally closest Dynamic World data to handle the temporal difference for each land cover type. Next, we eliminate outlier values based on the percentiles, and from the remaining data, we sample each landcover class for modeling. From many of the tested frameworks, we obtained an R2 of 0.79 with Gaussian Process Regression. Currently, in this framework, we observed an underestimation of higher values and an overestimation of lower values within a range of ±0.4 cm. Furthermore, considering the chosen six lakes, we observed a negative correlation between depth change and polarimetric features obtained from samples taken from land covers of grass and flooded vegetation, which is typical for natural lakes. In the second step of the development, we will increase the number of samples by including lakes from Switzerland and further develop the model.

How to cite: Ünalan, U. B., Yüzügüllü, O., and Aksoy, A.: Near Real-Time Depth Change Monitoring on Inland Water Bodies Using Sentinel-1 and Dynamic World Data, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-649, https://doi.org/10.5194/egusphere-egu23-649, 2023.

Monitoring dissolved methane in aquatic ecosystems contributes significantly to advancing our understanding of the carbon cycle in these habitats and capturing their impact on methane emissions. Low-cost metal oxide semiconductors (MOS) gas sensors are becoming an increasingly attractive tool to perform such measurements, especially at the air-water interface. However, the performance of MOS sensors in aquatic environmental sciences has come under scrutiny because of their cross-sensitivity to temperature, moisture, and sulfide interference. In this study, we evaluated the performance and limitations of a MOS methane sensor when measuring dissolved methane in waters. A MOS sensor was encapsulated in a hydrophobic ePTFE membrane to impede contact with water but allow gas perfusion. Therefore, the membrane enabled us to submerge the sensor in water and overcome cross-sensitivity to humidity. A simple portable, low-energy, flow-through cell system was assembled that included an encapsulated MOS sensor and a temperature sensor. Waters (with or without methane) were injected into the flow cell at a constant rate by a peristaltic pump. The signals from the two sensors were recorded continuously with a cost-efficient Arduino UNO microcontroller.. Our experiments revealed that the lower limit of the sensor was in the range of 0.1-0.2 uM and that it provided a stable response at water temperatures in the range of 18.5-28oC. More information at Butturini, A., & Fonollosa, J. (2022). Use of metal oxide semiconductor sensors to measure methane in aquatic ecosystems in the presence of cross‐interfering compounds. Limnology and Oceanography: Methods20(11), 710-720.

How to cite: Butturini, A. and Fonollosa, J.: Metal oxide semiconductor (MOS) sensors to measure methane in aquatic ecosystems. An eficient DIY low  cost application., EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-1221, https://doi.org/10.5194/egusphere-egu23-1221, 2023.

EGU23-1636 | Posters virtual | HS1.2.1

Using the hydrological model for filling the missing discharge data by using multi-site calibration 

Ankit Singh, Hemant Kumar Dhaka, Pragati Prajapati, and Sanjeev Kumar Jha

The river discharge data is one of the most important pieces of information to regulate various water resources, including flood frequency analysis, drought and flood prediction, etc. The missing observer discharge data, even a short gap, influences the whole analysis and gives a totally different result. Filling data gaps in streamflow data is thus a critical step in any hydrological study. Interpolation, regression-based analysis, artificial neural networks, and modeling are all methods for generating missing data. While using the hydrological model to generate the data, we first need to calibrate the hydrological model. The single-site calibration of the hydrological model has its own limitations, due to which it does not correctly predict the streamflow at intermediate gauge locations. This is because, while calibrating the model for the final outlet, we tune the parameters that affect the results for the final outlet only and neglect the intermediate sites' output. In this study, we demonstrate the importance of multi-site calibration and use the calibrated hydrological model to generate the missing data at intermediate sites.

For this study, we selected the Godavari River basin and calibrated it at the final outlet (single-site calibration) and at 18 + 1 outlets (multi-site calibration). The whole basin is divided into 103 subbasins, and the Soil and Water Assessment Tool (SWAT) hydrological model is used for this study. After the successful multi-site calibration, we generated the missing data at 25 different gauging locations. The initial results from single-site calibration (NSE (0.57) and R2 (0.61)) show good agreement between observed and simulated discharge for the final outlet. The multi-site calibration analysis is in progress, and full results will be presented at the conference.

How to cite: Singh, A., Dhaka, H. K., Prajapati, P., and Jha, S. K.: Using the hydrological model for filling the missing discharge data by using multi-site calibration, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-1636, https://doi.org/10.5194/egusphere-egu23-1636, 2023.

EGU23-2681 | Posters on site | HS1.2.1

A low cost real-time kinematic dGPS system for measuring glacier movement 

Kirk Martinez, Jane Hart, Sherif Attia, Graeme Bragg, Marcus Corbin, Michael Jones, Christian Kuhlmann, Elliot Weaver, Richard Wells, Ioannis Christou, and Emily James

Glacier movement has been measured over the years using commercial units such as those from Leica. The aim is to measure point movements on the glacier surface in order to capture fine-grained data about its movement. This can also help to calibrate satellite-based approaches which have much lower resolution. Commercial dGPS recorders cost thousands of Euros so our project is creating a solution using new lower cost dGPS boards which could enable their use by more earth scientists.

The u-blox Zed-F9P based boards from Sparkfun can be used as a base station to send dGPS corrections to “rover” units on the glacier via a radio link. Each measurement is accurate to about 2cm depending on conditions. In our design the radio is used by the rovers to forward good fixes back to the base station, which then uses off-site communications to send the data home. Two types of internet link have been enabled: using a nano-satellite board (by SWARM) and a more traditional GSM mobile phone board (for locations with coverage). Both these boards are also available from Sparkfun – making most of the modules off-the-shelf. However our power supply is optimised to save power and charge the lithium ion battery from a solar panel. A real-time clock chip is used to wake up the system to take readings and transmit data, so the sleep power is only 0.03 mW enabling a year-long lifetime. The whole system is controlled by a Sparkfun Thing Plus SAMD51 which provides the required four serial connections and a circuitpython  environment. The full system will be installed in Iceland in the summer of 2023 and replace the previous prototype based on Swift Piksi Multi units which had shown the measurement principle to be sound.

How to cite: Martinez, K., Hart, J., Attia, S., Bragg, G., Corbin, M., Jones, M., Kuhlmann, C., Weaver, E., Wells, R., Christou, I., and James, E.: A low cost real-time kinematic dGPS system for measuring glacier movement, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-2681, https://doi.org/10.5194/egusphere-egu23-2681, 2023.

EGU23-4844 | Posters on site | HS1.2.1

Quality control of stream water-stage using Hilbert-Huang Transform 

Yen- Chang Chen and Wu-Hsien Hsiao

Hydrological data, especially water stage and discharge, is very important for water resources planning and development, hydraulic structure design, and water resources management. Thus the hydrological data has to be observed and collected regularly and continuously. The hydrological data can be affected by many factors such as people, instruments, and climate. Therefore, the collected hydrological data still need to be subject to quality control and inspection to eliminate unreasonable data to ensure the accuracy and reliability. Traditionally, the quality control and inspection of stream water-stage is mainly manual. The verification of water stage data needs experienced hydrologists to judge the correctness of the data, and cannot be processed automatically. It is time consumed, costly, and labor intensive to process the quality control of stream water stage. Therefore, it is necessary to develop a feasible model to automatically check stream water-stage for providing reliable and accurate hydrological data.

This study applies Hilbert-Huang Transform (HHT) to process stream water-stage. The HHT is composed of Empirical Mode Decomposition (EEMD) and Hilbert transform (HT). The EEMD decomposes stream water-stage into many intrinsic mode functions (IMFs) and a residual. The first IMF component is used for Hilbert transform conversion to obtain the time amplitude energy relationship diagram. The amplitude fluctuation of the corresponding component of the stream water-stage, the amplitude value of the outliers can be revealed. When the amplitude value is larger than usual, there may be outliers, and vice versa. It depends on the threshold that is established in this study as the basis for filtering the incorrect water-stage. Therefore automatically inspecting the water-stage data can be achieved. The model for automatic inspecting procedure developed by this study will greatly reduce the manual quality control, not only shorten the checking time, save manpower, but also provide reliable and correct river water stage data.

How to cite: Chen, Y.-C. and Hsiao, W.-H.: Quality control of stream water-stage using Hilbert-Huang Transform, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-4844, https://doi.org/10.5194/egusphere-egu23-4844, 2023.

EGU23-4878 | Posters on site | HS1.2.1

Trials and design iterations experienced developing a low-cost depth trawl to sample macroplastic through the water column of a tidal river. 

David Higgins, Renata Correia, Hooi Siang Kang, Lee Kee Quen, Tan Lit Ken, Andre Vollering, Stijn Pinson, Thaine H. Assumpção, and Thomas Mani

Understanding the transport behaviour of mismanaged plastic waste in riverine and estuarine environments is growing. However, many studies to date focus on the surface layer transport while a limited number look to measure the vertical distribution of plastic waste within these systems. Factors such as density, shape, the influence of wind and flow velocity can determine the vertical distribution of the plastic waste in a river, but many knowledge gaps remain. With this, and as technology developers move to create innovative river surface focused interception solutions to extract plastic waste, a greater understanding of the transport behaviour of sub-surface plastic debris is required. Here, we present a comprehensive overview of the development stages required to build and deploy a low-cost depth trawl tool designed to sample plastic waste at a depth of up to 5m in a heavily polluted river in Malaysia. Topics covered include tool design concepts, manufacturing methods, onsite testing, river deployment learnings and sampling results. Field data is compiled from over 60 sampling surveys conducted over 14 days in several locations along the Klang River, Malaysia. The depth trawl is mounted to a locally available fishing boat (sampan) and consists of two steel horizontal arms, a steel frame, two winches, cables, weights, five nets, and is operated manually with the assistance of a solar-powered motor. The dimensions of each net are 30cm (W) x 50cm (H) x 100cm (L) with a mesh size of 30mm x 30mm. To ensure that the nets remain aligned vertically during deployment, a weight of 15kg is tied to the bottom of the net system on both sides. Samples were collected every 1 metre to a depth of 5 metres. Each sampling was conducted for 15 minutes, six times per day with an interval of 1 hour between samples to allow for changes in the tide and river flow direction. An ADCP was deployed in parallel to the depth trawl to provide measurements of flow velocity variation at the river surface and with depth. In addition, this paper reviews the depth trawl system’s capabilities and recommendations for further studies and applications in the field.

How to cite: Higgins, D., Correia, R., Kang, H. S., Quen, L. K., Ken, T. L., Vollering, A., Pinson, S., H. Assumpção, T., and Mani, T.: Trials and design iterations experienced developing a low-cost depth trawl to sample macroplastic through the water column of a tidal river., EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-4878, https://doi.org/10.5194/egusphere-egu23-4878, 2023.

EGU23-5922 | Posters on site | HS1.2.1

Effectiveness-assessment of nature-based flood mitigation using networked, low cost DIY environmental monitoring from FreeStation 

Sophia Burke, Arnout van Soesbergen, and Mark Mulligan

FreeStations are mature low-cost, networked, DIY environmental sensors and data loggers, developed since 2014  and now deployed around the world.  Build instructions are open source at www.freestation.org and based on high availability, low cost but accurate and robust components (with builds typically 3% the parts-cost of an equivalent proprietary monitoring systems).  This allows investment in a network of environmental loggers at the cost of a single, proprietary logger.  

FreeStations have been widely deployed in the DEFRA Natural Flood Management (NFM) national trials in the UK, and analytical methods developed to examine the performance of leaky dams, retention ponds, regenerative agricultural practices and other nature based solutions in mitigating flood risk at downstream assets.

These deployments usually consist of FreeStation weather stations: recording rainfall volume, rainfall intensity, air temperature, humidity and pressure as well as solar radiation, wind speed and direction.  The rainfall volume and instantaneous intensity are the most important for NFM studies.  Alongside weather stations, FreeStation sonar-based stage sensors are used, alongside river profile scan from a FreeStation LIDAR, to monitor change in river discharge due to an NFM intervention, relative to discharge at a downstream asset at risk.  Readings are taken at 10-minute intervals over multiple years.

A series of web based methods have been built as part of the FreeStation //Smart: platform to monitor and manage data from deployments and to analyse data to better understand flood mitigation by the key types of intervention.  In testing at more than 10 sites in the UK over a period of 2-3 years per site, large volumes of data have been collected at low cost and in support of local stakeholders during the H2020NAIAD and H2020ReSET projects.  

The data indicate the importance of careful design in leaky debris dams, the limited impact of inline retention ponds and the significant capacity of low-till farming methods to mitigate downstream flooding.  The effectiveness of NFM depends upon the number and scale of interventions, the proportion of the discharge at the downstream asset at risk which they affect (i.e. the downstream proximity of the asset at risk) and the capital and maintenance costs of the interventions. 

Low-cost approaches to environmental monitoring will be critical for developing the evidence base needed to better understand what nature based solutions work, where for water.  Low cost, internet-connected devices are easy to monitor and maintain, low risk and capable of extensive deployment to address the challenge of geographical variability which means that the impacts of specific NFM interventions are highly site specific. 

How to cite: Burke, S., van Soesbergen, A., and Mulligan, M.: Effectiveness-assessment of nature-based flood mitigation using networked, low cost DIY environmental monitoring from FreeStation, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-5922, https://doi.org/10.5194/egusphere-egu23-5922, 2023.

EGU23-8165 | Posters on site | HS1.2.1

Developing a smart sensor network for soil moisture monitoring in forests 

Nikita Aigner, Christine Moos, and Estelle Noyer

Forests play a crucial role in regulating the water content of soils and thus influence runoff formation, but also the susceptibility to drought or forest fires. However, the extent to which forests influence soil moisture is difficult to quantify and depends on several parameters, such as precipitation intensity and duration, and terrain or soil properties. To capture the temporal and spatial variability of soil moisture in forests, large-scale and long-term measurements are necessary. Currently, such measurements are relatively expensive and complex and thus generally lacking or restricted to agricultural areas.  

Our current work focuses on the development of a low-cost soil moisture sensor that uses off the shelf parts and can be deployed at scale to provide continuous long-term measurements. To increase adoption and ensure the digital sustainability of our concept, the project will be released open source to the general public.  

The sensor design is based around an ESP32 microcontroller to manage measurements with capacitive soil moisture sensors. For communication, we leverage the LoRa protocol and use infrastructure provided by the Things Network (TTN). Herein, we present the soft- and hardware architecture of a sensor prototype and results obtained from a proof-of-concept deployment. In addition, we discuss the calibration procedure and evaluation of capacitive soil moisture sensors (in comparison to time-domain reflectometry (TDR) sensors). Finally, we provide an outlook on future developments of our measurement system. The final goal of this project is to deploy sensors in several areas of interest that will allow for gathering data for a better understanding of the interaction of forests and soil moisture content.  

How to cite: Aigner, N., Moos, C., and Noyer, E.: Developing a smart sensor network for soil moisture monitoring in forests, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-8165, https://doi.org/10.5194/egusphere-egu23-8165, 2023.

EGU23-10497 | ECS | Posters virtual | HS1.2.1

Synchronized mapping of water quantity and quality of a reservoir through an unmanned surface vehicle: A case study of the Daljeon reservoir, South Korea 

Kwang-Hun Lee, Shahid Ali, Yena Kim, Ki-Taek Lee, Sae Yun Kwon, and Jonghun Kam

This study developed a synchronized mapping technique for water quantity and quality via an unmanned surface vehicle (USV). The USV with the acoustic doppler current profiler (ADCP) and the multiparameter sonde of water quality sensors (YSI EXO2) was used for identifying spatial and seasonal patterns of the Daljeon reservoir in South Korea. With this technique, we measured bathymetry and nitrate concentration from August 2021 through July 2022 at the high resolution spatial resolution and tested the sensitivity of estimated nitrate loads to spatial variations of input variables (water volumes and nitrate concentrations). Results showed that measured bathymetry and nitrate concentration varies over the water surface of the reservoir and time, which are associated with seasonal variations of temperature and precipitation. Despite weak spatial variations of the nitrate concentration, the water level of the reservoirs showed strong spatiotemporal variations depending on the topography of the reservoir and the  rainfall occurrence. Furthermore, we figured out using the mean for nitrate load was underestimated by -20% of the nitrate load estimates by considering spatial variation. High-resolution bathymetry measurement play a role in estimating nitrate loads with a minor impact of spatial variations of measured nitrate concentrations. We found that rainfall occurrences more likely increase estimated nitrate loads when it accounts for spatially variations of input variables, particularly water volumes. This study proved the potential utility of USV in simultaneously monitoring water quantity and quality for integrative water resource management for sustainably development of our communities.

How to cite: Lee, K.-H., Ali, S., Kim, Y., Lee, K.-T., Kwon, S. Y., and Kam, J.: Synchronized mapping of water quantity and quality of a reservoir through an unmanned surface vehicle: A case study of the Daljeon reservoir, South Korea, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-10497, https://doi.org/10.5194/egusphere-egu23-10497, 2023.

EGU23-11411 | Posters on site | HS1.2.1

Automated ablation stakes to constrain temperature-index melt models 

Andrew D. Wickert, Katherine R. Barnhart, William H. Armstrong, Matías Romero, Bobby Schulz, Gene-Hua Crystal Ng, Chad T. Sandell, Jeff D. La Frenierre, Shanti B. Penprase, Maximillian Van Wyk de Vries, and Kelly R. MacGregor

We developed automated ablation stakes to measure colocated in-situ changes in ice-surface elevation and climatological drivers of ablation. The designs implement open-source hardware, including the Margay data logger, which records information from a MaxBotix ultrasonic rangefinder as well as a sensor to detect atmospheric temperature and relative humidity. The stakes and sensor mounts are assembled using commonly available building materials, including electrical conduit and plastic pipe. The frequent (typically 1–15 minute) measurement intervals permit an integral approach to estimating temperature-index melt factors for ablation. Regressions of ablation vs. climatological drivers improve when relative humidity is included alongside temperature. We present all materials required to construct an automated ablation stake, alongside examples of their deployment and use in Alaska (USA), Ecuador, Patagonia (Argentina), and the Antarctic archipelago.

 

a: Alaska, 2012
b: Alaska, 2013
c: Ecuador, 2016
d: Argentina, 2020
e: Antarctica, 2021

How to cite: Wickert, A. D., Barnhart, K. R., Armstrong, W. H., Romero, M., Schulz, B., Ng, G.-H. C., Sandell, C. T., La Frenierre, J. D., Penprase, S. B., Van Wyk de Vries, M., and MacGregor, K. R.: Automated ablation stakes to constrain temperature-index melt models, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-11411, https://doi.org/10.5194/egusphere-egu23-11411, 2023.

EGU23-11777 | Posters on site | HS1.2.1

A low cost multi-chamber system (“Greenhouse Coffins”) to monitor CO2 and ET fluxes under semi-controlled conditions: Design and first results 

Mathias Hoffmann, Wael Al Hamwi, Matthias Lück, Marten Schmidt, and Maren Dubbert

Determining greenhouse gas (GHG) fluxes, water (ET) fluxes and their interconnectivity within the soil-plant-atmosphere-intersphere is crucial, not only when aiming to find solutions for current agricultural systems to mitigate the global climate crises but also to adapt them to related challenges ahead, such as more frequent and severe droughts. In a first attempt for a better understanding, often laboratory and/or greenhouse pot experiments are performed, during which gas exchange is predominately measured using especially manual closed chamber systems. Commercially available systems to determine gas exchange in terms of CO2 and ET are, however, costly and measurements itself labour-intensive. This limits the amounts of variables to be studied as well as possible repetitions during a study. Additionally, it resulted in the long-term focus on agroecosystems of the northern hemisphere while agroecosystems of sub-Saharan Africa as well as Southeast Asia are still being underrepresented.

We present an inexpensive (<1.000 Euro), Arduino based, multi-chamber system to semi-automatically measure 1) CO2 and 2.) ET fluxes. The systems consists of multiple, self-sufficient, closet-shaped PVC “coffins”. The “coffins” a closed by a frontal door and periodically ventilated through a sliding window. Relays connected to the microcontroller are used to steer closure/opening (linear actuator) and ventilation (axial fans). CO2 and ET fluxes are determined through the respective concentration increase during closure by a low-cost NDIR CO­2 (K30FR; 0-10,000 ppm, ± 30 ppm accuracy) and rH sensor (SHT-41). Parallel measurements of relevant environmental parameters inside and outside the “coffins” are conducted by DS18B20 (temperature) and BMP280 (air pressure) sensors. Sensor control, data visualization and storage, as well as steering closure/opening and ventilation is implemented in terms of a wifi and bluetooth enabled, socket powered (9V), compact microcontroller (D1 RS32) based logger unit. Here, we present the design, and first results of the developed, low-cost multi-chamber system. Results were validated against results of customized CO2 and ET measurement systems using regular scientific sensors (LI-COR 850) and data logger components (CR1000), connected to each “coffin” by a multiplexer.  Flow-meter were used for measurement synchronization.

How to cite: Hoffmann, M., Al Hamwi, W., Lück, M., Schmidt, M., and Dubbert, M.: A low cost multi-chamber system (“Greenhouse Coffins”) to monitor CO2 and ET fluxes under semi-controlled conditions: Design and first results, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-11777, https://doi.org/10.5194/egusphere-egu23-11777, 2023.

EGU23-12622 | ECS | Posters on site | HS1.2.1

Water user Fab Labs: co-design of low-tech sensors for irrigated systems 

Paul Vandôme, Crystele Leauthaud, Simon Moinard, Insaf Mekki, Abdelaziz Zairi, and Gilles Belaud

Mediterranean agriculture is facing the challenge to produce sustainably with a water resource under pressure. As irrigated areas expand in response to increasing vulnerability to drought, it is essential to support water users towards better agricultural water management. We set up two Fab Labs on the shores of the Mediterranean (France and Tunisia) to bring together water users around a collective project: co-constructing innovations to address local water management issues. A range of low-tech, low-cost and open source IoT-based sensors emerged from this process. The technologies were tested with users during the 2022 irrigation season. The aim of this study is to provide feedback on this participatory method as a facilitator for creating and sharing innovation in rural territories and to discuss the opportunities, benefits and limitations related to the use of these new technologies. We believe that this work contributes to make the measurement of water flows - and thus their understanding and better management - more accessible to the agricultural sector.     

How to cite: Vandôme, P., Leauthaud, C., Moinard, S., Mekki, I., Zairi, A., and Belaud, G.: Water user Fab Labs: co-design of low-tech sensors for irrigated systems, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-12622, https://doi.org/10.5194/egusphere-egu23-12622, 2023.

EGU23-13072 | ECS | Posters on site | HS1.2.1

Precipitation Measurement from Raindrops’ Sound and Touch Signals 

Seunghyun Hwang, Jinwook Lee, Jeemi Sung, Hyochan Kim, Beomseo Kim, and Changhyun Jun

This study proposes a novel method for rainfall intensity estimation from acoustic and vibration data with low-cost sensors. At first, a precipitation measurement device was developed to collect sound and touch signals from raindrops, composed of Raspberry Pi, a condenser microphone, and an accelerometer with 6 degrees of freedom. To figure out whether rainfall occurred or not, a binary classification model with the XGBoost algorithm was considered to analyze long-term time series of vibration data. Then, high-resolution acoustic data was used to investigate the main characteristics of rainfall patterns at a frequency domain for the period when it was determined that rainfall occurred. As a result of the Short Time Fourier Transform (STFT), the highest frequency, mean and standard deviation of amplitudes were selected as representative values for minute data. Finally, different types of regression models were applied to develop the method for rainfall intensity estimation from comparative analysis with other precipitation measurement devices (e.g., PARSIVEL, etc.). It should be noted that the new device with the proposed method functions reliably under extreme environmental conditions when the estimated rainfall intensity was compared with measured data from ground-based precipitation devices. It shows that low-cost sensors with sound and touch signals from raindrops can be effectively used for rainfall intensity estimation with easy installation and maintenance, indicating a strong possibility of being considered in a wide range of areas for precipitation measurement with high resolution and accuracy

Acknowledgement

This work was supported by the National Research Foundation of Korea(NRF) grant funded by the Korea government(MSIT) (No. NRF-2022R1A4A3032838).

How to cite: Hwang, S., Lee, J., Sung, J., Kim, H., Kim, B., and Jun, C.: Precipitation Measurement from Raindrops’ Sound and Touch Signals, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-13072, https://doi.org/10.5194/egusphere-egu23-13072, 2023.

EGU23-14370 | Posters on site | HS1.2.1

Monitoring an ephemeral stream with a Teensy 3.2 + audio shield to determine water level only from the noise of a stream 

Linus Fässler, Natalie Ceperley, Peter Leiser, and Bettina Schaefli

River networks in the Alps are very complex and hold many unanswered research questions. For example, various assumptions must be made to when studying tributaries and small rivers. Namely, there is not a widely accepted tool to measure streamflow in small, mountain streams that can overcome their specific challenges affordably without large installations. For example, alteration between extremely high and no discharge volume is characteristic of intermittent rivers and ephemeral streams (IRES). Conventional measuring devices all require streambed installation, which exposes them to displacement or destruction by abruptly rising water levels. One solution, thus, is to remove the sensor from the streambed and measure from a distance. We have experimented with an acoustic sound recorder mounted above the stream as an alternative tool to assess water level. We designed a low-cost audio sensor powered by a microcontroller with an audio shield specifically for recording IRES. To ensure reproducibility, we used Arduino for programming the Teensy 3.2. Images of the water level in an IRES were simultaneously captured when possible (daylight) and used for calibration. The water level visible in the images correlated well with that determined from the audio recordings from our self-developed audio sensor (R2 = 95%). Based exclusively on the audio recording of an IRES, we can obtain a time series of the water level, at least when water was present. We are currently unable to determine consistently whether water is present nor state with certainty when the streambed is dry based solely on acoustic data. Nevertheless, this new sensor allows us to measure an alpine channel network at more locations and over longer time periods than previously feasible.

How to cite: Fässler, L., Ceperley, N., Leiser, P., and Schaefli, B.: Monitoring an ephemeral stream with a Teensy 3.2 + audio shield to determine water level only from the noise of a stream, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-14370, https://doi.org/10.5194/egusphere-egu23-14370, 2023.

The development of artificial reservoirs plays a considerable role in regulating the spatial and temporal distribution of irrigated rainfall and guaranteeing sustainable agricultural development. Many studies have used the area-storage relationship to obtain the storage capacity of on-farm reservoirs (OFRs), but it does not work for OFRs with persistent water surface area. In this study, we proposed an effective method to estimate the water storage of irrigated OFRs by combining multi-source remote sensing data and ground observation. We quickly derived the location of irrigated OFRs by using seasonal characteristics of irrigated OFRs and obtained high-precision water surface area using an object-oriented segmentation. We estimated water storage of irrigated OFRs by combining three different ways (i.e., Lidar-based, ground observation-based (photos), and surface area-based). The method performs well in three aspects, i.e., identifying on-farm reservoirs, extracting water surface area, and calculating water storage. The accuracy of identification reaches 94.1%, and the derived water area agrees well with the surveyed results, i.e., an overall accuracy of 97.8%, the root mean square error (RMSE) and the mean absolute errors (MAE) are 962 m2 and 766 m2, respectively. The obtained water storage is reliable using three different ways (i.e., the area-storage, Lidar-based, and photo observations-based methods), with accuracy of 98.8%, 95.2%, and 94.1%, respectively. The proposed method enables monitoring of the storage of multiple types of irrigated OFRs, particularly the photo observation-based method can deal with the storage of OFRs with persistent water areas, showing huge potential to promote irrigated water resource utilization efficiency.

How to cite: wang, Y.: Monitoring water storage of on-farm reservoirs using remote sensing and ground observation, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-15372, https://doi.org/10.5194/egusphere-egu23-15372, 2023.

Hydrology is still one of the most data scarce natural sciences. The large number of variables to measure, their extreme spatiotemporal gradients, and the often harsh and hostile environmental conditions all contribute to this issue. This challenge is even more pronounced in remote and extreme environments such as the tropics, and mountain regions, where the need for robust data is most acute.

Many new and emergent technologies can help with building more cost-effective, robust, and versatile hydrological monitoring systems. However, the speed at which these new technologies are being incorporated in commercially available systems is slow and dictated by commercial interests and bottlenecks.

An alternative solution is for scientists to build their own systems using off the shelf components. Open-source hardware and software, such as the Arduino and Raspberry Pi ecosystems, make this increasingly feasible. As a result, a plethora of global initiatives for open-source sensing and logging solutions have emerged.

But despite these new technologies, it remains a major challenge to build open-source solutions that equal the reliability and robustness of the high-end commercial systems that are available on the market. Sharing experiences, best practices, and evidence on the real-world performance of different designs may help with overcoming this bottleneck.

In this contribution, I summarize the experience gained from developing and operating over 300 open-source data loggers, built around the Riverlabs platform. This platform is mostly a compilation of existing open-source hardware and software components and solutions, which were refined further and tweaked for robustness and reliability in extreme environments. Our loggers have been installed in locations as diverse as Arctic Norway, the high Andes of Peru and Chile, the Nepalese and Indian Himalayas, the Somali desert, and the Malaysian rainforest, providing a wide range of real-world test-cases and performances.

How to cite: Buytaert, W.: Towards a robust, open-source logging platform for environmental monitoring in challenging environments: the Riverlabs toolbox, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-15989, https://doi.org/10.5194/egusphere-egu23-15989, 2023.

EGU23-17527 | ECS | Posters on site | HS1.2.1

Design of an affordable and highly flexible IoT station for multiple gas concentration monitoring 

Francesco Renzi, Flavio Cammillozzi, Giancarlo Cecchini, Alessandro Filippi, and Riccardo Valentini

The air quality monitoring is a core topic for European environmental policies and worldwide. At the same time technologies such as electrochemical or NDIR gas sensors became affordable and easy to implement in a customized design. A highly flexible monitoring station has been designed and build in order to obtain a customizable and affordable device. It is composed of two boards, one in charge of connectivity and processing while the other allows to insert up to 11 gas sensors. Such number is achieved through the use of three multiplexers that allow to spare input pins of the processor. Moreover the flexibility at the moment is achieved using sensors with the same form factor but adapters are under development to increase the adaptability of the system, both hardware and software. An Arduino MKR zero runs the application that can be run in three different modes: single measurement, time driven or position driven. The last feature is obtained through an optional on-board U-blox GNSS module that allows to georeference the performed measurements. This mode is mainly used when the measurement cell is applied on moving object, such as drones. The system is able to send the data collected and receive commands using MQTT protocol (HiveMQ broker) through a NB-IoT connection and interact with the user from an online dashboard created using Thingsboard. The use of the MQTT protocol allows to send the data to multiple endpoints if the data should be provided also to third parties. Moreover, the data and some parameters are also saved on a sd card. All the system is built on stand alone boards to achieve easy maintaince of the system and to allow a rapid change in the used technology (a plug and play LoRaWan module is under development). Being a multi-application platform, price of the device is of course highly dependent on the chosen set of sensors thus, in the end, on the application itself (i.e. Air pollution or gas emission in barns). To sum up, the device described is a possible solution for an affordable gas concentration measurement system that can be adapted to fit a large variety of use cases combining software and hardware solutions.

How to cite: Renzi, F., Cammillozzi, F., Cecchini, G., Filippi, A., and Valentini, R.: Design of an affordable and highly flexible IoT station for multiple gas concentration monitoring, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-17527, https://doi.org/10.5194/egusphere-egu23-17527, 2023.

GI2 – Data networks and analysis

EGU23-1562 | ECS | Posters virtual | GI2.1

A new finite-difference stress modeling method governed by elastic wave equations 

Zhuo Fan, Fei Cheng, and Jiangping Liu

Numerical stress or strain modeling has been a focused subject in many fields, especially in assessing the stability of key engineering structures and better understanding in local or tectonic stress patters and seismicity. Here we proposed a new stress modeling method governed by elastic wave equations using finite-difference scheme. Based on the modeling scheme of wave propagation, the proposed method is able to solve both the dynamic stress evolution and the static stress state of equilibrium by introducing an artificial damping factor to the particle velocity. We validate the proposed method in three geophysical benchmarks: (a) a layered earth model under gravitational load, (b) a rock mass model under nonuniform loads on its exterior boundaries and (c) a fault zone with strain localization driven by regional tectonic loading that measured by GPS velocity field.  Because the governing equations of the proposed method are wave equations instead of equilibrium equations, we are able to use the perfectly matched layer as the artificial boundary conditions for models in unbounded domain, which will substantially improve the accuracy of them. Also, the proposed scheme maps the physical model on simple computational grids and therefore is more memory efficient for grid points’ positions not been stored. Besides, the efficient parallel computing of the finite-different method guarantees the proposed method’s advantage in computational speed. As a minor modification to wave modeling scheme, the proposed stress modeling method is not only accurate for geological models through different scales, but also physically reasonable and easy to implement for geophysicists.

How to cite: Fan, Z., Cheng, F., and Liu, J.: A new finite-difference stress modeling method governed by elastic wave equations, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-1562, https://doi.org/10.5194/egusphere-egu23-1562, 2023.

EGU23-2228 | ECS | Posters on site | GI2.1

Non-destructive geophysical damage analysis of medieval plaster in the cloister of the St. Petri Cathedral Schleswig (Germany) 

Yunus Esel, Ercan Erkul, Detlef Schulte-Kortnack, Christian Leonhardt, Julika Heller, and Thomas Meier

Buildings that have existed for centuries undergo structural changes over time due to variations in use. In addition, many structures are severely damaged for example by moisture intrusion. To determine the distribution of moisture in the structure, they are often examined pointwise by core sampling. In addition to invasive methods, non-destructive methods may be applied to obtain three-dimensional hints on the moisture distribution with structures of interest.            
The purpose of this paper is to show that non-destructive determination of moisture distribution is possible by using and combining geophysical measurement methods such as infrared thermography (IR), ultrasound (US) and ground penetrating radar (GPR). There are examples for the combination of these methods for non-destructive examination, but it is not yet commonly applied in the field of restoration and conservation of historic buildings.            
We present results of geophysical investigations of medieval wall paintings in the cloister of the cathedral in Schleswig (Federal State Schleswig-Holstein, Northern Germany) in the framework of a project funded by the German Federal Foundation for the Environment (Deutsche Bundesstiftung Umwelt - DBU). In the cloister, large-scale alterations of the medieval red-line paintings occurred due to gypsum deposits and a shellac coating. In order to quantify the material properties of a vault section (yoke) in the cloister during the restoration ultrasound surface wave measurements, passive and active thermography and ground penetrating radar measurements were carried out.
Repeating measurements at intervals of several months made it possible to evaluate the effectiveness of the test treatments by different solvents to remove the shellac as well as the gypsum deposits. In addition, our results from the passive thermography measurements show that in one section a defect in the horizontal barrier could be responsible for moisture ingress and associated damage. The radargrams recorded in this area confirm that a significant change in reflection amplitudes is present in the areas of increased moisture.

How to cite: Esel, Y., Erkul, E., Schulte-Kortnack, D., Leonhardt, C., Heller, J., and Meier, T.: Non-destructive geophysical damage analysis of medieval plaster in the cloister of the St. Petri Cathedral Schleswig (Germany), EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-2228, https://doi.org/10.5194/egusphere-egu23-2228, 2023.

EGU23-2347 | ECS | Posters on site | GI2.1

Non-destructive testing methods and numerical development for enhancing airfield pavement management 

Konstantinos Gkyrtis, Christina Plati, and Andreas Loizos

Pavements are an essential component of airport facilities. Airport infrastructures serve to safely transport people and goods on a day-to-day basis. They promote economic development, both regionally and internationally, by also boosting tourist flows. In times of crisis, they can be used for societal emergencies, such as managing migration flows. Therefore, airports need pavements in good physical condition to ensure uninterrupted operations. However, interventions on airfield pavements are costly and labor intensive. Aspects of pavement structural performance related to bearing capacity and damage potential remain of paramount importance as the service life of a pavement extends beyond its design life. Therefore, structural condition evaluation is required to ensure the long-term bearing capacity of the pavement. 

The design and evaluation of flexible airfield pavements are generally based on the Multi-Layered Elastic Theory (MLET) in accordance with Federal Aviation Administration (FAA) principles. The most informative tool for structural evaluation is the Falling Weight Deflectometer (FWD), which senses pavement surfaces using geophones that record load-induced deflections at various locations. Additional geophysical inspection data using Ground Penetrating Radar (GRP) is processed to estimate the stratigraphy of the pavement. The integration of the above data provides an estimate of the pavement's performance and potential for damage. However, GRP is not always readily applicable.

In addition, the most important concern in pavement evaluation is the mechanical characterization of pavement materials. At the top of pavement structures, asphalt mixtures behave as a function of temperature and loading frequency. This viscoelastic behavior deviates from MLET and this issue needs further investigation. Therefore, this study integrates measured NDT data and sample data from cores taken in-situ. The pavement under study is an existing asphalt pavement of a runway at a regional airport in Southern Europe. A comparative evaluation of the strain state within the pavement body is performed both at critical locations and at the pavement surface, taking into account elastic and viscoelastic behaviors. Strains are an important input to models of long-term pavement performance, which has a critical influence on aircraft maneuverability. In turn, the significant discrepancies found highlight the need for more mechanistic considerations in predicting the damage and stress potential of airfield pavements so that maintenance and/or rehabilitation needs can be better managed and planned.

Overall, this study highlights the sensing capabilities of NDT data towards a structural health monitoring of airfield pavements. Ground-truth data from limited destructive testing enrich pavement evaluation processes and enhance conventional FAA evaluation procedures. The study proposes a numerical development for accurate field inspections and improved monitoring protocols for the benefit of airfield pavement management and rehabilitation planning. 

How to cite: Gkyrtis, K., Plati, C., and Loizos, A.: Non-destructive testing methods and numerical development for enhancing airfield pavement management, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-2347, https://doi.org/10.5194/egusphere-egu23-2347, 2023.

The Laacher See Event- (LSE-) volcanism isochrone of 12.850 yrs BP (Bujatti-Narbeshuber, 1997), proxy for P/H boundary KISS (Bujatti-Narbeshuber, 1996), was improved from Gerzensee varves to 13.034 cal yrs BP (Van Raden, 2019).

    This LSE date now separates end Pleistocene, first, mainly oceanic-water KISS, from the second, Holocene-Younger Dryas Onset (YDO), continental-ice impact, as predicted by KISS-hypothesis, separating:„ a continental Koefels-comet ice-impact, from the mainly oceanic KISS, at the Pleistocene/Holocene boundary, associated with global warming, dendro C14 spikes, faunal mass extinction...“ (Bujatti-Narbeshuber, 1996; Max, 2022).

    Oceanic-water LSE-KISS (13.034 cal yrs BP, varves) of end Alleroed temperature maximum, separates by 157 yrs from continental-ice YDO-KISS (12.877 cal yrs BP, varve-date). A larger gap of 184 yrs results, taking C 14 dated YD-KISS (12.850 cal yrs BP), approaching 200 yrs of earlier varve-studies (Bujatti-Narbeshuber, 1997).

    LSE-KISS varve-date differs by 47 yrs from geo-magnetic Gothenberg Excursion Onset- (GEO-) isochrone of 13.081 cal yrs BP (Chen, 2020), suggesting geo-magnetic reversal, True Polar Wander (TPW) GEO-TPW-KISS from 2 Koefels-comet (Taurid-) fragments. This considers end-paleolithic Magdalenian Impact Sequelae Symbolisations (MISS).

    Questioning P/H isostatic-unloading volcanism (Zielinsky, 1996), LSE-KISS volcanism is from Mid Atlantic Ridge & Mid Atlantic Plateau (MAR&MAP) impact (Bujatti-Narbeshuber, 1997, 2022), as further corroborated by Greenland (NGRIP) ice-core sulfate monitoring: from LSE-KISS-volcanism (12.978 cal yrs) to YDO (12.867 cal yr BP), within 110 yrs, an unprecedented, bipolar-volcanic-eruption-quadruplet resulted (Lin, 2022).

    The first Taurid LSE-KISS (Varves-date: 13.034 cal yrs BP, GEO-date: 13.084 cal yrs BP.) into oceanic-water is evident from two 700 km Mid Atlantic Ridge & Plateau Lowering Events (MARPLES) releasing two separate Tsunamis (Bujatti-Narbeshuber, 2022): Resulting in submarine explosive-magmatism-silicates, seafloor-carbonates, volcanic-ash and sea-water in huge strato-meso-spheric overheated steam-plume moving eastward by eolian transport, descending in drowning rain-flood, largely contributing to Eurasian loess sediment layer (Muck, 1976).

    This is stratigraphically verified in e.g. relative stratigraphic positions in Netherland, Geldrop-Aalsterhut, with Younger Coversand I, bleached (!) (AMS 13.080- 12.915 cal yrs BP) underlying intercalated (!), charcoal rich (AMS 12.785-12.650 cal yrs BP) Usselo Horizon (Andronikov, 2016). It corresponds to US, Black Mats stratigraphy from second Taurid, continental-ice, YD-KISS (12.850 cal yrs BP, C14) plus Carolina Bays (CB) with: 1. Soft, white, loess sediment from first oceanic LSE-KISS. 2. YD-KISS proxies-stratum. 3. e.g. Carolina-Florida-coast-sand-disturbances, within 1.500 km radius of continental-ice YD-KISS ice-ejecta impact-curtain of 500.000 CB (LIDAR) 4. Black Mats after YD-KISS.

    After visiting Koefels-crater an “below continental-glacier-ice, circular geomagnetic-anomaly with paleoseismic Koefels-corridor of twelfe Holocene rockfalls”, Eugene Shoemaker (Vienna, May 5th 1997), when asked about Carolina Bays causation, is quoted: “Eugene spoke of a late Pleistocene origin of the Bays and as glaciological features while I preferred the paleoseismic interpretation. I interprete them as paleoseismic impact-seismic liquefaction features. They … are the first evidence for a late Pleistocene impact event. Dated by me …12.850 BP (1950) in calendar years”. (Bujatti-Narbeshuber, NHM letter to John Grant III, Sept. 22nd 1997).

    Both P/H-impacts break&make, Pleistocene criticality&Holocene damped flow, through 700 km geomorphological threshold (GLOVES) submersion & through (GTT) water, CO2 Greenhouse-gas-production, beyond glaciation threshold for hot climate prediction.

How to cite: Bujatti-Narbeshuber, M.: Pleistocene/Holocene (P/H) boundary oceanic Koefels-comet Impact Series Scenario (KISS) of 12.850 yr BP Global-warming Threshold Triad (GTT)-Part III, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-2869, https://doi.org/10.5194/egusphere-egu23-2869, 2023.

To evaluate the feasibility of CO2 sequestration in offshore, South Korea, we studied numerical modelling with elastic velocity model. The CO2 storage candidate is a brine saturated aquifer formation overlain by basalt caprock in the Southern Continental Shelf of Korea. Basalt formation without joint and fracture can seal a storage volume preventing leakage of injected CO2. Result of preliminary two-dimensional seismic exploration estimated that storage potential would be from 42.07 to 143.79 Mt of CO2. The input model include P- and S-wave velocity and density of shallow sediment and vasalt layer. To simulate CO2 injection, we assumed an area of CO2 plume at the interval beneath the depth of basalt formation and artificially decreased P-, S-wave velocities, and density values. Synthesized seismic records are comparable with survey's gather as direct arrival and primary reflections. The ongoing work can be extended on a quantitative verification concerning serveral cased of varying velcoties and densities.

How to cite: Cheong, S., Kang, M., and Kim, K. J.: Numerical modelling of seismic field record with elastic velocity construction for CO2 sequestration in offshore, South Korea, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-2980, https://doi.org/10.5194/egusphere-egu23-2980, 2023.

EGU23-4861 | Orals | GI2.1

Decay diagnosis of tree trunks using 3D point cloud and reverse time migration of GPR data 

Zhijie Chen, Hai Liu, Meng Xu, Yunpeng Yue, and Bin Zhang

Health monitoring and disease mitigation of trees are essential to ensure the sustainability of wood industry, safety of ecosystems, and maintenance of climatic conditions. Several non-destructive testing methods have been applied to monitor and detect the decays inside the trunks. Among them, ground penetrating radar (GPR) has gained recognition due to its high efficiency and good resolution. However, due to the wide beam width of the antenna pattern and the complicated scattering caused by the trunk structure, the recorded GPR profile is far from the actual geometry of the tree trunk. Moreover, the irregular contour of the tree trunk makes traditional data processing algorithms difficult to be performed. Therefore, an efficient migration algorithm with high resolution, as well as a high accuracy survey-line positioning method for curved contour of the trunk should be developed.

In this paper, a combined approach is proposed to image the inner structures inside the irregular-shaped trunks. In the first step, the 3D contour of the targeted tree trunk is built up by a 3D point cloud technique via photographing around the trunk at various angles. Subsequently, the 2D irregular contour of the cross-section of trunk at the position of the GPR survey line is extracted by the Canny edge detection method to locate the accurate position of each GPR A-scans [1]. Thirdly, the raw GPR profile is pre-processed to suppress undesired noise and clutters. Then, an RTM algorithm based on the zero-time imaging condition is applied for image reconstruction using the extracted 2D contour [2]. Lastly, a denoising method based on the total variation (TV) regularization is applied for artifact suppression in the reconstructed images [3].

Numerical, laboratory and field experiments are carried out to validate the applicability of the proposed approach. Both numerical and laboratory experimental results show that the RTM can yield more accurate and higher resolution images of the inner structures of the tree cross section than the BP algorithm. The proposed approach is further applied to a diseased camphor tree, and an elliptical decay defect is found the in the migrated GPR image. The results are validated by a visual inspection after the tree trunk was sawed down.

Fig. 1 Field experiment. (a) Geometric reconstruction result using point cloud data, (b) migrated result by the RTM algorithm and (c) bottom view of the tree trunk after sawing down. The red and yellow ellipses indicate the cavity and the decay region in the trunk, respectively.

References:

[1] Canny, "A Computational Approach to edge detection," IEEE Transactions on Pattern Analysis and Machine Interllgent, vol. PAMI-8, no. 6, pp. 679-698, 1986, doi: 10.1109/TPAMI.1986.4767851.

[2] S. Chattopadhyay and G. A. McMechan, "Imaging conditions for prestack reverse-time migration," Geophysics, vol. 73, no. 3, pp. S81-S89, 2008, doi: 10.1190/1.2903822.

[3] L. I. Rudin, S. Osher, and E. Fatemi, "Nonlinear total variation based noise removal algorithms," Physica D, vol. 60, pp. 259-268, 1992, doi: 10.1016/0167-2789(92)90242-F.

How to cite: Chen, Z., Liu, H., Xu, M., Yue, Y., and Zhang, B.: Decay diagnosis of tree trunks using 3D point cloud and reverse time migration of GPR data, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-4861, https://doi.org/10.5194/egusphere-egu23-4861, 2023.

EGU23-6795 | ECS | Orals | GI2.1

Relaxing requirements for spatio-temporal data fusion 

Harkaitz Goyena, Unai Pérez-Goya, Manuel Montesino-San Martín, Ana F. Militino, Peter M. Atkinson, and M. Dolores Ugarte

Satellite sensors need to make a trade-off between revisit frequency and spatial resolution. This work presents a spatio-temporal image fusion method called Unpaired Spatio-Temporal Fusion of Image Patches (USTFIP). This method combines data from different multispectral sensors and creates images combining the best of each satellite in terms of frequency and resolution. It generates synthetic images and selects optimal information from cloud-contaminated images, to avoid the need of cloud-free matching pairs of satellite images. The removal of this restriction makes it easier to run our fusion algorithm even in the presence of clouds, which are frequent in time series of satellite images. The increasing demand of larger datasets makes necessary the use of computationally optimized methods. Therefore, this method is programmed to run in parallel reducing the run-time with regard to other methods. USTFIP is tested through an experimental scenario with similar procedures as Fit-FC, STARFM and FSDAF. Finally, USTFIP is the most robust, since its prediction accuracy deprecates at a much lower rate as classical requirements become progressively difficult to meet.

How to cite: Goyena, H., Pérez-Goya, U., Montesino-San Martín, M., F. Militino, A., Atkinson, P. M., and Ugarte, M. D.: Relaxing requirements for spatio-temporal data fusion, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-6795, https://doi.org/10.5194/egusphere-egu23-6795, 2023.

Continual monitoring of tree roots, which is essential when considering tree health and safety, is possible using a digital model. Non-destructive techniques, for instance, laser scanning, acoustics, and Ground Penetrating Radar (GPR) have been used in the past to study both the external and internal physical dimensions of objects and structures [1], including trees [2,3]. Recent studies have shown that GPR is effective in mapping the root system's network in street trees [3]. Light Detection and Ranging (LiDAR) technology has also been employed in infrastructure management to generate 3D data and to detect surface displacements with millimeter accuracy [4]. However, scanning such structures using current state-of-the-art technologies can be expensive and time consuming. Further, continual monitoring of tree roots requires multiple visits to tree sites and, oftentimes, repeated excavations of soil.

This work proposes a Virtual Reality (VR) system using smartphone-based LiDAR and GPR data to capture ground surface and subsurface information to monitor the location of tree roots. Both datasets can be visualized in 3D in a VR environment for future assessment. LiDAR technology has recently become available in smartphones (for instance, the Apple iPhone 12+) and can scan a surface, e.g., the base of a tree, and export the data to a 3D modelling and visualization application. Using GPR data, we combined subsurface information on the location of tree roots with the LiDAR scan to provide a holistic digital model of the physical site. The system can provide a relatively low-cost environmental modelling and assessment solution, which will allow researchers and environmental professionals to a) create digital 3D snapshots of a physical site for later assessment, b) track positional data on existing tree roots, and c) inform the decision-making process regarding locations for potential future excavations.

Acknowledgments: Sincere thanks to the following for their support: Lord Faringdon Charitable Trust, The Schroder Foundation, Cazenove Charitable Trust, Ernest Cook Trust, Sir Henry Keswick, Ian Bond, P. F. Charitable Trust, Prospect Investment Management Limited, The Adrian Swire Charitable Trust, The John Swire 1989 Charitable Trust, The Sackler Trust, The Tanlaw Foundation, and The Wyfold Charitable Trust. The Authors would also like to thank Mr Dale Mortimer (representing the Ealing Council) and the Walpole Park for facilitating this research.

References

[1] Alani A. M. et al., Non-destructive assessment of a historic masonry arch bridge using ground penetrating radar and 3D laser scanner. IMEKO International Conference on Metrology for Archaeology and Cultural Heritage Lecce, Italy, October 23-25, 2017.

[2] Ježová, J., Mertens, L., Lambot, S., 2016. “Ground-penetrating radar for observing tree trunks and other cylindrical objects,” Construction and Building Materials (123), 214-225.

[3] Lantini, L., Alani, A. M., Giannakis, I., Benedetto, A. and Tosti, F., 2020. "Application of ground penetrating radar for mapping tree root system architecture and mass density of street trees," Advances in Transportation Studies (3), 51-62.

[4] Lee, J. et al., Long-term displacement measurement of bridges using a LiDAR system. Struct Control Health Monit. 2019; 26:e2428.

How to cite: Uzor, S., Lantini, L., and Tosti, F.: Low-cost assessment and visualization of tree roots using smartphone LiDAR, Ground-Penetrating Radar (GPR) data and virtual reality, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-6908, https://doi.org/10.5194/egusphere-egu23-6908, 2023.

EGU23-8384 | ECS | Orals | GI2.1

A Study on the Effect of Target Orientation on the GPR Detection of Tree Roots Using a Deep Learning Approach 

Livia Lantini, Federica Massimi, Saeed Sotoudeh, Dale Mortimer, Francesco Benedetto, and Fabio Tosti

Monitoring and protection of natural resources have grown increasingly important in recent years, since the effect of emerging illnesses has caused serious concerns among environmentalists and communities. In this regard, tree roots are one of the most crucial and fragile plant organs, as well as one of the most difficult to assess [1].

Within this context, ground penetrating radar (GPR) applications have shown to be precise and effective for investigating and mapping tree roots [2]. Furthermore, in order to overcome limitations arising from natural soil heterogeneity, a recent study has proven the feasibility of deep learning image-based detection and classification methods applied to the GPR investigation of tree roots [3].

The present research proposes an analysis of the effect of root orientation on the GPR detection of tree root systems. To this end, a dedicated survey methodology was developed for compilation of a database of isolated roots. A set of GPR data was collected with different incidence angles with respect to each investigated root. The GPR signal is then processed in both temporal and frequency domains to filter out existing noise-related information and obtain spectrograms (i.e. a visual representation of a signal's frequency spectrum relative to time). Subsequently, an image-based deep learning framework is implemented, and its performance in recognising outputs with different incidence angles is compared to traditional machine learning classifiers. The preliminary results of this research demonstrate the potential of the proposed approach and pave the way for the use of novel ways to enhance the interpretation of tree root systems.

 

Acknowledgements

The Authors would like to express their sincere thanks and gratitude to the following trusts, charities, organisations and individuals for their generosity in supporting this project: Lord Faringdon Charitable Trust, The Schroder Foundation, Cazenove Charitable Trust, Ernest Cook Trust, Sir Henry Keswick, Ian Bond, P. F. Charitable Trust, Prospect Investment Management Limited, The Adrian Swire Charitable Trust, The John Swire 1989 Charitable Trust, The Sackler Trust, The Tanlaw Foundation, and The Wyfold Charitable Trust. The Authors would also like to thank the Ealing Council and the Walpole Park for facilitating this research.

 

References

[1] Innes, J. L., 1993. Forest health: its assessment and status. CAB International.

[2] Lantini, L., Tosti, F., Giannakis, I., Zou, L., Benedetto, A. and Alani, A. M., 2020. "An Enhanced Data Processing Framework for Mapping Tree Root Systems Using Ground Penetrating Radar," Remote Sensing 12(20), 3417.

[3] Lantini, L., Massimi, F., Tosti, F., Alani, A. M. and Benedetto, F. "A Deep Learning Approach for Tree Root Detection using GPR Spectrogram Imagery," 2022 45th International Conference on Telecommunications and Signal Processing (TSP), 2022, pp. 391-394.

How to cite: Lantini, L., Massimi, F., Sotoudeh, S., Mortimer, D., Benedetto, F., and Tosti, F.: A Study on the Effect of Target Orientation on the GPR Detection of Tree Roots Using a Deep Learning Approach, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-8384, https://doi.org/10.5194/egusphere-egu23-8384, 2023.

EGU23-8667 | ECS | Posters on site | GI2.1

An Investigation into the Acquisition Parameters for GB-SAR Assessment of Bridge Structural Components 

Saeed Sotoudeh, Livia Lantini, Kevin Jagadissen Munisami, Amir M. Alani, and Fabio Tosti

Structural health monitoring (SHM) is a necessary measure to maintain bridge infrastructure safe. To this purpose, remote sensing has proven effective in acquiring data with high accuracy in relatively short time. Amongst the available methods, the ground-based synthetic aperture radar (GB-SAR) can detect sub-zero deflections up to 0.01 mm generated by moving vehicles or the environmental excitation of the bridges [1]. Interferometric radars are also capable of data collection regardless of weather, day, and night conditions (Alba et al., 2008). However, from the available literature - there is lack of studies and methods focusing on the actual capabilities of the GB-SAR to target specific structural elements and components of the bridge - which makes it difficult to associate the measured deflection with the actual bridge section. According to the antenna type, the footprint of the radar signal gets wider in distance which encompasses more elements and the presence of multiple targets in the same resolution cell adds uncertainty to the acquired data (Michel & Keller, 2021). To this effect, the purpose of the present research is to introduce a methodology for pinpointing targets using GB-SAR and aid the data interpretation. An experimental procedure is devised to control acquisition parameters and targets, and being able to analyse the returned outputs in a more clinical condition. The outcome of this research will add to the existing literature in terms of collecting data with enhanced precision and certainty.

 

Keywords

Structural Health Monitoring (SHM), GB-SAR, Remote Sensing, Interferometric Radar

 

Acknowledgements

This research was funded by the Vice-Chancellor’s PhD Scholarship at the University of West London.

 

References

[1] Benedettini, F., & Gentile, C. (2011). Operational modal testing and FE model tuning of a cable-stayed bridge. Engineering Structures, 33(6), 2063-2073.

[2] Alba, M., Bernardini, G., Giussani, A., Ricci, P. P., Roncoroni, F., Scaioni, M., Valgoi, P., & Zhang, K. (2008). Measurement of dam deformations by terrestrial interferometric techniques. Int.Arch.Photogramm.Remote Sens.Spat.Inf.Sci, 37(B1), 133-139.

[3] Michel, C., & Keller, S. (2021). Advancing ground-based radar processing for bridge infrastructure monitoring. Sensors, 21(6), 2172.

How to cite: Sotoudeh, S., Lantini, L., Munisami, K. J., Alani, A. M., and Tosti, F.: An Investigation into the Acquisition Parameters for GB-SAR Assessment of Bridge Structural Components, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-8667, https://doi.org/10.5194/egusphere-egu23-8667, 2023.

EGU23-8762 | ECS | Orals | GI2.1

Joint Interpretation of Multi-Frequency Ground Penetrating Radar and Ultrasound Data for Mapping Cracks and Cavities in Tree Trunks 

Saeed Parnow, Livia Lantini, Stephen Uzor, Amir M. Alani, and Fabio Tosti

As the Earth's lungs, trees are a natural resource that provide, amongst others, food, lumber, and oxygen. Therefore, monitoring these wooden structures with non-destructive testing (NDT) techniques such as ground penetrating radar (GPR) and ultrasound can provide valuable information about inner flaws and decays, which is an essential step for tree conservation.  

In recent years, GPR and ultrasound have been used to delineate the interior architecture of tree trunks [1-3]. However, more research is required to improve results and consequently have a more reliable interpretation. Due to limitations in depth penetration and signal-to-noise ratio [4], these approaches have a limited capacity for resolving features. The use of gain functions and higher frequencies to compensate for wave attenuation may exaggerate events and reduce resolution, respectively.

In this context, an integration between GPR multi-frequency and ultrasound data can be used to address this issue. Data were collected on a tree trunk log at the Faringdon Centre for Non-Destructive Testing and Remote Sensing using two high-frequency GPR systems (2GHz and 4GHz central frequencies) and an ultrasound (supporting a wide range of transducers from 24 kHz up to 500 kHz) testing equipment. Internal features of interest in terms of extended perimetric air gaps at the bark-wood interface, natural cracks and small artificial cavities were investigated through electromagnetic and mechanical waves. After compilation of data, a joint interpretation strategy for data analysis is developed. The processed data were mapped against the cut sections of the tree for validity purposes.

Although study of stand tree trunks would be more challenging, the findings of this research may be applied for wood timbers and pave the way to future research for living tree trunks.

 

Acknowledgements

This research was funded by the Vice-Chancellor’s PhD Scholarship at the University of West London.

 

References

[1] Arciniegas, A., et al., Literature review of acoustic and ultrasonic tomography in standing trees. Trees, 2014. 28(6): p. 1559-1567. 

[2] Giannakis, I., et al., Health monitoring of tree trunks using ground penetrating radar. IEEE Transactions on Geoscience and Remote Sensing, 2019. 57(10): p. 8317-8326.

[3] Espinosa, L., et al., Ultrasound computed tomography on standing trees: accounting for wood anisotropy permits a more accurate detection of defects. Annals of Forest Science, 2020. 77(3): p. 1-13.

[4] Tosti, F., et al., The use of GPR and microwave tomography for the assessment of the internal structure of hollow trees. IEEE Transactions on Geoscience and Remote Sensing, 2021. 60: p. 1-14.

 

How to cite: Parnow, S., Lantini, L., Uzor, S., Alani, A. M., and Tosti, F.: Joint Interpretation of Multi-Frequency Ground Penetrating Radar and Ultrasound Data for Mapping Cracks and Cavities in Tree Trunks, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-8762, https://doi.org/10.5194/egusphere-egu23-8762, 2023.

EGU23-10874 | ECS | Orals | GI2.1

Ground subsidence risk mapping and assessment along Shanghai metro lines by PS-InSAR and LightGBM 

Long Chai, Xiongyao Xie, Biao Zhou, and Li Zeng

Ground subsidence is a typical geological hazard in urban areas. It endangers the safety of infrastructures, such as subways. In this study, the ground subsidence risk of Shanghai metro lines was mapped and assessed. Firstly, PS-InSAR was used for the ground subsidence survey, and subsidence intensity was divided into five classes according to subsidence velocity. 10 subsidence causal factors were collected and the frequency ratio method was applied to analyze the correlation between subsidence and its causal factors. Then LightGBM model was used to generate a ground subsidence susceptibility map. And receiver operating characteristic curve and area under the curve (AUC) were adopted to assess the model. And AUC is 0.904, which suggests the model's performance is excellent. Finally, a risk matrix was introduced to consider the intensity and susceptibility of ground subsidence. The risk of ground subsidence was mapped and classified into five levels: R1 (very low), R2 (low), R3 (medium), R4 (high), and R5 (very high). The results showed that the risk of subway ground subsidence exhibited a regional-related characteristic. Metro lines located in areas with higher ground subsidence risk levels also had higher ground subsidence risk levels. Meanwhile, the statistical results of subway ground subsidence risk levels showed that subway stations were safer than sections.

How to cite: Chai, L., Xie, X., Zhou, B., and Zeng, L.: Ground subsidence risk mapping and assessment along Shanghai metro lines by PS-InSAR and LightGBM, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-10874, https://doi.org/10.5194/egusphere-egu23-10874, 2023.

EGU23-12226 | ECS | Orals | GI2.1

Evaluation of Spectral Mixing Techniques for Geological Mixture in a Laboratory Setup: Insights on the nature of mixing 

Maitreya Mohan Sahoo, Kalimuthu Rajendran, Arun Pattathal Vijayakumar, Shibu K. Mathew, and Alok Porwal

Geological mixtures having endmembers mixed at a fine scale pose a challenge to estimating their fractional abundances. Light incident on these mixtures interacts both at multilayered and surface levels, resulting in volumetric and albedo scattering, respectively. Accounting for these effects necessitates a nonlinear spectral mixing model approach rather than conventional linear mixing. In this study, we evaluate the performances of linear and various nonlinear spectral mixing models for an intimately mixed geological mixture, i.e., a banded hematite quartzite (BHQ) sample. The BHQ sample with distinct endmembers of hematite and quartzite facilitated our study of the behavior of light on two-component nonlinear mixtures. In a laboratory-based experimental setup, we used a spectroradiometer of full spectral range in the visible and near-infrared regions (350 to 2500nm) to acquire a hyperspectral image of the BHQ sample. It was followed by the identification of nonlinearly mixed regions and inferring changes in their spectral features. The nonlinearity induced in these regions was attributed to two significant causes- (1) the fine scale of spectral mixing and (2) the spectroradiometer sensor’s limited ability to spatially distinguish between focused and neighboring points, thereby producing a point spread effect. We observed the effects of nonlinear spectral mixing for our sample by changing the sensor’s height from 1mm to 5mm, to simulate fine and coarse-resolution images, respectively. The spectral mixing was modeled using the existing mapped ground truth fractional abundances and library endmembers’ spectra by linear mixing and established nonlinear techniques of the generalized bilinear model (GBM), polynomial post-nonlinear model (PPNM), kernel-based support vector machines (k-SVMs). The evaluated performance metric of reconstruction error revealed the nonlinearity effect in image pixels through statistical tests and nonlinearity parameters used in these models. It was further observed that the associated nonlinearity increases from fine to coarse-resolution images. The minimum error of image reconstruction was observed for the polynomial post-nonlinear model, with a single nonlinearity parameter and an average reconstruction error (ARE) of 0.05. Our study provided insights into the nature of nonlinear mixing with endmember composition and particle sizes.

How to cite: Sahoo, M. M., Rajendran, K., Pattathal Vijayakumar, A., Mathew, S. K., and Porwal, A.: Evaluation of Spectral Mixing Techniques for Geological Mixture in a Laboratory Setup: Insights on the nature of mixing, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-12226, https://doi.org/10.5194/egusphere-egu23-12226, 2023.

EGU23-13163 | ECS | Orals | GI2.1

High-resolution grain-size analysis and non-destructive hyperspectral imaging of sediments from the Gaoping canyon levee to establish past typhoon and monsoon activities affecting Taiwan during the late Holocene 

Joffrey Bertaz, Kévin Jacq, Christophe Colin, Zhifei Liu, Maxime debret, Hongchao Zhao, and Andrew Tien-Shun Lin

Non-destructive and high-resolution hyperspectral analyses are widely used in planetary and environmental sciences and in mining exploration. In recent years, the scanning method was applied to lacustrine sediment cores in complement to XRF core scanning. However, this approach was rarely applied to marine sediments. The Gaoping canyon, located south of Taiwan island, is connected to the Gaoping River and is a very active canyon with large sediment transfer capacity. In particular, about 4 typhoon-driven hyperpycnal flows have been recorded by mooring systems in every recent year. Studying their frequency and intensity responding to past climate and environmental changes is a key to understand future tropical storm frequency and related climate variability. Core MD18-3574 was collected on the western levee of the Gaoping canyon and displays numerous fine laminations (millimetric to centimetric) recording the deposition of the gravity flows occurring in the canyon and on the slope. In this study, we combined non-destructive analyses such as XRF core scanning and hyperspectral imaging with high-resolution grain size and XRD bulk mineralogy analyses to understand the sedimentological and geochemical variations at the scale of the laminae. Core MD18-3574 sediments consist mainly of fine silt, presenting an alternance of fine-grained and coarse-grained laminations. The average mean grain size is 13.4 µm ranging from 9 to 20.5 µm. Thick coarser grained laminations are showing grain size distributions and asymmetric sorting of typical turbidite sequence. Grain size and bulk mineralogy display great visual and statistical correlation with XRF (Fe/Ca, Si/Al) and hyperspectral proxies (sediment darkness (Rmean), Clay_R2200). Principal component analyses (PCA) demonstrates that darker laminae are composed of coarser sediments with high Si/Al (quartz and feldspar-rich) and Clay_R2200 values and low Fe/Ca (calcite-rich) resulting from gravity flows.  Inversely, lighter laminae consist of finer sediments with low Si/Al (muscovite and illite-rich), Clay_R2200 and high Fe/Ca resulting from hemipelagic deposition. Thus, such interpretation was extended to the core scale to identify gravity flows deposits layers. Moderate intensity tropical storm frequency is decreasing since the last 4 ka in response to the sea surface temperature (SST) decrease and enhanced East Asian winter monsoon since the middle Holocene. Tropical storm intensity increased after 2 ka in La Niña like periods indicating that the surge of super-typhoons hitting Taiwan could be triggered by El Niño Southern Oscillation (ENSO) state and variability. We can then assess that tropical storm activity is controlled by SST, monsoon system and ENSO conditions. This study brings new insights in the prediction of the ongoing climate change impacts on storms activity in the western Pacific Ocean.

How to cite: Bertaz, J., Jacq, K., Colin, C., Liu, Z., debret, M., Zhao, H., and Lin, A. T.-S.: High-resolution grain-size analysis and non-destructive hyperspectral imaging of sediments from the Gaoping canyon levee to establish past typhoon and monsoon activities affecting Taiwan during the late Holocene, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-13163, https://doi.org/10.5194/egusphere-egu23-13163, 2023.

EGU23-13329 | ECS | Orals | GI2.1

Combined use of NDT methods for steel rebar corrosion monitoring 

Giacomo Fornasari, Federica Zanotto, Andrea Balbo, Vincenzo Grassi, and Enzo Rizzo

This paper describes laboratory tests performed with an NDT geophysical methods: Ground Penetrating Radar (GPR), Self Potential (SP) and Direct Current (DC) methods in order to monitor the corrosion of a rebar embedded in concrete. Even if the GPR is a common geophysical method for reinforced concrete structures, the SP and DC techniques are not widely used. Rebar corrosion is one of the main causes of deterioration of engineering reinforced structures and this degradation phenomena reduces their service life and durability. Non-destructive testing and evaluation of the rebar corrosion is a major issue for predicting the service life of reinforced concrete structures.

Several new experiments were performed at Applied Geophysical laboratory of University of Ferrara, following the experiences coming from previous tests (Fornasari et al., 2022), where two reinforced concrete samples of about 50 cm x 30 cm were cast, with a central ribbed steel rebar of 10 mm diameter and 35 cm long, were partially immersed in a plastic box with salty and distilled water. In this experiment, we applied a new protocol, where an epoxy resin was used in order to focalize the corrosion only along the exposed part of the rebar. The steel rebar was partially painted with a waterproof resin in order to leave only the central part uncovered for a length of 8 cm. The same waterproof epoxy resin was applied on part of the concrete sample, in order to have a specific chlorides diffusion across a freeway zone of about 10cm x 8cm defined below the exposed rebar.

The experiments were carried out on two identically constructed reinforced concrete samples, exposed to distilled water (sample “A”) and the second, exposed to a salty water with chlorides (sample “B”). Both samples were partially immersed for only 1 cm form the lower surface. The sample B was immersed in a salty water plastic box with different NaCl concentrations. An initial NaCl concentration of 0.1 % was adopted for 7 days, then the concentration was increased to 1% and finally to 3.5% for further 7 days. The experiment was set up in two phases. In the first phase of this study, we monitored the "natural" corrosion occurred on sample "B" due to the diffusion of chlorides towards the steel rebar comparing the obtained data with those of sample "A" exposed to distilled water. In the second phase of the study, accelerated corrosion was applied to sample "B" in order to induce an increment of the corrosion phenomena. The accelerated corrosion was designed in order to reach different theoretical levels of mass weight loss in the steel rebar, which were of 2%, 5%, 10% and 20%. During the experiments, 2GHz C-Thrue GPR antenna, Multivoltmeter with non-polarized calomel referenced electrode for SP and ABEM Terrameter LS for resistivity data, were used to monitor the rebar corrosion monitoring. The collected data were used for an integration observation to detect the evolution of the corrosion phenomenon on the reinforcement steel rebar and to define a quantitative analysis of the phenomena.

 

How to cite: Fornasari, G., Zanotto, F., Balbo, A., Grassi, V., and Rizzo, E.: Combined use of NDT methods for steel rebar corrosion monitoring, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-13329, https://doi.org/10.5194/egusphere-egu23-13329, 2023.

EGU23-13720 | ECS | Posters on site | GI2.1

A fully customizable data management system for Built Cultural Heritage surveys through NDT 

Irene Centauro, Teresa Salvatici, Sara Calandra, and Carlo Alberto Garzonio

A fully customizable data management system for Built Cultural Heritage surveys through NDT

The diagnosis of Built Cultural Heritage using non-invasive methods is useful to deepen the understanding of building characteristics, assessing the state of conservation of materials, and monitoring over time the effectiveness of restoration interventions.

Ultrasonic and sonic tests are Non-Destructive Techniques widely used to evaluate the consistency of historic masonry and stone elements and to identify on-site internal defects such as voids, detachments, fractures. These tests, in addition to being suitable for Cultural Heritage because they are non-invasive, provide a fundamental preliminary screening useful to better address further analysis.

Ultrasonic and Sonic velocity tests performed on monuments involve a lot of different information obtained from many surveys.  It is therefore important to optimize the amount of data collected both during documentation and diagnostic phase, making them easily accessible and meaningful for analysis and monitoring. In addition, investigations set-up should be following a standard methodology, repeatable over time, suitable for different types of artifacts, and prepared for comparison with other techniques.

An integrated data management system is then also useful to support the decision-making processes behind maintenance actions.

This work proposes the development of a complete management IT solution for the Ultrasonic and Sonic measurements of different types of masonry, and stone artifacts. The system consists of a browser-based collaboration and document management platform, a mobile/desktop application for data entry, and a data visualization and reporting tool. This set of tools enable the complete processing of data, from the on-site survey to their analysis and visualization.

The proposed methodology allows the standardization of the data entry workflow, and it is scalable, so it can be adapted to different types of masonry and artifacts. Moreover, this system provides real-time verification of data, optimizes survey and analysis times, and reduces errors. The platform can be integrated with machine learning models, useful to gain insight from data.

This solution, aimed to improve the approach to diagnostics of Cultural Heritage, has been successfully applied by the LAM Laboratory of the Department of Earth Sciences (University of Florence) on different case studies (e.g., ashlar, frescoed walls, plastered masonries, stone columns, coat-of-arms, etc.) belonging to many important monuments.

How to cite: Centauro, I., Salvatici, T., Calandra, S., and Garzonio, C. A.: A fully customizable data management system for Built Cultural Heritage surveys through NDT, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-13720, https://doi.org/10.5194/egusphere-egu23-13720, 2023.

EGU23-13934 | Orals | GI2.1

Pavements Layered Media Characterizations using deep learning-based GPR full-wave inversion 

Li Zeng, Biao Zhou, Xiongyao Xie, and Sébastien Lambot

The possibility to estimate accurately the subsurface electric properties of the pavements from ground-penetrating radar (GPR) signals using inverse modeling is obstructed by the appropriateness of the forward model describing the GPR subsurface system. In this presentation, we improved the recently developed approach of Lambot et al. whose success relies on a stepped-frequency continuous-wave (SFCW) radar combined with an off-ground monostatic transverse electromagnetic horn antenna. The deep-learning based method were adopted to train an intelligent model including the waveform of the Green’s functions. The method was applied and validated in laboratory conditions on a tank filled with a two-layered sand subject to different water contents. Results showed agreement between the predictions of measured Green’s functions deep-learning model and the measured ones. Model inversions for the dielectric permittivity and heights of antenna further demonstrated for a comparison of presented method.

How to cite: Zeng, L., Zhou, B., Xie, X., and Lambot, S.: Pavements Layered Media Characterizations using deep learning-based GPR full-wave inversion, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-13934, https://doi.org/10.5194/egusphere-egu23-13934, 2023.

EGU23-14658 | Orals | GI2.1

Influence of tectonic deformation on the mechanical properties of calcareous rocks: drawbacks of the non-destructive techniques  

Elisa Mammoliti, Veronica Gironelli, Danica Jablonska, Stefano Mazzoli, Antonio Ferretti, Michele Morici, and Mirko Francioni

Discontinuity surfaces are well known to influence the mechanical behaviour of rocks under compression. Non-destructive techniques, such as ultrasonic pulse velocity and sclerometers, are increasingly used to estimate uniaxial compressive strength of rocks. In this study, several core samples derived from the doubling works of the railway network near Genga (Marche Region, Central Italy) were analysed in order to assess the influence of the structural geological context (proximity to folds, faults etc.) and tectonic deformation on rock strength. Tests were conducted in rock specimens through: i) conventional uniaxial compressive experiment, ii) non-destructive rebound-based methods such as Schmidt Hammer and Equotip  and iii) ultrasound. In this way, it was possible to make a critical analysis of the use of these techniques in the estimation of the uniaxial compressive strength (considering also information about discontinuity type, orientation and nature of the filling). Finally, a petrographic analysis using optical microscope has been undertaken as a support to the observations derived from the analysis at the sample scale. The results indicate that there are two main factors influencing the strength at the scale of the specimen. The first and most decisive factor is the presence of natural pre-existing fractures. The second is the tectonic deformation ratio: the greater the deformation is, the little the strength. Furthermore, through the combined use of uniaxial compressive experiment, non-destructive rebound-based methods and ultrasounds it was possible to highlights the advantages and limitations of each technique and define/propose new guidelines for their use. 

How to cite: Mammoliti, E., Gironelli, V., Jablonska, D., Mazzoli, S., Ferretti, A., Morici, M., and Francioni, M.: Influence of tectonic deformation on the mechanical properties of calcareous rocks: drawbacks of the non-destructive techniques , EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-14658, https://doi.org/10.5194/egusphere-egu23-14658, 2023.

EGU23-14846 | ECS | Orals | GI2.1

Combined NDT data for road management through BIM models 

Luca Bertolini, Fabrizio D'Amico, Antonio Napolitano, Jhon Romer Diezmos Manalo, and Luca Bianchini Ciampoli

One of the main priorities for road administrations and stakeholders is the management and monitoring of critical infrastructures, especially transportation infrastructures. In this context, Building Information Modeling (BIM) can be one of the more effective methodologies to be used to optimize the management process. In Italy, several laws and regulations have been issued, making the use of BIM procedures mandatory for the design of new infrastructures and emphasizing its role in the management of existing civil works [1, 2].

Monitoring operations of transportation infrastructures are generally conducted by on-site surveys. Non-Destructive Testing methods (i.e., GPR, LiDAR, Laser Profilometer, InSAR, etc.) have been used to perform these inspections as their outputs have been proven to be effective in determining the conditions of the infrastructure and its assets [3]. Moreover, BIM methodology could prove a valuable tool to manage the data provided by these surveys, as it consists in the creation of digital models capable of containing information related to the object that they are representing. These models can be used to store over time the different information obtained by the NDT surveys to carry out integrated analysis on the conditions of the infrastructure [4].

This study aims to analyze a potential BIM process capable of integrating different NDT surveys’ outputs, to generate an informative digital model of an infrastructure and its assets. The proposed methodology is then able to merge the data provided by the inspections, which is typically obtained by different operators and comes in different file formats, in a single BIM model. The main goal of the research is to provide a process to optimize the management procedures of transportation infrastructures, by creating digital models capable of reducing the problems typically associated with the monitoring and maintenance of these critical civil works. By merging different information in a single environment and relying on survey data that are commonly analyzed separately, an integrated analysis of the infrastructure can be carried out and data loss can be reduced.

The study was developed by relying on real data, obtained from on-site surveys carried out over Italian infrastructures. As different outputs have been collected, BIM models of different assets of the analyzed infrastructures were defined. Preliminary results have shown that the proposed methodology can be a viable tool for optimizing the management process of these critical civil works.

Acknowledgements

The research is supported by the Italian Ministry of Education, University and Research under the National Project “Extended resilience analysis of transport networks (EXTRA TN): Towards a simultaneously space, aerial and ground sensed infrastructure for risks prevention”, PRIN 2017. Prot. 20179BP4SM.

References

[1] MIT, 2018. Ministero delle Infrastrutture e dei Trasporti, D. Lgs 109/2018

[2] MIT, 2021. Ministero delle Infrastrutture e dei Trasporti, D.M. 312/2021

[3] D’Amico F. et al., 2020. Integration of InSAR and GPR Techniques for Monitoring Transition Areas in Railway Bridges. NDT&E Int

[4] D’Amico, F. et al., 2022. Integrating Non-Destructive Surveys into a Preliminary BIM-Oriented Digital Model for Possible Future Application in Road Pavements Management. Infrastructures 7, no. 1: 10

How to cite: Bertolini, L., D'Amico, F., Napolitano, A., Manalo, J. R. D., and Bianchini Ciampoli, L.: Combined NDT data for road management through BIM models, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-14846, https://doi.org/10.5194/egusphere-egu23-14846, 2023.

EGU23-14899 | ECS | Orals | GI2.1

Fusion of in-situ and spaceborne sensing for environmental monitoring 

Konstantinos Karyotis, Nikolaos Tsakiridis, and George Zalidis

Measuring soil reflectance in the field, rather than in a laboratory setting, can be very useful when it comes to numerous applications such as mapping the distribution of various soil properties, especially when prompt estimations are needed.  Recent advances in spectroscopy, and specifically in the development of low-cost Micro-Electro-Mechanical-Systems (MEMS) based spectrometers, pave the way for developing real-time applications in agriculture and environmental monitoring. Compared to high-end spectrometers, whose spectral range extends from Visible (VIS) and Near-InfraRed (NIR) to Shortwave InfraRed (SWIR), MEMS cover limited parts of the electromagnetic spectrum resulting in missing important information. In parallel, new space missions such as Planet Fusion are operationally ready and provide optical imagery (RGB and NIR) with high spatial (3m) and temporal (daily) resolution. To this end, we assessed the potential of augmenting the bands captured from a commercial MEMS sensor (Spectral Engines Nirone S2.2 @ 1750 – 2150 nm) by adjoining the Planet Fusion bands at the exact sampling date and location that in-situ scans originate.

Employing the above, a set of portable MEMS was used at a pilot area in Cyprus (Agia Varvara, Nicosia district) to develop a regional in-situ Soil Spectral Library (SSL). A set of 60 distinct locations were selected for capturing in situ spectral reflectance after the stratification of Planet Fusion pixels of the pilot area, while a physical soil sample was analyzed at the laboratory for the determination of Soil Organic Carbon (SOC) content. During the visit, topsoil moisture was also measured.

The resulting SSL, containing the in-situ spectra, SOC, and moisture content was further augmented by the 4 bands of Planet Fusion imagery acquired on the exact date of the field visit. At this stage, three Random Forest models for SOC content estimation were fitted using as explanatory variables initially only the MEMS data with moisture content, then Planet Fusion bands, and finally all three available inputs.

The results presented an observable decrease in RMSE of SOC content estimations when fusing in-situ with spaceborne data, highlighting the importance of the information contained at VIS-NIR when modeling SOC. On the other hand, the synergy of the two sensors is mutually beneficial; SOC absorption bands can also be found in the SWIR region and are hard to detect with remote sensing means since they fall within the strong water absorption region (around 1950 nm). MEMS-based systems operating at the SWIR part can support this process, and if combined with ancillary environmental measurements such as soil moisture, can provide a cost-effective solution for measuring SOC and other soil-related parameters. To loosen the necessity of laboratory analysis, it is necessary to establish protocols and guidelines for spectral data collection and management to ensure that the data collected is consistent and of high quality and develop representative SSLs that can be used to serve different modeling scenarios. 

How to cite: Karyotis, K., Tsakiridis, N., and Zalidis, G.: Fusion of in-situ and spaceborne sensing for environmental monitoring, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-14899, https://doi.org/10.5194/egusphere-egu23-14899, 2023.

EGU23-14981 | ECS | Orals | GI2.1

Implementation of a Digital Twin integrating remote sensing information for network-level infrastructure monitoring 

Antonio Napolitano, Valerio Gagliardi, Luca Bertolini, Jhon Romer Diezmos Manalo, Alessandro Calvi, and Andrea Benedetto

Nowadays, there is an emerging demand from public authorities and managing bodies, to evaluate the overall health of infrastructures and identify the most critical transport assets. Considering the national-scale level, thousands of transport infrastructure are in critical conditions and require urgent maintenance actions. Currently, most of available Digital Twins (DT) allow to explore and visualize data including limited kind of information. This issue still limits the operative and practical use by infrastructure owners, that require fast solutions for managing several amount of data. Moreover, this idea is perfectly in line with European and national actions related to the development of a DT of the earth’s systems, including the “DestinE” programme of the European Commission by EUSPA and the European Space Agency (ESA). For this purpose, a dynamic DT model of a critical infrastructure is developed, using the available data about design information, historical maintenance operations and monitoring surveys based on satellite imageries.

In this context, this study presents an innovative concept of Digital Twin, which integrates all the details coming from NDTs surveys, on-site inspections and satellite-based information, to store, manage and visualize valuable information. This is made possible by analysing the main several gaps and limitations of existing platforms, providing a viable integrated solution developing an upgradable strategic analysis tool. To this purpose, remote sensing methods are identified as viable technologies for continuous monitoring operations. More specifically, data coming from satellites and the processing techniques, such as the Multi-Temporal SAR Interferometry approach, are strategic for the continuous monitoring of the displacements associated to transport infrastructures. An advantage of these techniques is the lighter data-processing required for the assessment of displacements and the detection of critical areas [1, 2].

The study introduces two main levels of innovation. The first one is associated to the integrated approach for transportation planning, integrating quantitative data from multi-sources, into the more traditional territorial analysis models. The second one is related to the technological engineering discipline, and it consists of the fusion of observation data from multi-source, with the last-generation dynamic data connected to the environment.

Acknowledgements

This research is supported by the Project “M.LAZIO”, accepted and funded by the Lazio Region, Italy.

References

[1] D'Amico, F. et al., “Implementation of an interoperable BIM platform integrating ground based and remote sensing information for network-level infrastructures monitoring”, Spie Remote Sensing 2022.

[2] Gagliardi, V. et al., “Bridge monitoring and assessment by high-resolution satellite remote sensing technologies”, Spie Future Sensing Technologies 2020.

How to cite: Napolitano, A., Gagliardi, V., Bertolini, L., Manalo, J. R. D., Calvi, A., and Benedetto, A.: Implementation of a Digital Twin integrating remote sensing information for network-level infrastructure monitoring, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-14981, https://doi.org/10.5194/egusphere-egu23-14981, 2023.

EGU23-15542 | ECS | Orals | GI2.1

Novel perspectives in transport infrastructure management: Data-Fusion, integrated monitoring and augmented reality 

Valerio Gagliardi, Luca Bianchini Ciampoli, Fabrizio D'Amico, Alessandro Calvi, and Andrea Benedetto

Infrastructure networks are crucial elements to ensure the sustainability of the current development model in which the movement of people and goods is essential. On the other hand, transport assets are increasingly exposed to several issues, including climatic conditions changing, vulnerability and exposure to natural hazards such as hydraulic, geomorphological, landslides and seismic phenomena, which can affect the structural integrity causing damages and deteriorations. The context is made even more serious by the degradation of materials and the progressive ageing of infrastructure, often accelerated by environmental conditions and inadequate, or not always effective, maintenance actions. This requires the investigation of novel methods for the large-scale detection of network-scale linear infrastructures, and simultaneously, of detail to diagnose causes and determine the priorities for the most effective countermeasures.

The proposed solution is based on a Data-Fusion approach, merging data coming from multi-source and multi-scale data, to enhance the interpretation process in a holistic sense. The information comes from spaceborne Multi-temporal SAR Interferometry, complemented by more detailed aerial data, detected by UAVs and Ground Based Non-Destructive Testing Methods, including laser scanner surveys for resolution and digital integrability, high-resolution camera measurements assisted by artificial intelligence for the surface degradation and from prospecting data collected by Ground Penetrating Radar technology. All these data can be simultaneously analyzed into a comprehensive digital platform, providing a useful tool to support operators and public bodies to prioritize maintenance actions.

The digital platform can be investigated also using augmented reality tools, capable of generating and reproducing the Digital Twin of the inspected infrastructure into a real environment. This allows any monitoring evaluation through a diagnostic technique that integrates spatial, aerial, ground-based and geophysical surveys, allowing navigation within the infrastructure. Potential applications are numerous, ranging from mapping of wide areas affected by potential criticality to the definition of the main vulnerabilities related to the seismic and hydraulic risks, the analysis of land changes surrounding the assets following extreme natural events, and the reconstruction of historical deformative trends of roads, railways and bridges through the interpretation of SAR data.

Acknowledgments

This research is supported by the Italian Ministry of Education, University, and Research under the National Project “EXTRA TN”, PRIN2017, Prot. 20179BP4SM. In addition, this research is supported by the Project “MLAZIO” funded by Lazio Region (Italy).

How to cite: Gagliardi, V., Bianchini Ciampoli, L., D'Amico, F., Calvi, A., and Benedetto, A.: Novel perspectives in transport infrastructure management: Data-Fusion, integrated monitoring and augmented reality, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-15542, https://doi.org/10.5194/egusphere-egu23-15542, 2023.

EGU23-16471 | ECS | Orals | GI2.1

Hydrogen isotope fractionation between leaf wax compounds and source water in tropical angiosperms 

Amrita Saishree, Shreyas Managave, and Vijayananda Sarangi

The hydrogen isotope fractionation between leaf wax compounds and source water, the apparent fractionation (εapp), necessary for the reconstruction of hydrogen isotopic composition (δD) of precipitation, is mainly assessed through field and transect studies. The current εapp dataset, however, exhibit a bias toward mid-latitude regions of the Northern Hemisphere. Here we report the results of an outdoor experiment wherein four evergreen and three deciduous species were grown with water of known δD value (-1.8‰) in a tropical semi-arid monsoon region. This allowed us to estimate εapp more accurately and also quantify εapp variability within a species and among different species. Among-species εapp values varied by -119 ± 23‰ (for n-alkane of chain length n-C31) and -126 ± 27‰   (for n-alkanoic acid of chain length n-C30). The similarity of the among-species variability in εapp reported here and that observed in field and transect studies suggested the species-effect, rather than uncertainty in δD of source water, control the uncertainty in community-averaged εapp. The fractionation of  δD between n-C29 alkane and n-C30 alkanoic acid (ε29/30) and between n-C31 alkane and n-C32 alkanoic acid (ε31/32) were 7 ± 25‰ and 6 ± 15‰, respectively, suggesting minimal fractionation of hydrogen isotopes during decarboxylation. Further, as we did not observe a systematic difference between the εapp of deciduous and evergreen species; changes in the relative proportion of this vegetation in a community might not affect its εapp value.

How to cite: Saishree, A., Managave, S., and Sarangi, V.: Hydrogen isotope fractionation between leaf wax compounds and source water in tropical angiosperms, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-16471, https://doi.org/10.5194/egusphere-egu23-16471, 2023.

EGU23-16632 | ECS | Orals | GI2.1

Development of a flexible 2D DC Resistivity modelling technique for use in space domain 

Deepak Suryavanshi and Rahul Dehiya

Geoelectric non-destructive imaging and monitoring of the earth's subsurface requires robust and adaptable numerical methods to solve the governing differential equation. Most of the time, the DC data is acquired along a straight line. Hence, we solve the DC problem for the 2D case. But the source for the DC method exhibits a 3D nature. To account for the source's 3D nature, the 2D DC resistivity modeling is often carried out in the wavenumber domain. There have been studies that suggest ways for the selection of optimum wavenumbers and weights. But, this does not guarantee a universal choice of wavenumbers. The chosen wavenumbers and related weights strongly influence the precision of the resulting solution in the space domain. Many forward modeling studies demonstrate that selecting effective wavenumbers is challenging, especially for complicated models with topography, anisotropy, and significant resistivity differences. Moreover, forward modeling requires many wavenumbers as the models get more complex. 

This study focuses on developing a method that can completely omit wavenumbers for 2D DC resistivity modeling. The present work finds its motivation in a numerical experiment on a simple half-space model. Since the analytical response for such a model can be easily calculated, we match the analytical solution against the responses obtained from various wavenumbers and weights used in the literature. All the responses deviated from the analytical solution after a certain distance, and none of them were found to be accurate for large offsets. It was discovered after thorough testing of the numerical scheme that the wavenumbers selected for the forward modeling significantly impacted how practical the approach is for large offsets. 

To overcome this problem, a new boundary condition is derived and implemented in the existing numerical scheme. The numerical scheme chosen to perform the forward modeling is Mimetic Finite Difference Method (MFDM). We consider that the source is placed on the origin of the coordinate system. This removes the dependency of the source term, expressed in the Fourier domain, on the wavenumber. The solution obtained by solving the resulting equation will be an even function of the wavenumber and be real-valued. This ensures that the potential in the space domain for the 2D model will also be a real-valued even function with a symmetry about a plane perpendicular to the strike direction and passing through the origin. Because the first-order derivative of an even function at the plane of symmetry vanishes, mathematically, it can be expressed as a Neumann boundary condition at the considered plane. Therefore, we propose a scheme to solve the 2D resistivity problem in the space domain using the boundary condition mentioned here.

The developed algorithm is tested on isotropic and anisotropic two-layer models with large contrasts. It is found that the numerical solutions obtained using the modified boundary condition described above show considerable accuracy even for large offsets when compared with the analytical solution. On the other hand, the results obtained using available wavenumbers in the literature are also compared and are found to deviate considerably from the analytical solution at large offsets.

How to cite: Suryavanshi, D. and Dehiya, R.: Development of a flexible 2D DC Resistivity modelling technique for use in space domain, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-16632, https://doi.org/10.5194/egusphere-egu23-16632, 2023.

Approximately eight years ago, after a research activity that I started in the nineties on the application of GPR and, later, of NDTs to civil engineering, I realized that no technology can be considered as self standing. This is the consequence of the high complexity related to the civil engineering works and to the highly unpredictable impacts of ordinary processes and exceptional natural events. At the beginning of this century it was clear that a reliable and comprehensive monitoring of a phenomenon affecting bridges, tunnels, structures, or any civil engineering work is possible only by integrating data from different sources.

GPR was at that time a very promising technology, and many investigated in this field measuring e.g. pavement deformation, asphalt moisture, ballast degradation, also the mechanical properties of materials. The accurate outcomes represent a great step forward for the science in this sector, but the final results demonstrated to be partial, because the approach failed under a holistic perspective.

So, in the second decade of 2000, the need of a novel paradigm for investigation raised, in order not only to identify and quantify the problem, but also to diagnose its causes.

It was the stimulus to fuse data from different NDTs, under the assumption that information A and B give much more than A+B. It means that one information (A) can be explanatory of one or more characters contained in a second (B) that cannot be inferred by the knowledge of only one single standing information (B).

Based on this I decided, with very high level international colleagues, to establish a new session at EGU. It was the 2018. Today the sixth edition!

During these years a number ranging from 80 to 120 of researchers took part to each session. Also the number of countries involved is impressive, ranging for each session from 10 to 17. The institutions ranged from 36 to 50.

The number of contributions presented in the five editions is 141.

After 2018 we have seen several special issues of prominent journals were dedicated to data fusion. Recently, beyond the typical technologies as GPR, UT, ERT, a great attention was given to Lidar, Satellite and UAV.

Data fusion was also directed to other interesting and promising fields as archaeology, agriculture, urban planning, only to cite a few.

I would like to underline that this great interest started in Europe and in USA, but actually the geographical coverage is much wider and it includes at a same level also Asiatic and emerging countries.

There is now a new frontier that has to be. My vision is that this holistic approach can be used to develop an innovative immersive environment through the integration in augmented reality platforms on which a digital twin can be generated and dynamically upgraded through an adaptive interface, as well as using AI and machine learning paradigms.

How to cite: Benedetto, A.: Data fusion in civil engineering: personal experience, vision and historical considerations, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-16864, https://doi.org/10.5194/egusphere-egu23-16864, 2023.

Building Information Modeling is a software-based parametric design approach that allows a full interoperability between the various actors involved in a design or management process. Notwithstanding It has been specifically created for buildings projects, its use has been adapted to a wide range of applications, including transport infrastructure design and, more recently, cultural heritage. In regard to this field, it has been mainly applied to raise accuracy and effectiveness of restoring and stabilization activities for historical architectures.
The present study aims at demonstrating how the use of BIM may return remarkable outcomes in improving the current quality level of digital valorisation and virtual reconstructions of historical structures, especially when their rate of conservation is limited. Indeed, even though current digital reconstruction models are, usually, verified under an archaeological perspective, their structural consistency is never tested. This involves that many virtual reconstruction models are likely to represent structures that are historically accurate but that have no structural sense as, according to their geometric features and the construction materials/techniques, they would not stand their weight.
In this perspective, this study proposes a novel BIM-based methodology capable of both driving the archaeological reconstruction hypotheses and testing the reconstruction hypotheses on a structural basis. The model can be schematically represented by the following process:
1- Survey of the emerging: acquisition of data from superficial archaeological surveys (topographic data, laser scanner, aero photogrammetry, satellite images)
2- Survey of the hidden: acquisition of data from hypogeal surveys (georadar, electrical tomography, magnetometry);
3- Mechanical characterization: gathering of information concerning the materials of the find, proven in their mechanic qualities also through load stress tests;
4- Virtual reconstruction: proposal of a possible hypothesis of virtual reconstruction linked to structural and morphological features known to be present in the referred historical periods;
5- Structural test: engineering and structural confirmation of the forwarded hypothesis by means of finite element algorithms.
The proposed methodology was tested on the archaeological area of the Villa and Circus of Maxentius along the Ancient Appian Way in Rome; all the planned activities have been shared and authorized by the Sovrintendenza Capitolina ai Beni Culturali, within the context of the Project BIMHERIT, funded by Regione Lazio (DTC Lazio Call, Prot. 305-2020-35609).

How to cite: Santarelli, R. and Ten, A.: Integration of non-destructive surveys for BIM-based and structural-verified digital reconstruction of archaeological sites, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-17489, https://doi.org/10.5194/egusphere-egu23-17489, 2023.

In the wake of Chernobyl and Fukushima accidents radiocesium has become a radionuclide of most environmental concern. The ease with which this radionuclide moves through the environment and is taken up by plants and animals is governed by its chemical forms and site-specific environmental characteristics. Distinctions in climate and geomorphology, as well as 137Cs speciation in the fallout result in differences in migration rates of 137Cs in the environment and rates of its natural attenuation. In Fukushima areas 137Cs was found to be strongly bound to soil and sediment particles, its bioavailability being reduced as a result.  Up to 80% of the deposited 137Cs on the soil were reported to be incorporated in hot glassy particles (CsMPs) insoluble in water. Disintegration of these particles in the environment is much slower than of Chernobyl-derived fuel particles. The higher annual precipitation and steep slopes in Fukushima contaminated areas are conducive to higher erosion and higher total radiocesium wash-off. Typhoons Etou in 2015 and Hagibis in 2019 demonstrated the pronounced redistribution of 137Cs on river watersheds and floodplains, and in some cases natural self-decontamination occurred. Among the common features in 137Cs behavior in Chernobyl and Fukushima is a slow decrease in 137Cs activity concentration in small, closed, and semi-closed lakes and its particular seasonal variations: increase in summer and decrease in winter.

How to cite: Konoplev, A.: Fukushima and Chernobyl: similarities and differences of radiocesium behavior in the soil-water environment, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-1081, https://doi.org/10.5194/egusphere-egu23-1081, 2023.

After the Fukushima nuclear accident, atmospheric 134Cs and 137Cs measurements were taken in Fukushima city for 8 years, from March 2011 to March 2019. The airborne surface concentrations and deposition of radiocesium (radio-Cs) were high in winter and low in summer; these trends are the opposite of those observed in a contaminated forest area. The effective half-lives of 137Cs in the concentrations and deposition before 2015 (0.754 and 1.30 years, respectively) were significantly shorter than those after 2015 (2.07 and 4.69 years, respectively), which was likely because the dissolved radio-Cs was discharged from the local terrestrial ecosystems more rapidly than the particulate radio-Cs. In fact, the dissolved fractions of precipitation were larger than the particulate fractions before 2015, but the particulate fractions were larger after 2016. X-ray fluorescence analysis suggested that biotite may have played a key role in the environmental behavior of particulate forms of radio-Cs after 2014. 

Resuspension of 137Cs from the contaminated ground surface to the atmosphere is essential for understanding the long-term environmental behaviors of 137Cs. We assessed the 137Cs resuspension flux from bare soil and forest ecosystems in eastern Japan in 2013 using a numerical simulation constrained by surface air concentration and deposition measurements. In the estimation, the total areal annual resuspension of 137Cs is 25.7 TBq, which is equivalent to 0.96% of the initial deposition (2.68 PBq). The current simulation underestimated the 137Cs deposition in Fukushima city in winter by more than an order of magnitude, indicating the presence of additional resuspension sources. The site of Fukushima city is surrounded by major roads. Heavy traffic on wet and muddy roads after snow removal operations could generate superlarge (approximately 100 μm in diameter) road dust or road salt particles, which are not included in the model but might contribute to the observed 137Cs at the site.

The current presentation based on the two published papers: Watanabe et al., ACP, https://doi.org/10.5194/acp-22-675-2022 (2022) and Kajino et al., ACP, https://doi.org/10.5194/acp-22-783-2022 (2022). The presenters would like to thank all of the co-authors of the two papers for their significant contributions.

How to cite: Kajino, M. and Watanabe, A.: Eight-year variations in atmospheric radiocesium in Fukushima city and simulated resuspension from contaminated ground surfaces in eastern Japan, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-1607, https://doi.org/10.5194/egusphere-egu23-1607, 2023.

EGU23-2540 | Posters on site | GI2.2

Hydrological setting control 137Cs and 90Sr concentration at headwater catchments in the Chornobyl Exclusion Zone 

Yasunori Igarashi, Yuichi Onda, Koki Matsushita, Hikaru Sato, Yoshifumi Wakiyama, Hlib Lisovyi, Gennady Laptev, Dmitry Samoilov, Serhii Kirieiev, and Alexei Konoplev

Concentration-discharge relationships are widely used to understand the hydrologic processes controlling river water chemistry. We investigated how hydrological processes affect radionuclide concentrations (137Cs and 90Sr) in surface water in the headwater catchment at the Chornobyl exclusion zone in Ukraine. In flat wetland catchment, the depth of saturated soil layer changed little throughout the year, but changes in saturated soil surface area during snowmelt and immediately after rainfall affected water chemistry by changing the opportunities for contact between suface water and the soil surface. On the other hand, slope catchments with little wetlands, the water chemistry in river water is formed by changes in the contribution of "shallow water" and "deep water" due to changes in the water pathways supplied to the river. Dissolved and suspended 137Cs concentrations did not correlate with discharge rate or competitive cations, but the solid/liquid ratio of 137Cs showed a significant negative relationship with water temperature, and further studies are needed in terms of sorption/desorption reactions. 90Sr concentrations in surface water were strongly related to water pathways for each the catchments. The contact between surface water and the soil surface and the change in the contribution of shallow and deep water to stream water could changes 90Sr concentrations in surface water for in wetland and slope catchments, respectively. In this study, we revealed that the radionuclide concentrations in rivers in Chornobyl is strongly affected by the water pathways at headwater catchments.

How to cite: Igarashi, Y., Onda, Y., Matsushita, K., Sato, H., Wakiyama, Y., Lisovyi, H., Laptev, G., Samoilov, D., Kirieiev, S., and Konoplev, A.: Hydrological setting control 137Cs and 90Sr concentration at headwater catchments in the Chornobyl Exclusion Zone, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-2540, https://doi.org/10.5194/egusphere-egu23-2540, 2023.

EGU23-2561 | Posters on site | GI2.2

Dispersion of particle-reactive elements caused by the phase transitions in scavenging 

Kyeong Ok Kim, Vladimir Maderich, Igor Brovchenko, Kyung Tae Jung, Sergey Kivva, Katherine Kovalets, and Haejin Kim

A generalized model of scavenging of the reactive radionuclide 239,240Pu was developed, in which the sorption-desorption processes of oxidized and reduced forms on multifraction suspended particulate matter are described by first-order kinetics. One-dimensional transport-diffusion-reaction equations were solved analytically and numerically. In the idealized case of instantaneous release of 239,240Pu on the ocean surface, the profile of concentrations asymptotically tends to the symmetric spreading bulge in the form of a Gaussian moving downward with constant velocity. The corresponding diffusion coefficient is the sum of the physical diffusivity and the apparent diffusivity caused by the reversible phase transitions between the dissolved and particulate states. Using the method of moments, we analytically obtained formulas for both the velocity of the center mass and apparent diffusivity. It was found that in ocean waters that have oxygen present at great depths, we can consider in the first approximation a simplified problem for a mixture of forms with a single effective distribution coefficient, as opposed to considering the complete problem. This conclusion was confirmed by the modeling results for the well-ventilated Eastern Mediterranean. In agreement with the measurements, the calculations demonstrate the presence of a maximum that is slowly descending for all forms of concentration. The ratio of the reduced form to the oxidized form was approximately 0.22-0.24. At the same time, 239,240Pu scavenging calculations for the anoxic Black Sea deep water reproduced the transition from the oxidized to reduced form of 239,240Pu with depth in accordance with the measurement data.

How to cite: Kim, K. O., Maderich, ., Brovchenko, ., Jung, . T., Kivva, ., Kovalets, ., and Kim, .: Dispersion of particle-reactive elements caused by the phase transitions in scavenging, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-2561, https://doi.org/10.5194/egusphere-egu23-2561, 2023.

EGU23-3049 | ECS | Posters on site | GI2.2

Changes in Air Dose Rates due to Soil Water Content in Forests in Fukushima Prefecture, Japan 

Miyu Nakanishi, Yuichi Onda, Hiroaki Kato, Junko Takahashi, Hikaru Iida, and Momo Takada

Radionuclides released and deposited by the 2011 Fukushima Daiichi Nuclear Power Plant accident caused an increase in air dose rates in forests in Fukushima Prefecture. It has been reported that air dose rates increase during rainfall, but we found that air dose rates decreased during rainfall in forests in Fukushima. This is said to be due to the shielding effect of soil moisture. This study aimed to develop a method for estimating changes in air dose rates due to rainfall even in the absence of soil moisture data. Therefore, we used the preceding rainfall (Rw), an indicator that also takes into account past rainfall; we calculated Rw in Namie-Town, Futaba-gun, Fukushima Prefecture from May to July 2020, and estimated air dose rates. In this area, air dose rates decreased with increasing soil moisture. Furthermore, air dose rates could be estimated by combining Rw with a half-life of 2 hours and 7 days, and by considering hysteresis in the absorption and drainage processes. The coefficient of determination (R2) exceeded 0.70 for the estimation of soil water content at this time. Furthermore, good agreement was also observed in the estimation of air dose rates from Rw (R2 > 0.65). The same method was used to estimate air dose rates at the Kawauchi site from May to July 2019. Due to the high water repellency of the Kawauchi site, the increase in soil water content was very small and the change in air dose rate was almost negligible when soil water content was less than 15% and rainfall was less than 10 mm. This study enabled the estimation of soil water content and air dose rate from rainfall and captured the effect of rainfall on the decreasing trend of air dose rate. Therefore, in the future, This study can be used as an indicator to determine whether temporary changes in air dose rates are caused by influences other than rainfall. This study also contributes to the improvement of methods for estimating external dose rates for humans and terrestrial animals and plants in forests.

How to cite: Nakanishi, M., Onda, Y., Kato, H., Takahashi, J., Iida, H., and Takada, M.: Changes in Air Dose Rates due to Soil Water Content in Forests in Fukushima Prefecture, Japan, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-3049, https://doi.org/10.5194/egusphere-egu23-3049, 2023.

Wet scavenging modeling remains a challenge of the atmospheric transport of 137Cs following the Fukushima Daiichi Nuclear Power Plant accident, which significantly influences the detailed spatiotemporal 137Cs distribution. Till now, numerous wet deposition schemes have been proposed for 137Cs, but it is often difficult to evaluate them consistently, due to the limited resolution of meteorological field data and detailed differences in model implementations. This study evaluated the detailed behavior of 25 combinations of in- and below-cloud wet scavenging models in the framework of the Weather Research and Forecasting-Chemistry model, using high-resolution (1 km × 1 km) meteorological input. The above implementation enables consistent evaluation with great details, revealing complex local behaviors of these combinations. The 1-km-resolution simulations were compared with simulations obtained previously using 3-km-resolution meteorological field data, with respect to the rainfall pattern of the east Japan during the accident, atmospheric concentrations acquired at the regional SPM monitoring sites and the total ground deposition. The capability of these models in reproducing local-scale observations were also investigated with a local-scale observations at the Naraha site, which his only 17.5 km from the Fukushima Daiichi Nuclear Power Plant. The performance of the ensemble mean was also evaluated. Results revealed that the 1-km simulations better reproduce the cumulative rainfall pattern during the Fukushima accident than those revealed by the 3-km simulations, but showing with spatiotemporal variability in accuracy. And rainfall below 1 mm/h is critical for the simulation accuracy. Those single-parameter wet deposition models that rely solely on the rainfall showed improvements in performance in the 1-km simulations relative to that in the 3-km simulations, because of the improved rainfall simulation in the 1-km results. Those multiparameter models that rely on both cloud and rainfall showed more robust performance in both the 3-km and -1km simulations, and the Roselle–Mircea model presented the best performance among the 25 models considered. Besides rainfall, wind transport showed substantial influence on the removal process of atmospheric 137Cs, and it was nonnegligible even during periods in which wet deposition was dominant. The ensemble mean of the 1-km simulations better reproduces the high deposition area and the total deposition amount is closer to the observations than the 3-km simulation. At the local scale, the 1-km-resolution simulations effectively reproduced the 137Cs concentrations observed at the Naraha site, but with deviations in peak timing, mainly because of biased wind direction. These findings indicate the necessity of a multi-parameter model for robust regional-scale wet deposition simulation and a refined wind and dispersion model for local-scale simulation of 137Cs concentration.

How to cite: Zhuang, S., Dong, X., Xu, Y., and Fang, S.: Modeling and sensitivity study of wet scavenging models for the Fukushima accident using 1-km-resolution meteorological field data, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-4152, https://doi.org/10.5194/egusphere-egu23-4152, 2023.

EGU23-4697 | ECS | Orals | GI2.2

Quantifying the riverine sources of sediment and associated radiocaesium deposited off the coast of Fukushima Prefecture 

Pierre-Alexis Chaboche, Wakiyama Yoshifumi, Hyoe Takata, Toshihiro Wada, Olivier Evrard, Toshiharu Misonou, Takehiko Shiribiki, and Hironori Funaki

The Fukushima-Daiichi Nuclear Power Plant (FDNPP) accident trigged by the Great East Japan Earthquake and subsequent tsunami in March 2011 released large quantities of radionuclides in terrestrial and marine environments of Fukushima Prefecture. Although radiocaesium (i.e. 134Cs and 137Cs) activity in these environments has decreased since the accident, the secondary inputs via the rivers draining and eroding the main terrestrial radioactive plume were shown to sustain high levels of 137Cs in riverine and coastal sediments, which are likely deposited off the coast of the Prefecture. Accordingly, identifying the sources of sediment is required to elucidate the links between terrestrial and marine radiocaesium dynamics and to anticipate the fate of persistent radionuclides in the environment.

The objective of this study is to develop an original sediment source tracing technique to quantify the riverine sources of sediment and associated radionuclides accumulated in the Pacific Ocean. Target coastal sediment cores (n=6) with a length comprised between 20 and 60cm depth were collected during cruise campaigns between July and September 2022 at the Ota (n=2), Niida (n=1) and Ukedo (n=3) river mouths. Prior to gamma spectrometry measurements, sediment cores were opened and cut into 2 cm increments, oven-dried at 50°C for at least 48 hours, ground and passed through a 2-mm sieve.

Preliminary results regarding the spatial and depth distribution of radiocaesium in these samples show a strong heterogeneity, with highest radiocaesium levels (up to 134 ± 2 and 4882 ± 11 Bq kg-1 for 134Cs and 137Cs, respectively) found in coastal sediment cores located at the Ukedo river mouth. On the opposite, no trace or low levels of Fukushima-derived radiocaesium were found in the Niida and in one sediment core of the Ota River mouths. Additional measurements will be conducted to determine the physico-chemical properties of this sediment, in order to select the optimal combination of tracers, which will then be introduced into un-mixing models. This increase knowledge will undoubtedly be useful for watershed and coastal management in the FDNPP post-accidental context.

How to cite: Chaboche, P.-A., Yoshifumi, W., Takata, H., Wada, T., Evrard, O., Misonou, T., Shiribiki, T., and Funaki, H.: Quantifying the riverine sources of sediment and associated radiocaesium deposited off the coast of Fukushima Prefecture, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-4697, https://doi.org/10.5194/egusphere-egu23-4697, 2023.

EGU23-4925 | Posters on site | GI2.2

Verification of reproductivity of 137Cs activity concentration in the database by an ocean general circulation model 

Daisuke Tsumune, Frank Bryan, Keith Lindsay, Kazuhiro Misumi, Takaki Tsubono, and Michio Aoyama

Radioactive cesium (137Cs) is distributed in the global ocean due to global fallout from atmospheric nuclear tests, release from reprocessing plants in Europe, and supply to the ocean due to the Fukushima Daiichi Nuclear Power Plant accident. In order to detect future contamination by radionuclides, it is necessary to understand the global distribution of radionuclides such as 137Cs. For this purpose, the IAEA is compiling a database of observation results (MARIS). However, since the spatio-temporal densities of observed data vary widely, it is difficult to obtain a complete picture from the database alone. Comparative validation using ocean general circulation model (OGCM) simulations is useful in interpreting these observations, and global ocean general circulation model (CESM2, POP2) simulations were conducted to clarify the behavior of 137Cs in the ocean. The horizontal resolution is 1.125° longitude and 0.28° to 0.54° latitude. The minimum spacing near the sea surface is 10 m, and the spacing increases with depth to a maximum of 250 m with 60 vertical levels. Climatic values were used for driving force. As a source term for 137Cs to the ocean, atmospheric fallout from atmospheric nuclear tests was newly established based on rainfall data and other data, and was confirmed to be more reproducible than before. Furthermore, the release from reprocessing plants in Europe and the leakage due to the accident at the Fukushima Daiichi Nuclear Power Plant were taken into account. 2020 input conditions were assumed to continue after 2020, and calculations were performed from 1945 to 2030. The simulated 137Cs activities were found to be in good agreement, especially in the Atlantic and Pacific Oceans, where the observed densities are large. On the other hand, they were underestimated in the Southern Hemisphere, suggesting the need for further improvement of the fallout data. 137Cs concentrations from the Fukushima Daiichi Nuclear Power Plant accident in March 2011 were generally in good agreement, although the reproducibility remained somewhat problematic due to insufficient model resolution. In other basins, the concentration characteristics were able to be determined, although the observed values were insufficient. Radioactivity concentrations of atmospheric nuclear test-derived 137Cs may continue to be detected in the global ocean after 2030. The results of this simulation are useful for planning future observations to fill the gaps in the database.

How to cite: Tsumune, D., Bryan, F., Lindsay, K., Misumi, K., Tsubono, T., and Aoyama, M.: Verification of reproductivity of 137Cs activity concentration in the database by an ocean general circulation model, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-4925, https://doi.org/10.5194/egusphere-egu23-4925, 2023.

EGU23-4947 | ECS | Posters on site | GI2.2

Vertical distribution of radioactive cesium-rich microparticles in forest soil of Hamadori area, Fukushima Prefecture 

Takahiro Tatsuno, Hiromichi Waki, Naoto Nihei, and Nobuhito Ohte

A lot of radionuclides were scattered after the Fukushima Daiichi Nuclear Power Plant (FDNPP) accident. Previous studies showed that there were FDNPP-derived radioactive cesium-rich microparticles (CsMPs) with the size of a few μm in the soil and river water around Fukushima Prefecture[1]. CsMPs have high radioactive cesium (Cs) concentration per unit mass, therefore they can be one of the factor in overestimating the Cs concentration in samples. Because Cs in CsMPs may not react directly with clay particles unlike the Cs ion in liquid phase, it is considered that CsMPs work as Cs carrier in soils[2]. However, unlike ionic Cs and Cs adsorbed onto clay particles, the distribution and dynamics of CsMPs in soils have not been clarified. In this study, we investigated vertical distribution of CsMPs in the forest soil and the soil properties in Fukushima Prefecture, Japan.

Soil samples were collected from the forest in the difficult-to-return zone, approximately 10 km away from the FDNPP. The undisturbed soil samples were collected from 0-35 cm soil depth at 5 cm intervals using core sampler to investigate soil properties. Furthermore, litter samples on the surface soil layer were collected. Using these samples, the vertical distribution of Cs concentration in the soil and Cs derived from CsMPs were investigated. Cs concentration in samples placed in 100 mL of U8 container was measured using a germanium semiconductor detector. Cs derived from CsMPs was evaluated using an Imaging plate with reference to the method ffor quantification of CsMPs[3].

Like Cs adsorbed on the soil, CsMPs were also mostly distributed in the soil surface layer between o and 5 cm of soil depth. We considered that straining may be one of the mechanism of CsMPs retention on the soil surface. Bradford et al. (2006) [4] showed that straining might be a significant mechanism for colloid retention when the average particle size in the porous medium is less than 200 times larger than the colloidal particle size. In this study, assuming the CsMPs size of approximately 1 µm, the average particle size of the soil collected from surface layer 0-5 cm was less than 200 times that of CsMPs. However, the average particle size decreased in deeper layer than 5 cm, therefore, it was considered that straining mechanism could be stronger.

This work was supported by FY2022 Sumitomo Foundation and FY2022 Internal Project of Institute of Environmental Radioactivity, Fukushima University.

 

References

[1] Igarashi, Y. et al., 2019. J. Environ. Radioact. 205–206, 101–118.

[2]  Tatsuno, T et al., 2022. J. Environ. Manage. 329, 116983.

[3] Ikehara et al., 2018. Environ. Sci. Technol. 52, 6390–6398.

[4] Bradford et al., 2003. Environ. Sci. Technol. 37, 2242–2250.

How to cite: Tatsuno, T., Waki, H., Nihei, N., and Ohte, N.: Vertical distribution of radioactive cesium-rich microparticles in forest soil of Hamadori area, Fukushima Prefecture, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-4947, https://doi.org/10.5194/egusphere-egu23-4947, 2023.

EGU23-5042 | ECS | Posters on site | GI2.2

Changes in 90Sr transport dynamics in groundwater after large-scale groundwater drawdown in the vicinity of the cooling pond at the Chornobyl Nuclear Power Plant 

Hikaru Sato, Naoaki Shibasaki, Maksym Gusyev, Yuichi Onda, and Dmytro Veremenko

Migration of long-lived radioactive 90Sr introduced by nuclear accidents and radioactive waste requires long-term monitoring and protection management due to its half-life of 28.8 years and high mobility in water. Presently, 37 years have passed since the largest worldwide 90Sr contamination was released and deposited around the Chornobyl Nuclear Power Plant (ChNPP). In the vicinity of the ChNPP, the water level of the cooling pond (CP) has declined since May 2014 following the decommissioning phase of the Unit 3 reactor. The drawdown of the CP lowered the groundwater level in a massive vicinity (about 70 km2), and the change in the groundwater system due to the drawdown has caused concerns about possible changes in 90Sr concentrations in water and transport dynamics to the Pripyat River. Therefore, this study evaluated how 90Sr transport dynamics were influenced due to changes in the groundwater flow system from 2011 to 2020 based on observed data and results of the groundwater flow simulation in the CP vicinity.

The numerical simulation was conducted from 2011 to 2020 on monthly time-step using USGS MODFLOW with PM11 GUI and calibrated to groundwater heads measured at monitoring wells. In the location between the CP and the Pripyat River, estimated pore velocities near the river were reduced compared to velocities before the CP drawdown due to the decrease in the hydraulic gradient between the CP and the river. Decrease in groundwater velocity results decrease in groundwater discharge and delay of 90Sr transport. Therefore, the amount of 90Sr transported from the CP to the river is smaller than the period prior to the CP drawdown. The reduced 90Sr transport is expected to have less impact on the radioactivity in the river water even in the Pripyat River floodplain northwest of the CP where 90Sr concentrations significantly increased after the CP drawdown. In addition, the measured and simulated changes in groundwater flow direction and velocity suggested the possibility of 90Sr accumulation at the floodplain caused by stagnant groundwater from reduced velocity and additional 90Sr infiltration from surrounding ponds located at the Pripyat River floodplain. Therefore, enhancing the current monitoring of 90Sr concentrations near the floodplain would be needed for long-term monitoring and protection management to prevent the risk.

How to cite: Sato, H., Shibasaki, N., Gusyev, M., Onda, Y., and Veremenko, D.: Changes in 90Sr transport dynamics in groundwater after large-scale groundwater drawdown in the vicinity of the cooling pond at the Chornobyl Nuclear Power Plant, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-5042, https://doi.org/10.5194/egusphere-egu23-5042, 2023.

The 3D model THREETOX was applied for the long-term simulation of the planned release of radioactively contaminated water from Fukushima storage tanks to marine environment. Two radionuclides were considered: 3H that has the largest activity in tanks and 129I that can caused the largest dose of radiation to human. The constant release rate of 3H equal to 22 TBq/y according to TEPCO estimations and the constant release rate of 129I equal to 361 MBq/y according to estimations from the current study were used in the simulations.

The THREETOX model used monthly averaged currents from the KIOST-MOM model. A dynamic food web model was included in the THREETOX model. In the model, organisms uptake the activity directly from water and through the food chain. The food chain consists of phytoplankton, zooplankton, non-piscivorous (prey) fish, and piscivorous (predatory) fish. In case of 129I, macro-algae was also considered. The modelling area covers Fukushima coastal waters and extends for 1600 km from the coast to the East. From North to South this area extends for 1300 km.

From model results, we can see how contamination will spread along the coast in different seasons. For example, in summer time the currents near the coast are directed to the North that leads to contamination of the Sendai Bay. This means that at different points along the coast, the concentration of radionuclides can periodically change according to currents that change during the year. Calculated concentrations of activity at several points along the coast of Japan, which correspond to largest cities in the area of interest, were extracted from model results. For example, calculated concentration of 3H in water in Tomioka point, which is quite close to FDNPP, sometimes can exceed 200 Bq/m3. In Soma point, the concentration will exceed 50 Bq/m3, while in point Iwaki-Onahama – 20 Bq/m3 at some moments of time. In other points, the calculated concentration of 3H in water will not exceed 10 Bq/m3 that is less than background concentration 50 Bq/m3. Concerning 129I, its maximum concentration in water will be around 10-3 – 10-2 Bq/m3 in points close to FDNPP and around 10-4 Bq/m3 in points further from the NPP that is around 100 000 times less than the calculated concentrations of 3H.

Calculated concentrations of OBT (organically bounded tritium) in predatory and prey fish are less than 0.01 Bq/kg in all points except FDNPP point where it is around 0.02 Bq/kg. This value is 10 times less than measured concentration of OBT in fish (0.2 Bq/kg) that was made in 2014 in the coastal area near the damaged NPP. Calculated concentrations of 129I in predatory and prey fish are in the range 10-6 – 10-4 Bq/kg in all considered points. Concentrations of 129I in macro-algae are about 100 times higher due to ability of iodine to accumulate in macro-algae. 

How to cite: Bezhenar, R., Takata, H., and Maderich, V.: Transport of H-3 and I-129 in water and their uptake by marine organisms due to the planned release of Fukushima storage water, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-6019, https://doi.org/10.5194/egusphere-egu23-6019, 2023.

EGU23-6026 | Orals | GI2.2

Dynamic change of dissolved Cs-137 from headwaters to downstream in the Kuchibuto River catchment 

Yuichi Onda, Taichi Kawano, Keisuke Taniguchi, and Junko Takahashi

The Fukushima Daiichi Nuclear Power Plant (FDNPP) accident on March 11, 2011 resulted in the release of large amounts of radioactive cesium-137 (137Cs) into the environment. It is important to characterize the Cs-137 dynamics throughout the river from the headwaters to the downstream. Previous studies have suggested the importance of dissolved forms of Cs-137 in organic matter in small watersheds and dissolved forms in suspended solids in large watersheds. Since the concentration of suspended-form Cs has been shown to decrease significantly after decontamination in evacuated areas (Feng et al. 2022), this rapid decrease in suspended-form Cs-137 concentration can be used to determine the cause of dissolved-form Cs. Therefore, we attempted to evaluate whether the dissolved Cs-137 was derived from organic matter or suspended solids by comparing data before and after decontamination.

 The objective of this study is to compare the decreasing trends of Cs-137 concentrations in decontaminated and undecontaminated areas based on long-term monitoring of suspended solids, dissolved solids, and coarse organic matter Cs-137 concentrations since 2011. The study area includes four headwater basins and four river basins (eight sites in total) in the Kuchibuto River watershed in the Yamakiya district of Fukushima Prefecture, located approximately 35 km northwest of the FDNPP.

In the Kuchibuto River watershed, a large inflow of decontaminated soil with low Cs-137 concentrations due to an increase in the amount of bare land caused by decontamination resulted in a rapid decrease in the concentration of suspended-form 137Cs in the decontaminated area in the headwaters and in the upper reaches of the river. However, no clear effect of decontamination was observed in the concentrations of dissolved Cs-137 and Cs-137 in coarse organic matter. Comparison of the slopes of Cs-137 concentrations in the suspended, dissolved, and coarse organic matter showed that the slope of the dissolved form was similar to that of the coarse organic matter in the source watersheds, and similar to that of the SS in the downstream watersheds. These results suggest that the contribution of dissolved Cs-137 from organic matter in small watersheds and that from suspended solids in large watersheds is significant.

How to cite: Onda, Y., Kawano, T., Taniguchi, K., and Takahashi, J.: Dynamic change of dissolved Cs-137 from headwaters to downstream in the Kuchibuto River catchment, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-6026, https://doi.org/10.5194/egusphere-egu23-6026, 2023.

EGU23-10093 | Posters on site | GI2.2

Riverine 137Cs dynamics and remoralization in coastal waters during high flow events 

Yoshifumi Wakiyama, Hyoe Takata, Keisuke Taniguchi, Takuya Niida, Yasunori Igarashi, and Alexei Konoplev

Understanding riverine 137Cs dynamics during high-flow events is crucial for improving predictability of 137Cs transportation and relevant hydrological responses. It is frequently documented that the majority of 137Cs is exported during high-flow events triggered by intensive rainfall. Studies on 137Cs in coastal seawater suggested that a huge high-flow events resulted in high dissolved 137Cs concentration in seawater. Different temporal patterns of 137Cs concentrations in river water are found in the existing literature on 137Cs dynamics during high-flow events. Although such differences may reflect catchment characteristics, there is no comprehensive analysis for the relationships. This study explores catchment characteristics affecting 137Cs transport via river to ocean based on datasets obtained by sampling campaigns during high-flow events. 137Cs datasets obtained at 13 points in 6 river water systems were subject to the analysis. The analyses intended to explore relationship between catchment characteristics (scale and land use composition) and 137Cs dynamics in terms of variations in concentration, fluxes, and potential remobilization in seawater. We could not find any significant correlations between the parameters of catchment characteristics and mean values of normalized concentrations of 137Cs and apparent Kd. However, when approximating 137Cs concentrations and Kd value as a power function of suspended solid concentration (Y=α X^β), the power of β in the equations for dissolved 137Cs concentration and Kd showed negative and positive correlations with the logarithm of the watershed area, respectively, and the positive β was found when the catchment area was on the order of 100 km2 or larger and vice versa. This indicates that the concentration of dissolved 137Cs tends to decrease with increased water discharge in larger catchments for smaller catchments. These results suggest that the temporal pattern of dissolved 137Cs concentrations depends on watershed scale. 137Cs flux during a single event ranged from 1.9 GBq to 1.1 TBq and accounted for 0.00074% to 0.22% of total 137Cs deposited in relevant catchments. Particulate 137Cs flux accounted for more than 92% of total 137Cs flux, except for Ukedo River basin with a large dam reservoir. R-factor, an erosivity index in the Universal Soil Loss Equation model family, is a good parameter for reproducing sediment discharge and particulate 137Cs flux. Efficiency of particulate 137Cs flux, calculated by dividing the flux by R-factor of event, tended to be high in catchments with relatively low forest cover. Desorption ratio of 137Cs, obtained by 1-day shaking experiment of SS in seawater, ranged from 2.8 to 6.6%. The ratio was almost proportional of ratio of exchangeable 137Cs. The estimated amounts of desorbed 137Cs, obtained by multiplying particulate 137Cs and the desorption ratios, were greater than direct flux of dissolved 137Cs. Reanalysis of riverine 137Cs dataset in high flow events is revealing relationship between catchment characteristics and 137Cs dynamics. Further analyses, such as evaluation of decontamination impacts and inter-catchment comparisons of 137Cs fluxes, are required for better understanding.

How to cite: Wakiyama, Y., Takata, H., Taniguchi, K., Niida, T., Igarashi, Y., and Konoplev, A.: Riverine 137Cs dynamics and remoralization in coastal waters during high flow events, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-10093, https://doi.org/10.5194/egusphere-egu23-10093, 2023.

EGU23-10539 | Posters on site | GI2.2 | Highlight

Long-term dynamics of 137Cs accumulation at an urban pond 

Honoka Kurosawa, Kenji Nanba, Toshihiro Wada, and Yoshifumi Wakiyama

It is known that the semi-enclosed water area such as pond and dam reservoir is readily subject to 137Cs accumulation because of the secondary inflow from the catchment area. We present the long-term monitoring data of the 137Cs concentration in bottom sediment and pond water in an urban pond located in the central area of Koriyama City, Fukushima Prefecture to discuss the 137Cs dynamics of the urban pond. The pond was decontaminated by the bottom sediment removal in 2017. The bottom sediment core and pond water were collected in 2015 and 2018-2021. The inflow and outflow water were collected in 2020-2021. The river water around the pond was collected in 2021. The bottom sediment and water samples were measured for 137Cs concentration, particulate size distribution, and N and C stable isotopes. Compared between 2015 and 2018, the 137Cs inventory and 0-10 cm depth of 137Cs concentration in the bottom sediment at 7 points were decreased by 81 % (mean 1.50 to 0.28 MBq/m2) and 85 % (mean 31.5 to 4.8 kBq/kgDW), respectively. Although mean 137Cs inventory in bottom sediment did not drastically change during 2018-2021, its variability became wider. Points with increased 137Cs inventory in bottom sediment showed year-by-year increase in thickness of layer with concentrations higher than 8 kBq/kgDW, a criterion for considered decontamination. The 137Cs concentration in suspended solids (SS) in pond water was lowered after decontamination, although it still remained above 8 kBq/kgDW. The 137Cs concentrations in SS of inflow water were also high, exceeding 8 kBq/kgDW. The 137Cs concentration in SS of the river water around the pond was higher when it passed through the urban area, suggesting that the inflow of particles from urban origin maintained high 137Cs level in the pond. 

How to cite: Kurosawa, H., Nanba, K., Wada, T., and Wakiyama, Y.: Long-term dynamics of 137Cs accumulation at an urban pond, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-10539, https://doi.org/10.5194/egusphere-egu23-10539, 2023.

EGU23-10868 | Posters on site | GI2.2

Estimation of annual Cesium-137 influx from the FDNPP to the coastal water 

Shun Satoh and Hyoe Takata

Due to the accident at the Fukushima Daiichi Nuclear Power Plant (1F) in March 2011, radionuclides were introduced into the environment, and one of the release pathways to the ocean is the direct discharge from the 1F (on-going release). This was mainly caused immediately after the accident, but even now, the on-going release is continuing. In this study, firstly we estimated the on-going release of 137Cs from 1F over 10 years after the accident, using the TEPCO’s 137Cs monitoring results in the coastal area around 1F. Secondly, change in the monitoring data related to countermeasures by TEPCO (e.g. construction of iced walls) to reduce the introduction of contaminated water into the ocean or detect 137Cs in nearby seawater, so their effects on the on-going release estimation were also discussed. A box model including inside and outside of the port was assumed for the area around 1F, and the amount of 137Cs in the box was estimated (estimated value: modeled data). Then, the difference between the estimated value and the amount of 137Cs obtained from actual observed concentrations (measured value: monitoring data) was calculated. The result showed that the measured value was higher than the estimated value, suggesting the on-going release from 1F. As for decrease in monitoring data after the countermeasures, it is implied that the estimation of rate of on-going release has been reduced by the countermeasures.

How to cite: Satoh, S. and Takata, H.: Estimation of annual Cesium-137 influx from the FDNPP to the coastal water, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-10868, https://doi.org/10.5194/egusphere-egu23-10868, 2023.

EGU23-11671 | Posters on site | GI2.2

Changes in Cs-137 concentrations in river-bottom sediments and their factors in Fukushima Prefecture rivers 

Naoyuki Wada, Yuichi Onda, Xiang Gao, and Chen Tang

The Fukushima Daiichi Nuclear Power Plant accident (FDNPP) in 2011 resulted in the release of large amounts of Cs-137 into the atmosphere. Cs-137 deposited on land was mainly distributed in forests, but some of it has been discharged to the sea through rivers. The dissolved and suspended forms of Cs-137 in rivers have been focused on, and it is known that the discharge mechanism and concentration formation of Cs-137 differ depending on the land use in the river basin. On the other hand, there are few cases that focus on the dynamics of Cs-137 in river bottom sediments. River-bottom sediment is less likely to flow downstream than suspended sediments, so contamination in the downstream area may be long-term.
We will clarify the migration mechanism of Cs-137 in rivers including river-bottom sediment.Therefore, we will analyze data collected from 2011 to 2018 in 89 watersheds in Fukushima prefecture. In analyzing the data, we removed sampling points with brackish water using electrical conductivity and corrected for particle size to standardize the surface area of particles that absorb Cs-137.As a result, it was found that unlike dissolved and suspended forms, the Cs concentration in river-bottom sediments can increase within the initial year. This is related to the average initial deposition in the watershed and the amount of initial deposition at the river-bottom sediment sampling sites, with a tendency to increase with relatively higher initial deposition in the upstream area. It was also known that the decrease in suspended Cs concentration was more pronounced when anthropogenic activities in the watershed were more active, but there was no clear relationship between land use in the watershed and changes in river-bottom sediment Cs concentration. This indicates that suspended sediment Cs concentrations are controlled by initial deposition to suspended sediment production sources, whereas river-bottom Cs concentrations are controlled by multiple factors such as sediment traction and Cs supply from river water.

How to cite: Wada, N., Onda, Y., Gao, X., and Tang, C.: Changes in Cs-137 concentrations in river-bottom sediments and their factors in Fukushima Prefecture rivers, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-11671, https://doi.org/10.5194/egusphere-egu23-11671, 2023.

EGU23-12670 | ECS | Orals | GI2.2

Minimizing the loss of radioactively contaminated sediment from the Niida watershed (Fukushima, Japan) through spatially targeted afforestation. 

Floris Abrams, Lieve Sweeck, Johan Camps, Grethell Castillo-Reyes, Bin Feng, Yuichi Onda, and Jos Van Orshoven

Government-led decontamination of agricultural land in the Fukushima accident (2011) region has lowered the on-site radiation risk considerably. From 2013 to early 2017, 11.9% of the land in the Fukushima disaster affected Niida watershed in Japan was remediated through topsoil removal. However, this resulted in a 237.1% increase in suspended sediment loads in the river for 2016 compared to 2013.  In contrast, sediment loads decreased by 41% from 2016 to 2017; this can be attributed to the effect of natural vegetation restoration on sediment yield and transfer patterns (Bin et al., 2022). Since radiocaesium firmly binds to the clay minerals in the soil, it is inevitably transported along with the sediments downstream to the river systems. These observations confirm that rapid, spatially targeted interventions, such as revegetation, e.g., through afforestation, have the potential to decrease the magnitude and period of increased exports of contaminated sediments. The CAMF tool (Cellular Automata-based Heuristic for Minimizing Flow) (Vanegas et al., 2012) was originally designed to find the cells in a raster representation of a watershed for which afforestation would lead to a maximal reduction of sediment exports with minimal effort or cost while taking sediment flow from cell to cell into account. In our research, we adapted the CAMF tool to account for the radiocaesium budgets associated with the transported sediments. We applied the approach to the Niida catchment, where land-cover changes in upstream decontaminated regions are detected using drone imagery and linked to increased sediment loads in the Niida river using long-term river monitoring systems. For example In 2014, agricultural land (18.02 km2) was one of the major land uses in the regions where decontamination was ordered, resulting in increased sediment loads from 2014 to 2016. By recognizing both the on- and off-site impacts of the remediation interventions and their temporal dynamics, the modified CAMF tool offers scope for supporting the formulation of spatio-temporal schemes for the remediation of agricultural land. These schemes aim to decrease the radiation risk for downstream communities and minimize the potential recontamination of already decontaminated sites.

How to cite: Abrams, F., Sweeck, L., Camps, J., Castillo-Reyes, G., Feng, B., Onda, Y., and Van Orshoven, J.: Minimizing the loss of radioactively contaminated sediment from the Niida watershed (Fukushima, Japan) through spatially targeted afforestation., EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-12670, https://doi.org/10.5194/egusphere-egu23-12670, 2023.

EGU23-13366 | Orals | GI2.2

Similarity of long-term temporal decrease in atmospheric Cs-137 between Chernobyl and Fukushima 

Kentaro Akasaki, Shu Mori, Eiichi Suetomi, and Yuko Hatano

We compare the atmospheric concentrations of Cs-137 after a decade between Chernobyl and Fukushima cases. We plotted 8 datasets on log-log axes (5 cases in Chernobyl and 3 cases Fukushima) and found that they appear to follow a single function.

There have been measured the atmospheric concentration after the Chernobyl accident for more than 30 years [1]. On the other hand, several teams of Japanese researchers have been measured in Fukushima and its vicinity for almost 10 years. [2][3] In this study, we compare 5 sites in Chernobyl (Pripyat, Chernobyl, Baryshevka, Kiev, and Polesskoe) and 3 sites in Fukushima (FDNPP O-6 and O-7, Univ. Fukushima).

We adjust the magnitude of the data because it depends on the amount of the initial deposition. After the adjustment, we plot the 8 cases on a log-log plot. We found that the 8 cases collapse together, with the power index of -1.6. Namely,

C(t) ~ t^{-1.6}.               …(1)

Incidentally, we have been proposed a formula which reproduce the long-term behavior of atmospheric concentration at a fixed location as

C(t) = A exp(-bt) t^{-4/3}    …(2)

where A is a parameter which relates to the amount of the initial deposition and b as the reaction rate of all the first-order reactions (including the radioactive decay rate, the vegetation uptake rate, the runoff rate, etc). We will investigate the difference in the power-law index in Eq. (1) and (2). The parameter b is highly dependent on the environment. When we take a proper value of b, the apparent decrease of the concentration will change from t^{-4/3}. We may make the apparent power-index close to -1.6.

 

[1] E. K. Garger, et al., J. Env. Radioact., 110 (2012) 53-58.

[2] A. Watanabe, et al., Atmos. Chem. Phys. 22 (2022) 675-692.

[3] T. Abe, K. Yoshimura, Y. Sanada, Aerosol and Air Quality Research, 21 (2021) 200636.

How to cite: Akasaki, K., Mori, S., Suetomi, E., and Hatano, Y.: Similarity of long-term temporal decrease in atmospheric Cs-137 between Chernobyl and Fukushima, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-13366, https://doi.org/10.5194/egusphere-egu23-13366, 2023.

EGU23-13486 | ECS | Posters virtual | GI2.2

Distributions of tritium in the marine water and biota around Rokkasho Reprocessing Plant 

Satoru Ohtsuki, Yuhei Shirotani, and Hyoe Takata

For decommissioning of Fukushima Daiichi Nuclear Power Station (FDNPS), it is one of the biggest problems to treat the radioactive contaminated stagnant water in the building. It is difficult to remove H-3 from the contaminated water by only Advanced Liquid Processing System (ALPS) treatment. Thus, the Japanese Government announced to release the ALPS treated water containing H-3. To predict the alteration of the dose rate of the marine biota by the change of H-3 concentration in marine water after the release of ALPS water, it is necessary to understand the dynamics of H-3 in marine ecosystem. In this study, we studied the behavior of H-3 in the marine environment (water and biota) off Aomori and Iwate prefectures from FY2003 to FY2012, as the background data of the Pacific Ocean along the coast of the North East Japan. To clarify the dynamics of H-3 in marine biota, we compared H-3 and Cs-137. Excluding the period of the intermittent test operation of the Rokkasho Reprocessing Plant (FY2006-FY2008), the concentration of H-3 in seawater, tissue free water tritium (TFWT) and organically bound tritium (OBT) were 0.052-0.20 Bq/L with a mean of 0.12±0.031 Bq/L, 0.050-0.34 Bq/kg-wet with a mean of 1.1±0.039 Bq/kg-wet and 0.0070-0.099 Bq/kg-wet with a mean of 0.042±0.019 Bq/kg-wet, respectively. Before the FDNPS accident (FY2003-FY2010), Cs-137 concentration in seawater and marine biota were 0.00054-0.0027 Bq/L with a mean of 0.0016±0.00041 Bq/L and 0.022-1.8 Bq/kg-wet with a mean of 0.090±0.037 Bq/kg-wet, respectively. Concentration Ratio (CR), the ratio of the concentration of marine biota and seawater for TFWT, was to be 0.34-2.37 with a mean of 0.97±0.31 in all spices, meaning the concentration of marine biota was almost equal to seawater. For Cs-137, CR were 46-78 with a mean of 56±22. We compared CRs for TFWT of Gadus macrocephalus, Lophius litulon and Oncorhynchus keta with those of Cs-137. Comparing CR-TFWT and CR-Cs-137 for these three species, Spearman-R was <0.4 and p was >0.05, indicating that the dynamics of TFWT and Cs-137 in marine ecology is decoupled.

How to cite: Ohtsuki, S., Shirotani, Y., and Takata, H.: Distributions of tritium in the marine water and biota around Rokkasho Reprocessing Plant, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-13486, https://doi.org/10.5194/egusphere-egu23-13486, 2023.

EGU23-15515 | Posters on site | GI2.2

137Cs transport flux to surface water due to shallow groundwater discharge from forest hillslope 

Yuma Niwano, Hiroaki Kato, Satoru Akaiwa, Donovan Anderson, Hikaru Iida, Miyu Nakanishi, Yuichi Onda, Hikaru Sato, and Tadafumi Niizato

Groundwater systems and surface water can interact in a complex manner that influences catchment discharge, which then becomes more complex in forest slopes. A large amount of Radioactive cesium (137Cs) deposited on forests due to the Fukushima Daiichi Nuclear Power Plant accident remains in terrestrial environments and is transported downstream as suspended or dissolved forms by surface water. Generally, the concentration of dissolved 137Cs in surface water increases especially during runoff. While the leaching behavior of 137Cs from contaminated forest materials and soils to surface water has been heavily studied, the influence of 137Cs concentration in shallow groundwater systems in forest slopes have not been investigated. Therefore, detailed hydrological observations of groundwater on a forest hillslope will enable quantitative analysis of the influence of groundwater flow on the formation of dissolved 137Cs concentrations in surface water during base flow and during runoff. Our results showed that the dissolved 137Cs concentration in surface water increases during water discharge. The average concentration of dissolved 137Cs in shallow groundwater was 0.64 Bq/L, which was higher than that in surface water (average 0.10 Bq/L). Furthermore, it was also observed that a part of the shallow groundwater on the slope moves toward the river channel at the time of water runoff. This suggests that shallow groundwater may have flowed into the surface water during the outflow and contributed to the increase of 137Cs in the surface water. In this study, the contribution of groundwater in forest slopes to the dissolved 137Cs concentration in surface water was estimated using the hydrodynamic gradient distribution of groundwater in forest slopes and the measured dissolved 137Cs concentration in groundwater.

How to cite: Niwano, Y., Kato, H., Akaiwa, S., Anderson, D., Iida, H., Nakanishi, M., Onda, Y., Sato, H., and Niizato, T.: 137Cs transport flux to surface water due to shallow groundwater discharge from forest hillslope, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-15515, https://doi.org/10.5194/egusphere-egu23-15515, 2023.

Understanding, modeling and predicting the future of the Earth System in response to global change is a challenge for the Earth system scientific community, but a necessity to address pressing societal needs related to the UN Sustainable Development Goals and risk monitoring and prediction. These “wicked” environmental problems require the building of integrated modeling tools . The latter will only provide reliable response if they integrate all existing multi-disciplinary data sources. Open science and data sharing using the FAIR (Findable, Accessible, Interoperable, Reusable) principles provide the framework for such data sharing. However, when trying to put it into practice, we face a large fragmentation of the landscape, with different communities having developed their own data management systems, standards and tools.

When starting to work on the Theia/OZCAR Information System (IS) that aims to Facilitate the discovery, to make FAIR, in-situ data of continental surfaces collected by French research organizations and their foreign partners, we performed a “Tour de France” to understand the critical zone science users’ needs when searching for data. The common criterion that emerged was the variables names. We believe that this need is general to all disciplines involved in Earth System sciences and is all the more important when data is searched by scientists of other disciplines that are not familiar with the vocabularies of the other communities. This abstract aim is to share our experience in building the tools aiming at harmonizing and sharing variables names using FAIR principles.

In the Theia/OZCAR critical zone research community, long term observatories that produce the data have heterogeneous data description practices and variable names. They may be different for the same variable (i.e.: "soil moisture", "soil water content", "humidité des sols", etc.). Moreover, it is not possible to infer automatically or semi-automatically similarities between these variables names. In order to identify these similarities and implement data discovery functionalities on these dimensions in the IS, we built the Theia/OZCAR variable thesaurus. To enable technical interoperability of the thesaurus, it is published on the web using the SKOS vocabulary description standard. Other thesauri used in environmental sciences in Europe and worldwide have been identified and the definition of associative relationships with these vocabularies ensures the semantic interoperability of the Theia/OZCAR thesaurus. However, it is quite common that the variable names used for the search dimensions remain general (e.g. "soil moisture") and are not specific enough for the end user to interpret exactly what has been measured (e.g. "soil moisture at 10 cm depth measured by TDR probe"). Therefore, to improve data reuse and interoperability, the thesaurus now follows a recommendation of the Research Data Alliance and implements the I-ADOPT framework to describe the variables more precisely. Each variable is composed and described by relationships with atomic concepts whose definition is specified. The use of these atomic concepts enhances interoperability with other catalogues or services and contributes to the reuse of the data by other communities that those who collected them.

How to cite: Braud, I., Coussot, C., Chaffard, V., and Galle, S.: Theia/OZCAR Thesaurus: a terminology service to facilitate the discovery, interoperability and reuse of data from continental surfaces and critical zone science in interdisciplinary research, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-1099, https://doi.org/10.5194/egusphere-egu23-1099, 2023.

EGU23-1294 | Posters on site | GI2.3

A data integration system for ocean climate change research in the Northwest Pacific 

Sung Dae Kim, Hyuk Min Park, Young Shin Kwon, and Hyeon Gyeong Han

A data integration and processing system was established to provide long-time data and real-time data to the researcher who are interested in long-term variation of ocean data in the Northwest Pacific area. All available ocean data of 6 variables (ocean temperature, salinity, dissolved oxygen, ocean CO2, nutrients) in the NWP area (0°N - 65°N, 95°E - 175°E) are collected from the Korean domestic organizations (KIOST, NFIS, KHOA, KOEM), the international data systems (WOD, GTSPP, SeaDataNet, etc.), and the international observation networks (Argo, GOSHIP, GLODAP, etc.). Total number of data collected is over 5 millions and observation dates are from 1938 to 2022. After referring to several QC manuals and related papers, QC procedures and test criteria for 6 data items were determined and documented. Several Matlab programs complying with QC procedures were developed and used to check quality of all collected data. We excluded duplicated data from the data set and saved them in 0.25° grid data files. Long-term average over 40 years and standard deviation of data at each standard depths and grid point were calculated. All quality controlled data, qc flag, average, standard deviation of each ocean variables are saved in format of netCDF and provided to ocean climate researchers and numerical modelers. We also have 2 plans using the collected data from 2023 to 2025. The one is production of long-term grid data set focused on the NWP area, the other is developing a data service system providing observation data and reanalysis data together.

Acknowledgement : This research was supported by Korea Institute of Marine Science & Technology Promotion(KIMST) funded by the Ministry of Oceans and Fisheries(KIMST-20220033)

How to cite: Kim, S. D., Park, H. M., Kwon, Y. S., and Han, H. G.: A data integration system for ocean climate change research in the Northwest Pacific, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-1294, https://doi.org/10.5194/egusphere-egu23-1294, 2023.

EGU23-1599 | Posters on site | GI2.3

Overview of the services provided to marine data producers by ODATIS, the French ocean data center 

Sabine Schmidt, Erwann Quimbert, Marine Vernet, Joël Sudre, Caroline Mercier, Dominique Obaton, Jean-François Piollé, Frédéric Merceur, Gérald Dibarboure, and Gilbert Maudire

The consequences of global change on the ocean are multiple such as increase in temperature and sea level, stronger storms, deoxygenation, impacts on ecosystems. But the detection of changes and impacts is still difficult because of the diversity and variability of marine environments. While there has been a clear increase in the number of marine and coastal observations, whether by in situ, laboratory or remote sensing measurements, each data is both costly to acquire and unique. The number and variety of data acquisition techniques require efficient methods of improving data availability via interoperable portals, which facilitate data sharing according to FAIR principles for producers and users. ODATIS, the ocean cluster of Data Terra, the French research infrastructure for Earth data, is the entry point to access all the French Ocean observation data (Ocean Data Information and Services ; www.odatis-ocean.fr/en/). The first challenge of ODATIS is to get data producers to share data. To that purpose, ODATIS offers several services to help them define Data Management Plan (DPM), implement the FAIR principles, make data more visible and accessible by being referenced in the ODATIS catalog, and better tracked and cited through a Digital Object Identifier (DOI). ODATIS also offers a service for publishing open scientific data on the sea, through SEANOE (www.seanoe.org) that provides a DOI that can be cited in scientific articles in a reliable and sustainable way. In parallel to the informatic development of the ocean cluster, further communication and training are needed to inform the research community of these new tools. Through technical workshops, Odatis offers data providers practical experience and support in implementing data access, visualization and processing services. Finally, ODATIS relies on scientific consortia in order to promote and develop innovative processing methods and products for remote, airborne, or in situ observations of the ocean and its interfaces (atmosphere, coastline, seafloor) with the other clusters of the RI Data Terra.

How to cite: Schmidt, S., Quimbert, E., Vernet, M., Sudre, J., Mercier, C., Obaton, D., Piollé, J.-F., Merceur, F., Dibarboure, G., and Maudire, G.: Overview of the services provided to marine data producers by ODATIS, the French ocean data center, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-1599, https://doi.org/10.5194/egusphere-egu23-1599, 2023.

EGU23-5626 | Orals | GI2.3

An integration of digital twin technology, GIS and VR for the service of environmental sustainability 

Chen Wang, David Miller, Alessandro Gimona, Maria Nijnik, and Yang Jiang

A digital twin is a digital representation of real-world physical product, system, or process. Digital twins potentially offer a much richer capability to model and analyze real-world systems and improve environment sustainability.

In this work, an integrated 3D GIS and VR model for scenarios modeling and interactive data visualisation has been developed and implemented through the Digital Twin technology at the Glensaugh research farm. Spatial Multi-criteria Analysis has been applied to decide where to plant new woodlands, recognizing a range of land-use objectives while acknowledging concerns about possible conflicts with other uses of the land. The virtual contents (e.g., forest spatial datasets, monitored climate data, analyzed carbon stocks and natural capital asset index) have been embedded in the virtual landscape model which help raise public awareness of changes in rural areas.

The Digital twin prototype for Glensaugh Climate-Positive Farming was used at the STFC workshop 2021, GISRUK 2022, 2022 Royal Highland Show which provides an innovative framework to integrate spatial data modelling, analytical capabilities and immersive visualization.

Audience feedback suggested that the virtual environment was very effective in providing a more realistic impression of the different land-use and woodland expansion scenarios and environmental characteristics. This suggests considerable added value from using digital twin technology to better deal with complexity of data analysis, scenarios simulation and enable rapid interpretation of solutions.

Findings show this method has a potential impact on future woodland planning and enables rapid interpretation of forest and climate data which increases the effectiveness of their use and contribution to wider sustainable environment.

How to cite: Wang, C., Miller, D., Gimona, A., Nijnik, M., and Jiang, Y.: An integration of digital twin technology, GIS and VR for the service of environmental sustainability, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-5626, https://doi.org/10.5194/egusphere-egu23-5626, 2023.

EGU23-5866 | ECS | Posters on site | GI2.3

Mapping and Analysis of Anthrax Cases in Humans and Animals 

Tamar Chichinadze, Zaza Gulashvili, Nana Bolashvili, Lile Malania, and Nikoloz Suknidze

Anthrax is a rare but serious disease caused by gram-positive, stem-shaped bacteria Bacillus anthracis, which are toxin-producing, encapsulated, facultative anaerobic organisms. Anthrax is found naturally in the soil and mainly harms livestock and wildlife. It can cause serious illness in both humans and animals. Anthrax, an often fatal disease of animals, is spread to humans through contact with infected animals or their products. People get infected with anthrax when spores get into the body.

The study aims to monitor the anthill localization map of anthrax on geographical maps and identify geographical variables that are significantly associated with environmental risk factors for anthrax recurrence in Georgia (Caucasus), as specific diseases affect the geographical environment, soil, climate. etc.

We carefully analyzed a set of 1664 cases of anthrax in humans and 621 cases of anthrax in animals, up to 1430 locations in anthrax foci (animal burial sites, slaughterhouses, BP roads, construction, etc.) observed in Georgia. Literature and the National Center for Disease Control for over 70 years. We analyzed more than 30 geographical variables such as climate, topography, soil (soil type, chemical composition, acidity), landscape, etc., and created several digital thematic maps, and foci of ant distribution and detection. The identified variable will help you to monitor anthrax development foci.

How to cite: Chichinadze, T., Gulashvili, Z., Bolashvili, N., Malania, L., and Suknidze, N.: Mapping and Analysis of Anthrax Cases in Humans and Animals, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-5866, https://doi.org/10.5194/egusphere-egu23-5866, 2023.

EGU23-6357 | Posters on site | GI2.3

PANAME: a portal laboratory for city's environmental data 

Vincent Douet, Sophie Bouffiès-Cloché, Joanne Dumont, Martial Haeffelin, Jean-Charles Dupoont, Simone Kotthaus, Valéry Masson, Aude Lemonsu, Valerie Gros, Christopher Cantrell, Vincent Michoud, and Sébastien Payan

The urban is at the heart of many disciplinary projects covering very broad scientific areas. Acquired data or simulations are often accessible (when they are) via targeted thematic portals. However, the need for transdisciplinarity has been essential for several years to answer specific scientific questions or societal demands. For this, the crossing of human sciences data, health, air quality, land use, emissions inventories, biodiversity, etc., would allow new innovative studies in connection with the city.

PANAME (PAris region urbaN Atmospheric observations and models for Multidisciplinary rEsearch) developed by AERIS was designed as the first brick of a data portal that can promote the discovery, access, cross-referencing and representation of urban data from various sectors with air quality and urban heat islands as a starting point. The portal and future developments will be discussed in this presentation.

How to cite: Douet, V., Bouffiès-Cloché, S., Dumont, J., Haeffelin, M., Dupoont, J.-C., Kotthaus, S., Masson, V., Lemonsu, A., Gros, V., Cantrell, C., Michoud, V., and Payan, S.: PANAME: a portal laboratory for city's environmental data, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-6357, https://doi.org/10.5194/egusphere-egu23-6357, 2023.

EGU23-6873 | Posters on site | GI2.3

From local to global: Community services in interdisciplinary research data management  

Hela Mehrtens, Janine Berndt, Klaus Getzlaff, Andreas Lehmann, and Sören Lorenz

GEOMAR research covers a unique range of physical, chemical, biological and geological ocean processes. The department Digital Research Services develops and provides advice and tools to support scientific data workflows, including metadata description of expeditions, model experiments, lab experiments, and samples. Our focus lies on standardized internal data exchange in large interdisciplinary scientific projects and citable data and software publications in discipline specific repositories to meet the FAIR principles. GEOMAR aims at providing their services not only internally but as a collaborative RDM platform for marine projects as a community service. How to achieve this on the operational level is currently worked on jointly with other research institutions in community projects, e.g. within the DAM (German Alliance of Marine Research), the DataHUB, an initiative of several research centres within the Helmholtz research area Earth and Environment, and within the national research infrastructure NFDI4Earth, a network of more than 60 partners.  

Our latest use cases are the inclusion of the seismic data and numerical model simulations into the community portals to increase their visibility and reusability. We present the success stories and pitfalls of bringing a locally well established system in larger communities and address the challenges we are facing. 

How to cite: Mehrtens, H., Berndt, J., Getzlaff, K., Lehmann, A., and Lorenz, S.: From local to global: Community services in interdisciplinary research data management , EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-6873, https://doi.org/10.5194/egusphere-egu23-6873, 2023.

EGU23-7015 | ECS | Orals | GI2.3

Evaluation of five reanalysis products over France: implications for agro-climatic studies 

Mariam Er-rondi, Magali Troin, Sylvain Coly, Emmanuel Buisson, Laurent Serlet, and Nourddine Azzaoui

Agriculture is extremely vulnerable to climate change. Increase in air temperature alongside the more frequent extreme climate events are the main climate change’s negative impacts influencing the yields, safety, and quality of crops. One approach to assess the impacts of climate change on agriculture is the use of agro-climatic indicators (AgcIs). Agcls characterize plant-climate interactions and are practical and understandable for both farmers and decision makers.

Climate and climate change impact studies on crop require long samples of reliable past and future datasets describing both spatial and temporal variability. The lack of observed historical data with an appropriate temporal resolution (i.e., 30 years of continuous daily data) and a sufficient local precision (i.e., 1km) is a major concern. To overcome that, the reanalysis products (RPs) are often used as a potential reference data of observed climate in impact studies. However, RPs have some limitations as they contain some biases and uncertainties. In addition, the RPs’ evaluation is often conducted on climate indicators which raises questions about their suitability for agro-climatic indicators.

This work aims to evaluate the ability of five of the most used RPs to reproduce observed AgcIs for three specific crops (i.e., apple, corn, and vine) over France. The five RPs selected for this study are the SCOPE Climate, FYRE Climate, ERA5, ERA5 Land and the gridded dataset RFHR. They are compared to the SYNOP meteorological data provided by Météo-France, considered as a reference dataset from 1996 to 2021.

Our findings show a higher agreement between the five RPs and SYNOP for the temperature-based Agcls than the precipitation-based Agcls. RPs tend to overestimate the precipitation-based Agcls. We also note that, for each RP, the discrepancies between the AgcIs and the reference SYNOP dataset do not depend on the geographical location or the crop. This study emphasizes the need to quantify uncertainty in climate data in climate variability and climate change impact studies on agriculture.

How to cite: Er-rondi, M., Troin, M., Coly, S., Buisson, E., Serlet, L., and Azzaoui, N.: Evaluation of five reanalysis products over France: implications for agro-climatic studies, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-7015, https://doi.org/10.5194/egusphere-egu23-7015, 2023.

We present a method for publishing high performance compute (HPC) code and results in a scalable, portable and ready-to-use interactive environment in order to enable sharing, collaborating, peer-reviewing and teaching. We show how we utilize cloud native elements such as kubernetes, containerization, automation and webshells to achieve this and demonstrate such an OpenScienceLab for the MAGE (Multiscale Atmosphere Geospace Environment) model, being developed by the recently selected NASA DRIVE Center for Geospace Storms.
We argue that a key factor in the successful design of such an environment is its (cyber)-security, as  these labs require non-trivial compute resources open to a vast audience. Benefits as well as implied costs of different hosting options are discussed, comparing public cloud, hybrid, private cloud and even large desktops.
We encourage HPC centers to test our method using our fully open source blueprints. We hope to thus unburden the research staff and scientists to follow FAIR principles and support open source goals without needing a deep knowledge of cloud computing.

How to cite: Roedig, C. and Sorathia, K.: Cloud native OpenScienceLabs for HPC : Easing the road to FAIR collaboration and OpenSource, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-7703, https://doi.org/10.5194/egusphere-egu23-7703, 2023.

EGU23-8585 | Orals | GI2.3

Programmatic Update for NASA’s Commercial Smallsat Data Acquisition (CSDA) Program 

Aaron Kaulfus, Alfreda Hall, Manil Maskey, Will McCarty, and Frederick Policelli

Established in 2017 as a pilot project, the NASA Commercial Smallsat Data Acquisition (CSDA) Program evaluates and acquires commercial datasets that compliment NASA Earth Science research and application goals. The success of the pilot and recognition of the value commercial data provide to the scientific community led to establishment of a sustained program within NASA’s Earth Science Division (ESD) with objectives of providing continuous on-ramp of new commercial vendors to evaluate the potential to advance NASA’s Earth science research and application activities, enable sustained use of the purchased data by the scientific community, ensure long-term preservation of purchased data for scientific reproducibility, and coordinate with other U.S. Government agencies and international partners on the evaluation and use of commercial data. This presentation will focus on data made available for scientific use through the CSDA Program, especially those datasets added since the conclusion of the original pilot project, describe the process for end users to access of CSDA managed datasets, and provide a status overview of ongoing and upcoming vendor evaluation activities will be given. Recent scientific research results from CSDA subject matter experts utilizing commercial data will also be provided.

How to cite: Kaulfus, A., Hall, A., Maskey, M., McCarty, W., and Policelli, F.: Programmatic Update for NASA’s Commercial Smallsat Data Acquisition (CSDA) Program, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-8585, https://doi.org/10.5194/egusphere-egu23-8585, 2023.

EGU23-12144 | Orals | GI2.3

FAIR & Open Material Samples: The IGSN ID 

Rorie Edmunds

Material samples are a vital output of the scientific endeavour. They underpin research in the Earth, Space, and Environmental Sciences, and are a necessary component of ensuring the transparency and reproducibility of such research. While there has been a lot of discussion in recent years about the openness and FAIRness of data, code, methods, and so on, material samples have been much less under the spotlight.

The lack of focus on material samples is in part due to them being unique as a research output, in the sense that they are inherently physical and thus they are mostly transported and managed by human beings rather than machines; it is rather more straightforward to archive and share both information about an output—and the output itself—for something that is already a digital object. However, it is for this reason that materials samples must be made more FAIR and treated as first-class citizens of Open Science. To do this, one needs to connect the physical and digital worlds. IGSN IDs enable these connections to be made.

The IGSN ID is a globally unique and persistent identifier (PID) specifically for labelling material samples themselves (i.e., they are for neither images nor data about a sample). Functionally a Digital Object Identifier (DOI) registered under DataCite services, the IGSN ID can be applied to all types of material samples coming from any discipline. Not only can IGSN IDs be used to identify individual material samples that currently exist in a repository, museum, or otherwise, but they can also be registered

  • At the aggregate level for sample collections.
  • For the sites from which the samples are taken.
  • For ephemeral samples.

Importantly, in all cases, when registering an IGSN IDs, one must supply metadata in the DataCite Metadata Schema, as well as create landing pages that supply additional, disciplinary, user-focussed information about the collection, site, or (sub)sample. Hence, by registering a PID for a physical object, it is given a permanently resolvable URI to a findable and accessible digital footprint, and through the provision of rich metadata, enables its interoperability and reusability. Sharing of associated data is also possible within the metadata, and one may even include the potential for relocation of a sample itself for reuse.

This presentation will briefly introduce the IGSN ID and the partnership between DataCite and the IGSN e.V. to transfer the IGSN PID infrastructure under DataCite DOI services. It will mainly highlight practical use cases of IGSN IDs, including what is needed to include them in the sample workflow. It will also talk about efforts to better support IGSN IDs and sample metadata within the DataCite Metadata Schema.

How to cite: Edmunds, R.: FAIR & Open Material Samples: The IGSN ID, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-12144, https://doi.org/10.5194/egusphere-egu23-12144, 2023.

EGU23-12173 | Orals | GI2.3

ESPRESSO: Earth Science Problems for the Evaluation of Strategies, Solvers and Optimizers 

Andrew Valentine, Jiawen He, Juerg Hauser, and Malcolm Sambridge

Many Earth systems cannot be observed directly, or in isolation. Instead, we must infer their properties and characteristics from their signature in one or more datasets, using a variety of techniques (including those based on optimization, statistical methods, or machine learning). Development of these techniques is an area of focus for many geoscience researchers, and methodological advances can be instrumental in enhancing our understanding of the Earth.         

In our experience, progress is substantially hindered by the absence of infrastructure facilitating communication between sub-disciplines. Researchers tend to focus on one area of the earth sciences — such as seismology, hydrology or oceanography — with only slow percolation of ideas and innovations from one area to another. Indeed, silos often exist even within these subfields. Testing new ideas on new problems is challenging as it requires the acquisition of domain knowledge, an often difficult and time-consuming endeavour with uncertain returns. Key questions that arise include: What is a relevant field data set, and how has it been processed? Which simulation package is most appropriate to predict the data? What would a 'good' model look like and what should it be able to resolve? What is the current best practice?

To address this, we introduce the ESPRESSO project — a collection of Earth Science Problems for the Evaluation of Strategies, Solvers and Optimisers. It aims to provide  access to a suite of ‘test problems’, spanning a wide range of inference and inversion scenarios. Each test problem defines appropriate dataset(s) and simulation routines, accessible within a standardised Python interface. This will allow researchers to rapidly test new techniques across a spectrum of problems, share domain-specific inference problems and ultimately identify areas where there may be potential for fruitful collaboration and development. ESPRESSO is envisaged as an open, community-sourced project, and we invite contributions from across the geosciences.

How to cite: Valentine, A., He, J., Hauser, J., and Sambridge, M.: ESPRESSO: Earth Science Problems for the Evaluation of Strategies, Solvers and Optimizers, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-12173, https://doi.org/10.5194/egusphere-egu23-12173, 2023.

EGU23-12381 | ECS | Posters on site | GI2.3 | Highlight

An Exploratory Study on the Methodology for the Analysis of Urban Environmental Characteristics in Seoul City based on S-Dot Sensor Data 

Daehwan Kim, Kwanchul Kim, Dasom Lee, Jae-Hoon Yang, Seong-min Kim, and Jeong-Min Park

This paper identifies the aspects of living environment elements (PM2.5, PM10, Noise) throughout Seoul and the urban planning characteristics that affect them by utilizing the big data of the S-Dot sensor in Seoul, which has recently become a hot topic. In other words, it proposes a big data-based research methodology and research direction to confirm the relationship between urban characteristics and environmental sectors that directly affect citizens.  The temporal range is from 2020 to 2022, which is the available range of time series data for S-Dot sensors, and the spatial range is throughout Seoul by 500m*500m GRID. First of all, as part of analyzing specific living environment patterns, simple trends through EDA are identified, and cluster analysis is conducted based on the trends. After that, in order to derive specific urban planning characteristics of each cluster, basic statistical analysis such as ANOA and OLS, and MNL analysis were conducted to confirm more specific characteristics. As a result of this study, cluster patterns of PM2.5, PM10, noise and urban planning characteristics that affect them are identified, and there are areas with relatively high or low long-term living environment values compared to other regions. The results of this study are believed to be a reference for urban planning management measures for vulnerable areas of living environment, and it is expected to be an exploratory study that can provide directions to studies related to data in various fields related to environmental data in the future.

How to cite: Kim, D., Kim, K., Lee, D., Yang, J.-H., Kim, S., and Park, J.-M.: An Exploratory Study on the Methodology for the Analysis of Urban Environmental Characteristics in Seoul City based on S-Dot Sensor Data, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-12381, https://doi.org/10.5194/egusphere-egu23-12381, 2023.

EGU23-13420 | ECS | Orals | GI2.3

Development of interoperable web applications for paleoclimate research 

Alessandro Morichetta, Anne-Marie Lézine, Aline Govin, and Vincent Douet

Studying how the Earth’s climate changed in the past requires a joint interdisciplinary effort of scientists from different scientific domains. Paleoclimatic records are increasingly obtained on multiple archives (e.g. marine and terrestrial sediments, ice cores, speleothems, corals) and they document past changes in various climatic variables of the different components of the climatic system (e.g. ocean, atmosphere, vegetation, ice). 

Most paleoclimatic records still rely on independent observations with no standard format describing their data or metadata, resulting in a progressive increase of variables and taxonomies. Therefore, despite the achievements of the last decades (e.g. NOAA, NEOTOMA and PANGAEA databases), the lack of a common language strongly limits the systematic reusability of paleoclimate data, for example for the construction of paleoclimatic data syntheses or the evaluation of climate model simulations.

The international project “Abrupt Change in Climate and Ecosystems: Data and e-infrastructure” (ACCEDE, funded by the Belmont Forum) aims at creating an ecosystem for paleoclimatic data in order to investigate the tipping points of past climatic changes. In this context, the recently formalized Linked PaleoData (LiPD) format is the core for the standardization of paleoclimate data and metadata, and it is acting as communication protocol between the different databases that compose the e-infrastructure.

Here we show two web-based solutions that are part of this effort and that take advantage of the LiPD ecosystem. The African Pollen Database, and the IPSL Paleoclimate Database, both hosted and developed by Institut Pierre-Simon Laplace, France, have the objectives (1) to give open access, while respecting the FAIR principles, to a variety of paleoclimate datasets - from pollen fossils to various tracers measured on marine sediments, ice cores or tree rings -, and (2) to combine and compare, using visualization tools, carefully selected and well dated paleoclimatic records from different disciplines to address specific research questions. 

The two databases are the result of data recovery from pre-existing and obsolete archives that followed a process of data (and metadata) consolidation, enrichment and formatting, in order to respect the LiPD specification and ensure the interoperability between them and the already existing databases. We designed harmonised web interfaces and REST APIs to explore and export existing datasets with the help of filtering tools. Datasets are published with DOI under an open license, allowing free access to the completeness of information. A LiPD upload form is embedded to the websites, in order to encourage both users and data stewards to propose, edit, add new records, and to bring the community into the use of LiPD format. We are currently working on finalizing visualization tools to evaluate aggregate data for research and education purposes.

With this effort we are developing a framework in which heterogeneous paleoclimatic records are fully interoperable, allowing scientists from the whole community to take advantage of the completeness of the available data, and to reuse them for very different research applications.

How to cite: Morichetta, A., Lézine, A.-M., Govin, A., and Douet, V.: Development of interoperable web applications for paleoclimate research, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-13420, https://doi.org/10.5194/egusphere-egu23-13420, 2023.

EGU23-13455 | Posters on site | GI2.3

The Transnational access and training in the Geo-INQUIRE EU-project, an opportunity for researchers to develop leading-edge science at selected facilities and test-beds across Europe 

Gaetano Festa, Shane Murphy, Mariusz Majdanski, Iris Christadler, Fabrice Cotton, Angelo Strollo, Marc Urvois, Volker Röhling, Stefano Lorito, Andrey Babeyko, Daniele Bailo, Jan Michalek, Otto Lange, Javier Quinteros, Mateus Prestes, and Stefanie Weege

The Geo-INQUIRE (Geosphere INfrastructure for QUestions into Integrated REsearch) project, supported by the Horizon Europe Programme, is aimed at enhancing services to make data and high-level products accessible to the broad Geoscience scientific community. Geo-INQUIRE’s goal is to encourage curiosity-driven studies into understanding the geosphere dynamics at the interface between the solid Earth, the oceans and the atmosphere using long data streams, high-performance computing and cutting-edge facilities.

In the framework of Geo-INQUIRE, Transnational Access (TA, both virtual and on-site) will be provided at six test beds across Europe: the Bedretto Laboratory, Switzerland; the Ella-Link Geolab, Portugal; the Liguria-Nice-Monaco submarine infrastructure, Italy/France; the Irpinia Near-Fault Observatory, Italy; the Eastern Sicily facility, Italy; and the Corinth Rift Laboratory, Greece. These test beds are state-of-the-art research infrastructures, covering the Earth’s surface, subsurface, and marine environments over different spatial scales, from small-scale experiments in laboratories to kilometric submarine fibre cables. The TA will revolve around answering scientific key-questions on the comprehension of fundamental processes associated with geohazards and georesources such as: the preparatory phases of earthquakes, the role of the fluids within the Earth crust, the fluid-solid interaction at the seabed, and the impact of geothermal exploitation. TA will be also offered for software and workflows belonging to the EPOS-ERIC and the ChEESE Centre of Excellence for Exascale in Solid Earth, to develop awarded user’s projects. These are grounded on simulation of seismic waves and rupture dynamics in complex media, tsunamis, subaerial and submarine landslides. HPC-based Probabilistic Tsunami, Seismic and Volcanic Hazard workflows are offered to assess hazard at high-resolution with extensive uncertainty exploration. Support and collaboration will be offered to the awardees to facilitate the access and usage of HPC resources for tackling geoscience problems. Geo-INQUIRE will grant TA to researchers to develop their own lab or numerical experiments with the aim of advancing scientific knowledge of Earth processes while fostering cross-disciplinary research across Europe. To be granted, researchers submit a proposal to the yearly TA calls that will be issued three times during the project life. Calls will be advertised at the Geo-INQUIRE web page https://www.geo-inquire.eu/ and through the existing community channels.

To encourage the cross-disciplinary research, Geo-INQUIRE will also organize a series of training and workshops, focused on data, data products and software delivered by research infrastructures, and useful for researchers. In addition, two summer schools will be organized, dedicated to cross-disciplinary interactions of solid earth and marine science.

The proposals, for both transnational access and training, will be evaluated by a panel that reviews the technical and scientific feasibility of the project, ensuring equal opportunities and diversity in terms of gender, geographical distribution and career stage. The first call is expected to be issued by the end of Summer 2023. The data and products generated during the TAs will be made available to the scientific community via the project’s strict adherence to FAIR principles.

How to cite: Festa, G., Murphy, S., Majdanski, M., Christadler, I., Cotton, F., Strollo, A., Urvois, M., Röhling, V., Lorito, S., Babeyko, A., Bailo, D., Michalek, J., Lange, O., Quinteros, J., Prestes, M., and Weege, S.: The Transnational access and training in the Geo-INQUIRE EU-project, an opportunity for researchers to develop leading-edge science at selected facilities and test-beds across Europe, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-13455, https://doi.org/10.5194/egusphere-egu23-13455, 2023.

EGU23-14423 | Posters on site | GI2.3

EPOS-GNSS DATA GATEWAY: a portal to European GNSS Data and Metadata 

Mathilde Vergnolle and Jean-Luc Menut

EPOS-GNSS is the Thematic Core Service dedicated to GNSS data and products for the European Plate Observing System.
EPOS-GNSS provides a service to explore and download validated and quality controlled data and metadata. This service is based on a network of 10 data nodes connected to a centralized portal, called "EPOS-GNSS Data Gateway". The service aims to follow the FAIR principles and continues to evolve to better meet them. It currently provides more than 4 millions of daily files in the RINEX standardized format for 1670 European GNSS stations and their associated metadata.
In addition to the integration into the multi-disciplinary EPOS data portal, the service proposes a direct access to the data and metadata for users with a need for more complex or more specific queries and filtering. A GUI (web client) and a specialized command line client are provided to facilitate the exploration and download of the data and metadata.
The presentation introduces the EPOS GNSS-Data Gateway (https://gnssdata-epos.oca.eu), its clients, and its use.

How to cite: Vergnolle, M. and Menut, J.-L.: EPOS-GNSS DATA GATEWAY: a portal to European GNSS Data and Metadata, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-14423, https://doi.org/10.5194/egusphere-egu23-14423, 2023.

EGU23-14605 | Posters on site | GI2.3

Towards an interoperable digital ecosystem in Earth System Science research 

Wolfgang zu Castell, Jan Bumberger, Peter Braesicke, Stephan Frickenhaus, Ulrike Kleeberg, Ralf Kunkel, and Sören Lorenz

Earth System Science (ESS) relies on the availability of data from varying resources and ranging over different disciplines. Hence, data sources are rich and diverse, including observatories, satellites, measuring campaigns, model simulations, case studies, laboratory experiments as well as citizen science etc. At the same time, practices of professional research data management (RDM) are differing significantly among various disciplines. There are many well-known challenges in enabling a free flow of data in the sense of the FAIR criteria. Such are data quality assurance, unique digital identifiers, access to and integration of data repositories, just to mention a few. 

The Helmholtz DataHub Earth&Environment is addressing digitalization in ESS by developing a federated data infrastructure. Existing RDM practices at seven centers of the Helmholtz Association working together in a joint research program within the Research Field Earth and Environment (RF E&E) are harmonized and integrated in a comprehensive way. The vision is to establish a digital research ecosystem fostering digitalization in geosciences and environmental sciences. Hereby, issues of common metadata standards, digital object identifiers for samples, instruments and datasets, defined role models for data sharing certainly play a central role. The various data generating infrastructures are registered digitally in order to collect metadata as early as possible and enrich them along the flow of the research cycle.

Joint RDM bridging several institutions relies on professional practices of distributed software development. Apart from operating cross-center software development teams, the solutions rely on concepts of modular software design. For example, a generic framework has been developed to allow for quick development of tools for domain specific data exploration in a distributed manner. Other tools incorporate automated quality control in data streams. Software is being developed following guiding principles of open and reusable research software development.

A suite of views is being provided, allowing for varying user perspectives, monitoring data flows from sensor to archive, or publishing data in quality assured repositories. Furthermore, high-level data products are being provided for stakeholders and knowledge transfer (for examples see https://datahub.erde-und-umwelt.de). Furthermore, tools for integrated data analysis, e.g. using AI approaches for marine litter detection can be implemented on top of the existing software stack.

Of course, this initiative does not exist in isolation. It is part of a long-term strategy being embedded within national (e.g. NFDI) and international (e.g. EOSC, RDA) initiatives.

How to cite: zu Castell, W., Bumberger, J., Braesicke, P., Frickenhaus, S., Kleeberg, U., Kunkel, R., and Lorenz, S.: Towards an interoperable digital ecosystem in Earth System Science research, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-14605, https://doi.org/10.5194/egusphere-egu23-14605, 2023.

EGU23-15072 | Posters virtual | GI2.3

Automated Extraction of Bioclimatic Time Series from PDF Tables 

Sabino Maggi, Silvana Fuina, and Saverio Vicario

Since the development of the original specifications in the '90s the PDF document format has become the de-facto standard for the distribution and archival of documents in electronic form because of its ability to preserve the original layout of the documents, independently of the hardware, operating system and application software used to visualize them.

Unfortunately the PDF format does not contain explicit structural and semantic information, making it very difficult to extract structured information from them, in particular data presented in tabular form. 
The automatic extraction of tabular data is a difficult and challenging task because tables can have extremely different formats and layouts, and involves several complex steps, from the proper recognition and conversion of printed text into machine-encoded characters, to the identification of logically coherent table constructs (headers, columns, rows, spanning elements), and to the breaking down of the data constructs into elemental objects.

Several tools have been developed to support the extraction process. In this work we survey the most interesting tools for the automatic detection and extraction of tabular data, analyzing their respective advantages and limitations. A particular emphasis is given on programmable open source tools because of their flexibility and long-term availability, together with the possibility to easily tweak them to meet the peculiar needs of the problem at hand.

As a practical application, we also present a workflow based on a set of R and AWK scripts that can automatically extract daily temperature and precipitation data from the official PDF documents made available each year by Regione Puglia, in Italy. The lessons learned from the development of this workflow and the possibility to generalize the approach to different kinds of PDF documents are also discussed.

How to cite: Maggi, S., Fuina, S., and Vicario, S.: Automated Extraction of Bioclimatic Time Series from PDF Tables, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-15072, https://doi.org/10.5194/egusphere-egu23-15072, 2023.

EGU23-15293 | ECS | Posters virtual | GI2.3 | Highlight

Environmental parameters as a critical factor in understanding mosquito population 

Anastasia Angelou, Sandra Gewehr, Spiros Mourelatos, and Ioannis Kioutsioukis

The transmission of West Nile Virus is known to be affected by multiple factors related to the behavior and interactions between reservoir (birds), vector (Culex-mosquitos), and hosts (humans). Environmental parameters can play a critical role in understanding WNV epidemiology. The aim of this research was to determine the association of various climatic factors with the Culex mosquito abundance in Greece during the period 2011-2022. Climate data were acquired from ERA5 (European Centre for Medium-Range Weather Forecasts), while Culex abundance data were obtained through the mosquito surveillance network of ECODEVELOPMENT S.A, who hold the biggest mosquito surveillance network in Greece. The research was conducted at the municipality level. Culex abundance depends in a nonlinear fashion from temperature (Figure 1). The spread of the measurements indicates however there are other factors that affect the abundance of mosquitoes.

Figure 1 Scatter plot of air temperature VS Culex abundance in a municipality (Delta) with relatively sizeable mosquito population.

Correlation heatmaps were used as a tool to visualize the correlation of vector abundance and average monthly temperature up to 2 months before at several municipalities in the Region of Central Macedonia. The correlations decrease with increasing the lag in temperature (Figure 2). Moreover, there are some municipalities in which the correlation coefficient is considerably greater than others. Those correlations cannot be explained without considering the mosquito breeding sites found in these municipalities. In these municipalities there is a presence of important water resources, such as rice paddies, drainage canals, wetland systems or a combination of all the above. When surface waters warm and the outside temperature rises, the mosquito life cycle is completed more quickly, resulting in more generations being produced in a shorter period of time.

Figure 2 Correlation heatmap of the correlation coefficient between the mosquito abundance (municipality scale) and the average monthly temperature up to 2 months before.

Scatterplots and correlation heatmaps calculated with the Culex abundance and total precipitation, relative humidity or wind speed did not reveal similar patterns. Ongoing analysis focuses in more factors, environmental and not, which affect the abundance of mosquitoes that transmit WNV.

Acknowledgments 
This research has been co‐financed by the European Regional Development Fund of the European Union and Greek national funds through the Operational Program Competitiveness, Entrepreneurship and Innovation, under the call RESEARCH – CREATE – INNOVATE (project code: Τ2ΕΔΚ-02070). 

How to cite: Angelou, A., Gewehr, S., Mourelatos, S., and Kioutsioukis, I.: Environmental parameters as a critical factor in understanding mosquito population, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-15293, https://doi.org/10.5194/egusphere-egu23-15293, 2023.

EGU23-15863 | Orals | GI2.3

Building an Open Source Infrastructure for Next Generation End User Climate Services 

Benedikt Gräler, Katharina Demmich, Johannes Schnell, Merel Vogel, Stefano Bagli, and Paolo Mazzoli

Climate Services (CS) are crucial in empowering citizens, stakeholders and decision-makers in defining resilient pathways to adapt to climate change and extreme weather events. Despite advances in scientific data and knowledge (e.g. Copernicus, GEOSS), current CS fail to achieve their full value proposition to end users. Challenges include incorporation of social and behavioral factors, local needs, knowledge and the customs of end users. In I-CISK, we put forward a co-design based requirement analysis to develop a Spatial Data Infrastructure and Platform that empowers a next generation of end user CS, which follow a social and behaviorally informed approach to co-producing services that meet climate information needs of the Living Labs of the European I-CISK project. Core to the project are climate extremes such as droughts, floods and heatwaves. The use-cases touch upon agriculture, forestry, tourism, energy, health, and the humanitarian sectors. We will present the summarized stakeholders' requirements regarding the new climate-service platform and their technical implications for the open source spatial infrastructure. The design also includes assessing, managing and presenting uncertainties that are an inherent component of climate models.

How to cite: Gräler, B., Demmich, K., Schnell, J., Vogel, M., Bagli, S., and Mazzoli, P.: Building an Open Source Infrastructure for Next Generation End User Climate Services, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-15863, https://doi.org/10.5194/egusphere-egu23-15863, 2023.

EGU23-16416 | Posters virtual | GI2.3

The set up of the “UNO” project relational database for Stromboli volcano 

Simone Tarquini, Francesco Martinelli, Marina Bisson, Emanuela De Beni, Claudia Spinetti, and Gabriele Tarabusi

Active volcanoes are complex, poorly predictable systems that can pose a threat to humans and their infrastructures. As such, it is important to improve as much as possible the understanding of their behavior. The Stromboli volcano, in Italy, is one of the most active volcanoes in the world, and its almost persistent activity is documented since centuries. The persistent background activity is sometimes interrupted by much more energetic, dangerous episodes. The Istituto Nazionale di Geofisica e Vulcanologia (Italy) set up the interdisciplinary “UNO” project, aimed to understand when the Stromboli volcano is about to switch from the ordinary to the extraordinary activity. The UNO project includes an outstanding variety of research activities, such as sampling in the field, the modeling of Stromboli topography from ALS technique and satellite data, the 3D numerical simulations of ballistic trajectories, or the set up of an ultrasonic microphones system. Key to the success of the project is the collection of integrated high spatial and temporal resolution data and their joint analyses in a shared relational database. We present here the simplified logical model of such database, focusing on the identification of entities and their relationships.

How to cite: Tarquini, S., Martinelli, F., Bisson, M., De Beni, E., Spinetti, C., and Tarabusi, G.: The set up of the “UNO” project relational database for Stromboli volcano, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-16416, https://doi.org/10.5194/egusphere-egu23-16416, 2023.

EGU23-16605 | ECS | Orals | GI2.3

NASA’s Science Discovery Engine: An Interdisciplinary, Open Science Data and Information Discovery Service 

Kaylin Bugbee, Ashish Acharya, Carson Davis, Emily Foshee, Rahul Ramachandran, Xiang Li, and Muthukumaran Ramasubramanian

NASA’s Science Plan includes a strategy to advance discovery by leveraging cross-disciplinary opportunities between scientific disciplines. In addition, NASA is committed to building an inclusive, open science community over the next decade and is championing the new Open-Source Science Initiative (OSSI) to foster that community. The OSSI supports many activities to promote open science including the development of an empowering cyberinfrastructure to accelerate the time to actionable science. One component of the OSSI cyberinfrastructure is the Science Discovery Engine (SDE). The goal of the SDE is to enable the discovery of data, software and documentation across the five SMD divisions including Astrophysics, Biological and Physical Sciences, Earth Science, Heliophysics and Planetary Science. The SDE increases accessibility to NASA’s open science data and information and promotes interdisciplinary scientific discovery. In this presentation, we describe our work to develop the Science Discovery Engine in Sinequa, a Cognitive Search capability. We also share lessons learned about data governance, curation and information access.

How to cite: Bugbee, K., Acharya, A., Davis, C., Foshee, E., Ramachandran, R., Li, X., and Ramasubramanian, M.: NASA’s Science Discovery Engine: An Interdisciplinary, Open Science Data and Information Discovery Service, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-16605, https://doi.org/10.5194/egusphere-egu23-16605, 2023.

EGU23-2843 | ECS | PICO | ESSI1.1

Geography-Aware Masked Autoencoders for Change Detection in Remote Sensing 

Lukas Kondmann, Caglar Senaras, Yuki M. Asano, Akhil Singh Rana, Annett Wania, and Xiao Xiang Zhu

Increasing coverage of commercial and public satellites allows us to monitor the pulse of the Earth in ever-shorter frequency (Zhu et al., 2017). Together with the rise of deep learning in artificial intelligence (AI) (LeCun et al., 2015), the field of AI for Earth Observation (AI4EO) is growing rapidly. However, many supervised deep learning techniques are data-hungry, which means that annotated data in large quantities are necessary to help these algorithms reach their full potential. In many Earth Observation applications such as change detection, this is often infeasible because high-quality annotations require manual labeling which is time-consuming and costly.  

Self-supervised learning (SSL) can help tackle the issue of limited label availability in AI4EO. In SSL, an algorithm is pretrained with tasks that only require the input data without annotation. Notably, Masked Autoencoders (MAE) have shown promising performances recently where a Vision Transformer learns to reconstruct a full image with only 25% of it as input. We hypothesize that the success of MAEs also extends to satellite imagery and evaluate this with a change detection downstream task. In addition, we provide a multitemporal DINO baseline which is another widely successful SSL method. Further, we test a second version of MAEs, which we call GeoMAE. GeoMAE incorporates the location and date of the satellite image as auxiliary information in self-supervised pretraining. The coordinates and date information are passed as additional tokens to the MAE model similar to the positional encoding. 
The pretraining dataset used is the RapidAI4EO corpus which contains multi-temporal Planet Fusion imagery for a variety of locations across Europe. The dataset for the downstream task also uses Planet Fusion in pairs as input data. These are provided on a 600m * 600m patch level three months apart together with a classification if the respective patch has changed in this period. Self-supervised pretraining is done for up to 150 epochs where we take the model with the best validation performance on the downstream task as a starting point for the test set. 

We find that the regular MAE model scores the best on the test set with an accuracy of 81.54% followed by DINO with 80.63% and GeoMAE with 80.02%. Pretraining MAE with ImageNet data instead of satellite images results in a notable performance loss down to 71.36%. Overall, our current pretraining experiments can not yet confirm our hypothesis that GeoMAE is advantageous compared to regular MAE. However, in similar spirit, Cong et al. (2022) recently introduced SatMAE which outlines that for other remote sensing applications, the combination of auxiliary information and novel masking strategies is a key factor. Therefore, it seems that a combination of location and time inputs together with adapted masking may also hold the most potential for change detection. There is ample potential for future research in geo-specific applications of MAEs and we provide a starting point for this with our experimental results for change detection. 

How to cite: Kondmann, L., Senaras, C., Asano, Y. M., Rana, A. S., Wania, A., and Zhu, X. X.: Geography-Aware Masked Autoencoders for Change Detection in Remote Sensing, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-2843, https://doi.org/10.5194/egusphere-egu23-2843, 2023.

EGU23-3267 | ECS | PICO | ESSI1.1

Decomposition learning based on spatial heterogeneity: A case study of COVID-19 infection forecasting in Germany 

Ximeng Cheng, Jost Arndt, Emilia Marquez, and Jackie Ma

New models are emerging from Artificial Intelligence (AI) and its sub-fields, in particular, Machine Learning and Deep Learning that are being applied in different application areas including geography (e.g., land cover identification and traffic volume forecasting based on spatial data). Different from well-known datasets often used to develop AI models (e.g., ImageNet for image classification), spatial data has an intrinsic feature, i.e., spatial heterogeneity, which leads to varying relationships across different regions between the independent (i.e., the model input X) and dependent variables (i.e., the model output Y). This makes it difficult to conduct large-scale studies with a single robust AI model. In this study, we draw on the idea of modular learning, i.e., to decompose large-scale tasks into sub-tasks for specific sub-regions and use multiple AI models to achieve these sub-tasks. The decomposition is based on the spatial characteristics to ensure that the relationship between independent and dependent variables is similar in each sub-region. We explore this approach for forecasting COVID-19 cases in Germany using spatiotemporal data (e.g., weather data and human mobility data) as an example and compare the prediction tasks with a single model to the proposed decomposition learning procedure in terms of accuracy and efficiency. This study is part of the project DAKI-FWS which is funded by the Federal Ministry of Economic Affairs and Climate Action in Germany to develop an early warning system to stabilize the German economy.

How to cite: Cheng, X., Arndt, J., Marquez, E., and Ma, J.: Decomposition learning based on spatial heterogeneity: A case study of COVID-19 infection forecasting in Germany, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-3267, https://doi.org/10.5194/egusphere-egu23-3267, 2023.

EGU23-4929 | PICO | ESSI1.1

Using AI and ML to support marine science research 

Ilaria Fava, Peter Thijsse, Gergely Sipos, and Dick Schaap

The iMagine project is devoted to developing and delivering imaging data and services for aquatic science. Started in September 2022, the project will provide a portfolio of image data collections, high-performance image analysis tools empowered with Artificial Intelligence, and best practice documents for scientific image analysis. These services and documentation will enable better and more efficient processing and analysis of imaging data in marine and freshwater research, accelerating our scientific insights about processes and measures relevant to healthy oceans, seas, and coastal and inland waters. By building on the European Open Science Cloud compute platform, iMagine delivers a generic framework for AI model development, training, and deployment, which researchers can adopt for refining their AI-based applications for water pollution mitigation, biodiversity and ecosystem studies, climate change analysis and beach monitoring, but also for developing and optimising other AI-based applications in this field. The iMagine AI development and testing framework offers neural networks, parallel post-processing of extensive data, and analysis of massive online data streams in distributed environments. The synergies among the eight aquatic use cases in the project will lead to common solutions in data management, quality control, performance, integration, provenance, and FAIRness and contribute to harmonisation across RIs. The resulting iMagine AI development and testing platform and the iMagine use case applications will provide another component to the European marine data management landscape, valid for the Digital Twin of the Ocean, EMODnet, Copernicus, and international initiatives. 

How to cite: Fava, I., Thijsse, P., Sipos, G., and Schaap, D.: Using AI and ML to support marine science research, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-4929, https://doi.org/10.5194/egusphere-egu23-4929, 2023.

EGU23-6818 | ECS | PICO | ESSI1.1

Eddy identification from along-track altimeter data with multi-modal deep learning 

Adili Abulaitijiang, Eike Bolmer, Ribana Roscher, Jürgen Kusche, and Luciana Fenoglio-Marc

Eddies are circular rotating water masses, which are usually generated near the large ocean currents, e.g., Gulf Stream. Monitoring eddies and gaining knowledge on eddy statistics over a large region are important for fishery, marine biology studies, and testing ocean models.

At mesoscale, eddies are observed in radar altimetry, and methods have been developed to identify, track and classify them in gridded maps of sea surface height derived from multi-mission data sets. However, this procedure has drawbacks since much information is lost in the gridded maps. Inevitably, the spatial and temporal resolution of the original altimetry data degrades during the gridding process. On the other hand, the task of identifying eddies has been a post-analysis process on the gridded dataset, which is, by far, not meaningful for near-real time applications or forecasts. In the EDDY project at the University of Bonn, we aim to develop methods for identifying eddies directly from along track altimetry data via a machine (deep) learning approach.

Since eddy signatures (eddy boundary and highs and lows on sea level anomaly, SLA) are not possible to extract directly from along track altimetry data, the gridded altimetry maps from AVISO are used to detect eddies. These will serve as the reference data for Machine Learning. The eddy detection on 2D grid maps is produced by open-source geometry-based approach (e.g., py-eddy-tracker, Mason et al., 2014) with additional constraints like Okubo-Weiss parameter. Later, Sea Surface Temperature (SST) maps of the same region and date (also available from AVISO) are used for manually cleaning the reference data. Noting that altimetry grid maps and SST maps have different temporal and spatial resolution, we also use the high resolution (~6 km) ocean modeling simulation dataset (e.g., FESOM, Finite Element Sea ice Ocean Model). In this case, the FESOM dataset provides a coherent, high-resolution SLA and SST, salinity maps for the study area and is a potential test basis to develop the deep learning network.

The single modal training via a Conventional Neural Network (CNN) for the 2D altimetry grid maps produced excellent dice score of 86%, meaning the network almost detects all eddies in the Gulf Stream, which are consistent with reference data. For the multi-modal training, two different training networks are developed for 1D along-track altimetry data and 2D grid maps from SLA and SST, respectively, and then they are combined to give the final classification output. A transformer model is deemed to be efficient for encoding the spatiotemporal information from 1D along track altimetry data, while CNN is sufficient for 2D grid maps from multi-sensors.

In this presentation, we show the eddy classification results from the multi-modal deep learning approach based on along track and gridded multi-source datasets for the Gulf stream area for the period between 2017 and 2019. Results show that multi-modal deep learning improve the classification by more than 20% compared to transformer model training on along-track data alone.

How to cite: Abulaitijiang, A., Bolmer, E., Roscher, R., Kusche, J., and Fenoglio-Marc, L.: Eddy identification from along-track altimeter data with multi-modal deep learning, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-6818, https://doi.org/10.5194/egusphere-egu23-6818, 2023.

EGU23-8479 | ECS | PICO | ESSI1.1

Model evaluation strategy impacts the interpretation and performance of machine learning models 

Lily-belle Sweet, Christoph Müller, Mohit Anand, and Jakob Zscheischler

Machine learning models are able to capture highly complex, nonlinear relationships, and have been used in recent years to accurately predict crop yields at regional and national scales. This success suggests that the use of ‘interpretable’ or ‘explainable’ machine learning (XAI) methods may facilitate improved scientific understanding of the compounding interactions between climate, crop physiology and yields. However, studies have identified implausible, contradicting or ambiguous results from the use of these methods. At the same time, researchers in fields such as ecology and remote sensing have called attention to issues with robust model evaluation on spatiotemporal datasets. This suggests that XAI methods may produce misleading results when applied to spatiotemporal datasets, but the impact of model evaluation strategy on the results of such methods has not yet been examined.

In this study, machine learning models are trained to predict simulated crop yield, and the impact of model evaluation strategy on the interpretation and performance of the resulting models is assessed. Using data from a process-based crop model allows us to then comment on the plausibility of the explanations provided by common XAI methods. Our results show that the choice of evaluation strategy has an impact on (i) the interpretations of the model using common XAI methods such as permutation feature importance and (ii) the resulting model skill on unseen years and regions. We find that use of a novel cross-validation strategy based on clustering in feature-space results in the most plausible interpretations. Additionally, we find that the use of this strategy during hyperparameter tuning and feature selection results in improved model performance on unseen years and regions. Our results provide a first step towards the establishment of best practices for model evaluation strategy in similar future studies.

How to cite: Sweet, L., Müller, C., Anand, M., and Zscheischler, J.: Model evaluation strategy impacts the interpretation and performance of machine learning models, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-8479, https://doi.org/10.5194/egusphere-egu23-8479, 2023.

EGU23-9437 | PICO | ESSI1.1

On Unsupervised Learning from Environmental Data 

Mikhail Kanevski

Predictive learning from data usually is formulated as a problem of finding the best connection between input and output spaces by optimizing well-defined cost or risk functions.

In geo-environmental studies input space is usually constructed from the geographical coordinates and features generated from different sources of available information (feature engineering), by applying expert knowledge, using deep learning technologies and taking into account the objectives of the study. Often, it is not known in advance if the input space is complete or contains redundant features. Therefore, unsupervised learning (UL) is essential in environmental data analysis, modelling, prediction and visualization. UL also helps better understand the data and phenomena they describe as well as in interpreting/communicating modelling strategies and the results in the decision-making process.

The main objective of the present investigation is to review some important topics in unsupervised learning from environmental data: 1) quantitative description of the input space (“monitoring network”) structure using global and local topological and fractal measures, 2) dimensionality reduction, 3) unsupervised feature selection and clustering by applying a variety of machine learning algorithms (kernel-based, ensemble learning, self-organizing maps) and visualization tools.

Major attention is paid to the simulated and real spatial data (pollution, permafrost, geomorphological and wind fields data).  Considered case studies have different input space dimensionality/topology and number of measurements. It is confirmed that UL should be considered an integral part of a generic methodology of environmental data analysis. Comprehensive comparisons and discussions of the results conclude the research.

 

 

How to cite: Kanevski, M.: On Unsupervised Learning from Environmental Data, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-9437, https://doi.org/10.5194/egusphere-egu23-9437, 2023.

EGU23-11601 | PICO | ESSI1.1

Clustering Geodata Cubes (CGC) and Its Application to Phenological Datasets 

Francesco Nattino, Ou Ku, Meiert W. Grootes, Emma Izquierdo-Verdiguier, Serkan Girgin, and Raúl Zurita-Milla

Unsupervised classification techniques are becoming essential to extract information from the wealth of data that Earth observation satellites and other sensors currently provide. These datasets are inherently complex to analyze due to the extent across multiple dimensions - spatial, temporal, and often spectral or band dimension – their size, and the high resolution of current sensors. Traditional one-dimensional cluster analysis approaches, which are designed to find groups of similar elements in datasets such as rasters or time series, may come short of identifying patterns in these higher-dimensional datasets, often referred to as data cubes. In this context, we present our Clustering Geodata Cubes (CGC) software, an open-source Python package that implements a set of co- and tri-clustering algorithms to simultaneously group elements across two and three dimensions, respectively. The package includes different implementations to most efficiently tackle datasets that fit into the memory of a single machine as well as very large datasets that require cluster computing. A refining strategy to facilitate data pattern identification is also provided. We apply CGC to investigate gridded datasets representing the predicted day of the year when spring onset events (first leaf, first bloom) occur according to a well-established phenological model. Specifically, we consider spring indices computed at high spatial resolution (1 km) and continental scale (conterminous United States) for the last 40+ years and extract the main spatiotemporal patterns present in the data via CGC co-clustering functionality.  

How to cite: Nattino, F., Ku, O., Grootes, M. W., Izquierdo-Verdiguier, E., Girgin, S., and Zurita-Milla, R.: Clustering Geodata Cubes (CGC) and Its Application to Phenological Datasets, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-11601, https://doi.org/10.5194/egusphere-egu23-11601, 2023.

EGU23-12773 | PICO | ESSI1.1

Industrial Atmospheric Pollution Estimation Using Gaussian Process Regression 

Anton Sokolov, Hervé Delbarre, Daniil Boldyriev, Tetiana Bulana, Bohdan Molodets, and Dmytro Grabovets

Industrial pollution remains a major challenge in spite of recent technological developments and purification procedures. To effectively monitor atmosphere contamination, data from air quality networks should be coupled with advanced spatiotemporal statistical methods.

Our previous studies showed that standard interpolation techniques (like inverse distance weighting, linear or spline interpolation, kernel-based Gaussian Process Regression, GPR) are quite limited for the simulation of a smoke-like narrow-directed industrial pollution in the vicinity of the source (a few tenths of kilometers). In this work, we try to apply GPR, based on statistically estimated covariances. These covariances are calculated using СALPUFF atmospheric pollution dispersion model for a one-year simulation in the Kryvyi Rih region. The application of GPR permits taking into account high correlations between pollution values in neighboring points revealed by modeling. The result of the GPR covariance-based technique is compared with other interpolation techniques. It can be used then in the estimation and optimization of air quality networks.

How to cite: Sokolov, A., Delbarre, H., Boldyriev, D., Bulana, T., Molodets, B., and Grabovets, D.: Industrial Atmospheric Pollution Estimation Using Gaussian Process Regression, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-12773, https://doi.org/10.5194/egusphere-egu23-12773, 2023.

EGU23-12933 | ECS | PICO | ESSI1.1

Estimating vegetation carbon stock components by linking ground databases with Earth observations 

Daniel Kinalczyk, Christine Wessollek, and Matthias Forkel

Land ecosystems dampen the increase of atmospheric CO2 by storing carbon in soils and vegetation. In order to estimate how long carbon stays in land ecosystems, a detailed knowledge about the distribution of carbon in different vegetation components is needed. Current Earth observation products provide estimates about total above-ground biomass but do not further separate between carbon stored in trees, understory vegetation, shrubs, grass, litter or woody debris. Here we present an approach in which we link several Earth observation products with a ground-based database to estimate biomass in various vegetation components. Therefore, we use information about the statistical distribution of biomass components provided by the North American Wildland Fuels Database (NAWFD), which are however not available as geocoded data. We use ESA CCI AGB version 3 data from 2010 as a proxy in order to link the NAWFD data to the spatial information from Earth observation products. The biomass and corresponding uncertainty from the ESA CCI AGB and a map of vegetation types are used to select the likely distribution of vegetation biomass components from the set of in-situ measurements of tree biomass. We then apply Isolation Forest outlier detection and bootstrapping for a robust comparison of both datasets and for uncertainty estimation. We use Random Forest and Gaussian Process regression to predict the biomass of trees, shrubs, snags, herbaceous vegetation, coarse and fine woody debris, duff and litter from ESA CCI AGB and land cover, GEDI canopy height, Sentinel-3 LAI and bioclimatic data. The regression models reach high predictive power and allow to also extrapolate to other regions. Our derived estimates of vegetation carbon stock components provide a more detailed view on the land carbon storage and contribute to an improved estimate of potential carbon emissions from respiration, disturbances and fires.

How to cite: Kinalczyk, D., Wessollek, C., and Forkel, M.: Estimating vegetation carbon stock components by linking ground databases with Earth observations, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-12933, https://doi.org/10.5194/egusphere-egu23-12933, 2023.

EGU23-13196 | ECS | PICO | ESSI1.1

From Super-Resolution to Downscaling - An Image-Inpainting Deep Neural Network for High Resolution Weather and Climate Models 

Maximilian Witte, Danai Filippou, Étienne Plésiat, Johannes Meuer, Hannes Thiemann, David Hall, Thomas Ludwig, and Christopher Kadow

High resolution in weather and climate was always a common and ongoing goal of the community. In this regards, machine learning techniques accompanied numerical and statistical methods in recent years. Here we demonstrate that artificial intelligence can skilfully downscale low resolution climate model data when combined with numerical climate model data. We show that recently developed image inpainting technique perform accurate super-resolution via transfer learning using the HighResMIP of CMIP6 (Coupled Model Intercomparison Project Phase 6) experiments. Its huge data base offers a unique training opportunity for machine learning approaches. The transfer learning purpose allows also to downscale other CMIP6 experiments and models, as well as observational data like HadCRUT5. Combined with the technology of Kadow et al. 2020 of infilling missing climate data, we gain a neural network which reconstructs and downscales the important observational data set (IPCC AR6) at the same time. We further investigate the application of our method to downscale quantities predicted from a numerical ocean model (ICON-O) to improve computation times. In this process we focus on the ability of the model to predict eddies from low-resolution data.

An extension to:

Kadow, C., Hall, D.M. & Ulbrich, U. Artificial intelligence reconstructs missing climate information. Nature Geoscience 13, 408–413 (2020). https://doi.org/10.1038/s41561-020-0582-5

How to cite: Witte, M., Filippou, D., Plésiat, É., Meuer, J., Thiemann, H., Hall, D., Ludwig, T., and Kadow, C.: From Super-Resolution to Downscaling - An Image-Inpainting Deep Neural Network for High Resolution Weather and Climate Models, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-13196, https://doi.org/10.5194/egusphere-egu23-13196, 2023.

EGU23-14716 | ECS | PICO | ESSI1.1

Spatial-temporal transferability assessment of remote sensing data models for mapping agricultural land use 

Jayan Wijesingha, Ilze Dzene, and Michael Wachendorf

To assess the impact of anthropogenic and natural causes on land use and land use cover change, mapping of spatial and temporal changes is increasingly applied. Due to the availability of satellite image archives, remote sensing (RS) data-based machine learning models are in particular suitable for mapping and analysing land use and land cover changes. Most often, models trained with current RS data are employed to estimate past land cover and land use using available RS data with the assumption that the trained model predicts past data values similar to the accuracy of present data. However, machine learning models trained on RS data from particular locations and times may not be well transferred to new locations and time datasets due to various reasons. This study aims to assess the spatial-temporal transferability of the RS data models in the context of agricultural land use mapping. The study was designed to map agricultural land use (5 classes: maize, grasslands, summer crops, winter crops, and mixed crops) in two regions in Germany (North Hesse and Weser Ems) between the years 2010 and 2018 using Landsat archive data (i.e., Landsat 5, 7, and 8). Three model transferability scenarios were evaluated, a) temporal - S1, b) spatial - S2 and c) spatial-temporal - S3. Two machine learning models (random forest - RF and Convolution Neural Network - CNN) were trained. For each transferability scenario, class-level F1 and macro F1 values were compared between the reference and targeted transferability systems. Moreover, to explain the results of transferability scenarios, transferability results were further explored using dissimilarity index and area of applicability (AOA) concepts. The average macro F1 value of the trained model for the reference scenario (no transferability) was 0.75. For assessed transferability scenarios, the average macro F1 values were 0.70, 0.65 and 0.60, for S1, S2, and S3 respectively. It shows that, when predicting data from different spatial-temporal contexts, the model performance is decreasing. In contrast, the average proportion of the data inside the AOA did not show a clear pattern for different scenarios. In the context of RS data-related model building, spatial-temporal transferability is essential because of the limited availability of the labelled data. Thus, the results from this case study provide an understanding of how model performance changes when the model is transferred to new settings with data from different temporal and spatial domains.

How to cite: Wijesingha, J., Dzene, I., and Wachendorf, M.: Spatial-temporal transferability assessment of remote sensing data models for mapping agricultural land use, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-14716, https://doi.org/10.5194/egusphere-egu23-14716, 2023.

EGU23-16096 | ECS | PICO | ESSI1.1

Limitations of machine learning in a spatial context 

Jens Heinke, Christoph Müller, and Dieter Gerten

Machine learning algorithms have become popular tools for the analysis of spatial data. However, a number of studies have demonstrated that the application of machine learning algorithms in a spatial context has limitations. New geographic locations may lie outside of the data range for which the model was trained, and estimates of model performance may be too optimistic, when spatial autocorrelation of geographic data is not properly accounted for in cross-validation. We here use artificially created spatial data fields to conduct a series of experiments to further investigate the potential pitfalls of random forest regression applied to spatial data. We provide new insights on previously reported limitations and identify further limitations. We demonstrate that the same mechanism that leads to overoptimistic estimates of model performance (when based on ordinary random k-fold cross-validation) can also lead to a deterioration of model performance. When covariates contain sufficient information to deduce spatial coordinates, the model can reproduce any spatial pattern in the training data even if it is entirely or partly unrelated to the covariates. The presence of spatially correlated residuals in the training data changes how the model utilizes the information of the covariates and impedes the identification of the actual relationship between covariates and response. This reduces model performance when the model is applied to data with a different spatial structure. Under such conditions, machine learning methods that are sufficiently flexible to fit to autocorrelated residuals (such as random forest) may not be an optimal choice. Better models may be obtained using less flexible but more transparent approaches such as generalized linear models or additive models.

How to cite: Heinke, J., Müller, C., and Gerten, D.: Limitations of machine learning in a spatial context, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-16096, https://doi.org/10.5194/egusphere-egu23-16096, 2023.

EGU23-16768 | PICO | ESSI1.1

Knowledge Representation of Levee Systems - an Environmental Justice Perspective 

Armita Davarpanah, Anthony.l Nguy Robertson, Monica Lipscomb, Jacob.w. McCord, and Amy Morris

Levee systems are designed to reduce the risk of water-related natural hazards (e.g., flooding) in areas behind levees. Most levees in the U.S. are designed to protect people and facilities against the impacts of the 100-year floods. However, the current climate change is increasing the probability of the occurrence of 500-year flood events that in turn increases the likelihood of economic loss, environmental damage, and fatality that disproportionately impacts communities of color and low-income groups facing socio-economic inequities in leveed areas. The increased frequency and intensity of flooding is putting extra pressure on emergency responders that often require diverse, multi-dimensional data originating from different sources to make sound decisions. Currently, the integration of these heterogeneous data acquired by diverse sensors and emergency agencies about environmental, hydrological, and demographic indicators requires costly and complex programming and analysis that hinder rapid disaster management efforts. Our domain ‘Levee System Ontology (LSO)’ resolves the data integration and software interoperability issues by semantically modeling the static aspects, dynamic processes, and information content of the levee systems by extending the well-structured, top-level Basic Formal Ontology (BFO) and mid-level Common Core Ontologies (CCO). LSO’s class and property names follow the terminology of the National Levee Database (NLD), allowing data scientists using NLD data to constrain their classifications based on the knowledge represented in LSO. In addition to modeling the information related to the characteristics and status of the structural components of the levee system, LSO represents the residual risk in leveed areas, economic and environmental losses, and damage to facilities in case of breaching and/or overtopping of levees. LSO enables reasoning to infer components and places along levees and floodwalls where the system requires inspection, maintenance, and repair based on the status of system components. The ontology also represents the impact of flood management activities on different groups of people from an environmental justice perspective, based on the principles of DEI (diversity, equity, inclusion) as defined by the U.N. Sustainable Development Goals.

How to cite: Davarpanah, A., Nguy Robertson, A. L., Lipscomb, M., McCord, J. w., and Morris, A.: Knowledge Representation of Levee Systems - an Environmental Justice Perspective, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-16768, https://doi.org/10.5194/egusphere-egu23-16768, 2023.

EGU23-2454 | ECS | Orals | ESSI2.2

The future of NASA Earth Science in the commercial cloud: Challenges and opportunities 

Alexey Shiklomanov, Manil Maskey, Yoseline Angel, Aimee Barciauskas, Philip Brodrick, Brian Freitag, and Jonas Sølvsteen

NASA produces a large volume and variety of data products that are used every day to support research, decision making, and education. The widespread use of NASA’s Earth Science data is enabled by NASA’s Earth Science Data System (ESDS) program, which oversees the archiving and distribution of these data and invests in the development of new data systems and tools. However, NASA’s current approach to Earth Science data distribution — based on distributed institutional archives with individual on-premises high-performance computing capabilities — faces some significant challenges, including massive increases in data volume from upcoming missions, a greater need for transdisciplinary science that synthesizes many different kinds of observations, and a push to make science more open, inclusive, and accessible. To address these challenges, NASA is aggressively migrating its Earth Science data and related tools and services into the commercial cloud. Migration of data into the commercial cloud can significantly improve NASA’s existing data system capabilities by (1) providing more flexible options for storage and compute (including rapid, as-needed access to state-of-the-art capabilities); (2) by centralizing and standardizing data access, which gives all of NASA’s institutional data centers access to all of each other’s datasets; and (3) by facilitating “analysis-in-place”, whereby users can bring their own computational workflows and tools to the data rather than having to maintain their own copies of NASA datasets. However, migration to the commercial cloud also poses some significant challenges, including (1) managing costs under a “pay-as-you-go” model; (2) incompatibility with existing tools and data formats with object-based storage and network access; (3) vendor lock-in; (4) challenges with data access for workflows that mix on-premise and cloud computing; and (5) standardization for highly diverse data as is present in NASA’s data archive. I conclude with two examples of recent NASA activities showcasing capabilities enabled by the commercial cloud: An interactive analysis and development platform for analyzing airborne imaging spectroscopy data, and a new collection of tools and services for data discovery, analysis, publication, and data-driven storytelling (Visualization, Exploration, and Data Analysis, VEDA).

How to cite: Shiklomanov, A., Maskey, M., Angel, Y., Barciauskas, A., Brodrick, P., Freitag, B., and Sølvsteen, J.: The future of NASA Earth Science in the commercial cloud: Challenges and opportunities, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-2454, https://doi.org/10.5194/egusphere-egu23-2454, 2023.

EGU23-3657 | Orals | ESSI2.2

CADS 2.0: A FAIRest Data Store infrastructure blooming in a landscape of Data Spaces. 

Angel lopez alos, Baudouin raoult, Edward comyn-platt, and James varndell

First launched as the Climate Data Store (CDS) supporting the Climate Change Service (C3S) and later instantiated as the Atmosphere Data Store (ADS) for the Atmosphere Monitoring Service (CAMS), the shared underlaying Climate & Atmosphere Data Store Infrastructure (CADS) represents the technical backbone for the implementation of Copernicus services entrusted to ECMWF on behalf of the European Commission. CDS in addition also offer access to a selection of datasets from the Emergency Management Service (CEMS).  As the flagship instance of the infrastructure, CDS counts with more than 160k registered users and delivers a daily average over 100 TBs of data from a catalogue of 141 datasets.

CADS Software Infrastructure is designed as a distributed system and open framework that facilitates improved access to a broad spectrum of data and information via a powerful service-oriented architecture offering seamless web-based and API-based search and retrieve capabilities. CADS also provides a generic software toolbox that allow users to make use of available datasets and a series of state-of-the-art data tools that can be combined into more elaborated processes, and present results graphically in the form of interactive web applications.  CADS Infrastructure is hosted in an on-premises Cloud physically located within ECMWF Data Centre and implemented using a collection of virtual machines, networks and large data volumes.  Fully customized instances of CADS, including dedicated Virtual Hardware Infrastructure, Software Application and Catalogued content can be easily deployed thanks to implemented automatization and configuration software tools and a set of configuration files which are managed by a distributed version control system. Tailored scripts and templates allow to easily accommodate different standards and interoperate with external platforms.

ECMWF in partnership with EUMETSAT, ESA and EEA also implement the Data and Information Access Services (DIAS) platform called WEkEO, a distributed cloud-computing infrastructure used to process and make the data generated by Copernicus Services accessible to users together with derived products and all satellite data from the Copernicus Sentinels. Within the partnership ECMWF is responsible for the procurement of the software to implement Data Access Services, Processing and Tools which specifications build on the same fundamentals than CADS.  Adoption of FAIR principles has demonstrated cornerstone to maximize synergies and interactions between CADS, WEkEO and other related platforms.

 

Driven by the increasing demand and the evolving landscape of platforms and services a major project for the modernization of the CADS infrastructure is currently underway. The coming CADS 2.0 aims to capitalize experience, feedbacks, lesson learned, know-how from current CADS, embrace advanced technologies, engage with a broader user community, make the current platform more versatile and cloud oriented, improve workflows and methodologies, ensure compatibility with state-of-the-art solutions such as machine learning, data cubes and interactive notebooks, consolidate the adoption of FAIR principles and strength synergies with related platforms.

 

As complementary Infrastructures, WEkEO will allow users to harness compute resources without the networking and storage costs associated with public Cloud offerings in where CADS Toolbox 2.0  will deploy and run allowing heavy jobs (retrieval and reduction) to be submitted to CADS 2.0 core infrastructure as services.

How to cite: lopez alos, A., raoult, B., comyn-platt, E., and varndell, J.: CADS 2.0: A FAIRest Data Store infrastructure blooming in a landscape of Data Spaces., EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-3657, https://doi.org/10.5194/egusphere-egu23-3657, 2023.

EGU23-5038 | Posters on site | ESSI2.2

EO4EU - AI-augmented ecosystem for Earth Observation data accessibility with Extended reality User Interfaces for Service and data exploitation 

Vasileios Baousis, Stathes Hadjiefthymiades, Charalampos Andreou, Kakia Panagidh, and Armagan Karatosun

EO4EU is a European Commission-funded innovation project bringing forward the EO4EU Platform which will access and use of EO data easier for environmental, government, and even business forecasts and operations.

The EO4EU Platform, which will be accessible at www.eo4eu.eu, will link already existing major EO data sources such as GEOSS, INSPIRE, Copernicus, Galileo, DestinE among others and provide a number of tools and services to assist users to find and access the data they are interested in, as well as to analyse and visualise this data. The platform will leverage machine learning to support the handling of the characteristically large volume of EO data as well as a combination of Cloud computing infrastructure and pre-exascale high-performance computing to manage processing workloads.

Specific attention is also given to developing user-friendly interfaces for EO4EU allowing users to intuitively use EO data freely and easily, even with the use of extended reality.

EO4EU objectives are:

  • Holistic DataOps ecosystem to enhance access and usability of EO information.
  • A semantic-enhanced knowledge graph that augments the FAIRness of EO data and supports sophisticated data representation and dynamics.
  • A machine learning pipeline that enables the dynamic annotation of the various EO data sources.
  • Efficient, reliable and interoperable inter- and intra- data layer communications
  • Advance stakeholders’ knowledge capacity through informed decision-making and policy-making support.
  • A full range of use case scenarios addressing current data needs, capitalizing existing digital services and platforms, fostering their usability and practicality, and taking into account ethical aspects aiming at social impact maximization.

Technical and scientific innovation can be summarised as follows:

  • Improve compression rates for image quality and reduce data volumes.
  • Improve the quality of reconstructed compressed images, maintaining the same comparison rates
  • Facilitate the design of custom services with a minimized labelled data requirement
  • Learn robust and transferable representations of EO data
  • Publishing original trained models on EO data with all relevant assisting material to support reusability in a public repository.
  • Data fusion optimized execution in HPC and GPU environment
  • Better accuracy of data representation
  • Customizable visualization tools tailored to the needs of each use case
  • Dedicated graphs for end-users with various granularities, modalities, metrics and statistics to observe the overall trends in time, correlations, and cause-and-effect relationships through a responsive web-interfaced module.

In this presentation, the status of the project, the adopted architecture and the findings from our initial user surveys pertaining to EO data access and discovery will be analysed. Finally, the next steps of the project, the early access to the developed platform and the challenges and opportunities will be discussed.  

How to cite: Baousis, V., Hadjiefthymiades, S., Andreou, C., Panagidh, K., and Karatosun, A.: EO4EU - AI-augmented ecosystem for Earth Observation data accessibility with Extended reality User Interfaces for Service and data exploitation, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-5038, https://doi.org/10.5194/egusphere-egu23-5038, 2023.

EGU23-5862 | Orals | ESSI2.2

The ESA Green Transition Information Factories – using Earth Observation and cloud-based analytics to address the Green Transition information needs. 

Patrick Griffiths, Stefanie Lumnitz, Christian Retscher, Frank-Martin Seifert, and Yves-Louis Desnos

In response to the global climate and sustainability crisis, many countries have expressed ambitions goals in terms of carbon neutrality and a green economy. In this context, the European Green Deal comprises several policy elements aimed to achieve carbon neutrality by 2050.

In response to these ambitions, the European Space Agency (ESA) is initiating various efforts to leverage on space technologies and data and support various Green Deal ambitions. The ESA Space for Green Future (S4GF) Accelerator will explore new mechanisms to promote the use of space technologies and advanced modelling approaches for scenario investigations on the Green Transition of economy and society.

A central element of the S4GF accelerator are the Green Transition Information Factories (GTIF). GTIF takes advantage of Earth Observation (EO) capabilities, geospatial and digital platform technologies, as well as cutting edge analytics to generate actionable knowledge and decision support in the context of the Green Transition.

A first national scale GTIF demonstrator has now been developed for Austria.
It addressed the information needs and national priorities for the Green Deal in Austria. This is facilitated through a bottom-up consultation and co-creation process with various national stakeholders and expert entities. These requirements are matched with various EO industry teams that

The current GTIF demonstrator for Austria (GTIF-AT) builds on top of federated European cloud services, providing efficient access to key EO data repositories and rich interdisciplinary datasets. GTIF-AT initially addresses five Green Transition domains: (1) Energy Transition, (2) Mobility Transition, (3) Sustainable Cities, (4) Carbon Accounting and (5) EO Adaptation Services.

For each of these domains, scientific narratives are provided and elaborated using scrollytelling technologies. The GTIF interactive explore tools allow various users to explore the domains and subdomains in more detail to investigate better understand the challenges, complexities, and underlying socio-economic and environmental conflicts. The GTIF interactive explore tools combine domain specific scientific results with intuitive Graphical User Interfaces and modern frontend technologies. In the GTIF Energy Transition domain, users can interactively investigate the suitability of locations at 10m resolution for the expansion of renewable (wind or solar) energy production. The tools also allow investigating the underlying conflicts e.g., with existing land uses or biodiversity constraints. Satellite based altimetry is used to dynamically monitor the water levels in hydro energy reservoirs to infer the related energy storage potentials. In the sustainable cities’ domain, users can investigate the photovoltaic installments on rooftops and assess the suitability in terms of roof geometry and expected energy yields.

GTIF enables various users to inform themselves and interactively investigate the challenges but also opportunities related to the Green Transition ambitions. This enables, for example, citizens to engage in the discussion process for the renewable energy expansion or support energy start-ups to develop new services. The GTIF development follows an open science and open-source approach and several new GTIF instances are planned for the next years, addressing the Green Deal information needs and accelerating the Green Transition. This presentation will showcase some of the GTIF interactive explore tools and provide an outlook on future efforts.

How to cite: Griffiths, P., Lumnitz, S., Retscher, C., Seifert, F.-M., and Desnos, Y.-L.: The ESA Green Transition Information Factories – using Earth Observation and cloud-based analytics to address the Green Transition information needs., EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-5862, https://doi.org/10.5194/egusphere-egu23-5862, 2023.

EGU23-5936 | Orals | ESSI2.2

What does the European Spatial Data Infrastructure INSPIRE need in order to become a Green Deal Data Space? 

Joan Masó, Alba Brobia, Ivette Serral, Ingo Simonis, Francesca Noardo, Lucy Bastin, Carlos Cob Parro, Joaquín García, Raul Palma, and Sébastien Ziegler

In May 2007, the INSPIRE directive established the path towards creating the European Spatial Data Infrastructure (ESDI). While the Joint Research Centre (JRC) defined a set of detailed implementation guidelines, the European member states determined the agencies responsible for delivering the different topics specified in the directive’s annexes. INSPIRE’s goal was - and still is - to organize and share Europe’s data supporting environmental policies and actions. However, the way that INSPIRE was defined limited contributions to the public sector, and limited topics to those specifically listed in its annexes. Technical challenges and a lack of appropriate tools have impeded INSPIRE from implementing its own guidelines, and even after 15 years, the dream of a continuous, consistent description of Europe’s environment has still not completely materialized. We should apply the lessons learnt in INSPIRE when we build the Green Deal Data Space (GDDS). To create the GDDS, we should start with ESDI (the European Spatial Data Infrastructure), but also engage and align with the ongoing preparatory actions for data spaces (e.g., for green deal and agriculture) as well as include actors and networks that have emerged or been organized in the recent years. These include: networks of in situ observations (e.g. the  Environmental Research Infrastructures (ENVRI) community); Citizen Science initiatives (such as the biodiversity observations integrated in the Global Biodiversity Information Facility (GBIF), or sensor communities for e.g. air quality); predictive algorithms and machine learning models and simulations based on artificial intelligence (such as the ones deployed in the European Open Science Cloud, International Data Space Association and Gaia-X; services driven both by the scientific community and the private sector); remote sensing derived products developed by the Copernicus Services. Most of these data providers have already embraced the FAIR principles and open data, providing many examples of best practice which can assist newer adopters on the path to open science. In the Horizon Europe project AD4GD (AllData4GreenDeal), we believe that, instead of trying to force data producers to adopt cumbersome new protocols, we should take advantage of the latest developments in geospatial standards and APIs. These allow loosely coupled but well documented and interlinked data sources and models in the GDDS while achieving scientifically robust integration  and easy access to data in the resulting workflows. Another fundamental element will be the adoption of a common and extensible information model enabling the representation and exchange of Green Deal related data in an unambiguous manner, including vocabularies for Essential Variables to organize the observable measurements and increase the level of semantic interoperability. This will allow systems and components from different technology providers to seamless interoperate and exchange data, and to have an integrated view and access to exploit the full value of the available data. The project will validate the approach in three pilot cases: water quality and availability of Berlin lakes, biodiversity corridors in the metropolitan area of Barcelona and low cost air quality sensors in Europe. The AD4GD project is funded by the European Union under the Horizon Europe program.

How to cite: Masó, J., Brobia, A., Serral, I., Simonis, I., Noardo, F., Bastin, L., Cob Parro, C., García, J., Palma, R., and Ziegler, S.: What does the European Spatial Data Infrastructure INSPIRE need in order to become a Green Deal Data Space?, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-5936, https://doi.org/10.5194/egusphere-egu23-5936, 2023.

EGU23-7052 | Orals | ESSI2.2

FAIRiCUBE: Enabling Gridded Data Analysis for All 

Katharina Schleidt and Stefan Jetschny

Previously, collecting, storing, owning and, if necessary, digitizing data was vital for any data-driven application. Nowadays, we are swimming in data, whereby one could postulate that we are drowning. However, downloading vast data to local storage and subsequent in-house processing on dedicated hardware is inefficient and not in line with the big data processing philosophy. While the FAIR principles are fulfilled as the data is findable, accessible, and interoperable, the actual reuse of the data to gain new insights depends on the data user’s local capabilities. Scientists aware of the potentially available data and processing capabilities are still not able to easily leverage these resources as required to perform their work; while the analysis gap entailed by the information explosion is being increasingly highlighted, remediation lags.

The core objective of the FAIRiCUBE project is to enable players from beyond classic Earth Observation (EO) domains to provide, access, process, and share gridded data and algorithms in a FAIR and TRUSTable manner. To reach this objective, we are creating the FAIRiCUBE HUB, a crosscutting platform and framework for data ingestion, provision, analysis, processing, and dissemination, to unleash the potential of environmental, biodiversity and climate data through dedicated European data spaces.

In order to gain a better understanding of the various obstacles to leveraging available assets in regard to both data as well as analysis and processing modalities, several use cases have been defined addressing diverse aspects of European Green Deal (EGD) priority actions. Each of the use cases has a defined objective, approach, research question and data requirements.

The use cases selected to guide the creation of the FAIRiCUBE HUB are as follows:

  • Urban adaptation to climate change
  • Biodiversity and agriculture nexus
  • Biodiversity occurrence cubes
  • Drosophila landscape genomics
  • Spatial and temporal assessment of neighborhood building stock

Many of the issues encountered within the FAIRiCUBE project are formally considered solved. Catalogues are available detailing the available datasets, standards define how the datasets are to be structured and annotated with the relevant metainformation. A vast array of processing functionality has emerged that can be applied to such resources. However, while all this is considered state-of-the-art in the EO community, there is a subtle delta blocking access to wider communities that could make good use of the available resources pertaining to their own domains of work. These include, but are not limited to:

  • Identifying available data sources
  • Determining fitness for use
  • Interoperability of data with divergent spatiotemporal basis
  • Understanding access modalities
  • Scoping required resources
  • Providing non-gridded data holdings in a gridded manner

There is great potential in integrating the diverse gridded resources available from EO sources within wider research domains. However, at present, there are subtle barriers blocking this potential. Within FAIRiCUBE, these issues are being collected and evaluated, mitigation measures are being explored together with researchers not from traditional EO domains, with the goal of breaking down these barriers, and enabling powerful research and data analysis potential to a wide range of scientists.

How to cite: Schleidt, K. and Jetschny, S.: FAIRiCUBE: Enabling Gridded Data Analysis for All, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-7052, https://doi.org/10.5194/egusphere-egu23-7052, 2023.

EGU23-7074 | ECS | Posters on site | ESSI2.2

An EOSC-enabled Data Space environment for the climate community 

Fabrizio Antonio, Donatello Elia, Guillaume Levavasseur, Atef Ben Nasser, Paola Nassisi, Alessandro D'Anca, Alessandra Nuzzo, Sandro Fiore, Sylvie Joussaume, and Giovanni Aloisio

The exponential increase in data volumes and complexities is causing a radical change in the scientific discovery process in several domains, including climate science. This affects the different stages of the data lifecycle, thus posing significant data management challenges in terms of data archiving, access, analysis, visualization, and sharing. The data space concept can support scientists' workflow and simplify the process towards a more FAIR use of data.

In the context of the European Open Science Cloud (EOSC) initiative launched by the European Commission, the ENES Data Space (EDS) represents a domain-specific implementation of the data space concept. The service, developed in the frame of the EGI-ACE project, aims to provide an open, scalable, cloud-enabled data science environment for climate data analysis on top of the EOSC Compute Platform. It is accessible in the European Open Science Cloud (EOSC) through the EOSC Catalogue and Marketplace (https://marketplace.eosc-portal.eu/services/enes-data-space) and it also provides a web portal (https://enesdataspace.vm.fedcloud.eu) including information, tutorials and training materials on how to get started with its main features. 

The EDS integrates into a single environment ready-to-use climate datasets, compute resources and tools, all made available through the Jupyter interface, with the aim of supporting the overall scientific data processing workflow.  Specifically, the data store linked to the ENES Data Space provides access to a multi-terabyte set of variable-centric collections from large-scale global climate experiments.  The data pool consists of a mirrored subset of CMIP (Coupled Model Intercomparison Project) datasets from the ESGF (Earth System Grid Federation) federated data archive, collected and kept synchronized with the remote copies by using the Synda tool developed within the scope of the IS-ENES3 H2020 project. Community-based, open source frameworks (e.g., Ophidia) and libraries from the Python ecosystem provide the capabilities for data access, analysis and visualisation. Results  and experiment definitions (i.e., Jupyter Notebooks) can be easily shared among users promoting data sharing and application re-use towards a more Open Science approach. 

An overview of the data space capabilities along with the key aspects in terms of data management will be presented in this work.

How to cite: Antonio, F., Elia, D., Levavasseur, G., Ben Nasser, A., Nassisi, P., D'Anca, A., Nuzzo, A., Fiore, S., Joussaume, S., and Aloisio, G.: An EOSC-enabled Data Space environment for the climate community, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-7074, https://doi.org/10.5194/egusphere-egu23-7074, 2023.

EGU23-7786 | Posters on site | ESSI2.2

Constructing a Searchable Knowledge Repository for FAIR Climate Data 

Mark Roantree, Branislava Lalić, Stevan Savić, Dragan Milošević, and Michael Scriney

The development of a knowledge repository for climate science data is a multidisciplinary effort between the domain experts (climate scientists), data engineers who's skills include design and building a knowledge repository, and machine learning researchers who provide expertise on data preparation tasks such as gap filling and advise on different machine learning models that can exploit this data.

One of the main goals of the CA20108 cost action is to develop a knowledge portal that is fully compliant with the FAIR principles for scientific data management. In the first year, a bespoke knowledge portal was developed to capture metadata for FAIR datasets. Its purpose was to provide detailed metadata descriptions for shareable micro-meteorological (micromet) data using the WMO standard. While storing Network, Site and Sensor metadata locally, the system passes the actual data to Zenodo, receives back the DOI and thus, creates a permanent link between the Knowledge Portal and the storage platform Zenodo. While the user searches the Knowledge portal (metadata), results provide both detailed descriptions and links to data on the Zenodo platform. Our adherence to FAIR principles are documented below:

  • Findable. Machine-readable metadata is required for automatic discovery of datasets and services. A metadata description is supplied by the data owners for all micro-meteorological data shared on the system which subsequently drives the search engine, using keywords or network, site and sensor search terms.
  • Accessible. When suitable datasets have been identified, access details should be provided. Assuming data is freely accessible, Zenodo DOIs and links are provided for direct data access.
  • Interoperable. Data interoperability means the ability to share and integrate data from different users and sources. This can only happen if a standard (meta)data model is employed to describe data, an important concept which generally requires data engineering skills to deliver. In the knowledge portal presented here, the WMO guide provides the design and structure for metadata.    
  • Reusable. To truly deliver reusability, metadata should be expressed in as detailed a manner as possible. In this way, data can be replicated and integrated according to different scientific requirements. While the Knowledge Portal facilitates very detailed metadata descriptions, not all metadata is compulsory as it was accepted that in some cases, the overhead in providing this information can be very costly. 

Simple analytics are in place to monitor the volume and size of networks in the system. Current metrics include: network count; average size of network (number of sites); dates and size of datasets per network/site; numbers and types of sensors in each site, etc. The current Portal is in Beta version meaning that the system is currently functional but open only to members of the Cost Action who are nominated testers. This status is due to change in Q1/2023 when access will be open to the wider climate science community.  

Current plans include new Tools and Services to assess the quality of data, including the level of gaps and in some cases, machine learning tools will be provided to attempt gap filling for datasets meeting certain requirements.

 

How to cite: Roantree, M., Lalić, B., Savić, S., Milošević, D., and Scriney, M.: Constructing a Searchable Knowledge Repository for FAIR Climate Data, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-7786, https://doi.org/10.5194/egusphere-egu23-7786, 2023.

EGU23-7842 | Orals | ESSI2.2 | Highlight

Destination Earth - Processing Near Data and Massive Data Handling 

Danaele Puechmaille, Jordi Duatis Juarez, Miruna Stoicescu, Michael Schick, and Borys Saulyak

Destination Earth is an operational service under the lead of the European Commission being implemented jointly by ESA, ECMWF and EUMETSAT.

The presentation will provide insights of how Destination Earth provides Near Data Processing and deals with Massive Data.

The objective of the European Commission’s Destination Earth (DestinE) initiative is to deploy several highly accurate digital replicas of the Earth (Digital Twins) in order to monitor and simulate natural as well as human activities and their interactions, to develop and test “what-if” scenarios that would enable more sustainable developments and support European environmental policies. DestinE addresses the challenge to manage and make accessible the sheer amount of data generated by the Digital Twins and observation data located at external sites such as the ones depicted in the figure below. This data will be made available fast enough and in a format ready to support analysis scenarios proposed by the DestinE service users.

 

Figure 1 :  DestinE Data Sources (green) and Stakeholders (orange)

 

The “DestinE Data Lake” (DEDL) is one of the three Destination Earth components interacting with:

  • the Digital Twin Engine (DTE), which runs the simulation models, under ECMWF responsibility
  • the DestinE Core Service Platform (DESP), which represents the user entry point to the DestinE services and data, under ESA responsibility

The DestinE Data Lake (DEDL) fulfils the storage and access requirements for any data that is offered to DestinE users. It provides users with a seamless access to the datasets, regardless of data type and location. Furthermore, the DEDL supports big data processing services, such as near-data processing to maximize throughput and service scalability. The data lake is built inter alia upon existing data lakes such as Copernicus DIAS, ESA, EUMETSAT, ECMWF as well as complementary data from diverse sources like federated data spaces, in-situ or socio-economic data. The DT Data Warehouse is a sub-component of the DEDL which stores relevant subsets of the output from each  digital twin (DT) execution being powered by ECMWFs Hyper-Cube service.

During the session, EUMETSAT’s representative will share to the community how the Destination Earth Data Lake component implements and takes advantage of Near Data Processing and also how the System handles massive data access and exchange. The Destination Earth Data Portfolio will be presented.

Figure 2: Destination Earth Data Portfolio

How to cite: Puechmaille, D., Duatis Juarez, J., Stoicescu, M., Schick, M., and Saulyak, B.: Destination Earth - Processing Near Data and Massive Data Handling, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-7842, https://doi.org/10.5194/egusphere-egu23-7842, 2023.

EGU23-8041 | Posters on site | ESSI2.2

Coastal Digital Twins: building knowledge through numerical models and IT tools 

Anabela Oliveira, André B. Fortunato, Gonçalo de Jesus, Marta Rodrigues, and Luís David

Digital Twins integrate continuously, in an interactive, two-way data connection, the real and the virtual assets. They provide a virtual representation of a physical asset enabled through data and models and can be used for multiple applications such as real-time forecast, system optimization, monitoring and controlling, and support enhanced decision making. These recent tools take advantage of the huge online volume of data streams provided by satellites, IoT sensing and many real time surveillance platforms, and the availability of powerful computational resources that made process-solving, high resolution models or AI-based models possible, to build high accuracy replicas of the real world.
In this paper, the adaptation of the concept of Digital Twins is extended from the ocean to the coastal zones, handling the high non-linear physics and the complexity of monitoring these regions, using the on-demand coastal forecast framework OPENCoastS (Oliveira et al., 2020; Oliveira et al., 2021) to build a user-centered data spaces where multiple services, from early-warning tools to collaboratory platforms, are customized to meet the users needs. Computational effort and data requirements for these services is high, integration of Coastal Digital Twins in federated computational infrastructures, such as European Open Science Cloud (EOSC) or INCD in Portugal, to guarantee the capacity to serve multiple users simultaneously.

This tool is demonstrated in the coastal area of Albufeira, located in the southern part of Portugal, in the scope of the SINERGEA innovation project. Coastal cities face growing challenges from flooding, sea water quality and energy sustainability, which increasingly require an intelligent, real-time management. The urban drainage infrastructures  transport to the wastewater treatment plants all waters likely to pollute downstream beaches. Real-time tools are required to support the assessment and prediction of the quality of bathing waters, to assess the possible need to prohibit beach water usage. During heavy rainfall events, a decentralized management systems can also contribute to mitigate downstream flooding. This requires the operation of the entire system to be optimized depending on the specific environmental conditions, and the participation and access to all the information by the several stakeholders. This system integrates real-time information provided by different entities, including monitoring networks, infrastructure operation data and a forecasting framework. The forecasting system includes several models covering all relevant water compartments: atmospheric, rivers and streams, urban stormwater and wastewater infrastructure, and receiving coastal water bodies circulation and water quality predictions.

References

A. Oliveira, A.B. Fortunato, M. Rodrigues, A. Azevedo, J. Rogeiro, S. Bernardo, L. Lavaud, X. Bertin, A. Nahon, G. Jesus, M. Rocha, P. Lopes, 2021. Forecasting contrasting coastal and estuarine hydrodynamics with OPENCoastS, Environmental Modelling & Software, Volume 143,105132, ISSN 1364-8152, https://doi.org/10.1016/j.envsoft.2021.105132.

A. Oliveira, A.B. Fortunato, J. Rogeiro, J. Teixeira, A. Azevedo, L. Lavaud, X. Bertin, J. Gomes, M. David, J. Pina, M. Rodrigues, P. Lopes, 2019. OPENCoastS: An open-access service for the automatic generation of coastal forecast systems, Environmental Modelling & Software, Volume 124, 104585, ISSN 1364-8152, https://doi.org/10.1016/j.envsoft.2019.104585.

How to cite: Oliveira, A., B. Fortunato, A., de Jesus, G., Rodrigues, M., and David, L.: Coastal Digital Twins: building knowledge through numerical models and IT tools, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-8041, https://doi.org/10.5194/egusphere-egu23-8041, 2023.

Data Spaces promise an innovative packaging of data and services into targeted one-stop shops of insight. A key ingredient for the fulfilment of the Data Space promise is easier, analysis-ready and fit-for-purpose access in particular to the Big Data which EO pixels and voxels constitute. Datacubes have proven to offer suitable service concepts and today are considered an acknowledge cornerstone.

In the GAIA-X EO Expert Group, a subgroup of the Geoinformation Working Group, one of the use cases investigated is the EarthServer federation. It bridges a seeming contradiction: a decentralized approach of independent data providers - with heterogeneous offerings, paid as well as free - versus
a single, common pool of datacubes where users do not need to know where data sit inorder to access, analyse, mix, and match them. Currently, a total of 140+ Petabyte is online available.

Membership in EarthServer is open and free, with a Charter being finalized ensuring transparent and democratic governance (one data provider - one vote). EarthServer thereby presents a key building block for the forthcoming Data Spaces: not only does it allow unifying data within a given Data Space, it also acts as a natural enabler for bridging and integrating different Data Spaces. This is amplified by the fact that the technology underlying EarthServer
is both the OGC datacube reference implementation and the INSPIRE Good Practice.

In our talk we present concept and practice of location-transparent datacube federations, exemplified by EarthServer, and its opportunities for future-directed Data Spaces.

How to cite: Baumann, P.: Spatio-Temporal Datacube Infrastructures as a Basis for Targeted Data Spaces, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-8394, https://doi.org/10.5194/egusphere-egu23-8394, 2023.

EGU23-8788 | Orals | ESSI2.2 | Highlight

Towards the European Green Deal Data Space 

Marta Gutierrez David, Mark Dietrich, Nevena Raczko, Sebastien Denvil, Mattia Santoro, Charis Chatzikyriakou, and Weronika Borejko

The European Commission has a program to accelerate the Digital Transition and is putting forward a vision based on cloud, common European Data Spaces and AI. As the data space paradigm unfolds across Europe, the Green Deal Data Space emerges. Its foundational pillars are to be built by the GREAT project. 

Data Spaces will be built over federated data infrastructures with common technical requirements (where possible) taking into account existing data sharing initiatives. Services and middleware developed to enable a federation of cloud-to-edge capacities will be at the disposal of all data spaces.

GREAT, the Green Deal Data Space Foundation and its Community of Practice, has the ambitious goal of defining how data with the potential to help combat climate and environmental related challenges, in line with the European Green Deal, can be shared more broadly among many stakeholders, sectors and boundaries, according to European values such as fair access, privacy and security. 

The project will consider and incorporate community defined requirements and use case analyses to ensure that the resulting data space infrastructure is designed and built with and for the users. 

An implementation roadmap will guide the efforts of multiple actors to converge toward a blueprint technical architecture, a data governance scheme that enables innovative business cases, and an inventory of high value datasets, that will enable proof of concept, implementation and scale-up of a minimum viable green deal data space.This roadmap will identify the resources and other key ingredients needed for the Green Deal Data Space to be successful. Data sharing by design and data sovereignty are some of the main principles that will apply from the early stages ensuring cost effective and sustainable infrastructures that will drive Europe towards a single data market and green economic growth. 

This talk will present how to engage with the project, the design methodology, progress towards the roadmap for deployment and the collaborative approach to building data spaces in conjunction with all the sectoral data spaces and the Data Space Support Centre.

How to cite: Gutierrez David, M., Dietrich, M., Raczko, N., Denvil, S., Santoro, M., Chatzikyriakou, C., and Borejko, W.: Towards the European Green Deal Data Space, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-8788, https://doi.org/10.5194/egusphere-egu23-8788, 2023.

EGU23-9291 | Orals | ESSI2.2

Harmonising the sharing of marine observation data considering data quality information 

Simon Jirka, Christian Autermann, Joaquin Del Rio Fernandez, Markus Konkol, and Enoc Martínez

Marine observation data is an important source of information for scientists to investigate the state of the ocean environment. In order to use data from different sources it is critical to understand how the data was acquired. This includes not only information about the measurement process and data processing steps, but also details on data quality and uncertainty. The latter aspect becomes especially important if data from different types of instruments shall be used. An example for this is the combined use of expensive high-precision instruments in conjunction with lower-cost but less precise instruments in order to densify observation networks.

Within this contribution we will present the work of the European MINKE project which intends, among further objectives, to facilitate the quality-aware and interoperable exchange of marine observation data.

For this purpose, a comprehensive review of existing interoperability standards and encodings has been performed by the project partners. This included aspects such as:

  • standards for encoding observation data
  • standards for describing sensor data (metadata)
  • Internet of Things protocols for transmitting data from sensing devices
  • interfaces for data access

From a technical perspective, the evaluation has especially considered developments such as the OGC API family of standards, lightweight data and metadata encodings, as well as developments coming from the Internet of Things community. This has been complemented by an investigation of relevant vocabularies that may be used for enabling semantic interoperability through a common terminology within data sets and corresponding metadata.

Furthermore, specific consideration was given to the description of different properties that help to assess the quality of an observation data sets. This comprises not only the description of the data itself but also quality related aspects of data acquisition processes. For this purpose, the MINKE project is working on recommendations how to enhance the analysed (meta) data models and encodings with further elements to better transport critical information for better interpreting data sources with regard to the accuracy, uncertainty and re-usability.

Within our contribution we will present the current state of the work within the MINKE project, the results achieved so far and the practical implementations that are performed in cooperation with the project partners.

How to cite: Jirka, S., Autermann, C., Del Rio Fernandez, J., Konkol, M., and Martínez, E.: Harmonising the sharing of marine observation data considering data quality information, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-9291, https://doi.org/10.5194/egusphere-egu23-9291, 2023.

EGU23-10223 | Orals | ESSI2.2

Identifying and Describing Billions of Objects: an Architecture to Tackle the Challenges of Volume, Variety, and Variability 

Jens Klump, Doug Fils, Anusuriya Devaraju, Sarah Ramdeen, Jesse Robertson, Lesley Wyborn, and Kerstin Lehnert

Persistent identifiers are applied to an ever-increasing diversity of research objects, including data, software, samples, models, people, instruments, grants, and projects. There is a growing need to apply identifiers at a finer and finer granularity. The systems developed over two decades ago to manage identifiers and the metadata describing the identified objects struggle with this increase in scale. Communities working with physical samples have grappled with these challenges of the increasing volume, variety, and variability of identified objects for many years. To address this dual challenge, the IGSN 2040 project explored how metadata and catalogues for physical samples could be shared at the scale of billions of samples across an ever-growing variety of users and disciplines. This presentation outlines how identifiers and their describing metadata can be scaled to billions of objects. In addition, it analyses who the actors involved with this system are and what their requirements are. This analysis resulted in the definition of a minimum viable product and the design of an architecture that addresses the challenges of increasing volume and variety. The system is also easy to implement because it reuses commonly used Web components. Our solution is based on a Web architectural model that utilises Schema.org, JSON-LD and sitemaps. Applying these commonly used architectural patterns on the internet allows us not only to handle increasing volume, variety and variability but also enable better compliance with the FAIR Guiding Principles.

How to cite: Klump, J., Fils, D., Devaraju, A., Ramdeen, S., Robertson, J., Wyborn, L., and Lehnert, K.: Identifying and Describing Billions of Objects: an Architecture to Tackle the Challenges of Volume, Variety, and Variability, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-10223, https://doi.org/10.5194/egusphere-egu23-10223, 2023.

EGU23-13331 | Posters on site | ESSI2.2

Data challenges and opportunities from nascent kilometre-scale simulations 

Valentine Anantharaj, Samuel Hatfield, Inna Polichtchouk, and Nils Wedi

Computational experiments using earth system models, approaching kilometre-scale (k-scale) horizontal resolutions, are becoming increasingly common across modeling centers. Recent advances in high performance computing systems, along with efficient parallel algorithms that are capable of leveraging accelerator hardware, have made k-scale models affordable for specific purposes. Surrogate models developed using machine learning methods also promise to further reduce the computational cost while enhancing model fidelity. The “avalanche of data from k-scale models” (Slingo et al., 2022) has also posed new challenges in processing, managing, and provisioning data to the broader user community. 

During recent years, a joint effort between the European Center for Medium-Range Weather Forecasts (ECMWF) and the Oak Ridge National Laboratory (ORNL) has succeeded in simulating “a baseline for weather and climate simulations at 1-km resolution,” (Wedi et al., 2020) using the Summit supercomputer at the Oak Ridge Leadership Facility (OLCF). The ECMWF hydrostatic Integrated Forecasting System (IFS), with explicit deep convection on an average grid spacing of 1.4 km, was used to perform a set of experimental nature runs (XNR) spanning two seasons corresponding to a northern hemispheric winter (NDJF), and August - October (ASO) months corresponding the tropical cyclone season in the North Atlantic. 

We developed a bespoke workflow to process and archive over 2 PB of data from the 1-km XNR simulations (XNR1K). Further, we have also facilitated access to the XNR1K data via an open science data hackathon. The hackathon projects also have access to a data analytics cluster to further process and analyze the data. The OLCF data center supports high speed data sharing via globus data transfer mechanism. External users are using the XNR1K data for a number of ongoing research projects, including observing system simulation experiments, designing satellite instruments for severe storms, developing surrogate models, understanding atmospheric processes, and generating high-fidelity visualizations.

During our presentation we will share our challenges, experiences and lessons learned related to the processing, provisioning and management of the large volume of data, and the stakeholder engagement and logistics of the open science data hackathon.

Slingo, J., Bates, P., Bauer, P. et al. (2022) Ambitious partnership needed for reliable climate prediction. Nat. Clim. Chang.  https://doi.org/10.1038/s41558-022-01384-8

Wedi, N., Polichtchouk, I., et al. (2020) A Baseline for Global Weather and Climate Simulations at 1 km Resolution, JAMES. https://doi.org/10.1029/2020MS002192

How to cite: Anantharaj, V., Hatfield, S., Polichtchouk, I., and Wedi, N.: Data challenges and opportunities from nascent kilometre-scale simulations, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-13331, https://doi.org/10.5194/egusphere-egu23-13331, 2023.

EGU23-13501 | Posters on site | ESSI2.2

The brokering approach empowering the WMO data space for hydrology 

Enrico Boldrini, Paolo Mazzetti, Fabrizio Papeschi, Roberto Roncella, Massimiliano Olivieri, Washington Otieno, Igor Chernov, Silvano Pecora, and Stefano Nativi

WMO is coordinating the efforts to build a data space for hydrology, called the WMO Hydrological Observing System (WHOS). 
Hydrological datasets have intrinsic value and are worth the enormous human, technological and financial resources required to collect them over long periods of time. Their value is maximized when data is open, of quality, discoverable, accessible, interoperable, standardized, and addressing user needs, enabling various sector users to use and reuse the data. It is essential that hydrological data management and exchange is implemented effectively to maximize the benefits of data collection and optimize reuse.
WHOS provides a service-oriented framework that connects data providers to data consumers. It realizes a system of systems that provides registry, discovery, and access capabilities to hydrology data at different levels (local, basin, regional, global). In 2015, the World Meteorological Congress supported the full implementation of WHOS, which is currently publicly available at https://community.wmo.int/activity-areas/wmo-hydrological-observing-system-whos, along with information for both end users and data providers about how to use and join it.
End users (such as hydrologists, forecasters, decision makers, general public, academia) can discover, access, download and further process hydrological data available through WHOS portal by means of their preferred clients (web applications, tools and libraries).
Data providers (such as National Meteorological and Hydrological Services - NMHSs, river basin authorities, private companies, academia) can share their data through WHOS by publishing it online by means of machine-to-machine web services.
The brokering approach powered by the Discovery and Access Broker (DAB) technology enables the interoperability between data providers’ services and end users’ clients. A mediation layer implemented by the DAB brokering framework mediates between the different standard protocols and data models used by both providers and consumers to seamlessly enable the data flow from  heterogeneous data providers to the clients of each end user.
In parallel, WHOS experts are working in constant collaboration with the data providers to support the implementation of the latest standards required by the international guidelines (e.g., WaterML2.0 and WIGOS Metadata Standard), optimize the data publication and improve the metadata and data quality.
The WHOS Distance Learning course has been successfully conducted; attenders from NMHSs were provided updated information and guidelines to optimize their hydrological data sharing. The course is currently being translated into Spanish to carry out it for Spanish speaking countries in 2023.
WHOS is a hydrological component of WMO Information System (WIS), which is currently in its pilot phase. WHOS and WIS Interoperability tests are currently being piloted and expected to end in 2023. The aim of this interoperability is to promote smooth data exchange between Hydrology community and the wider WMO community. Finally, hydrological data shared through WHOS will be accessible to general WIS users (all piloted programmes, including climate through OpenCDMS, and cryoshere) and at the same time WHOS users will make use of observations made available by WIS.

How to cite: Boldrini, E., Mazzetti, P., Papeschi, F., Roncella, R., Olivieri, M., Otieno, W., Chernov, I., Pecora, S., and Nativi, S.: The brokering approach empowering the WMO data space for hydrology, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-13501, https://doi.org/10.5194/egusphere-egu23-13501, 2023.

EGU23-14237 | Orals | ESSI2.2

Environmental data value stream as traceable linked data - Iliad Digital Twin of the Ocean case 

Piotr Zaborowski, Rob Atkinson, Alejandro Villar Fernandez, Raul Palma, Ute Brönner, Arne Berre, Bente Lilja Bye, Tom Redd, and Marie-Françoise Voidrot

In the distributed heterogeneous environmental data ecosystems, the number of data sources, volume and variances of derivatives, purposes, formats, and replicas are increasingly growing. In theory, this can enrich the information system as a whole, enabling new data value to be revealed via the combination and fusion of several data sources and data types, searching for further relevant information hidden behind the variety of expressions, formats, replicas, and unknown reliability. It is now visible how complex data alignment is, and even more, it is not always justified due to capacity and business issues. One of the challenging, but also most rewarding approaches is semantic alignment, which promises to fill the information gap of data discovery and joins. To formalise one, an inevitable enabler is an aligned, linked, and machine readable data model enabling the specification of relations between data elements generated information. The Iliad - digital twins of the ocean are cases of this kind, where in-situ data and citizen science observations are mixed with multidimensional environmental data to enable data science and what-if models implementation and to be integrated into even broader ecosystems like the European Digital Twin Ocean (EDITO) and European Data Spaces. An Ocean Information Model (OIM) that will enable traversals and profiles is the semantic backbone of the ecosystem. Defined as the multi-level ontology, it will explain data using well known generic (Darwin Core, WoT), spatio-temporal (SOSA/SSN, OGC Geo, W3C Time, QUDT, W3C RDF Data Cube, WoT) and domain (WORMS, AGROVOC) ontologies. Machine readability and unambiguity allow for both automated validation and some translations.
On the other hand, efficient use of this requires yet another skill in data management and development besides GIS, ICT and domain expertise. In addition, as the semantics used in the data and metadata have not yet been stabilised on the implementation level, it introduces a few more flexibilities of data expression. Following the GEO data sharing and data management principles along with FAIR, CARE and TRUST, the environmental data is prepared for harmonisation. Furthermore, to ease the entry and to harmonise conventions, the authors introduce a multi-touchpoint data value chain API suite with an aligned approach to semantically enrich, entail and validate data sets such as observations streams in JSON or JSON-LD based on OIM, through storage and scientific data in NetCDF to exposing this semantically aligned data via the newly endorsed and already successful OGC Environmental Data Retrieval API. The practical approach is supported by a ready-to-use toolbox of components that presents portable tools to build and validate multi-source geospatial data integrations keeping track of the information added during mesh-up and predictions and what-if implementations.

How to cite: Zaborowski, P., Atkinson, R., Villar Fernandez, A., Palma, R., Brönner, U., Berre, A., Bye, B. L., Redd, T., and Voidrot, M.-F.: Environmental data value stream as traceable linked data - Iliad Digital Twin of the Ocean case, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-14237, https://doi.org/10.5194/egusphere-egu23-14237, 2023.

EGU23-15907 | Posters on site | ESSI2.2

GSSC Now - ESA's Thematic Exploitation Platform for Navigation Science Data 

Vicente Navarro, Sara del Rio, Emilio Fraile, Luis Mendes, and Javier Ventura

Nowadays, the sheer amount of data collected from space-borne and ground-based sensors, is changing past approaches towards data processing and storage. In the Information Technology domain, the rapid growth of data generation rates, expected to produce 175 zettabytes worldwide by 2025, is changing approaches to data processing and storage dramatically. This landscape has led to a new golden age of Machine learning (ML), able to extract knowledge and discover patterns between input and output variables given the sheer volume of available training data.

In space, over 120 satellites from four Global Navigation Satellite Systems (GNSS), including Galileo, will provide, already this decade, continuous, worlwide signals in several frequencies. On ground, the professional market represented by thousands of permanent GNSS stations has been complemented by billions of mass-market receivers integrated in smartphones and Internet-of-Things (IoT) devices.

Along their travel down to Earth through the atmosphere, multiple sources alter the precisely modulated GNSS signals. As they pass through irregular plasma patches in the ionosphere, GNSS signals undergo delay and fading, formally known as 'scintillation'. Further down, they are modified by the amount of water vapor in the troposphere. These alterations, recorded by GNSS receivers as digital footprints in massive streams of data, represent a valuable resource for science, increasingly employed to study Earth’s atmosphere, oceans, and surface environments.

In order to realize the scientific potential of GNSS data, at the European Space Astronomy Centre (ESAC) near Madrid, the GNSS Science Support Centre (GSSC) led by ESA’s Navigation Science Office, hosts ESA’s data archive for scientific exploitation of GNSS data.

Analysis of Global Navigation Satellite Systems (GNSS) data has traditionally pivoted around the idea of datasets search and download from multiple repositories that act as data-hubs for different types of GNSS resources generated worldwide. In this work we introduce an innovative GNSS Thematic Exploitation Platform, GSSC Now, which expands a GNSS-centric data lake with novel capabilities for discovery and high-performance-computing.

We explain how this platform performs GNSS data fusion from multiple data sources, enabling the deployment of Machine Learning (ML) processors to unleash synergies across science domains.

Finally, through the presentation of several GNSS science use cases, we discuss the implementation of GSSC Now’s cyber-infrastructure, current status, and future plans to accelerate the development of innovative applications and citizen-science.

How to cite: Navarro, V., del Rio, S., Fraile, E., Mendes, L., and Ventura, J.: GSSC Now - ESA's Thematic Exploitation Platform for Navigation Science Data, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-15907, https://doi.org/10.5194/egusphere-egu23-15907, 2023.

EGU23-16662 | Posters on site | ESSI2.2

NLP-based Cognitive Search Engine for the GEOSS Platform data 

Yannis Kopsinis, Zisis Flokas, Pantelis Mitropoulos, Christos Petrou, Thodoris Siozos, and Giorgos Siokas

Effectively querying unstructured text information in large databases is a highly demanding task. Conventional approaches, such as an exact match or fuzzy search, return valid and thorough results only when the user query adequately matches the wording within the text or the query is included in keyword-tag lists. The GEOSS portal relies on conventional search tools for data and services exploration and retrieval, limiting its capacity. This challenge, recent advances in Artificial Intelligence (AI)-based Natural Language Processing (NLP) try to surpass with enhanced information retrieval and cognitive search. Rather than relying on exact or fuzzy text matching, it detects documents that semantically and conceptually are close enough to the search query. 

The EIFFEL EU-funded project aims to reveal the role of GEOSS as the default Digital Portal for building Climate Change (CC) adaption and mitigation applications and offer the Earth Observation community the ground-breaking capacity of exploiting existing GEOSS datasets. To this end, as a lead technological partner of the EIFFEL consortium, LIBRA AI Technologies, designs and develops an end-to-end advanced cognitive search system dedicated to the GEOSS Portal and exceeds current challenges.

The proposed system comprises an AI language model optimized for CC-related text and queries, a framework for collecting a sizeable CC-specific corpus used for the language model specialization, a back-end that adopts modern database technologies with advanced capabilities for embedding-based cognitive search matching, and an open Application Programming Interface (API). The cognitive search component is the backbone of the EIFFEL visualisation engine, which will allow any GEOSS user, as well as the EIFFEL Climate Change application developing teams, to detect GEOSS data objects and services that are of interest for their research and application but could not effectively get accessed with the available GEOSS Portal search engine.

The work described in this abstract is part of the EIFFEL European project. The EIFFEL project has received funding from the European Union's Horizon 2020 research and innovation programme under grant agreement No 101003518. We thank all partners for their valuable contributions.

How to cite: Kopsinis, Y., Flokas, Z., Mitropoulos, P., Petrou, C., Siozos, T., and Siokas, G.: NLP-based Cognitive Search Engine for the GEOSS Platform data, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-16662, https://doi.org/10.5194/egusphere-egu23-16662, 2023.

EGU23-16795 | Posters on site | ESSI2.2

Harmonizing Diverse Geo-Spatiotemporal Data for Event Analytics 

Michael Rilee, Kwo-Sen Kuo, Michael Bauer, Niklas Griessbaum, and Dai-Hai Ton-That

Parallelization is the only means by which it is possible to process large amounts of diverse data on reasonably short time scales. However, while parallelization is necessary for performant and scalable BigData analysis, it is insufficient. We observe that we most often require spatiotemporal coincidence (i.e., at the same space and time) in geo-spatiotemporal analyses that integrate diverse datasets. Therefore, for parallelization, these large volumes of diverse data must be partitioned and distributed to cluster nodes with spatiotemporal colocation to avoid data movement among the nodes necessitated by misalignment. Such data movement devastates scalability.

The prevalent data structure for most geospatial data, e.g., simulation model output and remote sensing data products, is the (Raster) Array, with accompanying geolocation arrays, i.e., longitude-latitude,  of the same shape and size establishing, through the array index, a correspondence between a data array element and its geolocation. However, this array-index-to-geolocation relation is ever-changing from dataset to dataset and even within a dataset (e.g., swath data from LEO satellites). Consequently, it is impossible to use array indices for partitioning and distribution to achieve consistent spatiotemporal colocation.

A simplistic way to address this diversity is through homogenization, i.e., resampling (aka re-gridding) all data involved onto the same grid to establish a fixed array-index-to-geolocation relation. Indeed, this crude approach has become the existing common practice. However, different applications have different requirements for resampling, influencing the choice of the interpolation algorithm (e.g., linear, spline, flux-conserved, etc.). Regardless of which algorithm is applied, large amounts of modified and redundant data are created, which not only exacerbates the BigData Volume challenge but also obfuscates the processing and data provenance.

SpatioTemporal Adaptive-Resolution Encoding, STARE, was invented to address the scalability challenge through data harmonization, allowing efficient spatiotemporal colocation of the “native data” without re-gridding. STARE (1) ties its indices directly to space-time coordinate locations, unlike raster array indices used in the current practice which must go indirectly through the floating-point longitude-latitude arrays to reference geolocation, and (2) embeds neighborhood information in the indices to enable highly performant numerical operations for “joins” such as intersect, union, difference, and complement. These two properties together give STARE its exceptional data-harmonizing power because, when given a pair of STARE indices are associated with a data element, we know not only its spatiotemporal location but also its neighborhood, i.e., the spatiotemporal volume (2D in space plus 1D in time) that the data element represents.

These capabilities of STARE-based technologies allow not only the harmonization of diverse datasets but also sophisticated event analytics. In this presentation, we will discuss the application of STARE to the integrative analysis of Extra-Tropical Cyclones and precipitation events, wherein we use STARE to identify and catalog co-occurrences of these two kinds of events so that we may study their relationships using diverse data of the best spatiotemporal resolution available.

How to cite: Rilee, M., Kuo, K.-S., Bauer, M., Griessbaum, N., and Ton-That, D.-H.: Harmonizing Diverse Geo-Spatiotemporal Data for Event Analytics, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-16795, https://doi.org/10.5194/egusphere-egu23-16795, 2023.

EGU23-529 | ECS | Posters on site | HS3.8

How consistent are citizens in their observation of temporary streams? 

Mirjam Scheller, Ilja van Meerveld, Sara Blanco, and Jan Seibert

Half of the global river network dries up from time to time. However, these so-called temporary streams are not represented well in traditional gauging networks. One reason is the difficulty in measuring zero flows. Therefore, new approaches, such as low-cost sensors and citizen science, have been developed in the past few years. CrowdWater is such a citizen science project, in which citizens can submit observations of the state of temporary streams with the help of a smartphone app. The flow state of the stream is assessed visually and assigned to one of the following six classes: dry streambed, wet/damp streambed, isolated pools, standing water, trickling water, and flowing.

To determine the consistency of observations by different citizens, we asked questions regarding the flow state to more than 1200 people, who passed by temporary streams of various sizes in Switzerland and Germany. The survey consisted of 19 multiple-choice questions (with 14 being yes/no questions), three rating scale questions, two open-ended questions and five demographic questions, and was available in German and English. Most participants were interested in the topic and happy to participate. We estimate that about 80% of the people we approached participated in the survey.

Over 90% of the participants were native German speakers. When the expert assessment of the flow state was dry streambed, isolated pools or flowing water multiple surveys (4-6) could be done for up to four streams. Other states (standing water and trickling water) were assessed at only one stream. The surveys covered all six flow state classes: dry streambed: 4 times with a total of 244 participants; wet/damp streambed: 3 times with 179 participants; isolated pools: 5 times with 265 participants; standing water: 3 times with 177 participants; trickling water: 2 times with 106 participants; flowing: 6 times with 297 participants.

The answers of the participants were consistent for the driest and wettest states (dry streambed and flowing water) but differed for the in-between states. For example, half of the participants at one stream chose the wet streambed category, while the other half decided on standing water. This suggests that visual assessments of flow states for multiple classes are more complicated than could be assumed initially, but still give additional information beyond the flowing or dry categories. Above all, it provides information for streams that otherwise would be unmonitored.

How to cite: Scheller, M., van Meerveld, I., Blanco, S., and Seibert, J.: How consistent are citizens in their observation of temporary streams?, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-529, https://doi.org/10.5194/egusphere-egu23-529, 2023.

Floods are one of the most common and catastrophic natural events worldwide, making studies on the magnitude, severity and frequency of past events essential for risk management. On this wise, remote sensing techniques have been widely used in flooding diagnoses, where Sentinel-2 images are one of the main resources employed in surface water mapping. These studies have developed single band, spectral indexes and machine learning-based methods, which have typically been applied to large water bodies. However, one of the issues in identifying water surfaces remains their size. When water surfaces have sizes close to the spatial resolution of satellite images, they are difficult to detect and map. To improve remotely sensed images' spatial resolution, an algorithm for super-resolving imagery has been developed, giving good results, especially in areas covered by agricultural land with large uniform surfaces. Although this method has proved effective on Sentinel-2 images, it has not been tested for enhancing flood mapping. Thus, flood mapping is still considered an open research topic, as no suitable method has been found for all datasets and all conditions. Consequently, the present study has developed a methodology for flood delineation in small-sized water bodies. The method leverages the advantages of Sentinel-2 MSI data, image preprocessing techniques, thresholding algorithms, spectral indexes and an unsupervised classification method. Thus, this framework includes a) the generation of super-resolved Sentinel-2 images, b) the application of seven spectral indexes to highlight flood surfaces and evaluation of their effectiveness, c) the application of fifteen methods for flood extent mapping, including fourteen thresholding algorithms and one unsupervised classification method and, d) the evaluation and comparison of the performance of flood mapping methods. The technique was applied in the Carrión River, located in the Duero basin, province of Palencia, Spain. This river is classified as a narrow water body, which presents recurrent flood events due to its morphometric characteristics, fluvial dynamics, and land uses. The results obtained show optimal performances when highlighting flood zones by combining AWE spectral indices with methods such as those of Huang and Wang, Li and Tam, Otsu, and momentum-preserving thresholding algorithms and EM cluster classification.

How to cite: Lombana, L.: Flood mapping in small-size water rivers: Analysis of spectral indexes using super-resolved Sentinel-2 images, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-690, https://doi.org/10.5194/egusphere-egu23-690, 2023.

EGU23-2029 | ECS | Posters on site | HS3.8

Showcasing the Potential of Crowd-sourced Observations for Flood Model Calibration 

Antara Dasgupta, Stefania Grimaldi, Raaj Ramsankaran, Valentijn Pauwels, and Jeffrey Walker

Floods are one the costliest natural disasters, having caused global economic losses worth over USD 51 million and >6000 fatalities just in 2020. Hydrodynamic modelling and forecasting of flood inundation requires distributed observations of flood depth and extent to enable effective evaluation and to minimize uncertainties. Given the decline of in situ hydrological monitoring networks, Earth Observation (EO) has emerged as a valuable tool for model calibration and evaluation in data scarce regions, as it provides synoptic observations of flood variables. However, low temporal frequencies and the (currently) instantaneous nature of EO, still limits the ability to characterize fast moving floods. The concurrent rise of smartphones, social media, and internet access has recently led to the emerging discipline of citizen sensing in hydrology, which has the potential to complement real-time EO and in situ flood observations. Despite this, methods to effectively utilise crowd-sourced flood observations to quantitatively assess model performance are yet to be developed. In this study the potential of crowd-sourced flood observations for hydraulic model evaluation is demonstrated for the first time. The channel roughness for the hydraulic model LISFLOOD-FP was calibrated using just 32 distributed high-water marks and wrack marks collected by the community and provided by the Clarence Valley Council for the 2013 flood event. Since the timings of acquisition of these data points were unknown, it was assumed that these provide information on the peak flow. Maximum model simulated and observed water levels were thus compared at observation locations for each model realization, and errors were quantified through the root mean squared error (RMSE) and the mean percentage difference (MPD), respectively. Peak flow information was also extracted from the 11 available hydrometric gauges along the Clarence River and used to constrain the roughness parameter, to enable the quantification of value addition from the citizen sensed observations. Identical calibrated parameter values were obtained for both data types resulting in a mean RMSE value of ∼50 cm for peak flow simulation across all gauges. Outcomes from this study demonstrate the utility of uncertain crowd-sourced flood observations for hydraulic flood model calibration in ungauged catchments.

How to cite: Dasgupta, A., Grimaldi, S., Ramsankaran, R., Pauwels, V., and Walker, J.: Showcasing the Potential of Crowd-sourced Observations for Flood Model Calibration, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-2029, https://doi.org/10.5194/egusphere-egu23-2029, 2023.

Due to the influence of climate change, the range of change in precipitation and regional variation have increased over the past 10 years, and the occurrence of local drought is increasing. The existing water supply and demand analysis system in Korea is produced by each management department, so there are limitations in data collection and decision-making on water distribution. For efficient water management, integration of water information should be prioritized. Based on this, actual water use monitoring, evaluation and water shortage prediction technology can be developed.

In this study, the DB of water-cycle system was constructed focusing on domestic water and transfer function model was developed. DB construction was classified into 3 stages (pre-preparation/investigation and analysis/application and evaluation), and the first stage was defined as the concept of water inflow/delivery/outflow from the urban perspective and the current status of data by point was identified. In the second stage, research directions were established through expert consultation and undisclosed data were collected through cooperation with related organizations. The third stage was applied to Gongju-si and Nonsan-si in Korea, which are the study sites, and the supplementations were reviewed. A transfer function model was developed using the constructed DB. It is expected that it will be possible to construct a useful transfer function model when analyzing the performance index by learning the outflow of the Singwan sewage treatment equipment based on the water intake amount of the Hyeondo intake station and confirming the autocorrelation of the non-significant residual.

In the future, additional considerations (outlet location, service area, and sewage treatment area subdivision) are required in national reports on river basins and droughts, and precipitation is also considered as a major input factor for outflow.

 

(This work was supported by a grant from the Korea Environmental Industry & Technology Institute (KEITI), funded by the Ministry of Environment (ME) of the Republic of Korea (2022003610003))

How to cite: Lee, S. and Lee, S.: Construction of integrated DB for domestic water-cycle system and development of transfer function model, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-2201, https://doi.org/10.5194/egusphere-egu23-2201, 2023.

EGU23-2493 | ECS | Posters virtual | HS3.8

pyRCIT - A rainfall nowcasting tool based on a synthetic approach 

Ting He and Thomas Einfalt

Operating precise rainfall nowcasting with the help of observations from weather radar can give an effective warning before hydrometerological hazards occur. A common radar based rainfall nowcasting procedure includes: rain cell identification and tracking, spatial and temporal analysis of rain cell, rainfall nowcasting and nowcasting results evaluation.

In this study, an open source rainfall nowcasting tool - pyRCIT is designed and developed which is purely based on qualified weather radar data. It have four main modules: (1) weather radar data processing; (2) rainfall spatial and temporal analysis; (3) deterministic rainfall nowcasting and (4) ensemble rainfall nowcasting. In pyRCIT, rainfall is firstly obtained from weather radar data sets with a series of data quality adjustment procedures. Secondly, rain cells are identified and their spatial and temporal properties are analyzed by the RCIT algorithm. Thirdly, deterministic rainfall nowcasting is operated following the extrapolating schema using lagrangian persistence and semi-lagrangian methods separately, nowcasting results are evaluated by the object oriented verification method - SAL (Structure-Amplitude-Location). Finally, nowcasting uncertainties are analyzed by the random field theory and the quantified uncertainties are implemented as the aid of ensemble rainfall nowcasting.

Nowcasting quality of pyRCIT are evaluated by comparing it with some main rainfall nowcasting methods: TREC, SCOUT and pySTEPS. Comparative results showed that deterministic nowcasting score of pyRCIT were higher than the TREC and SCOUT methods but is nearly equal to the score of pySTEPS, for the ensemble nowcasting, score of pyRCIT is higher than all three methods for the selected cases. The pyRCIT can serve as the basis for further hydro-meteorological applications such as spatial and temporal analysis of rainfall events and flash flood forecasting.

The code of pyRCIT is available at https://github.com/greensubriane/PYRCIT.git

How to cite: He, T. and Einfalt, T.: pyRCIT - A rainfall nowcasting tool based on a synthetic approach, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-2493, https://doi.org/10.5194/egusphere-egu23-2493, 2023.

EGU23-5818 | Posters on site | HS3.8

Application of multimodal deep learning using radar and water level data for water level prediction 

Seongsim Yoon, Seyong Kim, and Sangmin Bae

In general, water level prediction models using deep learning techniques have been developed using time-series water level observation data from upstream water level stations and target water level stations even though many of physical data are necessary to predict water level. The changes of the water level are greatly affected by rainfall in the basin, therefore rainfall information is needed to more accurately predict the water level. In particular, radar data has the advantage of being able to directly acquire the amount of rainfall occurring within a watershed. This study aims to develop the multimodal deep learning model to predict the water level using 2D grid radar rainfall data and 1D time-series water level observation data. This study proposed two multimodal deep learning models which have different structures. Both multimodal deep learning models predict the water level by simultaneously using the observed water level data up to the present time and the radar rainfall data that affects the water level in the future. The first proposed model consists of a deep learning network that links 2D Average Pooling (AvgPool2D), which compresses 2D radar data to 1D data, and Long Short-Term Memory (LSTM), which predicts 1D time series water level data. The second proposed model consists of a deep learning network that predicts water levels by linking Conv2DLSTM and LSTM, which can reflect the characteristics of 2D radar data without deformation.  The two proposed multimodal deep learning models were learned and evaluated in the upper basin of Hantan River. In addition, it was compared with the results of single-modal LSTM using only water level data. There are three water level stations in the study area, and the objective was to predict the water level of the downstream station up to 180 minutes in advance. For learning and verification of the deep learning model, 10-minute water level and radar rainfall data were collected from May 2019 to October 2021. For the radar data used as input, the grid data included in the target watershed were extracted and used among composite radar data with a resolution of 1 km operating by Ministry of Environment. As a result of evaluating each learned deep learning model, two multimodal models had higher prediction accuracy than the single-modal using only water level data. In particular, second proposed model (Conv2dLSTM+LSTM) had better predictive performance than first proposed model (AvgPool2D+LSTM) at the time of the sudden rise in water level due to rainfall.

Acknowledgments

Research for this paper was carried out under the KICT Research Program (project no. 202200175-001, Development of future-leading technologies solving water crisis against to water disasters affected by climate change) funded by the Ministry of Science and ICT.

How to cite: Yoon, S., Kim, S., and Bae, S.: Application of multimodal deep learning using radar and water level data for water level prediction, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-5818, https://doi.org/10.5194/egusphere-egu23-5818, 2023.

EGU23-7700 | ECS | Orals | HS3.8

Improved flush detection and classification in combined sewer monitoring 

Markus Pichler and Dirk Muschalla

During rain events, rainwater reaches the combined sewer system and causes additional hydraulic and pollutant load. Due to the limited capacity of the sewer system and the wastewater treatment plant, overflow structures are constructed to reduce the discharge and thus create a potential hazard for the environment. For optimal management of these structures, it is necessary to know the runoff and pollutant load of the events and their distribution over time. When these distributions have a significant peak, they are often referred to as a flush, the best-known phenomenon being the first flush at the beginning of a rainfall event. This knowledge can be used for the design of retention facilities and the calibration of sewer models. The flush phenomena are mainly caused by the erosion of contaminants on the surface as well as the remobilisation of sediments in the sewer network.

Although many papers have investigated the first flush, no common pattern for the occurrence of these flushes has been identified. While the concentration of the flushes in rainwater sewers can be measured directly, the rain flushes in combined sewers are mixed with more polluted wastewater, which leads to a reduction in signal strength.

The sensor site for the used measurement data is located in a combined sewer overflow in the western part of Graz, Austria with a catchment area of 460 ha, consisting mainly of residential areas and with about 19500 inhabitants.

This work aims to separate and classify pollution flush signals from rainfall events in combined sewer systems to better understand the relationship between these signals and rainfall event characteristics.

For this reason, the continuous hydraulic and pollution data are first analysed to determine the representative dry weather contribution. By subtracting the dry weather contribution from the combined wastewater volume and the mass flux, the stormwater contribution and thus the flushes can be estimated. In addition, automatic event detection of combined sewer events was done.

Next, the wet weather events are classified by clustering the stormwater runoff-induced pollutant distribution (flush signals) and the event parameters. For the clustering of the temporal pollutant load distribution of events of different duration, the events are normalised by the mass-volume curves. To obtain the best possible clustering result, the dimension of the mass-volume curves is reduced by a principal component analysis. Different clustering methods, such as partitioning or hierarchical methods, are applied and compared.

How to cite: Pichler, M. and Muschalla, D.: Improved flush detection and classification in combined sewer monitoring, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-7700, https://doi.org/10.5194/egusphere-egu23-7700, 2023.

EGU23-8802 | Orals | HS3.8

Improving Early Warning System for Urban Flooding in Chinese Mega Cities using Advanced Active Phased Array Radar 

Dehua Zhu, Yunqing Xuan, Richard Body, Dongming Hu, and Xiaojun Bao

This two-year trial aims to bring together academics and industrial partners from UK and China to conduct a pilot study on the use of the active phased array radar to provide early urban flood warnings for Chinese mega cities, which facing challenging urban flood issues. This is the first in the world of cascade modelling using the cutting-edge active phase array radar (APRA) to provide rainfall monitoring and nowcasting information for a real-time two-dimension urban drainage model. The collaboration built up by this project and the first-hand experiment data will serve well to further catalyse the taking-up of state-of-the-art weather radars for urban flood risk management, and to tackle the innovation in tuning the radar technology to fit the complex urban environment as well as advanced modelling facilities that are designed to link the observations, providing decision making support to the city government. Recommendations for applying high spatial-temporal resolution precipitation data to real-time flood forecasting on an urban catchment are provided and suggestions for further investigation are discussed.

How to cite: Zhu, D., Xuan, Y., Body, R., Hu, D., and Bao, X.: Improving Early Warning System for Urban Flooding in Chinese Mega Cities using Advanced Active Phased Array Radar, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-8802, https://doi.org/10.5194/egusphere-egu23-8802, 2023.

EGU23-9494 | Orals | HS3.8

A Nonstationary Multivariate Framework for Modelling Compound Flooding 

Yunqing Xuan and Han Wang

 Flooding is widely regarded as one of the most dangerous natural hazards worldwide. It often arises from various sources either individually or combined such as extreme rainfall, storm surge, high sea level, large river discharge or the combination of them. However, the concurrence or close succession of these different source mechanisms can lead to compound flooding, resulting in larger damages and even catastrophic consequences than those from the events caused by the individual mechanism. Here, we present a modelling framework aimed at supporting risk analysis of compound flooding in the context of climate change, where nonstationary joint probability of multiple variables and their interactions need to be quantified.The framework uses the Block Bootstrapping Mann-Kendall test to detect the temporal changes of marginals, and the correlation test associated with the Rolling Window method to estimate whether the correlation structure varies with time; it then evaluates various combinations of marginals and copulas under stationary and nonstationary assumptions. Meanwhile, a Bayesian Markov Chain Monte Carlo method is employed to estimate the time-varying parameters of copulas.

How to cite: Xuan, Y. and Wang, H.: A Nonstationary Multivariate Framework for Modelling Compound Flooding, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-9494, https://doi.org/10.5194/egusphere-egu23-9494, 2023.

EGU23-9546 | ECS | Orals | HS3.8

DeepRain: a separable residual convolutional neural algorithm with squeeze-excitation blocks for rainfall nowcasting 

Ahmed Abdelhalim, Miguel Rico-Ramirez, and Dawei Han

Precipitation nowcasting is critical for mitigating the natural disasters caused by severe weather events. State-of-the-art operational nowcasting methods are radar extrapolation techniques that calculate the motion field from sequential radar images and advect the precipitation field into the future. However, these methods assume the motion field's invariance, and prediction is based solely on recent observations, rather than historical radar sequences. To overcome these limitations, deep learning methods such as convolutional neural networks have recently been applied in radar rainfall nowcasting. Although, the promising progress of using deep learning techniques in rainfall nowcasting, these methods face some challenges. These challenges include producing blurry predictions, inaccurate forecasting of high rainfall intensities and degradation of the prediction accuracy with rising lead times. Therefore, the aim of this study is to develop a more performant deep-learning model capable of overcoming these challenges and preventing information loss in order to produce more accurate nowcasts. DeepRain is a convolutional neural network that uses a spatial and channel Squeeze & Excitation Block after each convolutional layer, local importance-based pooling, and residual connections to improve model performance. The algorithm is trained and validated using the UK Met Office's radar rainfall mosaic, which is produced by the UK Met Office Nimrod system. Using verification metrics, the model's predictive skill is compared to another deep learning model and a range of extrapolation methods.

Keywords: deep learning; rainfall nowcasting; radar; convolutional neural networks; Squeeze-and-Excitation

How to cite: Abdelhalim, A., Rico-Ramirez, M., and Han, D.: DeepRain: a separable residual convolutional neural algorithm with squeeze-excitation blocks for rainfall nowcasting, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-9546, https://doi.org/10.5194/egusphere-egu23-9546, 2023.

EGU23-9588 | ECS | Orals | HS3.8

Comparative performance of recently introduced Deep Learning models for Rainfall-Runoff Modelling 

Yirgalem Gebremichael, Gerald Corzo Perez, and Dimitri Solomatine

Machine learning and specifically deep learning has been applied in solving numerous hydrology related problems in the past. Furthermore, extensive research has been done on the evaluation and comparison of performances of different Machine learning techniques applied in solving hydrology related problems. In this research, the possible reasons behind these performance variations are being assessed. The performance of recently introduced deep learning techniques for rainfall-runoff modelling are being evaluated by looking in to the possible modelling set-up and training procedures. Therefore, model set-up and training procedures such as: normalization techniques, input variable selection (feature selection), sampling techniques, model complexity, optimization techniques and random initialization of weights are being examined closely in order to improve the performances of different deep learning techniques for rainfall-runoff modelling. As a result, this study is trying to answer whether these factors have significant effect on the model accuracy.

The experiments are being conducted on different deep learning models such as: LSTMs, GRUs and MLPs as well as non-deep learning models such as: XGBoost, Random Forest, Linear Regression and Naïve models. Deep learning frameworks including TensorFlow and Keras are being implemented on Python. For better generalization, study areas from three different climatic zones namely: Bagmati catchment in Nepal, Yuna catchment in Dominican Republic and Magdalena catchment in Colombia are chosen to implement this experimental research. Additionally, in situ meteorological and stream flow data are being used for the rainfall-runoff modelling.

The preliminary model results show that model performances in case of Bagmati catchment are higher as compared to the other catchments. The LSTMs and MLPs are performing good with NSE values of 0.71 and 0.72 respectively. Most importantly, the linear regression model was outperforming the other models with NSE up to 0.75 in case of considering 6 days lagged rainfall input. This implies the relationship between daily rainfall and runoff data from Bagmati catchment may not be as complex. On the contrary, the 3-hourly data from Yuna catchment shows results with lower values for the performance metrics. This may be an indication of more complex relationships within the Yuna catchment.

This research provides key elements of the modelling process, especially in setting up and training deep learning models for rainfall-runoff modelling. The comparative analysis performed here, provides a basis of performance variations on different basins. This work contributes to the experiences in understanding machine learning requirements for different types of river basins.

How to cite: Gebremichael, Y., Corzo Perez, G., and Solomatine, D.: Comparative performance of recently introduced Deep Learning models for Rainfall-Runoff Modelling, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-9588, https://doi.org/10.5194/egusphere-egu23-9588, 2023.

EGU23-10491 | ECS | Posters on site | HS3.8

Addressing discoverability, trust and data quality in peer-to-peer distributed databases for citizen science 

Julien Malard-Adam, Sheeja Krishnankutty, Anandaraja Nallusamy, and Wietske Medema

Peer-to-peer distributed databases show promise for lowering the barrier to entry for citizen science projects. These databases, which do not require a centralised server to store and exchange data, instead use participants’ devices (phones or computers) to store and transfer data directly between project participants. This offers concrete advantages in terms of avoiding usually very costly and time-consuming server maintenance for the research team, as well as improving data access and sovereignty for the participating communities.

However, several technical challenges remain to the routine use of distributed databases in citizen science projects. In particular, indexing data and discovering peers who hold data of interest or from the same project; managing safety, trust and permissions; and ensuring data quality all without relying on a central server to perform these functions requires a rethinking of the standard paradigms of database and user account management.

This presentation will give a brief overview of the Constellation software for distributed scientific databases before presenting several novel approaches (concentric recursive data search, user network-centric trust, and multiple data quality verification layers) it has adopted to respond to the above-mentioned challenges. Examples of concrete applications of Constellation for data sharing in the fields of hydrology and agronomy will be included.

How to cite: Malard-Adam, J., Krishnankutty, S., Nallusamy, A., and Medema, W.: Addressing discoverability, trust and data quality in peer-to-peer distributed databases for citizen science, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-10491, https://doi.org/10.5194/egusphere-egu23-10491, 2023.

Population growth and economic development increase water demand, while human activities degrade the quality of available water resources along the adjacent rivers. The U.S. state of Alabama has been suffering from floods causing the degraded water quality by scouring pollutants into the water. In recent decades, Alabama has been experiencing persistent precipitation deficits and unusual severe droughts, resulting in limited economic and water-based recreation activities within downstream states. Since 2020, The COVID-19 pandemic aroused a series policies like quarantine and lock down, which slowed down the economic development and reduced chances of people going outside to witness the water pollution accidents.

In this study, we conducted a sentiment analysis of over 9,900 water pollution complaints (2012-2020) from residents in Alabama. Overall, it is found that complaints are dominated by negative and objective complaints no matter what extremes events or environmental accidents. Results show that sentiment alteration during climate extremes and COVID period was detected. Potential causes of the sentimental alteration in the public water pollution complaint reports were explored. Results show more complaints during summer seasons, which can be explained as higher temperature and intensive precipitation at that time. More complaints are distributed in the counties that are higher socioeconomically developed, to be more specific, counties with more population and higher GDP level. The severity of antecedent extreme events can affect the sentiment of environmental pollution complaints related to on-going extreme events due to limited human judgements. Key words extracted from the complaints point out the pollution resources and locations, which provide important clues from local government to resolved problems.

This study provides an example of how unstructured data such as public complaints can be used as a technology to improve the water pollution and public health monitoring with the help of big data and artificial intelligent technologies. While the results of this study were based water pollution complaints from residents of Alabama state, it is applicable to other environmental pollutions (like air and land) and other regions with available long-term textual data.

 

How to cite: Liu, A. and Kam, J.: Observed Sentimental Alteration in the Public Water Pollution Complaints during Climatic Extremes and the COVID-19 Pandemic, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-10886, https://doi.org/10.5194/egusphere-egu23-10886, 2023.

EGU23-10937 | ECS | Orals | HS3.8

Feature engineering strategies based on GIS and the SCS-CN method for improving hydrological forecasting in a complex mountain basin 

María José Merizalde, Paul Muñoz, Gerald Corzo, and Rolando Célleri

Hydrological modeling and forecasting are important tools for adequate water resources management, especially in complex systems (basins) characterized by high spatio-temporal variability of runoff driving forces, landscape heterogeneity, and insufficient hydrometeorological monitoring. Yet, during the last decades, the use of machine learning (ML) techniques has become popular for runoff forecasting, and the current research trend focuses on performing feature engineering (FE) strategies aimed both at improving forecasting efficiencies and allowing model interpretation. Here, we employed three ML techniques, the Random Forest (RF) algorithm, traditional Artificial Neural Networks (ANN), and specialized Long-Short Term Memory (LSTM) networks, assisted by FE strategies for developing short-term runoff forecasting models for a 3300-km2 complex basin representative of the tropical Andes of Ecuador. We exploited the information of two readily-available satellite products, the IMERG and GSMaP to overcome the absence of ground precipitation data, and the FE strategies proposed were based on GIS and the Soil Conservation Service Curve Number (SCS-CN) method to synthesize the use of land use and land cover, soil types, slope, among other hydrological concepts. To assess the forecasting improvement, we contrasted a set of efficiency metrics calculated both for the developed specialized models and for referential models without the application of  FE strategies. In terms of results, we were first able to develop accurate forecasting models by exploiting precipitation satellite data powered by ML techniques with different complexity levels. Second, the referential forecasting models reached efficiencies (Nash-Sutcliffe efficiency, NSE) varying from 0.9 (1-hour lead time) to 0.5 (11-hours), which were comparable for the RF, ANN, and LSTM techniques. Whereas for the specialized models, we found an improvement of 5–20 % in NSE-values for all lead times. The proposed methodology and the insights of this study provide hydrologists with new tools for developing short-term runoff forecasting systems in complex basins otherwise limited by data scarcity and model complexity issues.

How to cite: Merizalde, M. J., Muñoz, P., Corzo, G., and Célleri, R.: Feature engineering strategies based on GIS and the SCS-CN method for improving hydrological forecasting in a complex mountain basin, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-10937, https://doi.org/10.5194/egusphere-egu23-10937, 2023.

EGU23-11217 | ECS | Posters on site | HS3.8

International Natural Disasters Research and Analytics (INDRA) Reporter: A multi-platform Citizen Science Framework and Tools for Disaster Risk Reduction 

Manabendra Saharia, Dhiraj Saharia, Shreya Gupta, and Satyakam Singhal

With pervasive access to mobile phones with powerful sensors and processors, crowdsourcing has become increasingly prominent as a means of supplementing data obtained from traditional sensors. But there is a lack of a comprehensive application programming interface (API)-based framework that can collect data from multiple sources through user-friendly workflows. INDRA Reporter has been designed with a mobile-first approach geared towards real-time applications and an emphasis on user-interface/user-experience (UI/UX) to maximize collection of higher fidelity data. This paper details a comprehensive suite of tools for active and passive crowdsensing of natural hazards such as floods, storm, lightning, rain etc. Currently the framework includes mobile applications, telegram chatbots, and a publicly available dashboard. Most citizen science applications in flooding are quantitative, which makes it difficult for non-specialists to provide accurate scientific information along with providing user insight into prevailing situation within a single coherent workflow. It is imperative that workflows targeting dangerous situations emphasize on speed and visual acuity while collecting critical data.  The main objective of INDRA is to provide a simple and intuitive way of collecting qualitative and quantitative data from people. Since traditional data collection through ground-based sensors and satellites suffer from various limitations, measurements collected using INDRA will supplement these sources and form the basis of developing multi-sensor data products. We are reporting the development and release of four components of the framework – a) open INDRA API b) INDRA Reporter mobile application, c) Telegram Chat bot, and d) web dashboard. The API has been kept extensible in order to expand the data collection to other hydrologic and meteorological phenomenon.

How to cite: Saharia, M., Saharia, D., Gupta, S., and Singhal, S.: International Natural Disasters Research and Analytics (INDRA) Reporter: A multi-platform Citizen Science Framework and Tools for Disaster Risk Reduction, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-11217, https://doi.org/10.5194/egusphere-egu23-11217, 2023.

EGU23-11419 | ECS | Posters on site | HS3.8

Spatio-temporal analysis of storm surge in the Korean Peninsula 

Jung-A Yang

The Korean Peninsula (KP) located in the Northwest Pacific have different topographic features. West coast of the KP has large tidal variations. If storm surge occurred at high tide, the west coast is vulnerable to flooding. The south coast has a complex coastline with hundreds of islands. Its complex topography can amplify storm surge height (SSH) and it also makes it difficult to conduct numerical modeling for storm surge. Moreover, as the KP is located in the pathways of typhoons, it has been affected by an average of three typhoons every year. The KP has actually suffered from storm surge-induced disaster several times in the past. In order to plan efficient and effective countermeasures against storm surge disasters, it is required to identify the vulnerability of coastal regions in the KP. Therefore, this study quantitatively analyzed the frequency and cause of occurrence of storm surges that occurred along the Korean coast in the past.

First, this study collected observed tidal data at 48 tide stations which are installed along the coast of the KP and performed a hormonic analysis on the observed tidal data to build a database of SSH information that occurred along the coast of the KP from 1979 to 2021. Next, the cause of the storm surge was evaluated based on the occurrence time of the high-level SSH. If the storm surge occurred in winter season, it was treated as being caused by an extra-tropical cyclone, and if in summer season, by and tropical cyclone. Lastly, storm surge vulnerable areas were assessed based on frequency and level of the SSH. To this end, the coast of the KP was divided into five zones: the northwest coast, the southwest coast, Jeju island, the southeast coast and northeast coast. The frequency of the high-level SSH generated in those zones was calculated, and areas where storm surge occurred a lot were selected as vulnerable areas.

As a result of the study, it was found that the high-level SSH with more than 1 m in the KP are caused by tropical cyclone in summer, and the area most vulnerable to storm surge is the southeast coast.

However, the observed tidal data used in this study has a limitation in that the collection period differs depending on the zone: the observation period is longer for the southeast coast than for the southwest coast. Looking at the paths of past typhoons, many typhoons passed through the west coast, so the possibility that the southwest coast would have been judged to be more vulnerable than the southeast coast cannot be ignored if the observed tidal data for the southwest coast were more abundant. In addition, since storm surge is phenomenon that is affected not only by meteorological conditions but also by topographic conditions (e.g., complexity of coastline, water depth, etc.), spatio-temporal analysis of storm surge by topographic conditions is going to be conducted through future research.

 

Acknowledgement

This work was supported by the National Research Foundation of Korea grant funded by the Korea government(MSIT) (No. 2022R1C1C2009205).

 

How to cite: Yang, J.-A.: Spatio-temporal analysis of storm surge in the Korean Peninsula, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-11419, https://doi.org/10.5194/egusphere-egu23-11419, 2023.

EGU23-12475 | Posters on site | HS3.8

Comparing different radar-raingauge precipitation merging methods for Tuscany region 

Rossano Ciampalini, Andrea Antonini, Alessandro Mazza, Samantha Melani, Alberto Ortolani, Ascanio Rosi, Samuele Segoni, and Sandro Moretti

Radar-based rainfall estimation represents an effective tool for hydrological modelling. Nevertheless, this data type is subject to systemic and natural perturbations that need to be considered before to use it. Because of that and to improve data quality, corrections based on raingauge observations are frequently adopted. Here, we compared the efficacy of different radar-raingauge merging procedures over a regional raingauge-radar network focusing on a selected number of rainfalls events.
We adopted the methods: 1) Kriging with External Drift (KED) interpolation (Wackernagel 1998), 2) Probability-Matching-Method (PMM, Rosenfeld et al., 1994), and 3) a kriging mixed method exploiting the Conditional Merging (CM) process (Sinclair-Pegram, 2005) based on elaborations available at DPCN (Italian National Civil Protection Department). These methods have been applied on the Tuscany Regional Territory using raingauge recorded rainfalls at 15’ time-step and CAPPI (Constant altitude plan position indicator) reflectivity data at 2000/3000/5000 m at 5’ and 10’.
Relationships describing precipitation VS radar reflectivity were integrated and elaborated as part of the development of the merging procedures, while the comparison involved the analysis of variance and diversity coefficients. Kriging-based elaborations showed different spatial patterns depending on the applied procedure, with patterns closer to radar variability when using DPCN, and more reflecting the gauge data structure when adopting KED. The probabilistic method (PMM), instead, had the advantage of integrating the gauge data while preserving the spatial radar patterns, confirming interesting perspectives.

How to cite: Ciampalini, R., Antonini, A., Mazza, A., Melani, S., Ortolani, A., Rosi, A., Segoni, S., and Moretti, S.: Comparing different radar-raingauge precipitation merging methods for Tuscany region, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-12475, https://doi.org/10.5194/egusphere-egu23-12475, 2023.

EGU23-12857 | Orals | HS3.8

Surge-tide interaction along the Italian coastline 

Alessandro Antonini, Elisa Ragno, and Davide Pasquali

Storm surge events are probably one of the most studied phenomena in coastal regions since they can lead to coastal flooding, environmental damage, and sometimes loss of human life. In regions of shallow water, among other localized processes, surges occurring at high astronomical tides tend to be damped while surges occurring at rising tides are amplified affecting water level extremes. This requires accounting for tide-surge interaction when defining the coastal hazards due to extreme water levels.

Cities along the Italian coast, such as Venice, Ravenna, Bari (Adriatic sea), Genova, Livorno, Napoli, and Palermo (Tyrrhenian sea), are vulnerable to coastal flooding. Hence, a thorough understanding of the interaction between water level components, i.e., storm surge and astronomical tides, is required to define a proper framework for coastal risk assessment.

Here, we analyze water level observations in several Italian coastal locations to investigate possible correlation and interaction between astronomical tide and the storm surge. Then we study the effect that such interaction has on extreme water level statistics to support the development of flood-resilient adaptation strategies.

How to cite: Antonini, A., Ragno, E., and Pasquali, D.: Surge-tide interaction along the Italian coastline, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-12857, https://doi.org/10.5194/egusphere-egu23-12857, 2023.

EGU23-12909 | Posters on site | HS3.8

Smart Groundwater Monitoring System for Managed Aquifer Recharge Based on Enabled Real-Time Internet of Things 

Khan Zaib Jadoon, Muhammad Zeeshan Ali, Hammad Ullah Khan Yousafzai, Khalil Ur Rehman, Jawad Ali Shah, and Nadeem Ahmed Shiekh

Groundwater has provided a reliable source of high-quality water for human use. After India, USA and China, Pakistan is the fourth largest groundwater user in the world and around 60x109 m3 of groundwater is extracted annually. The situation in Pakistan has further exacerbated when government subsidized electricity for agricultural purposes – paving the way for installation of myriad tube wells across the country which resulted in excessive withdrawal of groundwater. The major challenges in sustainable groundwater management system are twofold. First, increasing withdrawals to meet growing human needs have led to significant groundwater depletion, which is usually not monitored due to high cost of monitoring system. Second, data limitations and the application of regional groundwater models for future prediction.

In this research, Internet of Things (IoT) enabled smart groundwater monitoring system has been developed and tested to monitor in-situ real-time dynamics of groundwater depletion. Each groundwater monitoring sensor is connected to an embedded module that consists of a microcontroller and a wireless transceiver based on Long Range Radio (LoRa) technology. The readings from each LoRa enabled module is aggregated at one (or more) gateways which is then connected to a central server typically through an IP connection. Sensors of the smart groundwater monitoring system were calibrated in the lab by fluctuation water levels in a 3-meter water column. A network of the low-cost groundwater sensors was installed in managed aquifer recharge wells to provide real-time assessment of groundwater level measurement remotely. The smart and resource efficient groundwater monitoring system help to reduce number of physical visits to the field and also enhance stakeholders participation to get social benefits (promote equity among groundwater users), economic benefit (optimize pumping, which reduces energy cost) and technical benefit (better estimates of groundwater abstraction) for sustainable groundwater management.

How to cite: Jadoon, K. Z., Ali, M. Z., Yousafzai, H. U. K., Rehman, K. U., Shah, J. A., and Shiekh, N. A.: Smart Groundwater Monitoring System for Managed Aquifer Recharge Based on Enabled Real-Time Internet of Things, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-12909, https://doi.org/10.5194/egusphere-egu23-12909, 2023.

EGU23-13505 | Posters on site | HS3.8

Water observations by the public- experiences from the CrowdWater project 

Ilja van Meerveld, Franziska Schwarzenbach, Rieke Goebel, Mirjam Scheller, Sara Blanco Ramirez, and Jan Seibert

Hydrology is a data limited science, especially spatially distributed observations are lacking. Citizen science observations can complement existing monitoring networks and provide useful data. Engaging the public in data collection can also increase people’s interest and awareness about water-related topics. In this PICO, we will present the CrowdWater project, in which citizen scientists share, with the help of a smartphone app, hydrological observations on stream water levels, the presence of water in temporary streams, soil moisture conditions, plastic pollution, and general information on water quality. We will highlight the type of data that are collected, our quality control procedures, and the value of the data for hydrological model calibration. We will also discuss the motivations of the citizen scientists to join the project and to continue to contribute to the project. Although the majority of our frequent contributors are adults, we try to engage the youth in the project by giving presentations in schools and at science fairs. Therefore, we will end the PICO presentation with some examples of our outreach work and lessons learned.

How to cite: van Meerveld, I., Schwarzenbach, F., Goebel, R., Scheller, M., Blanco Ramirez, S., and Seibert, J.: Water observations by the public- experiences from the CrowdWater project, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-13505, https://doi.org/10.5194/egusphere-egu23-13505, 2023.

EGU23-15389 | ECS | Posters on site | HS3.8

An innovative data driven approach improves drought impact analysis using earth observation data 

Ye Tuo, Xiaoxiang Zhu, and Markus Disse

Drought is a devastating natural hazard that can be of diverse magnitude, duration and intensity. It leads to economic and social losses and ecological imbalances. Ascribing to climate change, drought has occurred more frequently with high intensity worldwide in recent decades, such as the striking droughts in the summer of year 2022. In water resource aspect, one direct consequence of drought is the decrease of water amount in the rivers, which could further develop into water shortage for irrigation and drinking water supply, and cargo shipping disruption. Therefore, in order to make management decisions that help mitigate the drought damage, it is important to monitor river water anomalies and identify the vulnerable shrinking sections along the river network. Traditional river gauging stations only provide us limited observations of particular spots. A proper utilization of spatially distributed remote sensing data is necessary and effective. In this work, we develop a novel framework to monitor river water shrinking anomaly by including image processing and machine learning approaches, based on earth observation data. The Rhine, a major cargo-route river, is selected as the pilot case, because it had huge water decrease and caused shipping disruption during the 2022 summer’s drought in Germany. The Modified Normalized Difference Water Index (MNDWI) is calculated from the green and Shortwave-Infrared bands of Sentinel-2 satellite images.  MNDWI images of a specific non-drought period is defined as the reference datasets representing normal conditions. Afterwards, a new water shrinking index is introduced to quantify the river water anomaly during drought periods.  Specifically, a python based algorithm which includes image processing and machine learning clustering methods is developed to scan along the MNDWI images to compute the water shrinking index with adjustable river section size. With the index datasets, river sections are further grouped into categories with drought vulnerable levels. By parameterizing the section size, the algorithm is able to quantify and identify the vulnerable shrinking river sections with varying scales. It provides classified references of drought affected hotspots for the regional water management plans in case of drought mitigation. Such a scalable framework can offer a timely distributed monitoring of the drought impacts on the water resource along the river network. As a long term benefit, numerical connections can be identified between the river shrinking status and the economic losses of cargo shipping disruption due to drought.  This is of great value to facilitate the drought impact analysis and forecasts.

How to cite: Tuo, Y., Zhu, X., and Disse, M.: An innovative data driven approach improves drought impact analysis using earth observation data, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-15389, https://doi.org/10.5194/egusphere-egu23-15389, 2023.

EGU23-16292 | ECS | Posters on site | HS3.8

Hydrological decision-making systems using high-resolution weather radar observations –  a case study from Hungary 

Zsolt Zoltán Fehér, Erika Budayné-Bódi, Attila Nagy, Tamás Magyar, and Tamás János

According to past observations and long-term forecasts, the Carpathian Basin is distinguished by two precipitation trends. The frequency, length, and severity of periods of precipitation deficit and drought are increasing. Furthermore, as small-scale convective updrafts intensify, heavy thunderstorms become more intense. Both trends pose significant risks from an anthropogenic perspective. The former increases food insecurity due to intensifying droughts, which damages agricultural yields, while the latter mainly increases property damage via heavy hailstorms.

The 2022 drought year demonstrated that effective use of available water is the foundation for sustainable growth, which may be supported by well-designed infrastructure investments and smart water management technologies. A rainfall radar system with a high spatial and temporal resolution that contributes to near real-time machine decision-making is one conceivable component of such a complex system.

The Furuno WR-2100 precipitation radar, which was deployed on the outskirts of Debrecen in 2020 for benchmarking purposes, is the first component of such an intelligent decision-making system in Hungary. The radar's range comprises both urban and rural areas, allowing it to continually gather high-resolution test data for both urban hydrology and agricultural irrigation system developments.

The research presented in the article was carried out within the framework of the Széchenyi Plan Plus program with the support of the RRF 2.3.1 21 2022 00008 project.

How to cite: Fehér, Z. Z., Budayné-Bódi, E., Nagy, A., Magyar, T., and János, T.: Hydrological decision-making systems using high-resolution weather radar observations –  a case study from Hungary, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-16292, https://doi.org/10.5194/egusphere-egu23-16292, 2023.

The quality of mosaic QPE directly determines the accuracy of QPF products from nowcasting models. However, there is a common spatial discontinuity phenomenon caused by the biases of multiple radars in mosaic QPE. Consistency correction, a type of multi-radar quality control method, can be used to mitigate the spatial discontinuity of mosaic QPE, but its improving effect on QPF products should be analyzed.

For this consideration, a consistency correction method based on GPM KuPR proposed by Chu et al (2018a) was applied to the three S-band operational radars of China, and the improvement on QPE by Z-R relationship, deterministic QPF by S-SPROG (Spectral Prognosis), and ensemble QPF by STEPS (Short-Term Ensemble Prediction System) were analyzed. The results showed: 1) the raw reflectivity factors by the three operational radars over the same equidistance area were significantly different. After the consistency correction, the differences decreased to be less than 0.5 dB and the spatial discontinuity of mosaic products disappeared. 2) The precision of mosaic QPE was significantly improved after the correction, and the average RMSE of QPE decreased by 19.5%, and the TS of heavy rainfall and above rose by 44.8%. 3) The 0-1h deterministic QPF by S-SPROG, and ensemble QPF by STEPS were significantly improved after the correction. The deterministic (ensemble) TS of moderate rain and above rose by 11.9% (10.2%), and that of heavy rain and above increased by 34.2% (38.7%). 4) Furthermore, the consistency correction method contributed to precipitation velocity estimation, and decreased its RMSE by 25.0%. Clearly, the consistency correction method is significantly contributive to multi-radar mosaic QPE and precipitation nowcasting.

How to cite: Chu, Z.: Improvement of Multi-Radar Quantitative Precipitation Nowcasting with Consistency Correction Method, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-16647, https://doi.org/10.5194/egusphere-egu23-16647, 2023.

GI3 – Planetary and Earth Observation instrumentation

EGU23-320 | ECS | Orals | GI3.1 | Highlight

A comparison of Perseverance rover and HiRISE data: siteinterpretations in Jezero Crater 

Constantijn Vleugels, Bernard Foing, and Okta Swida

Large parts of the Martian surface have been imaged with orbiters. The High Resolution Imaging Science Experiment (HiRISE) can be used to build Digital Terrain Models (DTMs) of Mars with high horizontal and vertical resolution – distinguishing metre-size objects with a vertical error of tens of centimetres – and interpret the geologic history of a site. These maps may aid in rover landing site selection and finding science targets for these missions. However, rover-based imaging ultimately brings the most detailed view of a site and provides ‘ground-truth’ data to orbital observations on much smaller scales. Studying the differences between geologic interpretations from larger scale orbital observations and smaller scale rover images helps understand the limits of orbital maps and the added value of rover observations. We compare remote sensing data from orbit with rover panoramic camera data. The validity of geologic interpretations derived from orbital image data (such as HiRISE) in Jezero Crater is examined with ground-based, publicly available data from Mastcam-Z on the Mars 2020 Perseverance rover. Mastcam-Z can provide stereo colour images of the scene around the rover. 

The rover is currently in its Delta Campaign after landing at the Octavia E. Butler site and its subsequent trip to the Séítah formation, indicated in the figure below which shows Perseverance’s traverse near the western delta of Jezero crater (the basemap is a HiRISE DTM overlaid on a Context Camera mosaic produced by The Murray Lab).  Along the way, it has imaged the Séítah and Máaz formations and outcrops of the western delta formation. These units are expected to be volcanic (Séítah and Máaz) and deltaic (western delta) deposits. We can use the Mastcam-Z images made along the traverse to test what geologic interpretations we can reliably infer from orbital data.

How to cite: Vleugels, C., Foing, B., and Swida, O.: A comparison of Perseverance rover and HiRISE data: siteinterpretations in Jezero Crater, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-320, https://doi.org/10.5194/egusphere-egu23-320, 2023.

EGU23-1679 | ECS | Posters virtual | GI3.1

Improving the Accessibility of Borehole Geophysics: A Cost-Efficient, Highly Modifiable Borehole Tilt Sensor 

Ian Lee, Robert Hawley, David Collins, and Joshua Elliott

We present a cost-efficient borehole tilt sensor that was developed by our group at Dartmouth College to study ice deformation on Jarvis Glacier in Alaska. We first detail the entire sensor development, deployment, and data collection process, along with showcasing successful use cases of our sensors on Jarvis and other glaciers both by our and other geophysical research groups. For our Jarvis work, we installed our tilt sensor system in two boreholes drilled close to the lateral shear margin of Jarvis Glacier and successfully collected over 16 months of uninterrupted borehole deformation data in a harsh polythermal glacial environment. The data included gravity and magnetic measurements that tracked the orientation of the sensors in the borehole as ice flows, and we used the resultant kinematic measurements to compute borehole deformation that provided insights into the ice flow dynamics on polythermal glaciers. Our tilt sensors can house many types of sensors to accommodate different scientific needs (e.g., temperature, pressure, electrical conductivity), and can be adapted for the different glacial thermal regimes and conditions like Athabasca Glacier in Canada, which is a temperate glacier in contrast to Jarvis’ polythermal regime. There remains a high knowledge and financial barrier to entry for borehole geophysics research for both development and procurement of a tilt sensor system, and our goal is to lower this barrier by supporting production efforts of our tilt sensor system for both research and educational needs. With our established sensor development plan and demonstrated success in the field, our group has collaborated with Polar Research Equipment (PRE), a Dartmouth alumni-founded company specializing in the development of polar research tools, to serve as a commercial resource to help support polar researchers during the development and/or production of an effective and cost-efficient (~80% cheaper than commercial versions) tilt sensor and its associated systems.

How to cite: Lee, I., Hawley, R., Collins, D., and Elliott, J.: Improving the Accessibility of Borehole Geophysics: A Cost-Efficient, Highly Modifiable Borehole Tilt Sensor, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-1679, https://doi.org/10.5194/egusphere-egu23-1679, 2023.

EGU23-2676 | Orals | GI3.1

MaQuIs - Mars Quantum Gravity Mission 

Lisa Woerner, Bart Root, Philippe Bouyer, Claus Braxmaier, Dominic Dirkx, Joao Encarnacao, Ernst Hauber, Hauke Hussmann, Ozgur Karatekin, Alexander Koch, Lee Kumanchik, Federica Migliaccio, Mirko Reguzzoni, Birgit Ritter, Manuel Schilling, Christian Schubert, Cedric Thieulot, Wolf von Klitzing, and Olivier Witasse

With MaQuIs we propose a mission to investigate the gravitational field of Mars. Observing the gravitational field over time yields information about the planets tectonic lithoshphere, mass distribution, and composition. Consequently, this mission allows to study static and dynamic processes on and under the surface of Mars, including phenomena such as melting cycles and tectonic activity.

MaQuIs will deploy quantum mechanical means to measure Mars gravitational field with the highest precision yet. In addition, the nature of the proposed instrumentation achieves high sensitivities without needing more complex satellite constellations. As such, MaQuIs follows successful missions for the Earth and Moon, extending the technology to Mars.

In this presentation we will outline the expected scientific merit and explain the underlying technology and planned configuration of the mission.  

How to cite: Woerner, L., Root, B., Bouyer, P., Braxmaier, C., Dirkx, D., Encarnacao, J., Hauber, E., Hussmann, H., Karatekin, O., Koch, A., Kumanchik, L., Migliaccio, F., Reguzzoni, M., Ritter, B., Schilling, M., Schubert, C., Thieulot, C., von Klitzing, W., and Witasse, O.: MaQuIs - Mars Quantum Gravity Mission, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-2676, https://doi.org/10.5194/egusphere-egu23-2676, 2023.

EGU23-2838 | ECS | Orals | GI3.1

Development of a 3D-printed ion-electron plasma spectrometer with an hemispheric field of view for microsats and planetary missions 

Gwendal Hénaff, Matthieu Berthomier, Leblanc Frédéric, Techer Jean-Denis, Degret Gabriel, and Pledel Sylvain

One of the challenges in space instrumentation is to measure the energy and 3D angular distribution of charged particles within the limited resources available on planetary missions. Current electrostatic energy analyzers allow the measurement of the energy and angular distribution of charged particles around a 2D viewing plane.

Since most planetary probes are three-axis stabilized, electrostatic scanning deflectors are needed to provide the 3D distribution of charged particles using a minimum of two sensors. However, deflections up to +/- 90° cannot be achieved at high energy (above 10-15 keV) while higher energy accelerated particles play a key role in the dynamics of planetary magnetospheres. In addition, electrons and positive ions have to be measured with dedicated sensors which increases the complexity of plasma payloads and of their accommodation on planetary platforms.

We introduce a novel instrument design, that would allow measurement of the energy spectrum and 3D angular distribution of charged particles on three-axis stabilized platforms without using scanning deflectors. The design is possible using new electrostatic geometries and the capability of additive manufacturing technology. An innovative and compact ion/electron detection system is used to simultaneously observe both type of particles with a single sensor.

 We show that we reach the performance of current reference designs while having a true 3D field of view and significantly reducing the payload needs. With a mass budget of 2 kg, our combined electron/ion instrument fits the requirements to fly aboard small satellites. It would significantly reduce the size and cost of the platform and may open new perspectives for planetary exploration by a fleet of micro/nano-satellites.

How to cite: Hénaff, G., Berthomier, M., Frédéric, L., Jean-Denis, T., Gabriel, D., and Sylvain, P.: Development of a 3D-printed ion-electron plasma spectrometer with an hemispheric field of view for microsats and planetary missions, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-2838, https://doi.org/10.5194/egusphere-egu23-2838, 2023.

NASA’s Earth Surface Mineral Dust Source Investigation (EMIT) imaging spectrometer was launched to the International Space Station (ISS) on the 14th of July 2022.  EMIT measures the spectral range from 380 to 2500 nm with 285 contiguous spectral channels with 60 m spatial sampling and an 80 km swath.  The EMIT imaging spectrometer is optically fast at F/1.8 to deliver high signal-to-noise ratio observations.  Novel methods are used for on-orbit calibration, dark signal measurement, and geolocation.  The EMIT measurement characteristics and processing results through calibration, atmospheric corrections, and surface mineralogy retrievals are reported.  The EMIT science team will use these new comprehensive observations of surface mineralogy across the Earth’s arid land dust source regions to update the initial conditions of Earth System Models to understand and reduce uncertainties in mineral dust radiative forcing at the regional and global scale now and in the future.  EMIT’s measurements, products, and results with be available to other investigators for the broad set of science and applications they enable through the NASA Land Processes Data Active Archive Center.  The connection between EMIT, Carbon Plume Mapper, the Mapping Imaging Spectrometer for Europa, and the High-resolution Volatiles and Minerals Moon Mapper on Lunar Trailblazer is also described.

How to cite: Green, R.: Imaging Spectroscopy Observations from NASA’s Earth Surface Mineral Dust Source Investigation launched in 2022 and Connections to Imaging Spectrometers for Greenhouse Gas Measurement, Europa, the Moon, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-4510, https://doi.org/10.5194/egusphere-egu23-4510, 2023.

EGU23-7046 | ECS | Orals | GI3.1

Evolution of the oxygen escape from Earth over geological time scales 

Maria Luisa Alonso Tagle, Romain Maggiolo, Herbert Gunell, Johan De Keyser, Gael Cessateur, Giovanni Lapenta, Viviane Pierrard, and Ann Carine Vandaele

Atmospheric erosion plays a significant role in the long-term evolution of planetary atmospheres, and therefore on the development and sustainability of habitable conditions. Atmospheric escape varies over time, due to changes in planetary conditions and the evolution of the Sun. In the case of a magnetized planet like Earth, the dominant scavenging mechanisms are polar wind and polar cusp escape. Both processes are sensitive to the ion supply from the atmosphere, which depends on the solar EUV radiation and the composition of the neutral atmosphere. Moreover, they are modulated by the coupling between the solar wind and the ionosphere, which depends on the solar wind dynamic pressure and the planetary magnetic moment.

We developed a semi-empirical model of atmospheric loss to extrapolate from current measurements of oxygen escape from Earth to past conditions. This model takes into account the variations of the solar EUV/UV flux, the solar wind dynamic pressure, and the Earth’s magnetic moment. In this study, we identify the main factors and processes that control oxygen escape from Earth, considering present-day atmospheric conditions. We constrain the variation of the oxygen loss rate over time and estimate the total oxygen loss during the last ~2 billion years.

How to cite: Alonso Tagle, M. L., Maggiolo, R., Gunell, H., De Keyser, J., Cessateur, G., Lapenta, G., Pierrard, V., and Vandaele, A. C.: Evolution of the oxygen escape from Earth over geological time scales, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-7046, https://doi.org/10.5194/egusphere-egu23-7046, 2023.

EGU23-7363 | Posters on site | GI3.1

On modeling of silicon detector for space applications using Geant4 

Mikhail Rashev

Silicon detectors are widely used for analyses of particles/radiation in space. They show a good response for a wide spectrum of different particles. Via construction of an appropriate shielding, one can select and analyze only a single sort of particles/their energy and suppress detection of particles of all other kinds. It is difficult to find a good solution for shielding only experimentally. A modeling software such as Geant4 allows us to find a solution for the shielding. This software calculates interaction of particles with shielding or detector and the resulting energy deposition.

The current work is based on modeling of aluminum shielding of the RAPID/IES instrument on board of four Cluster spacecrafts. Since 2000 Cluster mission encounters the Earth's radiation belts and measures energetic electrons among other particles, waves and electromagnetic fields. Accurate modeling using Geant4 allows us to filter unwanted particles out of the result and possibly remove some artifacts in space.

The Geant4 code calculates an attenuation of radiation. Preliminary this software does not calculate electrical signal. There is, however, a possibility to extend the code and add other functionalities. We are exploring possibilities to include signal processing in the Geant4 code for the detector, analog and digital processing units.

How to cite: Rashev, M.: On modeling of silicon detector for space applications using Geant4, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-7363, https://doi.org/10.5194/egusphere-egu23-7363, 2023.

EGU23-7966 | ECS | Posters on site | GI3.1

Faraday cup instruments for solar wind and interplanetary dust monitoring 

Oleksii Kononov, Jiří Pavlů, Tereza Ďurovcová, Jana Šafránková, Zdeněk Němeček, and Lubomír Přech

Importance of solar wind monitoring for space weather applications increases with expansion of power networks and oils or gas pipelines to larger geomagnetic latitudes and development of new communication networks. Instruments based on Faraday cups are an ideal solution for these purposes because they are robust and their light weight and low power consumption facilitate their applications for a small spacecraft. Another important feature of Faraday cups is their capability of detection of impacts of interplanetary dust. Such instruments are currently a part of two planned ESA missions that will be briefly introduced. In the core of contribution, we describe the preliminary instrument design and concentrate on most important technical aspects of their development including a computer modeling of the most important parts of detectors. Among others, we present the effects of the grid geometry on the detector capability to determine the plasma velocity vector and temperature and search for optimum detector configuration for small spacecraft missions. We also discuss the data strategy allowing maximum scientific income with limited spacecraft telemetry.

How to cite: Kononov, O., Pavlů, J., Ďurovcová, T., Šafránková, J., Němeček, Z., and Přech, L.: Faraday cup instruments for solar wind and interplanetary dust monitoring, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-7966, https://doi.org/10.5194/egusphere-egu23-7966, 2023.

EGU23-8161 | ECS | Posters on site | GI3.1

Formulation of spectral indexes from M3 cubes for lunar mineral exploration using python 

Javier Eduardo Suarez Valencia, Angelo Pio Rosi, and Giacomo Nodjourmi

Introduction

The scientific exploration of planetary bodies is enhanced using spectral indexes, and specific band combinations/operations that allow the interpretation of the compositional properties of planetary surfaces. The best hyperspectral sensor for the study of the Moon is M3 onboard Chandrayan-1 (Pieters et al., 2008), it has 86 channels, and covers the range between 450 to 3000 nm, a region that shows the main properties of the rock-forming minerals of the Moon. Although the data of M3 has been widely used with different techniques, there is no unified set of spectral indexes for this instrument, and the ones defined are usually produced in proprietary software. In this work, we compiled spectral indexes from several sources and recreated them in python.

Methods

We compiled spectral indexes from the literature, namely the ones defined by Zambon et al. (2020), Bretzfelder et al. (2020), and Horgan et al. (2014). Before applying the indexes, an M3 cube was processed in ISIS3 (Laura et al., 2022) and filtered in python to reduce the noise. Subsequently, the spectral indexes were replicated according to the procedures described by the authors and compared with the original results. Most of the process was done with common scientific libraries such as rioxarray (Guillies, 2013), OpenCV (Bradski, 2000), specutils (Earl et al., 2022), and NumPy (Harris et al., 2020).

Results

We were able to reproduce fourteen indexes with high fidelity. All of them are formulated to highlight the spectral features around the absorptions in 1000 nm and 2000 nm, which are the location with the major expressions from olivine and pyroxenes. Comparing our results with the ones in the literature, we found that the color ramps are similar in both results and that the surface features showcased in both cases are consistent with each other and their known compositions.

Discussion and conclusions

Small differences between the original indexes and the ones recreated here are expected, due to variations in the internal methods across libraries, the different ways of preprocessing and filtering, and the quality of the original cubes. Further comparison and validation of the procedures is planned.

Nevertheless, we believe that the results are consistent enough to be used as scientific inputs, thus providing an open-source alternative for the analysis of spectral indexes of the surface of the Moon. This work is in progress, and the code is going to be available via EuroPlanet GitHub organization (https://github.com/europlanet-gmap), as well as in the Space Browser of the EXPLORE platform (https://explore-platform.eu/space-browser).

Acknowledgments

This project has received funding from the European Union’s Horizon 2020 research and innovation program under grant agreement No 101004214.

References

Bradski, G. (2000). The OpenCV Library.

Bretzfelder et al., (2020). Identification of Potential Mantle Rocks Around the Lunar Imbrium Basin.

Earl et al., (2022). astropy/specutils: V1.9.1 

Gillies, S. & others. (2013). Rasterio: Geospatial raster I/O for Python programmers. 

Harris et al., (2020). Array programming with NumPy.

Horgan et al., (2014). Near-infrared spectra of ferrous mineral mixtures and methods for their identification in planetary surface spectra.

Laura et al., (2022). Integrated Software for Imagers and Spectrometers 

Zambon et al., (2020). Spectral Index and RGB maps.

How to cite: Suarez Valencia, J. E., Pio Rosi, A., and Nodjourmi, G.: Formulation of spectral indexes from M3 cubes for lunar mineral exploration using python, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-8161, https://doi.org/10.5194/egusphere-egu23-8161, 2023.

EGU23-8918 | ECS | Posters on site | GI3.1

Strofio: A Status Update 

Jared Schroeder, Stefano Livi, and Frederic Allegrini

Strofio is a neutral mass spectrometer designed to measure the chemical composition of Mercury’s exosphere. Neutral species enter the instrument through one of two inlets before they are ionized via electron impact. The product ions are then guided by dozens of individually programmed electrodes toward the detector. A rotating electric field determines the time-of-flight (TOF) of each particle before they collide with a microchannel plate (MCP). Upon launch, one of the system’s electrodes (D5) suffered an anomaly that disrupted communications between the commanded value and the value reported in telemetry. This particular electrode is responsible for steering the particles into the MCP. Laboratory tests with the engineering model confirm mission requirements are satisfied regardless of the electrode state with the caveat being a reduced first-order mass range; however, second-order manipulation can extend the mass range to pre-anomaly standards. I will present the latest advances we have made in optimizing the instrument in its current state.

How to cite: Schroeder, J., Livi, S., and Allegrini, F.: Strofio: A Status Update, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-8918, https://doi.org/10.5194/egusphere-egu23-8918, 2023.

EGU23-9318 | ECS | Orals | GI3.1

Training the future Space Entrepreneurs and Astronauts: the experience of the EuroSpaceHub Academy with the Analog Missions for validation of planetary instruments, protocols and techniques 

Serena Crotti, Jara Pascual, Bernard Foing, Agata Kołodziejczyk, Brent Reymen, Ioana Roxana Perrier, Henk Rogers, Sofia Pavanello, Celia Avila Rauch, Gabriel De La Torre, and Armin Wedler

EuroSpaceHub is a project funded by the EIT HEI initiative, led by EIT Manufacturing and Raw Materials. The main goal of the project is fostering collaborative innovation and entrepreneurship in the Space-Tech ecosystem. EuroSpaceHub includes several initiatives; among them is the EuroSpaceHub Academy: an educational programme to train young students, researchers and professionals as Analog Astronauts and Space entrepreneurs.

Thanks to the experience of one of the founding partners of EuroSpaceHub - Lunex EuroMoonMars - students have the opportunity to participate as analog astronauts in various campaigns, which makes them learn with a hands-on approach. Analog missions are both important for carrying out investigations with a view to future Space exploration  and for developing technical scientific knowledge in students. EuroMoonMars has been involved in the organization of these campaigns since 2009, starting at the MDRS (Utah). Other missions were organized at the HISEAS base on the Mauna Loa (Hawaii), in Iceland (CHILL-ICE), in Etna/Vulcano Italy, Atacama Desert (Chile), at the AATC in Poland, ESTEC Netherlands, Eifel Germany and others [1-10]. During analog simulations, students learn to control on-board instruments and to structure their own experiments, collecting data and processing the results efficiently. EuroSpaceHub and Lunex support not only student participation in these missions and their organisation, but also a set of specific trainings under the umbrella of the ESH Academy, complementary to the missions. During the missions, PhD and Master's students can take advantage of special settings and equipment to conduct their investigations, which range from Space and planetary science, instruments, protocols, data analysis,
(biology, psychology, physiology and engineering, to name but a few).

EuroSpaceHub and Lunex are also developing an innovative habitat for analog missions and outreach, ExoSpaceHab Express. Its easy transportation, which is conceived on wheels, makes it a unique contribution in the landscape of existing habitats. Thanks to ExoSpaceHab-X, an increasing number of students will have access to the missions and dedicated training. Also, more and more data will be collected to investigate crews’ reactions in confinement, mission protocols, planning and operations. 

References: [1] Foing, B. et al (2022) LPSC 53, 2042 [2] Foing B. et al (2021) LPSC52, 2502 [3] Musilova M. et al (2020) LPSC51, 2893 [4] Perrier I.R. et al (2021) LPSC52, 2562 [5] Crotti, S. et al (2022) EGU22, 5974 [6] Foing, B. et al (2021) LPSC52, 2502 [7] Heemskerk, M. et al (2021) LPSC52, 2762 [8] Foing, B. et al (Editors, 2011) Astrobiology field Research in Moon/Mars Analogue Environments, Special Issue IJA, 10, vol. 3. 137-305; [9] Foing B. et al. (2011) Field astrobiology research at Moon-Mars analogue site: Instruments and methods, IJA 2011, 10 (3), 141 [10] Foing, B. H. et al, (2017) LPICo2041, 5073 

Acknowledgments: We thank EuroSpaceHub Consortium, collaborators, EIT HEI initiative, EIT Manufacturing and Raw Materials, VilniusTech, Collabwith, International Space University, Universidad Complutense de Madrid, Igor Sikorsky Kyiv Polytechnic Institute, Lunex Foundation and EuroMoonMars. We thank Adriano Autino and Space Renaissance International, all EMMPOL participants and the staff of AATC.

How to cite: Crotti, S., Pascual, J., Foing, B., Kołodziejczyk, A., Reymen, B., Perrier, I. R., Rogers, H., Pavanello, S., Rauch, C. A., De La Torre, G., and Wedler, A.: Training the future Space Entrepreneurs and Astronauts: the experience of the EuroSpaceHub Academy with the Analog Missions for validation of planetary instruments, protocols and techniques, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-9318, https://doi.org/10.5194/egusphere-egu23-9318, 2023.

EGU23-9912 | Orals | GI3.1

The Magnetometer on the Psyche mission 

Jose M. G. Merayo, Benjamin P. Weiss, Jodie Ream, Rona Oran, Peter Brauer, Corey J. Cochrane, Kyle D. Cloutier, Lindy Elkins-Tanton, John Leif Jørgensen, Clara Maurel, Ryan S. Park, Carol A. Polanskey, Maria De Soria-Santacruz Pich, Carol A. Raymond, Christopher Russell, Daniel Wenkert, Mark A. Wieczorek, Maria T. Zuber, and Kyle Webster

The asteroid (16) Psyche is the target of the NASA Psyche mission, where the magnetometer is one of the three science instruments on board. Its purpose is to prove whether the asteroid formed from the core of a differentiated planetesimal. The magnetometer will measure the magnetic field at different distances from the asteroid in order to detect any remanent magnetization, where a magnetic moment larger than 2×10^14 Am2 could imply that the body once generated a core dynamo, and therefore formed as an igneous differentiation.

The Psyche spacecraft carries two three-axis fluxgate magnetometers mounted on a fixed boom at 2.15m and 1.45m, respectively, which provide redundancy and gradiometer capabilities to compensate for spacecraft-generated magnetic fields. The magnetometers will be powered on early in the initial checkout phase and remain on throughout cruise and orbital operations and producing 50 vectors per second. The in-flight temperature of the magnetometers is expected to span a large range, therefore an extensive calibration program has been carried out in order to characterize the instruments and prove the performance pre-flight.

How to cite: Merayo, J. M. G., Weiss, B. P., Ream, J., Oran, R., Brauer, P., Cochrane, C. J., Cloutier, K. D., Elkins-Tanton, L., Jørgensen, J. L., Maurel, C., Park, R. S., Polanskey, C. A., Pich, M. D. S.-S., Raymond, C. A., Russell, C., Wenkert, D., Wieczorek, M. A., Zuber, M. T., and Webster, K.: The Magnetometer on the Psyche mission, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-9912, https://doi.org/10.5194/egusphere-egu23-9912, 2023.

EGU23-10104 | Orals | GI3.1

Sense-checking the calibration of the Cluster FGM magnetometer spin-axis offsets using mirror mode waves in the magnetosheath 

Leah-Nani Alconcel, Timothy Oddy, Patrick Brown, and Chris Carr

The calibrated data from the Cluster fluxgate magnetometer instruments (FGMs) aboard the four Cluster spacecraft are accessible through the European Space Agency (ESA) Cluster Science Archive (CSA). The FGM team at Imperial College – the PI institute that built and supports operation of the magnetometers – has regularly provided validated data to the CSA since its inception. The calibration and validation pipeline is well established and provides measurements at the highest instrument resolution within an uncertainty as low as 0.1 nT. New methods for magnetic field calibration have been proposed in the many years since Cluster’s commissioning. One of these uses mirror mode waves in the Earth’s magnetosheath to determine the spin-axis offsets of an in-flight magnetometer instrument. The FGM team applied this method to the Cluster instrument data during periods when the spacecraft spend a substantive proportion of their orbits in the magnetosheath, typically May-June and October-November. The offsets determined by this method were compared to those determined by the method already integrated into the pipeline. Good agreement was found between the two methods.

Due to the limitations in resource, the substantial effort that would be required to change calibration methods and re-deliver over 20 years of FGM data, and the potential impact on literature already published, the team would not recommend retroactive integration of the new method into the pipeline. However, the study provides a useful sense check of the pipeline and the data already delivered, as well as the remaining data to be delivered through to the end of the Cluster mission.

How to cite: Alconcel, L.-N., Oddy, T., Brown, P., and Carr, C.: Sense-checking the calibration of the Cluster FGM magnetometer spin-axis offsets using mirror mode waves in the magnetosheath, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-10104, https://doi.org/10.5194/egusphere-egu23-10104, 2023.

EGU23-10788 | Orals | GI3.1 | Highlight

Botany on The Moon 

Heather Smith

We propose a suite of instruments, Botany on The Moon, designed to investigate the feasibility of plant growth on the Moon. Botany is composed of two single-species plant growth modules (Arabidopsis, & radish) plus two environmental monitoring instruments that record (1) direct and scattered sunlight in the photosynthetically active radiation or wavelengths (termed PAR), and (2) level of cosmic radiation and induced lunar neutrons. Together these four investigations contribute to our understanding of how plants can be grown on the Moon.

The core perspective in Botany is that physical experiments are needed to understand plant growth on the Moon. Little is known about plant behavior in reduced (fractional) gravity environments (less than the nominal 1g that occurs on Earth). How biology responds to partial gravity (in combination with radiation effects) remains unexplored.

Botany’s primary science goals can be achieved during the sunlit timeframe of a Lunar Day. However, significantly more data and knowledge is gained by extending the growth duration window to approximately 45 Earth days. Hence, Botany is proposing to take advantage of the CLPS provided Survive-the-Night service.  If the CLPS provider is able to provide power for Botany to survive the night, our secondary science goal to determine the feasibility of transitioning the plants from a normal growth phase (at 22oC during the sunlit time) to a slow growth phase (at 5oC during the nighttime), returning to normal growth phase (at 22oC during the second sunlit time) can be achieved. However, all of Botany’s primary science goals can be achieved during the lunar sunlit timeframe, albeit with the loss of data due to the shorter growth duration. The Botany instrument suite including the LPX plant chambers are designed for a 45 Earth-day mission on the Lunar surface, including surviving the 354 hours of the Lunar night. The Botany on The Moon proposed project has a payload mass of ~ 12kg and estimated cost of ~ $11.5 Million U.S. dollars.

The 20-person Botany payload team is led by a mid-career women scientist and involves a gender diverse science and engineering team at various stages in their career from 10 institutions located within three countries. The Botany team includes NASA ARC, KIPR (a long-term NASA ARC contract organization), SDL, UNC-G (a minority serving institution (MSI)), a Canadian instrument provided by McMaster University, and a science team from various institutions. Our team combines complimentary skills, mission management experience, and expertise in plant science.

How to cite: Smith, H.: Botany on The Moon, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-10788, https://doi.org/10.5194/egusphere-egu23-10788, 2023.

The HyTI (Hyperspectral Thermal Imager) mission, funded by NASA’s Earth Science Technology Office InVEST (In-Space Validation of Earth Science Technologies) program, will demonstrate how high spectral and spatial long-wave infrared image data can be acquired from a 6U CubeSat platform. The mission will use a spatially modulated interferometric imaging technique to produce spectro-radiometrically calibrated image cubes, with 25 channels between 8-10.7 microns, at 13 cm-1 resolution), at a ground sample distance of ~60 m. The HyTI performance model indicates narrow band NEDTs of <0.3 K. The small form factor of HyTI is made possible via the use of a no-moving-parts Fabry-Perot interferometer, and JPL’s cryogenically-cooled HOT-BIRD FPA technology. Launch is scheduled for June 2023. The value of HyTI to Earth scientists will be demonstrated via on-board processing of the raw instrument data to generate L1 and L2 products, with a focus on rapid delivery of data regarding volcanic degassing, and land surface temperature. This presentation will describe the mission and the technology, including the interferometric imaging approach, and how the Cube Sat will support instrument operations and data processing.

How to cite: Wright, R. and the HyTI Team: The HyTI Mission: High Spatial and Spectral Sesolution Imaging from a 6U Cube Satellite, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-10917, https://doi.org/10.5194/egusphere-egu23-10917, 2023.

EGU23-11555 | Posters on site | GI3.1

BepiColombo: Operations and Data Analysis through the Quick-Look Analysis (QLA) tool 

Thomas Cornet, Alan Macfarlane, Elena Racero, Sebastien Besse, and Santa Martinez

The ESA-JAXA BepiColombo mission is currently en route to Mercury since October 2018. It consists of the ESA Mercury Planetary Orbiter (MPO) and the JAXA Mercury Magnetospheric Orbiter (MMO) spacecraft which, along with the Mercury Transfer Module (MTM), are stacked all together during the seven years’ cruise phase. This long cruise phase is interspersed by nine planetary flybys used to reach Mercury’s orbit capture. In this configuration, most of the MPO instruments located on the nadir side are obstructed by the MTM and cannot observe. Nevertheless, a subset of “side-looking” instruments can be operated in the stacked-spacecraft configuration during the cruise and gather scientific data. These instruments, mostly dedicated to the study of the Hermean environment (magnetic field, solar wind, exosphere), are operated during the planetary flybys as well as for several cruise science observations. Such events are used to test the BepiColombo Science Ground Segment (SGS) operating systems and processes. The SGS develops the Quick-Look Analysis (QLA) tool that will support the rapid analysis of the instruments’ operational and scientific data acquired during the mission science phase observations, starting in 2026. At present, the tool is used to support cruise and flybys operations, in addition to fostering science collaborations between the BepiColombo instrument teams through its data sharing capabilities. We will present the current status and functionalities.

How to cite: Cornet, T., Macfarlane, A., Racero, E., Besse, S., and Martinez, S.: BepiColombo: Operations and Data Analysis through the Quick-Look Analysis (QLA) tool, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-11555, https://doi.org/10.5194/egusphere-egu23-11555, 2023.

In response to the problem that ground-based optical monitoring systems cannot monitor near-Earth asteroids which are too close to the Sun on the celestial sphere, we raise a method that tracks and determines the orbit of asteroids by Distant Retrograde Orbit (DRO) platforms with optical monitoring. Through data filtering by visibility analysis and the initial orbit information of the asteroids provided by Jet Propulsion Laboratory (JPL), the asteroids' orbits are determined and compared with the reference orbit. Simulation results show that with a measurement accuracy of two arcseconds and an arc length of three years, the orbit determination accuracy of the DRO platform for near-Earth asteroids can reach tens of kilometers, especially the asteroids with Atira orbits to an accuracy of fewer than ten kilometers. In conclusion, the near-Earth asteroids monitoring systems based on DRO platforms are capable to provide sufficient monitoring effectiveness which enables precise tracking of the target asteroids and forecast of their positions.

How to cite: Yezhi, S.: Near-Earth asteroids orbit determination by DRO space-based optical observations, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-13233, https://doi.org/10.5194/egusphere-egu23-13233, 2023.

EGU23-13996 | ECS | Posters on site | GI3.1

Simulation Study for Precise Orbit Determination of a Callisto Orbiter and Geodetic Parameter Recovery 

William Desprats, Daniel Arnold, Stefano Bertone, Michel Blanc, Adrian Jäggi, Lei Li, Mingtao Li, and Olivier Witasse

Callisto, the outermost of the four Galilean satellites, is identified as a key body to answer present questions about the origin and the formation of the Jovian system. Callisto appears to be the least differentiated and the geologically least evolved of the Galilean satellites, and therefore the one best reflecting the early ages of the Jovian system.

While the ESA JUICE mission plans several flybys of Callisto, an orbiter would allow it to measure geodetic parameters to much higher resolution, as it was suggested by several recent mission proposals,e.g., the Tianwen-4 (China National Space Administration) and MAGIC (Magnetics, Altimetry, Gravity, and Imaging of Callisto) proposals. Recovering parameters such as those describing Callisto’s gravity field, its tidal Love numbers, and its orientation in space would help to significantly constrain Callisto’s interior structure models, including the characterization of a potential subsurface ocean.

We perform a closed-loop simulation of spacecraft tracking, altimetry, and accelerometer data of a high inclination, low altitude orbiter, which we then use for the recovery of its precise orbit and of Callisto’s geodetic parameters. We compare our sensitivity and uncertainty results to previous covariance analyses. We estimate geodetic parameters, such as gravity field, rotation, and orientation parameters and the k2 tidal Love number, based on radio tracking (2-way Doppler) residuals. We consider several ways to mitigate the mismodeling of non-gravitational accelerations, such as using empirical accelerations and pseudo stochastic pulses, and we evaluate the benefits of an on-board accelerometer.

We also investigate the added value of laser altimeter measurements to enable the use of altimetry crossovers to improve orbit determination and gravity-related geodetic parameters, but also to estimate the recovery of surface tidal variations (via the h2 Love number). For our closed-loop analyses, we use both a development version of the Bernese GNSS Software and the open-source pyXover software.

How to cite: Desprats, W., Arnold, D., Bertone, S., Blanc, M., Jäggi, A., Li, L., Li, M., and Witasse, O.: Simulation Study for Precise Orbit Determination of a Callisto Orbiter and Geodetic Parameter Recovery, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-13996, https://doi.org/10.5194/egusphere-egu23-13996, 2023.

EGU23-14692 | Orals | GI3.1

Project DragLiner: Harnessing plasma Coulomb drag for satellite deorbiting to keep orbits clean 

Maria Genzer, Pekka Janhunen, Harri Haukka, Antti Kestilä, Maria Hieta, Pyry Peitso, Perttu Yli-Opas, Hannah Ploskonka, Petri Toivanen, Janne Sievinen, Marco Marques, David Macieira, Ahmed El Moumen, Farzaneh Gholami, Miguel Olivares-Mendez, Baris Can Yalcin, and Carol Martinez Luna

When a high-voltage charged tether is put into streaming space plasma, the tether’s electric field disturbs the flow of plasma ions and thereby taps momentum from the plasma flow [1-4]. The effect is called electrostatic Coulomb drag. One application is the electric solar wind sail which uses the solar wind to generate interplanetary propulsion [1, 2]. Another application is the Plasma Brake [3, 4] which uses the ionospheric ram flow to generate Coulomb drag that slowly de-orbits the satellite. Both positive and negative tether polarities work. The plasma physics is different, but the net effect is a transfer of momentum in both cases. The reasons are somewhat complicated, but there is good motivation to select positive polarity in the solar wind case and negative polarity in the ionospheric Plasma Brake case. Measurement of Coulomb drag in Low Earth Orbit and testing deployment of tether is to be carried out by ESTCube-2 cubesat [5] which is scheduled for launch in spring 2023, and forthcoming Foresail cubesat scheduled for launch later in 2023-2024.

Project DragLiner is ongoing and funded by ESA to define requirements and a preliminary design of a passive Coulomb Drag based deorbit system capable of bringing down LEO spacecrafts in an order of magnitude shorter time than the current regulations of re-enter time for the spacecraft (25 years). Other main requirements for the deorbiting system are low mass and independence from the spacecraft resources. The project will also create a TRL 4 prototype of a Plasma Brake module that can be used to deorbit a few hundred kilogram satellite or launcher upper stage from Low Earth Orbit. The module deploys ~5 km long tether that is made of four 25-50 micrometre diameter conductive wires. In addition to aluminium wires used previously in Cubesat projects we will also evaluate more advanced carbon fibre composite wires. The redundant multi-wire tether structure is used so that the tether does not break even when micrometeoroids cut some of its wires. The tether is deployed from a storage reel. The tether is kept at -1 kV voltage by an onboard high-voltage source. A ~100 m long metal-coated tape tether is used as an electron-gathering surface that closes the current loop. Alternatively, conducting parts of the debris satellite could be used for electron gathering. The power consumption is a few watts. 

Project Dragliner uses basic Space Plasma Physics to solve a practical and important problem of keeping satellite orbits clean for future generations and preventing a catastrophic Kessler syndrome scenario.

[1] Janhunen, P., Electric sail for spacecraft propulsion, J. Prop. Power, 20, 763-764, 2004.

[2] Janhunen, P. and A. Sandroos, Simulation study of solar wind push on a charged wire: basis of solar wind electric sail propulsion, Ann. Geophys., 25, 755-767, 2007.

[3] Janhunen, P., Electrostatic plasma brake for deorbiting a satellite, J. Prop. Power, 26, 370-372, 2010.

[4] Janhunen, P., Simulation study of the plasma-brake effect, Ann. Geophys., 32, 1207-1216, 2014.

[5] Iakubivskyi, I., et al., Coulomb drag propulsion experiment of ESTCube-2 and FORESAIL-1, Acta Astronautica, 177, 771-783, 2020.

How to cite: Genzer, M., Janhunen, P., Haukka, H., Kestilä, A., Hieta, M., Peitso, P., Yli-Opas, P., Ploskonka, H., Toivanen, P., Sievinen, J., Marques, M., Macieira, D., El Moumen, A., Gholami, F., Olivares-Mendez, M., Yalcin, B. C., and Martinez Luna, C.: Project DragLiner: Harnessing plasma Coulomb drag for satellite deorbiting to keep orbits clean, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-14692, https://doi.org/10.5194/egusphere-egu23-14692, 2023.

EGU23-14760 | ECS | Posters on site | GI3.1

A novel user-friendly Jupyter-based tool for analysing orbital subsurface sounding radar data. 

Giacomo Nodjoumi, Sebastian Emanuel Lauro, and Angelo Pio Rossi

Orbital radars, such as the SHAllow RADar (SHARAD) [1] or the Mars Advanced Radar for Subsurface and Ionosphere Sounding (MARSIS) [2] instruments on board Mars Reconnaissance Orbiter (MRO) and Mars Express (MEX) respectively, provide valuable data about the Martian subsurface [3,4].

Common analysis methodologies comprise a direct comparison between the radargram (RDR) and the corresponding Surface Clutter Simulation (SCS) to visually spot any subsurface reflector. The surface time delays converted in the space domain are then compared with the corresponding topographic profile to check if any discrepancy occurred. and thus be mistaken for subsurface reflections. Once confirmed that the subsurface reflector is valid, the proper picking can be performed by looking at the radargram and both the radargram and the simulation power intensities. Finally, it is possible to estimate the real dielectric constant ε', which is the real component of the complex permittivity ε' - iε'' using Equation Eq1 [3]:

where Δt is the two-way travel time between the surface and the subsurface reflector, c is the speed of light in a vacuum and h is the reflector’s depth. Assuming different values for ε' and inverting Eq1, is possible to estimate the depth, thus the thickness of the reflector’s unit. In this work, we present the first pre-release of a user-friendly interface, with which is possible to easily perform the above analysis while granting robustness and reproducibility. Besides, it is possible to implement further custom processing functions to increase the accuracy of the results and/or expand the tool capabilities. We started the development using SHARAD US RDR and SCS, while MARSIS compatibility is under implementation. We provided also additional Jupyter notebooks for data download. This tool is based on the Jupyter lab environment and open-source python packages served as a docker container.

Open Research: The tool presented here is available on GitHub [5]

Funding: This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreements No 101004214 and No 871149.

References:

[1] Seu, R., et al., SHARAD Sounding Radar on the Mars Reconnaissance Orbiter., doi:10.1029/2006JE002745.

[2] Jordan, R., et al., The Mars Express MARSIS Sounder Instrument. doi:10.1016/j.pss.2009.09.016.

[3] Shoemaker, E.S., et al., New Insights Into Subsurface Stratigraphy Northwest of Ascraeus Mons, Mars, Using the SHARAD and MARSIS Radar Sounders. doi:10.1029/2022JE007210.

[4] Lauro, S.E., et al., Using MARSIS Signal Attenuation to Assess the Presence of South Polar Layered Deposit Subglacial Brines. doi:10.1038/s41467-022-33389-4.

[5] Nodjoumi, G. MORDOR - Mars Orbital Radar Data Open-Reader 2023.

How to cite: Nodjoumi, G., Lauro, S. E., and Rossi, A. P.: A novel user-friendly Jupyter-based tool for analysing orbital subsurface sounding radar data., EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-14760, https://doi.org/10.5194/egusphere-egu23-14760, 2023.

EGU23-14810 | ECS | Orals | GI3.1

Statistics of Alfvénic structures in the Solar Wind and their impact on Magnetometer Calibration 

Johannes Z. D. Mieth, Ferdinand Plaschke, Uli Auster, David Fischer, Daniel Heyner, and Werner Magnes

Exploiting the Alfvénic structures of the solar wind is an established method for calibrating spaceborne magnetometers. However, not every statistical property of Alfvén waves follows a uniform distribution, so calibration accuracy in certain sensor directions may be significantly affected by the choice of the data set used. This work examines the statistical properties of Alfvénic disturbances and other structures of the solar wind in a wide range of spatial and temporal scales using data from the current BepiColombo mission, now in the inner solar system, the lunar and Earth-bound satellites of the THEMIS and ARTEMIS missions, and the Earth-bound MMS mission. The influence of the data selection on calibration is characterized and quantified. We benefit from the fact that the magnetometers of the above-mentioned missions have been partially calibrated by independent methods, using the spacecraft spin or alternative observations of the total magnetic field.

How to cite: Mieth, J. Z. D., Plaschke, F., Auster, U., Fischer, D., Heyner, D., and Magnes, W.: Statistics of Alfvénic structures in the Solar Wind and their impact on Magnetometer Calibration, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-14810, https://doi.org/10.5194/egusphere-egu23-14810, 2023.

EGU23-15409 | ECS | Orals | GI3.1

Callio SpaceLab – Sustainable living, sustaining life 

Jari Joutsenvaara, Antti Tenetz, Julia Puputti, and Ossi Kotavaara

Callio SpaceLab is an initiative for international space testing, R&D for future human space exploration. SpaceLab's extremely confined environment of the mine and surroundings provide testbeds to simulate human space exploration, analogue astronaut training and experiences for space research and systems in extreme environments on Earth.

Many steps need to be taken here on Earth to put a (hu)man on the Moon and later on Mars. The Earth-based simulation environments are called Terrestrial analogue sites or space analogues. Some analogues are more general, but some have characteristics similar to the extra-terrestrial conditions: e.g., Venus has an analogue environment at Mt. Etna (1), Italy, Mars at Atacama desert (2), Chile and Moon (3), Mauna Kea, Hawai, USA.

Space analogues research covers many topics ranging from testing of habitats and other constructions, fieldwork, in-situ resource utilisation and vehicles; some concentrate on low gravity  (simulated, e.g., in pools) and confinement from the existing world in enclosed environments.

Callio SpaceLab is a concept being developed at the Pyhäsalmi mine, Finland. It is one of Europe’s deepest  (1.4 km) base metal mines. The underground mining ended in 8/2022, but that is just the beginning. The Pyhäjärven Callio is developing the site for a second life, including underground pumped-hydro energy storage, a solar park, FUTUREMINE testing environment for autonomous mining equipment, and more (4,5). Research activities are coordinated by the University of Oulu´s Kerttu Saalasti Institute.

In order to survive on the extraterrestrial landscapes Moon and Mars, one needs to bring enough protection to sustain life and activities. Mine is a suitable terrestrial analogue test environment for confinement studies, biology, astrobiology, in-situ resource utilisation, scientific drilling, rover testing (inclines up to1:7), communications systems testing, space design-, art- and culture projects, etc. (6). The mine has extensive connectivity. Deep space communications can be simulated for different missions, from spaceflights to extraterrestrial bases and activities both on surfaces and in the depth of space objects and celestial bodies.

The site´s hosting rock is a massive volcanogenic sulfide (VMS) deposit formed 1.9 Ga (7). Exploration drilling has found saline water pockets dated at least 30 Ma old. The water samples have shown traces of bacteria common to deep subsurface environments (8).

 

References

  • Gabriel V.,  et al. Mineralogy and Spectroscopy of Mount Etna Lava Flows as an Analogue to Venus. https://ui.adsabs.harvard.edu/abs/2022LPICo2678.2255E
  • Azua-Bustos A.,  et al. The Atacama Desert in Northern Chile as an Analog Model of Mars. 2022. https://doi.org/10.3389/fspas.2021.810426
  • Inge IL ten K.,  et al. Mauna Kea, Hawaii, as an Analog Site for Future Planetary Resource Exploration: Results from the 2010 ILSO-ISRU Field-Testing Campaign. Journal of Aerospace Engineering. https://doi:10.1061/(ASCE)AS.1943-5525.0000200
  • Callio - Mine for Business. 2023. https://callio.info
  • Joutsenvaara J.,  et al. Callio Lab - the deep underground research centre in Finland, Europe. 2021.  https://doi.org/10.1088/1742-6596/2156/1/012166
  • Tenetz A., More than Planet - Deep residency and workshop, creative Eu-project - 2022-2025. http://www.photonorth.fi/fi/projektit/more-than-planet/
  • Imaña M,  et al., 3D modeling for VMS exploration in the Pyhäsalmi district, Central Finland in. In: Proceedings of the 12th Biennial SGA Meeting. 2013. p. 12–5.
  • Miettinen H, et al., Microbiome composition and geochemical characteristics of deep subsurface high-pressure environment, Pyhasalmi mine Finland. https://doi.org/10.3389%2Ffmicb.2015.01203

How to cite: Joutsenvaara, J., Tenetz, A., Puputti, J., and Kotavaara, O.: Callio SpaceLab – Sustainable living, sustaining life, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-15409, https://doi.org/10.5194/egusphere-egu23-15409, 2023.

EGU23-2395 | ECS | PICO | GI3.2

Performance of water indices using large-scale sentinel-2 data in Google Earth Engine Computing 

Mathias Tesfaye Abebe and Lutz Breuer

Evaluating the performance of water indices and quantifying the spatial distribution of water-related ecosystems are important for monitoring surface water resources of our study area since there is a limited study available to compute water indices using high-resolution and multi-temporal sentinel-2 data on a large scale. In addition, a comparative performance analysis of water indices methods using the aforementioned dataset on a country scale, showing their strengths and weaknesses, was missing too. To address these problems, this paper evaluated the performance of water indices for surface water extraction in Ethiopia. For this purpose, high spatial and multi-temporal resolution large-scale sentinel-2 data were employed and processed using the Google Earth Engine cloud computing system. In this study, seven indices, namely water index (WI) and automatic water extraction index (AWEI) with shadow and no shadow, normalized difference water index (NDWI), modified normalized difference water index (MNDWI), sentinel water index (SWI), and land surface water index (LSWI) were evaluated with overall accuracy, producer’s accuracy, user’s accuracy, and Kappa coefficient. The result revealed that the WI and AWEIshadow were the most accurate to extract the surface water compared to other indices in qualitative and quantitative evaluation of accuracy indicators obtained with a kappa coefficient of 0.96 and 0.95, respectively, and with overall accuracy for both in 0.98. In addition, the AWEIshadow index was also relatively better at suppressing shadow and urban areas. The accuracy difference between LSWI and other indices was significant which performed the worst with overall accuracy and kappa coefficients of 0.82 and 0.31, respectively. Using best-performing indices of WI and AWEIshadow, 82650 and 86530 square km of surface water fractions were extracted, respectively. Therefore, our result confirmed that WI and AWEIshadow indices generated better water extraction outputs using a high spatial and multi-temporal resolution of sentinel-2 data under a wide range of environmental conditions and water body types on the country scale.

How to cite: Abebe, M. T. and Breuer, L.: Performance of water indices using large-scale sentinel-2 data in Google Earth Engine Computing, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-2395, https://doi.org/10.5194/egusphere-egu23-2395, 2023.

EGU23-5297 | PICO | GI3.2

DC and FDEM salt wedge monitoring of the Po di Goro river (Italy). 

Enzo Rizzo, Paola Boldrin, Alessandro Bondesan, Francesco Droghetti, Luigi Capozzoli, Gregory De MArtino, Enrico Ferrari, Giacomo Fornasari, Valeria Giampaolo, and Federica Neri

The global warming is affecting the rising seas, which increase the saltwater contamination of the coastal zone in terms of intrusion and penetration in the delta system. The delta systems are characterized by complex dynamic between freshwater coming from continent and saltwater. The hydrodynamic system is greatly affected by the problem of climate change producing a scarce recharge of the aquifers and an increase of the upstream of the mixing zone in the surface waters. These conditions can hinder the water use for irrigation purpose leading to salinization of soils. This summer the Po River underwent a large saltwater intrusion crisis endangering the sustainability of the freshwater resources. The saline wedge in the Po Delta area defined salinisation of groundwater and the soil. These phenomena allow for the active ingression of seawater from the east because the hydraulic head is not sufficient to avoid water to flow inland from the sea. In order to define the water quality, the electrical conductivity (EC) is one of the typical used chemical-physical parameters. However, a common probe defines a punctual acquisition and, therefore, it is time consuming to make a monitor along a long river (> 50km), such as the Po di Goro, that is one of the Po River branches. The research group defined two fast geophysical approach for the monitoring of the saltwater penetration and intrusion. The FDEM method was used to detect the saline wedge in the river and the Electrical Resistivity Tomography was applied to monitor the hydrodynamic iteration between the river and the subsoil around the riverbanks. Two geophysical field activities were planned before and after the salt penetration crisis in the Po River, defined in the last summer. In detail, two ERTs and two long FDEM profiles were carried out along the Po di Goro river. Moreover, a “moving boat” approach with a multilevel EC probe was applied to join the acquired geophysical data set. The ERT sections highlighted how the salty water in the river contaminated the surrounding subsoil. The FDEM data sets defined the hydrodynamic of the saltwater wedge in the river detecting the salty plume front. These results highlight the great potential of the proposed geophysical approach to monitor the saline plume during crisis periods.

How to cite: Rizzo, E., Boldrin, P., Bondesan, A., Droghetti, F., Capozzoli, L., De MArtino, G., Ferrari, E., Fornasari, G., Giampaolo, V., and Neri, F.: DC and FDEM salt wedge monitoring of the Po di Goro river (Italy)., EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-5297, https://doi.org/10.5194/egusphere-egu23-5297, 2023.

The first recorded environmental protests in Bor, Serbia, began in 1906, only 3 years after the mining and smelting of copper ores started. In 1931, one of the first results of chemical analysis of river water were issued, stating that the content of free acid (as H2SO4) in Bor River just after the mine was 0.0168 %. Another report from 1935 stated that the pH value of Bor River was 4.5, the concentration of Fe was 81 mg/L, and the concentration of Cu was 22 mg/L. At that time, sampling and analysis of river water were initiated by the rebellious local community who wanted compensation for the damage made to their agricultural fields. Throughout the years, the pollution of Bor River became a norm, and researchers from Serbia and the world investigated the pollution from the physical, chemical, mineralogical, and microbiological aspects. From 2015 to 2021, the pH value of Bor River ranged from 2.1 to 6.3, the concentration of Fe ranged from 66 to 355 mg/L, and the concentration of Cu ranged from 4 to 116 mg/L, depending on the intensity of mining and smelting activities. These more recent results are not so different from those about a century before. However, since the mining and smelting combine Bor changed its ownership in 2018, the monitoring of the pollution became more advanced, and there are more reclamation activities. Several automatic monitoring stations with inductively coupled plasma optical emission spectrometers or mass spectrometers (ICP-OES or ICP-MS) were installed in the field by the polluted rivers for the purpose of monitoring. Water from the largest acid mine drainage accumulation, the Robule Lake, was treated, drained, and in 2023. the Robule Lake does not exist anymore. Additional monitoring and reclamation activities are expected which could reduce the pollution of Bor River in the future.

How to cite: Stefan, D.: Past and present monitoring results of acid mine drainage around copper mines and smelter in Bor, Serbia, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-9667, https://doi.org/10.5194/egusphere-egu23-9667, 2023.

EGU23-9701 | ECS | PICO | GI3.2

Applicability of remote sensing evapotranspiration products in reducing uncertainty and equifinality in hydrological model calibration of Oued El Abid watershed. 

Soufiane Taia, Lamia Erraioui, Jamal Chao, Bouabid El Mansouri, and Andrea Scozzari

Typically, hydrological models are calibrated using observed streamflow at the outlet of the watershed. This approach may fail to mimic landscape characteristics, which significantly impact runoff generation because the streamflow incorporates contributions from several hydrological components. However, remotely sensed evapotranspiration (AET) products are commonly used as additional data with streamflow to better constrain model parameters. Several researchers demonstrated the efficacy of AET products in reducing the degree of equifinality and predictive uncertainty, resulting in a significant enhancement in hydrological modelling. Due to the variety of publicly available AET datasets, which vary in their methods, parameterization, and spatiotemporal resolution, selecting an appropriate AET for hydrological modelling is of great importance. The purpose of this study is to investigate the difference in simulated hydrologic responses resulting from the inclusion of different remotely sensed AET products in a single and multi-objective calibration with observed streamflow data. The GLEAM_3.6a, GLEAM_3.6b, MOD16A2, GLDAS, PML_V2, TerraClimate, FLDAS, and SSEBop datasets were downloaded and incorporated into the calibration of the SWAT hydrological model. The findings indicate that the incorporation of remotely sensed AET data in multi-objective calibration tends to improve model performance and decrease predictive uncertainty, as well as significantly improves parameter identification. Furthermore, AET single-variable calibration results show that the model would have performed well in simulating streamflow even without streamflow data. Moreover, each dataset included in this investigation responded differently. GLEAM_3.6b and GLEAM_3.6a performed the best, followed by FLDAS and PML_V2, while MOD16A2 was the least performing dataset. Thus, this research supports the use of remotely sensed AET in the calibration of hydrological models as a best practice.

 

How to cite: Taia, S., Erraioui, L., Chao, J., El Mansouri, B., and Scozzari, A.: Applicability of remote sensing evapotranspiration products in reducing uncertainty and equifinality in hydrological model calibration of Oued El Abid watershed., EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-9701, https://doi.org/10.5194/egusphere-egu23-9701, 2023.

EGU23-10657 | PICO | GI3.2 | Highlight

Evaluating the applicability of transient electromagnetic (TEM) data to characterize aquifer geometry in urban areas 

Adrián Flores Orozco, Lukas Aigner, and Josef Ferk

Understanding subsurface properties within urban areas is critical for an adequate management of groundwater, for instance to delineate the migration of pollutants, artificial recharge systems or geothermal collectors. Information is available from extraction wells, yet the resolution of the information is limited to the locations where wells are available. Geophysical methods offer an alternative to gain subsurface information. However, asphalted roads and limited accessibility might reduce the applicability of electrical methods for investigations beyond a few meters, whereas vibrations due to traffic and railroads might hinder the application of seismic methods. In this work, we investigate the use of the transient electromagnetic (TEM) method to resolve the geometry of aquifers in urban areas. We propose the use of relative small loops to gain separation from buried structures and increase data quality in late times as required to reach a depth of investigations of ca. 40 m. Measurements were conducted in gardens located within cities deploying single-loop as well as in-loop geometries using two different instruments. Additionally, we evaluate our small loop configuration in a quasi noise-free site through comparison to larger loops and electrical methods. Analysis of the data demonstrate that relative small loops (12.5 m x 12.5 m) may be a possible solution to gain information in urban areas down to a depth of 30 m, yet a minimal separation to anthropogenic structures of ca. 5 m is required. Information at such depth can not be easily gain with refraction seismic or electrical resistivity tomographic measurements in such small areas. Moreover, our results reveal the possibility to gain similar information with smaller loops (6.25 m x 6.25 m), offering the possibility to increase the separation to sources of noise (i.e., buried infrastructure) and increase the data quality. The inversion of TEM measurements collected along a 100 m profile permitted to obtain vertical and lateral variations in aquifer geometry with a maximal depth of investigation of ca. 40 m, while DC-resistivity measurements in the same profile were limited to less than 10 m depth. Stochastic inversion of the data permitted to investigate the uncertainty in the obtained model parameters (resistivity and thickness of the resolved layers, i.e., aquifer).

How to cite: Flores Orozco, A., Aigner, L., and Ferk, J.: Evaluating the applicability of transient electromagnetic (TEM) data to characterize aquifer geometry in urban areas, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-10657, https://doi.org/10.5194/egusphere-egu23-10657, 2023.

EGU23-11724 | PICO | GI3.2

The IceWorm: an improved low-cost, low-power sensor for measuring dissolved CH4 in water bodies 

Jesper Christiansen, Sarah Elise Sapper, and Christian Juncher Jørgensen

Recent studies show emissions of dissolved methane (CH4) in the meltwater from the Greenland Ice Sheet. To better understand the phenomenon and evaluate its potential significance for the Arctic CH4 budget, continuous long-term measurements of dissolved CH4 concentrations are needed. Commercially available dissolved CH4 analyzers (DGEU-UGGA (LGR), CONTROS HydroC CH4 (Kongsberg) and Mini CH4 (pro-Oceanus)) generally have high power consumption and are very costly, limiting their operation in remote off-grid locations.

Here we present calibrations and field tests of a low-cost, low-power alternative – the "IceWorm" - for long-term monitoring of dissolved CH4. The IceWorm uses a Figaro TGS2611-E00 metal oxide sensor (MOS). While MOS are cheap and power efficient, a known drawback is the sensitivity of the sensor's resistance to changes in humidity and temperature. In a previous prototype, we showed that by encasing the MOS in a hydrophobic and gas-permeable silicone membrane, a constant humidity in the headspace around the sensor can be achieved, yielding consistent results when deployed in glacial meltwater at constant temperature (0.0 – 0.1˚C)1. In this updated version, the sensor was encased in a hydrophobic and gas-permeable Teflon membrane allowing for fast (~1 min) equilibrium between the water and headspace around the sensor and hence a rapid detection of changes in dissolved CH4 concentrations.

The first calibration was performed by exposing the IceWorm to stepwise increasing Two field calibrations of the sensor performance in meltwater at 0.0˚C were done: Afterwards, the sensors remained in the field for several weeks in the subglacial meltwater stream and the sensors were recalibrated in lab air under the same conditions to check for long-term sensor drift. Initially, field calibrated to measure dissolved CH4 in glacial meltwater at 0.0˚C, the IceWorm was also tested in a freshwater surface stream at temperatures between 1.6 – 15.7˚C. To account for the temperature difference, we compared the laboratory and field calibrations allowing us to correct the sensor output to temperature variations in the stream.

We will present time series of long-term measurements of dissolved CH4 in two different types of water bodies and discuss the promising performance of the sensor at temperatures different to stable 0˚C as well as the usability of in-air calibrations compared to the field calibrations with discrete samples.

1. Sapper et al. (2022) DOI:10.5194/egusphere-egu22-9972

How to cite: Christiansen, J., Sapper, S. E., and Juncher Jørgensen, C.: The IceWorm: an improved low-cost, low-power sensor for measuring dissolved CH4 in water bodies, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-11724, https://doi.org/10.5194/egusphere-egu23-11724, 2023.

EGU23-12141 | PICO | GI3.2

A framework for cost-effective enrichment of water demand records at fine spatio-temporal scales 

Panagiotis Kossieris, Ioannis Tsoukalas, and Christos Makropoulos

Residential water demand is a key element of urban water systems, and hence its analysis, modelling and simulation is of paramount importance to feed modelling applications. During the last decades, the advent of smart metering technologies has released new streams of high-resolution water demand data, allowing the modelling of demand process at fine spatial (down to appliance level) and temporal (down to 1 sec) scales. However, high-resolution data (i.e., lower than 1 min) remains limited, while longer series at coarser resolution (e.g., 5 min or 15 min) do exist and are becoming increasingly more available, while the metering devices with such sampling capabilities have potential for a wider deployment in the near future. This work attempts to enrich the information at fine scales addressing the issue of data unavailability in a cost-effective way. Specifically, we present a novel framework that enables the generation of synthetic (yet statistically and stochastically consistent) water demand records at fine time scales, taking advantage of coarser-resolution measurements. The framework couples: a) lower-scale extrapolation methodologies to provide estimations of the essential statistics (i.e., probability of no demand and second-order properties) for model’s setup at fine scales, and b) stochastic disaggregation approaches for the generation of synthetic series that resamples the regime of the process at multiple temporal scales. The framework, and individual modules, are demonstrated in the generation of 1-min synthetic water demands at the household level, using 15 min data from the available smart meter.

How to cite: Kossieris, P., Tsoukalas, I., and Makropoulos, C.: A framework for cost-effective enrichment of water demand records at fine spatio-temporal scales, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-12141, https://doi.org/10.5194/egusphere-egu23-12141, 2023.

EGU23-14097 | PICO | GI3.2

Geofluids inferences using deep electrical resistivity tomography for a sustainable energy transition 

Valeria Giampaolo, Luigi Capozzoli, Gregory De Martino, Vincenzo Lapenna, Giacomo Prosser, Fabio Olita, Paola Boldrin, and Enzo Rizzo

In the last years, the use of Deep Electrical Resistivity Tomography (DERT) has become more common for the investigation of areas with complex geological setting. The considerable resolution obtained through such a technique makes it possible to discriminate much more effectively the resistivity contrasts existing in the shallower crustal levels, thus providing more reliable information on the physical conditions of the rocks, the presence of structural discontinuity surfaces, on the presence and trend in the subsoil of aquifers and/or fluids of various origins.

For these reasons, some DERT investigations were carried out in a structurally complex area located close to Tramutola village, in the western side of the Agri Valley, where the largest onshore hydrocarbon reservoir in west Europe is present.

The Tramutola site represented a key sector for the early petroleum exploration and exploitation of the area. Natural oil spills were historically known since the 19th century in the investigated area, and these helped the national oil company to identify the first shallower hydrocarbon traces. Furthermore, a considerable amount of sulphureous hypothermal water (~28 °C with a flow rate of 10 l/s) with associated gases (mainly CH4 and CO2) was found during the drilling of the “Tramutola2” well (404.4 m) in 1936. From a geological point of view, the study area, is characterized by the presence of a complete section of the tectonic units of the southern Apennines and a complex structural framework, not yet fully clarified, which affect fluids circulation.

To foster the efficient and sustainable use of the geothermal resource in Tramutola area, surface and subsurface geological, hydrogeological and new geophysical data were combined in order deepen our knowledges about the reservoir of the hypothermal fluids and their circulation.

The municipality of Tramutola is interested in the rehabilitation of the abandoned oil wells, both in terms of exploitation of the geothermal resource and for the realisation of a tourist “Park of energy”. The aim is to provide a wide audience with strategies, models, and technical skills capable of making visitors more active and critical towards the sustainable use of energy resources. Furthermore, the possible exploitation of geothermal resources of the Tramutola site represents a strategic action in the Basilicata region as a prototype of energy transition from fossil fuels to more environmentally friendly energy resources. This is also essential to satisfy the increased demand for clean energy in the area (no. 7 affordable and clean energy United Nations’ SDGs) and also contribute to climate change mitigation through the reduction of CO2 emissions (13 climate action).

How to cite: Giampaolo, V., Capozzoli, L., De Martino, G., Lapenna, V., Prosser, G., Olita, F., Boldrin, P., and Rizzo, E.: Geofluids inferences using deep electrical resistivity tomography for a sustainable energy transition, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-14097, https://doi.org/10.5194/egusphere-egu23-14097, 2023.

EGU23-14977 | PICO | GI3.2

A low-cost novel optical sensor for water quality monitoring 

Sean Power, Louis Free, Chloe Richards, Ciprian Briciu-Burghina, Adrian Delgado Ollero, Ruth Clinton, and Fiona Regan

With increasing environmental pressure due to global climate change, increases in global
population and the need for sustainable obtained resources, water resources management
is critical. In-situ sensors are fundamental to the management of water systems by providing
early warning, forecasting and baseline data to stakeholders. To be fit-for-purpose,
monitoring using in-situ sensors has to be carried out in a cost effective way and allow
implementation at larger spatial scales. If networks of sensors are to become not only a
reality but common place, it is necessary to produce reliable, inexpensive, rugged sensors
integrated with data analytics.


In this context, the aim of this project was to design and develop a low cost, robust and
reliable optical sensor which capable of continuous measurement of chemical and physical
parameters in aquatic environments. An iterative engineering design method cycling
between sensor design, prototyping and testing was used for the realisation and optimisation
of the sensor. The sensor can provide absorption, scatter, and fluorescence readings over a
broad spectral range (280nm to 850nm) and temperature readings in real-time using a suite
of optical sensors (CMOS Spectrometers and photodiode detector), custom designed LED
array light source and a digital temperature probe. Custom electronics and firmware were
developed to control the sensor and facilitate data transmission to an external network.
Sensor electronics are housed in a marine grade watertight housing; the optical components
are mounted inside a custom designed 3D-printed optical head which joins with the sensor
housing. The sensor is capable of measuring a range of optical parameters and temperature
in a single measurement cycle. Sensor analytical performance was demonstrated in the
laboratory, for detection and quantification of turbidity using analytical standards and in the
field by comparison with a commercially available multi- parameter probe (YSI, EXO 2).
The laboratory and field trials demonstrate that the sensor is fit-for-purpose and an excellent
tool for early warning monitoring by providing high frequency time-series data, operate
unattended in-situ for extended periods of times and capture pollution events.

Acknowledgement - This research is carried out with the support of Project Ireland’s 2040’s
Disruptive Technologies Innovation Fund.

How to cite: Power, S., Free, L., Richards, C., Briciu-Burghina, C., Delgado Ollero, A., Clinton, R., and Regan, F.: A low-cost novel optical sensor for water quality monitoring, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-14977, https://doi.org/10.5194/egusphere-egu23-14977, 2023.

EGU23-15083 | PICO | GI3.2

Gedi Data Within Google Earth Engine: Potentials And Analysis For Inland Surface Water Monitoring 

Alireza Hamoudzadeh, Roberta Ravanelli, and Mattia Crespi

Inland surface water is the source of about 60% and a key component of the hydrological cycle. The monitoring of inland surface water is fundamental to understanding the effects of climate change on this key resource and preventing water stresses. Water levels traditionally measured by ground instruments like gauge stations are expensive and have high maintenance costs. Conversely, Earth Observation technologies can nowadays collect frequent and regular data with continuous monitoring of water reservoirs, reducing monitoring costs.

 

With the availability of new data, the need for a capable computation tool is crucial. Google Earth Engine (GEE), a cloud-based computation platform capable of integrating a high variety of datasets with powerful analysis tools [1], has recently added the Global Ecosystem Dynamics Investigation (GEDI) [4] to its wide archive. 

 

The GEDI [2] instrument, hosted onboard the International Space Station,  is a geodetic-class, light detection and ranging (LiDAR) system, having a 25 m spot (footprint) on the surface over which 3D structure is measured. The footprints are separated by 60 m along-track, with an across-track distance of about 600 m. The measurements are made over the Earth's surface nominally between the latitudes of 51.6° and -51.6°. GEDI was originally developed to enable radically improved quantification and understanding of the Earth’s carbon cycle and biodiversity. 

 

The available literature highlights that the quality of GEDI data is variable and impacted by several factors (e.g., latitude, orbit). Our preliminary analysis is focused on the accuracy assessment of the GEDI data, at first addressing the problem of outliers detection and removal, and secondly comparing the water levels measured by GEDI with reference ground truth; thus, we considered four lakes in Northern Italy for which level measurements from gauge stations are available.

The proposed outlier detection consists of two steps for each GEDI passage over water surfaces.

The first step is based on two flags implanted within GEDI bands. Specifically, the “quality_flag” indicates if the considered footprint has valid waveforms (1=valid, 0=invalid), due to anomalies in the energy, sensitivity, and amplitude of signals; the “degrade_flag” indicates the degraded state of pointing (saturation intensity of returned photons might reduce the accuracy of measurements) and/or positioning information (GPS data gap, GPS receiver clock drift).

The second step relies on the robust version of the standard 3σ test, implemented considering the NMAD (Normalized Median Absolute Deviation): every GEDI measurement not within -/+3*NMAD from the median is removed as outlier.

To assess the outlier detection procedure and to preliminarily evaluate the accuracy of the GEDI data, we compared the water levels inferred from the median of GEDI measurements after outlier removal with the contemporary water levels from hydrometric stations at four major lakes (Como, Garda, Iseo, Maggiore) in Northern Italy [3]. The comparison is ongoing over the period from GEDI activation until June 2022, for 3 years.

References

[1] Cardille, et al., 2022. Cloud-Based Remote Sensing with Google Earth Engine.

[2] Dubayah, et al., 2021. GEDI L3 gridded land surface metrics, version 1

[3] Enti Regolatori dei Grandi Laghi, 2022. Home Page - Laghi. www.laghi.net.

[4] University of Maryland, 2022. GEDI ecosystem lidar

How to cite: Hamoudzadeh, A., Ravanelli, R., and Crespi, M.: Gedi Data Within Google Earth Engine: Potentials And Analysis For Inland Surface Water Monitoring, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-15083, https://doi.org/10.5194/egusphere-egu23-15083, 2023.

EGU23-15274 | PICO | GI3.2 | Highlight

Cost-effective full monitoring system for long-term measurements in lake ecosystems 

Daniele Strigaro, Massimiliano Cannata, Camilla Capelli, and Fabio Lepori

The concomitance of climate changes and human activities effects is a mix of co-factors that can induce unknown dynamics and feedbacks which need to be studied and monitored. Lakes are one of the most affected natural resources. Due to their importance for economy, water supply, tourism it is essential to safeguard their health. Unfortunately, lake monitoring is dominated by very high costs of materials and by proprietary solutions that are a barrier for data interoperability. To this end, an integrated system which uses as much open source licensed technology as possible and is open source itself will be presented. The main idea is to create a complete pipeline that can integrate different data sources by means of processes that can make the time series organized and accessible and then be served via standard services. Data integration allows further analysis of the data to produce new time series either by manual or automatic processes. This proposition also includes the creation of an Automatic High-Frequency Monitoring (AHFM) system built using cost-effective principles and meeting open design requirements. The preliminary results and the applications of this solution will be described such as the calculation of the primary production and the quasi real-time detection of algal blooms. The study area where this system has been developed and tested is Lake Lugano in the southern part of Switzerland, which is a very productive lake affected by climate changes effects. The developed system permits the integration of the historical data measured with the traditional campaigns on the lake with new datasets collected with innovative technologies so that the comparison and validation of datasets can be more easily performed. In this way it is possible to detect biases and create automatic data pipelines to calculate indicators and notify alerts. 

How to cite: Strigaro, D., Cannata, M., Capelli, C., and Lepori, F.: Cost-effective full monitoring system for long-term measurements in lake ecosystems, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-15274, https://doi.org/10.5194/egusphere-egu23-15274, 2023.

EGU23-15814 | ECS | PICO | GI3.2 | Highlight

Low-cost in-situ sensor networks for soil moisture and water table measurements: experiences and recommendations. 

Ciprian Briciu-Burgina, Jiang Zhou, Muhammad Intizar Ali, and Fiona Regan

Soil moisture is an essential parameter for irrigation management, transport of pollutants and estimation of energy, heat, and water balances. Soil moisture is one of the most important soil spatial-temporal variables due to the highly heterogeneous nature of soils which in turn drives water fluxes, evapotranspiration, air temperature, precipitation, and soil erosion. Recent developments have seen an increasing number of electromagnetic sensors available commercially for soil volumetric water content (θ) and their use is expanding providing decision support and high-resolution data for models and machine learning algorithms.

In this context, two demonstrations of in-situ LoRaWAN sensor networks are presented. The 1st one is from a grassland site, Johnstown Castle, Wexford Ireland where a network of 10 low-cost soil moisture (SM) sensors has been operating for 12 months. The 2nd network has been operating for 6 months at a peatland site (Cavemount Bog, Offaly, Ireland) which is currently undergoing a rehabilitation process through re-wetting. At this site, in addition to SM sensors, ultrasonic sensors are used for continuous measurement of the water table at 7 locations. For both sites, the analytical performance of the SM sensors has been determined in the laboratory, through calibrations in liquids of known dielectric permittivity and through field validation via sample collection or time domain-reflectometry instrumentation (TDR). Experiences and recommendations in deploying, maintaining, and servicing the sensor networks, and data management (cleaning, validation, analysis) will be presented and discussed. Emphasis will be placed on the key learnings to date and the performance of the low-cost sensor networks in terms of collected data.

Small-scale sensor networks like these are expected to bridge the gap between the low spatial resolution provided by the satellite-derived products and the single point/field measurements. Within the project, the sensor network will provide spatial observations to complement existing fixed point measurements. It will allow researchers to investigate SM dynamics at field scale in response to different soil types, soil density, elevation, and land cover.

How to cite: Briciu-Burgina, C., Zhou, J., Ali, M. I., and Regan, F.: Low-cost in-situ sensor networks for soil moisture and water table measurements: experiences and recommendations., EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-15814, https://doi.org/10.5194/egusphere-egu23-15814, 2023.

EGU23-16053 | ECS | PICO | GI3.2 | Highlight

Multiparametric water quality sensor based on carbon nanotubes: Performance assessment in realistic environment 

Balakumara Vignesh M, Stéphane Laporte, Yan Ulanowski, Senthilmurugan Subbiah, and Bérengère Lebental

Good quality water is crucial to most developing nations' sustainability. However, there is a clear lack of affordable and reliable solutions to monitor water quality. According to the WHO 2022 Sustainable Development Goals report, about 3 billion people do not have information on their water quality. While off-line measurements are commonly practiced, the availability of in-situ monitoring solutions is considered critical to the generalization of water monitoring, but current technologies are bulky, expensive and usually do not target  a sufficient number of quality parameters. [1]

To meet this challenge, the LOTUS project (https://www.lotus-india.eu/) brings forward a low-cost, compact, versatile multiparametric chemical sensor aiming at real-time monitoring of chlorine, pH, temperature and conductivity in potable water. The proposed solution –a tube of 21.2 cm in length by 3.5 cm in diameter – is composed of a replaceable sensor head incorporating the sensing elements and a sensor body containing the acquisition and communication electronics. The sensor head integrates a 1cm² silicon chip with 2 temperature sensors (serpentine-shaped thermistors), 3 conductivity sensors (parallel electrodes in a 4-probe configuration) and a 10x2 sensor array of multi-walled carbon nanotube (CNT) chemistors. The CNT are arranged in random networks between interdigitated electrodes and are either non-functionalized or functionalized with a dedicated polymer. [1]

We evaluated the performance of 7 units of this solution in Sense-city facility (located at University Gustave Eiffel, France - https://sense-city.ifsttar.fr/ ),  exploiting its 44m potable water loop with 93.8-mm PVC pipes. The system was operated at 25 m3/h and 1 bar, at temperature ranging between 15°C and 20°C, conductivity between 870 µS/cm and 1270 µS/cm; and chlorine between 0 and 5 mg/L. Because of the high-level of electromagnetic interferences in Sense-City and limited shielding of the acquisition system, the sensor signal is severely noisy and various steps of denoising are required. From the initial dataset were extracted a small number of devices and time periods with both sufficient variations in the target parameters and manageable level of signal-over-noise ratio. 

For chip 141, over 150hours of testing, CNT-based chemistors showed sensitivity to pH and active chlorine (HClO) with differentiated response between functionalized and non-functionalized devices. However, pH and chlorine can only be estimated with MAE respectively 0.17 and 0.18mg/L due to the high noise level. Over 400h, with chip 141, the real-time temperature of the water can be estimated with an MAE of 0.4°C in flowing water and 0.1°C  in static water. The chip 141 dataset did not feature enough conductivity variation to assess performances. This was achieved on chip AS001 with an MAE of 176.2 µS/cm over 80 hours.

Overall, these results provide a preliminary proof of operation of the solution in realistic environment, with the high noise level being a major limitation. A new version of system is being designed to reduce the noise, to be tested in Sense-City in 2023.

[1] Cousin, P. et al. (2022). Improving Water Quality and Security with Advanced Sensors and Indirect Water Sensing Methods. Springer Water. https://doi.org/10.1007/978-3-031-08262-7_11

How to cite: Vignesh M, B., Laporte, S., Ulanowski, Y., Subbiah, S., and Lebental, B.: Multiparametric water quality sensor based on carbon nanotubes: Performance assessment in realistic environment, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-16053, https://doi.org/10.5194/egusphere-egu23-16053, 2023.

EGU23-16460 | PICO | GI3.2

Using hard and soft data from direct and indirect methods to develop a model for the investigation of a metamorphic aquifer 

Salvatore Straface, Francesco Chidichimo, Michele De Biase, and Francesco Muto

In Italy, despite large areas of the country being covered by metamorphic rocks, the hydrogeological properties of these formations are not yet well known. The productivity of metamorphic aquifers is generally lower than the more common ones such as alluvial or carbonates. However, in some Mediterranean areas such as in Calabria region the scarcity of water resources and their considerable extension (metamorphic aquifers make up 39% of the total) determines a request for further studies either on their hydrodynamic properties and their hydraulic behaviour in order to achieve their sustainable exploitation. Interest in these metamorphic aquifers becomes ever greater if climate changes are considered. The purpose of this study is to provide the geological-structural and hydrogeological modeling of a metamorphic aquifer, through the measurement of direct and indirect data and the application of a numerical model, in a large area of the Sila Piccola, in Calabria. To recognize and characterize the geometries of the aquifer in metamorphic rocks in a complex geological setting, data on springs, wells and piezometers installed in boreholes and located at various depths were collected. These surveys were implemented by geoelectric tomography profiles and by geognostic investigations. The recognition of the geometries and above all the stratigraphic relationships between the various outcropping rocks and lithological units have been accompanied by macrostructural and meso-structural analysis to better evaluate the state of fracturing of the rock mass. The characterization of hydrodynamic properties in crystalline-metamorphic aquifers, that is constituted by granite and metamorphic rocks, is extremely complex given the lateral-vertical anisotropies. Among the main fractures there is a network of secondary connections of different order and degree which determines a continuous variation of these properties at different scales and defines the modality and direction of the groundwater flow. The MODFLOW-2005 groundwater model was used to simulate the flow phenomena in the aquifer, obtaining hydraulic conductivity values of 2.7 × 10-6 m / s, corresponding to two orders of magnitude higher than that calculated with the slug-tests inside the slope. In summary, the mathematical model was able to estimate the equivalent permeability of the aquifer and the presence of a lateral recharge from a neighboring deep aquifer that materializes a significant water supply.

How to cite: Straface, S., Chidichimo, F., De Biase, M., and Muto, F.: Using hard and soft data from direct and indirect methods to develop a model for the investigation of a metamorphic aquifer, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-16460, https://doi.org/10.5194/egusphere-egu23-16460, 2023.

Currents transport the sediment discharge of the Amazon River as far as the Orinoco Delta (Venezuela).  The combined actions of waves (predominately from the NE) and the Guiana Current create mud banks of 30 km in width.  A continuous process of mud erosion and accretion propagates the mud banks westward  

The talk demonstrates tracking the mud banks with satellite-derived bathymetry (SDB).  The SDB method used here is not the familiar Lyzenga bottom radiance to depth inversion which works only in clear waters.  Here there is no bottom visibility.  Instead, the SDB uses the interaction of ocean waves with the bottom.  Ocean waves exhibit refraction, slower celerity, and reduced wavelength as they ‘feel’ the bottom.   These phenomena are observable regardless of water turbidity.

WKB has been successfully implemented with X-band radars on coastal towers and ships (by German and UK researcher groups); and with the WorldView and Pleiades satellites (by this author and others).  However, all these sensor modalities have small ground footprints (~10 km2 to 100 km2).

The European Sentinel-2 satellites have dramatically increased WKB coverage to a regional scale.  This talk presents a Sentinel-2 view of the 1500 km muddy coastline, extending up to 50 km offshore (a total area of 75,000 km2).    

The leap in WKB possibilities was made possible by a 220 km image swath, repeat visits every five days, and the free distribution of the images from the Copernicus portal.

How to cite: Abileah, R.: Tracking mud banks on the 1500 km coastline from the Amazon to the Orinoco Delta, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-16881, https://doi.org/10.5194/egusphere-egu23-16881, 2023.

EGU23-1981 | ECS | Posters on site | GI3.3

Total stratospheric bromine inferred from balloon-borne solar occultation bromine oxide (BrO) measurements using the new TotalBrO instrument 

Karolin Voss, Philip Holzbeck, Ralph Kleinschek, Michael Höpfner, Gerald Wetzel, Björn-Martin Sinnhuber, Klaus Pfeilsticker, and André Butz

Halogenated organic and inorganic compounds, in particular those containing chlorine, bromine and iodine are known to contribute to the global ozone depletion as well as directly and indirectly to climate forcing. As a result of the Montreal Protocol (1987), the chlorine and bromine loadings of the stratosphere are closely monitored, while the role of iodinated compounds to the stratospheric ozone photochemistry is still uncertain.

To address the questions concerning bromine and iodine compounds, a compact solar occultation instrument (TotalBrO) has been specifically designed to measure BrO, IO (iodine oxide) and other UV/Vis absorbing gases by means of Differential Optical Absorption Spectroscopy (DOAS) from aboard a stratospheric balloon. The instrument (power consumption < 100 W) comprises of an active camera-based solar tracker (LxWxH ~ 0.40 m x 0.40 m x 0.50 m, weight ~ 12 kg) and a spectrometer unit (LxWxH ~ 0.45 m x 0.40 m x 0.40 m, weight ~ 25 kg). The spectrometer unit houses two grating spectrometers which operate in vacuum and under temperature stabilization by an ice-water bath.

We discuss the performance of the TotalBrO instrument during the first two deployments on stratospheric balloons launched from Kiruna in August, 2021 and from Timmins in August, 2022 within the HEMERA program. Once the balloon gondola was azimuthally stabilized the solar tracker was able to follow the sun with a 1σ precision lower than 0.02° up to solar zenith angles (SZAs) of 95°. The spectral retrieval (of 46 spectra acquired at SZA between 84° and 90°) allowed us to infer the BrO mixing ratio above 32 km altitude. The total bromine in the middle stratosphere is inferred by accounting for the BrO/Bry partitioning derived from a photochemical model.

How to cite: Voss, K., Holzbeck, P., Kleinschek, R., Höpfner, M., Wetzel, G., Sinnhuber, B.-M., Pfeilsticker, K., and Butz, A.: Total stratospheric bromine inferred from balloon-borne solar occultation bromine oxide (BrO) measurements using the new TotalBrO instrument, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-1981, https://doi.org/10.5194/egusphere-egu23-1981, 2023.

EGU23-2923 | ECS | Posters on site | GI3.3

Total organic carbon measurements reveal large discrepancies in reported petrochemical emissions 

Megan He, Jenna Ditto, Lexie Gardner, Jo Machesky, Tori Hass-Mitchell, Christina Chen, Peeyush Khare, Bugra Sahin, John Fortner, Katherine Hayden, Jeremy Wentzell, Richard Mittermeier, Amy Leithead, Patrick Lee, Andrea Darlington, Junhua Zhang, Samar Moussa, Shao-Meng Li, John Liggio, and Drew Gentner

Oil sands are a prominent unconventional source of petroleum. Total organic carbon measurements via an aircraft campaign (Spring-Summer 2018) revealed emissions above Canadian oil sands exceeding reported values by 1900-6300%. The “missing” compounds were predominantly intermediate- and semi-volatile organic compounds, which are prolific precursors to secondary organic aerosol formation. 

Here we use a novel combination of aircraft-based measurements (including total carbon emissions measurements) and offline analytical instrumentation to characterize the mixtures of organic carbon and their volatility distributions above oil sands facilities. These airborne, real-time observations are supplemented by laboratory experiments identifying substantial, unintended emissions from waste management practices, emphasizing the importance of accurate facility-wide emissions monitoring and total carbon measurements to detect potentially vast missing emissions across sources.

Detailed chemical speciation confirms these observations near both surface mining and in-situ facilities were oil sands-derived, with facility-wide emissions around 1% of extracted petroleum—a comparable loss rate to natural gas extraction. Total emissions, spanning extraction through waste processing, were equivalent to total Canadian anthropogenic emissions from all sources. These results demonstrate that the full air quality and environmental impacts of oil sands operations cannot be captured without complete coverage of a wider volatility range of emissions.

How to cite: He, M., Ditto, J., Gardner, L., Machesky, J., Hass-Mitchell, T., Chen, C., Khare, P., Sahin, B., Fortner, J., Hayden, K., Wentzell, J., Mittermeier, R., Leithead, A., Lee, P., Darlington, A., Zhang, J., Moussa, S., Li, S.-M., Liggio, J., and Gentner, D.: Total organic carbon measurements reveal large discrepancies in reported petrochemical emissions, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-2923, https://doi.org/10.5194/egusphere-egu23-2923, 2023.

EGU23-3473 | Posters on site | GI3.3 | Highlight

The FAAM large atmospheric research aircraft: a brief history and future upgrades 

James Lee

The UK’s large atmospheric research aircraft is a converted BAe 146 operated by the Facility for Airborne Atmospheric Measurements (FAAM). With a range of 2000 nautical miles, the FAAM aircraft is capable of operating all over the world and it has taken part in science campaigns in over 30 different countries since 2004. The aircraft can fly as low as 50 feet over the sea and sustain flight at 100 feet high. The service ceiling is nearly 11 km high. Typically, flights will last anywhere between one and six hours, and we will carry up to 18 scientists onboard, who guide the mission and support the operation of up to 4 tonnes of scientific equipment. Currently, the aircraft is undergoing a £49 million mid-life upgrade (MLU) program, which will extend its lifetime to at least 2040. The three overarching objectives of the MLU are to:

Safeguard the UK’s research capability – allowing the facility to meet the needs of the research community, enhance the range of services available, and respond to environmental emergencies.

Provide frontier science capability – meeting new and existing research needs and supporting ground-breaking science discoveries, with a flexible and world-class airborne laboratory.

Reduce environmental impact – maintaining and improving the performance of the facility, and minimising emissions and resource use from aircraft operation.

Presented here will be a brief history of the aircraft operations, including example science outcomes from all flights all over the world. In addition, detail of the ongoing upgrades, in particular the new and cutting-edge measurement capability for gases, aerosols, clouds, radiation and meteorology. Also presented will be the expected reductions in environmental impact of the aircraft and how these will be monitored.

How to cite: Lee, J.: The FAAM large atmospheric research aircraft: a brief history and future upgrades, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-3473, https://doi.org/10.5194/egusphere-egu23-3473, 2023.

EGU23-6620 | ECS | Posters on site | GI3.3

Airborne observations over the North Atlantic Ocean reveal the first gas-phase measurements of urea in the atmosphere 

Emily Matthews, Thomas Bannan, M. Anwar Khan, Dudley Shallcross, Harald Stark, Eleanor Browne, Alexander Archibald, Stéphane Bauguitte, Chris Reed, Navaneeth Thamban, Huihui Wu, James Lee, Lucy Carpenter, Ming-xi Yang, Thomas Bell, Grant Allen, Carl Percival, Gordon McFiggans, Martin Gallagher, and Hugh Coe

Despite the reduced nitrogen (N) cycle being central to global biogeochemistry, there are large uncertainties surrounding its sources and rate of cycling. Here, we present the first observations of gas-phase urea (CO(NH₂)₂) in the atmosphere from airborne high-resolution mass spectrometer measurements over the North Atlantic Ocean. We show that urea is ubiquitous in the marine lower troposphere during the Summer, Autumn and Winter flights but was found to be below the limit of detection during the Spring flights. The observations suggest the ocean is the primary emission source but further studies are required to understand the processes responsible for the air-sea exchange of urea. Urea is also frequently observed aloft due to long-range transport of biomass-burning plumes. These observations alongside global model simulations point to urea being an important, and as yet unaccounted for, component of reduced-N to the remote marine environment.  Since we show it readily partitions between gas and particle phases, airborne transfer of urea between nutrient rich and poor parts of the ocean can occur readily and could impact ecosystems and oceanic uptake of CO2, with potentially important atmospheric implications.  

How to cite: Matthews, E., Bannan, T., Khan, M. A., Shallcross, D., Stark, H., Browne, E., Archibald, A., Bauguitte, S., Reed, C., Thamban, N., Wu, H., Lee, J., Carpenter, L., Yang, M., Bell, T., Allen, G., Percival, C., McFiggans, G., Gallagher, M., and Coe, H.: Airborne observations over the North Atlantic Ocean reveal the first gas-phase measurements of urea in the atmosphere, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-6620, https://doi.org/10.5194/egusphere-egu23-6620, 2023.

EGU23-7804 | Posters virtual | GI3.3

In-situ trace-gas measurements from the ground to the stratosphere by an OF-CEAS balloon-borne instrument 

Valery Catoire, Chaoyang Xue, Gisèle Krysztofiak, Patrick Jacquet, Michel Chartier, and Claude Robert

Monitoring climate change and stratospheric ozone budget requires accurate knowledge of the abundances of greenhouse gases and ozone depleting substances from the lower troposphere to the stratosphere. An infrared laser absorption spectrometer called SPECIES (acronym for SPECtromètre Infrarouge à lasErs in Situ) has been developed for balloon-borne trace gases measurements.

The complete instrument has been validated on the occasion of a flight in August 2021 in the polar region (Kiruna, Sweden) within the frame of the “KLIMAT 2021” campaign managed by CNES for the “MAGIC” project using concomitant balloon and aircraft flights. Results of this flight concerning CH4 and CO2 will be presented.

How to cite: Catoire, V., Xue, C., Krysztofiak, G., Jacquet, P., Chartier, M., and Robert, C.: In-situ trace-gas measurements from the ground to the stratosphere by an OF-CEAS balloon-borne instrument, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-7804, https://doi.org/10.5194/egusphere-egu23-7804, 2023.

EGU23-7986 | ECS | Posters on site | GI3.3

Ship emissions and apparent sulphur fuel content measured of board of a large research aircraft in international waters and Sulphur Emission Control Area 

Dominika Pasternak, James Lee, Beth Nelson, Magdalini Alexiadou, Loren Temple, Stéphane Bauguitte, Steph Batten, James Hopkins, Stephen Andrews, Emily Mathews, Thomas Bannan, Huihui Wu, Navaneeth Thamban, Nicholas Marsden, Ming-Xi Yang, Thomas Bell, Hugh Coe, and Keith Bower

Since 1st January 2020 the legal sulphur content of shipping fuel was decreased – from 3.5% to 0.5% by mass outside of the Sulphur Emission Control Areas (SECAs) to improve coastal air quality. A possible downside of this change was acceleration of climate change since sulphur is believed to be a negative climate forcer and sipping is one of its main sources. Further question was the level of compliance to the new rules, especially in the open waters. Another climate related aspect of shipping is recent growth in the liquified natural gas (LNG) tanker fleets. LNG is considered the greenest of the fossil fuels, however there are few empirical studies of methane emissions from marine LNG transport.

The Atmospheric Composition and Radiative forcing changes due to UN International Ship Emissions regulations (ACRUISE) project aims to address the above considerations. During three field campaigns the FAAM Airborne Laboratories’ large research aircraft was deployed to target ships in coastal shipping lanes and open waters. First measurements were performed in July 2019 (before regulation change) in shipping lanes along the Portuguese coast, the English Channel SECA and the Celtic Sea. Further two campaigns were delayed by the COVID-19 pandemic until September 2021 and April 2022, targeting ships in the Bay of Biscay, the English Channel SECA and the Celtic Sea. Throughout the project, nearly 300 ships were measured during 30 research flights, varying from plume aging and cloud interaction studies, through collecting bulk statistics in busy shipping lanes to comparing emissions in and out of SECA. This work focuses on the gaseous species measurements (SO2, CO2, CH4 and VOCs from whole air samples). They are used to study changes in apparent sulphur fuel content of the ships observed throughout ACRUISE, plume composition and methane emissions from LNG tankers.

How to cite: Pasternak, D., Lee, J., Nelson, B., Alexiadou, M., Temple, L., Bauguitte, S., Batten, S., Hopkins, J., Andrews, S., Mathews, E., Bannan, T., Wu, H., Thamban, N., Marsden, N., Yang, M.-X., Bell, T., Coe, H., and Bower, K.: Ship emissions and apparent sulphur fuel content measured of board of a large research aircraft in international waters and Sulphur Emission Control Area, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-7986, https://doi.org/10.5194/egusphere-egu23-7986, 2023.

EGU23-8329 | ECS | Posters on site | GI3.3 | Highlight

Airborne remote sensing research infrastructure for strengthening science, international collaboration and capacity building in the Arctic 

Shridhar Jawak, Agnar Sivertsen, William D. Harcourt, Rudolf Denkmann, Ilkka Matero, Øystein Godøy, and Heikki Lihavainen

Svalbard Integrated Arctic Earth Observing System (SIOS) is an international collaboration of 28 scientific institutions from 10 countries to build a collaborative research infrastructure that will enable better estimates of future environmental and climate changes in the Arctic. SIOS' mission is to develop an efficient observing system in Svalbard, share technology and data using FAIR principles, fill knowledge gaps in Earth system science and reduce the environmental footprint of science in the Arctic. This study presents SIOS' efforts to strengthen science, international collaboration and capacity building in the high Arctic archipelago of Svalbard through its airborne research infrastructure. SIOS supports the coordinated usage of its airborne remote sensing resources such as the Dornier aircraft and uncrewed aerial vehicles (UAVs) for improved research activities in Svalbard, complementing in situ and space-borne measurements and reducing the environmental footprint of research in Svalbard. Since 2019, SIOS in collaboration with its member institution Norwegian Research Centre (NORCE) installed, tested, and operationalised optical imaging sensors in the Lufttransport Dornier (DO228) passenger aircraft stationed in Longyearbyen under the SIOS-InfraNor project making it compatible with research use in Svalbard. Two optical sensors are installed onboard the Dornier aircraft; (1) the PhaseOne IXU-150 RGB camera and (2) the HySpex VNIR-1800 hyperspectral sensor. The aircraft with these cameras is configured to acquire aerial RGB imagery and hyperspectral remote sensing data in addition to its regular logistics and transport operation in Svalbard. Since 2020, SIOS has supported and coordinated around 50 flight hours to acquire airborne data using the Dornier aircraft and UAVs in Svalbard supporting around 20 scientific projects. The use of airborne imaging sensors in these projects enabled a variety of applications within glaciology, biology, hydrology, and other fields of Earth system science: Mapping glacier crevasses, generating DEMs for glaciological applications, mapping and characterising earth (e.g., minerals, vegetation), ice (e.g., sea ice, icebergs, glaciers and snow cover) and ocean surface features (e.g., colour, chlorophyll). The use of passenger aircraft warrants the following benefits: (1) regular logistics and research activities are optimally coordinated to reduce flight hours in carrying scientific observations, (2) project proposals for the usage of aircraft-based measurements facilitate international collaboration, (3) measurements conducted during 2020-21 are useful in filling the gaps in field based observations occurred due to the Covid-19 pandemic, (4) airborne data are used to train polar scientists as a part of the annual SIOS training course and upcoming data usability contest, (5) data is also useful for Arctic field safety as it can be used to make products such as high-resolution maps of crevassed areas on glaciers. In short, SIOS airborne remote sensing activities represent optimized use of infrastructure, promote capacity building, Arctic safety and facilitate international cooperation.

How to cite: Jawak, S., Sivertsen, A., Harcourt, W. D., Denkmann, R., Matero, I., Godøy, Ø., and Lihavainen, H.: Airborne remote sensing research infrastructure for strengthening science, international collaboration and capacity building in the Arctic, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-8329, https://doi.org/10.5194/egusphere-egu23-8329, 2023.

EGU23-11813 | Posters on site | GI3.3

First evaluation of a 6-months Meteodrone campaign 

Maxime Hervo, Julie Pasquier, Lukas Hammerschmidt, Tanja Weusthoff, Martin Fengler, and Alexander Haefele

 From December 2021 to May 2022, MeteoSwiss conducted a proof of concept with Meteomatics to demonstrate the capability of drones to provide data of sufficient quality and reliability on a routine operational basis. Meteodrones MM-670 were operated automatically 8 times per night at Payerne, Switzerland. 864 meteorological profiles were measured and compared to co-localized measurements including radiosoundings and remote-sensing instruments. To our knowledge, it is the first time that Meteodrone measurements are evaluated in such an intensive campaign.

The availability of the Meteodrone measurements over the whole campaign was 75.7% with 82.2% of the flights reaching the nominal altitude of 2000m above sea level. Using the radiosondes as a reference, the quality of the Meteodrone measurements can be quantified according to WMO requirements (WMO OSCAR , 2022). Applying this method, the temperature measured by the Meteodrone can be considered as a “breakthrough”, meaning that they are a significant improvement if they are used for high resolution Numerical Weather Prediction. The Meteodrone’s humidity and wind profiles are classified as “useful” for high-resolution numerical weather predictions, suggesting they can be used for assimilation in numerical models. The quality is similar compared to the temperature measured by a microwave radiometer and the humidity measured by a Raman Lidar. However, the wind measured by a Doppler Lidar was more accurate than the estimation of the Meteodrone.

This campaign opens the door for operational usage of automatic drones for meteorological applications.

How to cite: Hervo, M., Pasquier, J., Hammerschmidt, L., Weusthoff, T., Fengler, M., and Haefele, A.: First evaluation of a 6-months Meteodrone campaign, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-11813, https://doi.org/10.5194/egusphere-egu23-11813, 2023.

EGU23-13766 | ECS | Posters on site | GI3.3

How inlet tubing material affects the response time of water vapor concentration measurements 

Markus Miltner, Tim Stoltmann, and Erik Kerstel

Measurements involving water in the vapor phase have to deal with the stickiness of the H2O molecule: The associated adsorption and desorption processes can increase the response time of these measurements significantly. To achieve short response times in scientific instrument design, hydrophobic surface materials are used to reduce surface interactions in the tubing that guides the sample towards the analyzer. The study presented here focuses on the effects of the tubing material choice, length, humidity level, gas flow rate, and temperature on the observed response time. We use an Optical Feedback Cavity Enhanced Absorption Spectrometer (OFCEAS) designed for stable water isotope measurements at low water concentration (< 1000 ppm), which we connect to two bottles containing humidified synthetic air of different water concentration using 6.6-m tubing of different materials and surface treatments. Other parameters that are varied are the flow rate and the temperature of the tubing. With proper selection of tubing material and surface treatment, the contribution from the tubing to the overall response time for low water concentration isotopic measurements can be sufficiently suppressed for it to be neglected.

How to cite: Miltner, M., Stoltmann, T., and Kerstel, E.: How inlet tubing material affects the response time of water vapor concentration measurements, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-13766, https://doi.org/10.5194/egusphere-egu23-13766, 2023.

EGU23-14164 | ECS | Posters virtual | GI3.3

Multi-angular airborne thermal observations: A new hyperspectral setup for simulating thermal radiation and emissivity directionality at the satellite scale 

Mary Langsdale, Callum Middleton, Martin Wooster, Mark Grosvenor, and Dirk Schuettemeyer

Land Surface Temperature (LST) is a key parameter to the understanding and modelling of many Earth system processes. Viewing and illumination geometry are known to have significant impacts on remotely sensed retrieval of LST, particularly for heterogeneous regions with mixed components. However, it is difficult to accurately quantify these impacts, in part due to the challenges of retrieving high-quality data for the different components in a scene at a variety of different viewing and illumination geometries over a time period where the real surface temperature and sun-sensor geometries are invariant. Previous field studies have attempted this through observations with aircraft-mounted single-band thermal cameras to further understanding of real-world conditions, but these sensors have limited accuracies and cannot be used to consider the angular variability of emissivity or to simulate multi-band satellite observations.

To redress this, the National Centre for Earth Observation’s Airborne Earth Observatory (NAEO) have developed and manufactured a modified mount for their state-of-the-art commercial pushbroom longwave hyperspectral airborne sensor, the Specim AisaOWL (102 narrowband channels across the 7.6 – 12.6 µm region). When mounted in standard mode, the field-of-view of the OWL sensor is 24° (± 12°), however the modified mount enables off-nadir measurements up to 48°. This has the potential to evaluate both thermal radiation and spectral emissivity directionality up to and beyond the view angles of most thermal satellite sensors. With LST now classified as an Essential Climate Variable, this work is particularly relevant as it will help to improve the accuracy of retrievals from current and future satellites (e.g. LSTM, SBG, TRISHNA).

In this presentation, we first present an overview of the design modifications that enable these high-angle observations and preliminary results from test flights before detailing how this setup will be used in an upcoming joint ESA-NASA campaign dedicated to quantifying and simulating thermal radiation directionality over agricultural regions at the satellite scale.

How to cite: Langsdale, M., Middleton, C., Wooster, M., Grosvenor, M., and Schuettemeyer, D.: Multi-angular airborne thermal observations: A new hyperspectral setup for simulating thermal radiation and emissivity directionality at the satellite scale, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-14164, https://doi.org/10.5194/egusphere-egu23-14164, 2023.

EGU23-14187 | ECS | Posters on site | GI3.3

Aircraft observations of NH3 from agricultural sources 

Lara Noppen, Lieven Clarisse, Frederik Tack, Thomas Ruhtz, Alexis Merlaud, Martin Van Damme, Michel Van Roozendael, Dirk Schuettemeyer, and Pierre Coheur

Ammonia (NH3) is mainly emitted in the atmosphere by anthropogenic activities, especially by agriculture. Excess emissions greatly disturb ecosystems, biodiversity, and air quality. Despite our awareness of these deleterious consequences, NH3 concentrations are increasing in most industrialized countries. This underlines the need for more stringent regulations and good knowledge of the species gained through effective monitoring.

Since a decade, NH3 is monitored from space, daily and globally, with thermal infrared sounders. However, their coarse spatial resolution (above 10 km) renders accurate quantification of NH3 sources particularly challenging. Indeed, only the largest and most isolated NH3 point sources have been identified and quantified from current observations and often only by exploiting long-term averages. To address the urgent need for better constraining NH3 emissions, a new satellite, called Nitrosat, has been proposed in response to the 11th ESA’s Earth Explorer call. The mission aims at mapping simultaneously NO2 and NH3 at a spatial resolution of 500 m at a global scale. With the support of ESA, almost 30 aircraft demonstration flights took place in Europe between 2020 and 2022. These flights mapped gapless areas of at least 10 by 20 km containing various sources of NO2 and NH3 using two instruments: the SWING instrument targeting NO2 in the UV-VIS and Hyper-Cam LW measuring infrared spectra to observe NH3.

Here we present NH3 observations from campaigns performed in Italy in spring 2022. The Po Valley was the main target, as it is the largest (agricultural) hotspot of NH3 in Europe.  Despite the presence of large background concentrations in the Po Valley, we show that the infrared measurements are able to expose a multitude of local agricultural hotspots such as cattle farms. A particularly successful campaign covering the region from Vetto to Colorno demonstrates measurement sensitivity to the gradual increase of NH3 background concentrations outside and inside the Po Valley. We also discuss flights carried out further south in Italy targeting other emissions of NH3, such as those from a soda ash plant, and the emissions from a fertilizer release experiment that was organized in collaboration with a farmer. We present the measurements both at their native horizontal resolution of 4 m and downsampled at the 500 m resolution of Nitrosat.

How to cite: Noppen, L., Clarisse, L., Tack, F., Ruhtz, T., Merlaud, A., Van Damme, M., Van Roozendael, M., Schuettemeyer, D., and Coheur, P.: Aircraft observations of NH3 from agricultural sources, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-14187, https://doi.org/10.5194/egusphere-egu23-14187, 2023.

EGU23-15334 | ECS | Posters on site | GI3.3

Global measurements of cloud properties using commercial aircraft 

Gary Lloyd and Martin Gallagher

In-Service Aircraft for a Global Observing System (IAGOS) is a European research infrastructure that uses the infrastructure of commercial aviation to make in-situ measurements of the atmosphere. We present data from the cloud sensing instrument installed on these aircraft between 2011 and 2021. This includes 1000s of flights across the globe that detect the concentration of cloud particles over the range 5-75 um and this provides information about seasonal variation in cloud frequency across different parts of the globe. From these measurements we are able to estimate properties such as Liquid/Ice Water Content (LWC/IWC), The Effective Diameter (ED) and Mean Volume Diameter (MVD).

How to cite: Lloyd, G. and Gallagher, M.: Global measurements of cloud properties using commercial aircraft, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-15334, https://doi.org/10.5194/egusphere-egu23-15334, 2023.

EGU23-17533 | ECS | Posters on site | GI3.3

Synergy of active and passive airborne observations for the evaluation of the radiative impacts of aerosols. Application to the AEROCLO-SA field campaign in Namibia 

Mégane Ventura, Fabien Waquet, Gerard Brobgniez, Frederic Parol, Marc Mallet, Nicolas Ferlay, Oleg Dubovic, Philippe Goloub, Cyrille Flamant, and Paola Formenti

Aerosols have important effects on both local and global climate, as well as on clouds and precipitations. We present here some original results of the AErosol RadiatiOn and CLOud in Southern Africa (AEROCLO-sA) field campaign led in Namibia in August and September 2017. This region shows a strong response to climate change and is associated with large uncertainties in climate models. Large amounts of biomass burning aerosols emitted by vegetation fires in Central Africa are transported far over the Namibian deserts and are also detected over the stratocumulus clouds covering the South Atlantic Ocean along the coast of Namibia. Absorbing aerosols above clouds are associated with strong positive direct radiative forcing (warming) that are still underestimated in climate models (De Graaf etal.,2021). The absorption of solar radiation by absorbing above clouds may also cause a warming where the aerosol layer is located. This warming would alter the thermodynamic properties of the atmosphere, which would impact the vertical development of low-level clouds impacting the cloud top height and its brightness.

The airborne field campaign consisted in ten flights performed with the French F-20 Falcon aircraft in this region of interest. Several instruments were involved: the OSIRIS polarimeter, prototype of the next 3MI spaceborne instrument of ESA (Chauvigné etal.,2021), the LNG lidar, an airborne photometer called PLASMA, as well as fluxmeters and dropsondes used to measure thermodynamical quantities, supplemented with in situ aerosol measurements of particles size distribution.

In order to quantify the aerosols radiative impact on the Namibian regional radiative budget, we use an original approach that combines polarimeter and lidar data to derive heating rate of the aerosols. This approach is evaluated during massive transports of biomass burning particles. To calculate this parameter, we use a radiative transfer code and additional meteorological parameters, provided by the dropsondes. We will introduce, the flight of September 8, 2017, aerosol pollution was very important. Emissions and dust were carried along the Namibian coast, and an aerosol plume was observed above a stratocumulus. We will present vertical profiles of heating rates computed in the solar and thermal parts of the spectrum with this technique. Our results indicated particularly strong heating rate values retrieved above clouds due to aerosols, in the order of 8K per day, which is likely to perturbate the dynamic of the below cloud layers.

In order to validate and to quantify this new methodology, we used the flux measurements acquired during loop descents performed during dedicated parts of the flights, which provides unique measurements of flux distribution (upwelling and downwelling) and heating rates in function of the altitude.

Finally, we will discuss the possibility to apply this method to available spaceborne passive and active observations in order to provide the first estimates of heating rate profiles above clouds at global scale.

How to cite: Ventura, M., Waquet, F., Brobgniez, G., Parol, F., Mallet, M., Ferlay, N., Dubovic, O., Goloub, P., Flamant, C., and Formenti, P.: Synergy of active and passive airborne observations for the evaluation of the radiative impacts of aerosols. Application to the AEROCLO-SA field campaign in Namibia, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-17533, https://doi.org/10.5194/egusphere-egu23-17533, 2023.

EGU23-894 | Orals | ESSI4.2

Remote Sensing based Evapotranspiration Estimation and Sensitivity Analysis 

Mahesh Kumar Jat, Ankan Jana, and Mahender Choudhary

Evapotranspiration (ET) is an important factor to calculate the water loss to the atmosphere and water demand for crops. Global and regional estimates of daily evapotranspiration are essential for our understanding of the hydrologic cycle. Remote sensing methods have many advantages in estimating daily ET for a large heterogeneous area.  In the present study, the sensitivity of ET with respect to different remote sensing-derived variables has been quantified while using the energy balance algorithm for land (SEBAL) method to estimate daily ET. The sensitivity of SEBAL-based ET has been determined for NDVI, LST, albedo, and SAVI using Extended Fourier Amplitude Sensitivity Test (eFAST) method. Relative changes in ET estimates for a range ± 20% of important parameters i.e., NDVI, albedo, SAVI, and LST have been determined and the sensitivity coefficient was estimated. Further, the sensitivity of SEBAL estimated ET has been investigated for different land cover and land use classes i.e., cropland, barren land, settlement, forest, and sparse vegetation. Results show that ET is significantly sensitive to the albedo and LST, however, other LULC classes have a different level of sensitivity. For cropland, ET is sensitive to NDVI. The sensitivity coefficient also indicates a significant effect of albedo and LST on the SEBAL estimated ET. For cropland, a 20% decrease in albedo and LST resulted in a 4.24% and 4.19% reduction in ET, and a 20% increase leads to an increase in ET by 13% and 5.53%, respectively. For sparse vegetation, a 20% reduction in albedo leads to an increase in ET by 7.46% while a 20% increase in albedo may reduce the ET by 15.70%. SAVI has an inverse relationship with ET for forest, barren land, settlement, and sparse vegetation as compared to other variables. The study concludes that SEBAL estimated ET is sensitive to albedo and LST significantly. The study helps in understanding the scope of uncertainty in remote sensing-based ET estimation.

How to cite: Jat, M. K., Jana, A., and Choudhary, M.: Remote Sensing based Evapotranspiration Estimation and Sensitivity Analysis, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-894, https://doi.org/10.5194/egusphere-egu23-894, 2023.

EGU23-2551 | ECS | Posters on site | ESSI4.2

Impact of hail events on agriculture: A remote sensing-based analysis of hail damage in the context of climate change 

Vanessa Streifeneder, Daniel Hölbling, and Zahra Dabiri

In the project HAGL (“Impact of hail events on agriculture: A remote sensing-based analysis of hail damage in the context of climate change”), we analyse the effects of hail damage on agriculture. In the context of climate change and the associated increased risk of extreme weather events to society and the economy, this project deals with a locally catastrophic natural hazard that causes high costs, namely hail. Hail, combined with severe storms, causes millions of Euros of damage to agriculture every year. The influence of climate change on local weather patterns (e.g. thunderstorms) is still relatively unexplored, but early evidence points to an increase in weather patterns causing hail and an increase in hailstone sizes. In Austria, especially southeastern Styria with its various crops is frequently affected by extreme hail events. Yield losses due to hail damage can be existence-threatening for farmers, which is why an effective damage assessment is of great interest.

We aim to develop an efficient method to determine the damage to agriculture caused by hail using various remote sensing data. Through a spatial hotspot analysis, we identify regions in southeastern Styria that are particularly affected by hailstorms to test and validate our method. We perform a combined analysis of Sentinel-2 optical and Sentinel-1 synthetic aperture radar (SAR) data using object-based image analysis (OBIA) methods and different vegetation indices derived from the multispectral data as well as radar backscatter signals to detect hail damage. Finally, we aim to create a damage categorisation that could support insurance work in the event of a disaster and make it more efficient by providing a first estimation of the damage before an on-side assessment is conducted. Especially for large agricultural fields, this would save time and resources by making it possible to prioritise areas with high damage and organise the fieldwork of insurance employees accordingly.

How to cite: Streifeneder, V., Hölbling, D., and Dabiri, Z.: Impact of hail events on agriculture: A remote sensing-based analysis of hail damage in the context of climate change, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-2551, https://doi.org/10.5194/egusphere-egu23-2551, 2023.

Climate change can be described as the dominant factor all these decades concerning changes in forest phenology while, at the same time, temperature affects the development time (Barrett & Brown, 2021; X.Zhou et al., 2020; Suepa et al., 2016). Satellite image-time series data have proven their value regarding forest health and forest phenology observation. Monitoring continuous plant phenology is critical for the ecosystem at a regional and global level since the high sensitivity of vegetation life cycle to climate change; the so-called phenophases are essential biological indicators to comprehend how climate change has impacted these ecosystems and how this will change the ensuing years. (Buitenwerf, Rose, and Higgins 2015; Johansson et al. 2015).  

This study conducts a time-series analysis using the breaks for additive season and trend (BFAST) time-series decomposition algorithm, to detect possible abrupt changes in forest seasonality and the impacts of extreme climatic events on forest health, examining Sentinel-1 and Sentinel-2 data for the period 2017-2021. The backscatter coefficient from Sentinel-1, Normalised Difference Moisture Index (NDMI), Enhanced Vegetation Index (EVI), and Green Chlorophyll Index (GCI) were created by Sentinel-2 and assessed to find possible correlations between them. All the satellite time-series data derived through the Google Earth Engine platform.

The study area is the Paphos Forest, managed by the Department of Forest which could be described as a representative Mediterranean forest; thus, it is vital to monitor it because Mediterranean forests are expected to experience the first climate change in Europe. More specifically, the study focus on the Nortwest, West and Southwest areas of the Paphos Forest since the SAR images are from Ascending orbit. Moreover, Paphos forest has unspoiled vegetation, and a highly reduced number of forest wildfires have occurred in recent years, favouring the reliability of the research's results. 

 

 

Acknowledgements

The authors acknowledge the 'EXCELSIOR': ERATOSTHENES: Excellence Research Centre for Earth Surveillance and Space-Based Monitoring of the Environment H2020 Widespread Teaming project (www.excelsior2020.eu). The 'EXCELSIOR' project has received funding from the European Union's Horizon 2020 research and innovation programme under Grant Agreement No 857510, from the Government of the Republic of Cyprus through the Directorate General for the European Programmes, Coordination and Development and the Cyprus University of Technology.

How to cite: Theocharidis, C., Gitas, I., Danezis, C., and Hadjimitsis, D.: Satellite times-series analysis and assessment of the BFAST algorithm to detect possible abrupt changes in forest seasonality utilising Sentinel-1 and Sentinel-2 data. Case study: Paphos forest, Cyprus, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-2620, https://doi.org/10.5194/egusphere-egu23-2620, 2023.

EGU23-3514 | ECS | Orals | ESSI4.2

England Peat Map: The challenges of using Earth observation data and machine learning approaches at the national scale 

Alex Hamer, Sam Dixon, Christoph Kratz, Craig Dornan, Chris Miller, Michael Prince, Charlie Hart, Tom Hunt, and Andrew Webb

The world’s peatlands are our largest terrestrial carbon store whilst also providing a sustainable source of drinking water, a haven for wildlife and storing a record of our past. The England Peat Map aims to provide baseline maps for the extent, depth, and condition of peaty soils in England by 2024. This will enable targeting of future restoration, support nature recovery, improve greenhouse emissions reporting and natural capital accounting.

The maps will be created using a combination of multi-scale Earth observation imagery (satellite and airborne), existing and new ecological field survey data and machine/deep learning. Extent and depth mapping is implemented with random forest models and uses Sentinel satellite imagery and airborne LiDAR in combination with other ancillary datasets (e.g., geology and climate) for prediction. Assessment of peatland condition requires looking at these landscapes in different ways. Land cover mapping is used as a proxy for condition by targeting reflective classes for condition (e.g., Sphagnum, heather, and bare peat). Random forest and convolutional neural network (CNN) models are used in combination with Sentinel satellite imagery, aerial photography, and airborne LiDAR to produce national outputs. Mapping erosion/drainage features (grips, gullies and haggs) across the landscape is essential in understanding the underlying hydrological condition of the peatland and promising results have been achieved using CNNs with LiDAR and aerial photography. The final aspect of assessed condition is the movement of peat, also termed bog breathing, and is measured using Sentinel-1 Interferometric Synthetic Aperture Radar (InSAR). This opportunity is a result of novel in-situ peat movement cameras being installed across pilot sites to provide ground truth data.

The final maps will be released free of charge under an open UK government license, allowing wider application and new opportunities for use compared with currently available datasets. For example, these baseline maps have the potential to contribute towards national peatland monitoring to address further decline of peatland habitats and target restoration interventions to achieve cost effective results. Several challenges have occurred during the initial phase of the project such as the difficulty in licensing suitable training data and in defining what we are mapping when features lack a globally agreed definition (e.g., surface features). The talk will discuss these challenges as well as the future direction of the project and how these challenges can be overcome.

How to cite: Hamer, A., Dixon, S., Kratz, C., Dornan, C., Miller, C., Prince, M., Hart, C., Hunt, T., and Webb, A.: England Peat Map: The challenges of using Earth observation data and machine learning approaches at the national scale, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-3514, https://doi.org/10.5194/egusphere-egu23-3514, 2023.

EGU23-6396 | Posters on site | ESSI4.2

Evolution of biologically active ultraviolet doses in Cyprus 

Ilias Fountoulakis, Konstantinos Fragkos, Kyriakoula Papachristopoulou, Argyro Nisantzi, Antonis Gkikas, Diofantos Hadjimitsis, and Stelios Kazadzis

Solar ultraviolet (UV) radiation is only a very small fraction of the total solar radiation reaching the Earth's surface. Nevertheless, it is of exceptional significance for life on Earth. In the last two decades, significant trends in biologically effective doses have been reported over many mid-latitude sites, due to changes in total ozone, aerosols, and cloudiness. In the present study, reanalysis and satellite information for aerosols, clouds, and total ozone, from Copernicus Atmospheric Monitoring Service (CAMS), MIDAS (ModIs Dust AeroSol) dataset, Spinning Enhanced Visible and InfraRed Imager (SEVIRI) aboard Meteosat Second Generation (MSG) satellite, and Ozone Monitoring Instrument (OMI) aboard Aura satellite respectively, for the period 2004 - 2021 are used as inputs to a radiative transfer model and UV spectra are simulated for the island of Cyprus on fine spatial (0.05° x 0.05°) and temporal (15 mins) resolution. Effective doses for the production of vitamin D in the human skin, erythema, and DNA damage are calculated from the produced spectra. There is also an effort to attribute the changes in the UV biological doses to the corresponding changes in total ozone, aerosols, and cloudiness. The significant role of dust in the changes in UV doses over the island is also discussed.

Acknowledgments: The authors acknowledge the ‘EXCELSIOR’: ERATOSTHENES: EΧcellence Research Centre for Earth Surveillance and Space-Based Monitoring of the Environment H2020 Widespread Teaming project (www.excelsior2020.eu). The ‘EXCELSIOR’ project has received funding from the European Union’s Horizon 2020 research and innovation programme under Grant Agreement No 857510, from the Government of the Republic of Cyprus through the Directorate General for the European Programmes, Coordination and Development and the Cyprus University of Technology. The Department of Meteorology of the Republic of Cyprus is acknowledged for providing the ground-based data for the validation of the modelled quantities. 

How to cite: Fountoulakis, I., Fragkos, K., Papachristopoulou, K., Nisantzi, A., Gkikas, A., Hadjimitsis, D., and Kazadzis, S.: Evolution of biologically active ultraviolet doses in Cyprus, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-6396, https://doi.org/10.5194/egusphere-egu23-6396, 2023.

EGU23-7269 | Posters on site | ESSI4.2

Modelled-based Photosynthetically Active Radiation climatology for Cyprus: Validation with measurements and trends 

Konstantinos Fragkos, Ilias Fountoulakis, Argyro Nisantzi, Kyriakoula Papachristopoulou, Diofantos Hadjimitsis, and Stelios Kazadzis

The visible part of the surface downward solar radiation (400 – 700 nm) known as Photosynthetically Active Radiation (PAR) is a key parameter for many land process models and terrestrial applications. More specifically, it is a critical ecological factor affecting agriculture productivity, ecosystem-atmosphere energy, CO2 fluxes, canopy architecture in forest ecosystems, and the growth of phytoplankton, among others. 

Despite its high importance, PAR measurements are rather scarce and no relevant worldwide radiometric networks for this quantity, in contrast with other actinometric quantities (e.g., global horizontal irradiance), exist. For these reasons, PAR levels are mostly estimated by satellite observations and modeling techniques.   

In the current study, we present a 16-year PAR climatology over Cyprus, based on the combined use of radiative transfer (RT) models and satellite imagery. Copernicus Atmospheric Monitoring Service (CAMS) AOD and PWV, aerosol climatology of SSA and AE based on the MACv3 aerosol climatology, Ozone – OMI data for the period 2005 – 2021, are used as input to the RT model LibRadtran to obtain the clear sky PAR levels. Consequently, the CAMS Cloud Modification Factor based on MSG images will be used to derive the PAR under all sky conditions. The derived climatology has a spatial resolution of 0.05x0.05 degrees and a temporal variation of 15 minutes, as constrained by the availability of Seviri/MSG images. Finally, the quality of the retrieved climatology is assessed by comparison with ground-based PAR measurements and PAR retrievals from measurements of GHI through relevant conversion algorithms, from quantum sensors and pyranometers that are installed in selected stations of the Meteorological Service of Cyprus.

 

Acknowledgments: The authors acknowledge the ‘EXCELSIOR’: ERATOSTHENES: EΧcellence Research Centre for Earth Surveillance and Space-Based Monitoring of the Environment H2020 Widespread Teaming project (www.excelsior2020.eu). The ‘EXCELSIOR’ project has received funding from the European Union’s Horizon 2020 research and innovation programme under Grant Agreement No 857510, from the Government of the Republic of Cyprus through the Directorate General for the European Programmes, Coordination and Development and the Cyprus University of Technology. The Department of Meteorology of the Republic of Cyprus is acknowledged for providing ground-based data for validating the modelled quantities.

How to cite: Fragkos, K., Fountoulakis, I., Nisantzi, A., Papachristopoulou, K., Hadjimitsis, D., and Kazadzis, S.: Modelled-based Photosynthetically Active Radiation climatology for Cyprus: Validation with measurements and trends, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-7269, https://doi.org/10.5194/egusphere-egu23-7269, 2023.

EGU23-8315 | Orals | ESSI4.2

A coupled GIS-MCDA approach to map the feasibility of Managed Aquifer Recharge 

Anis Chekirbane, Constantinos F. Panagiotou, Aloui Dorsaf, and Stefan Catalin

Managed aquifer recharge (MAR) is a water resource management technique that involves the intentional recharge and storage of water into groundwater systems. MAR is considered an innovative nature-based solution for increasing water availability, improving water quality, and reducing surface water runoff. However, the feasibility of implementing MAR projects depends on several factors, for example recharge water availability, water demand, and the intrinsic site characteristics (e.g., geology, hydrogeology) of the area.

The current study proposes an adapted approach of MAR feasibility mapping through the integration of GIS and multi-criteria decision analysis (GIS-MCDA). The geospatial feasibility of MAR application is evaluated by considering the suitability maps of four thematic layers, namely intrinsic, water availability, non-physical and water demand.  The applicability of this approach is demonstrated in Enfidha plain (NE Tunisia), for which multiple types of spatial and temporal datasets have been collected.   The selection of the criteria is done based on literature studies and MAR experts’ opinions with respect to their relevance to MAR implementation, whereas the weights are determined using analytical hierarchy process (AHP). Hence, an intrinsic suitability map was established via the integration of ArcGIS software and MCDA in a web-based platform, called INOWAS (https://inowas.com/). The results suggest that more than 80% of the total plain area is considered intrinsically suitable for MAR implementation.  The potential MAR feasibility of the demonstration site is expected to be established by overlaying the suitability maps of the three thematic layers.

In addition to standardizing the process of MAR feasibility, the derived maps constitute an asset in the process of planning and implementing effective MAR projects that help to ensure the long-term sustainability of water resources in the Sahel region of Tunisia.

Acknowledgement

This work is funded by National Funding Agencies from Germany (Bundesministerium für Bildung und Forschung – BMBF), Cyprus (Research & Innovation Foundation – RIF), Portugal (Fundação para a Ciência e a Tecnologia – FCT), Spain (Ministerio de Ciencia e Innovación – MCI) and Tunisia (Ministère de l’Enseignement Supérieur et de la Recherche Scientifique – MESRS) under the Partnership for Research and Innovation in the Mediterranean Area (PRIMA). The PRIMA programme is supported under Horizon 2020 by the European Union’s Framework for Research and Innovation.

How to cite: Chekirbane, A., F. Panagiotou, C., Dorsaf, A., and Catalin, S.: A coupled GIS-MCDA approach to map the feasibility of Managed Aquifer Recharge, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-8315, https://doi.org/10.5194/egusphere-egu23-8315, 2023.

EGU23-10910 | Posters on site | ESSI4.2

Spatio-temporal prediction of aerosol optical thickness using machine learning and spatial analysis techniques 

Seonghun Pyo, Kwonho Lee, and Seunghan Park

Emission sources, meteorology, and topography are the major factors that make it difficult to predict aerosols in space and time. In this study, the moderate resolution imaging spectro-radiometer (MODIS) aerosol optical thickness (AOT) and the surface meteorology observed in Korea have been used to predict spatio-temporal AOT by using the machine learning with spatial analysis techniques. This method enables timeseries based prediction and spatial distribution modeling, and allows modeling values where there are no observation points. The model results show root mean square error (RMSE) 0.33 which is smaller than the standard deviation of the observed value 0.43. Using this technique, the trend of aerosol change in the future was estimated, and it was found that the aerosol in the area of interest decreased by about 7.4%. The methodology will be useful to analyze the regional scale aerosol evaluations, air quality, and climate study.

 

Acknowledgement

This research was supported by Basic Science Research Program through the National Research Foundation of Korea(NRF) funded by the Ministry of Education(NRF-2019R1I1A3A01062804)”

How to cite: Pyo, S., Lee, K., and Park, S.: Spatio-temporal prediction of aerosol optical thickness using machine learning and spatial analysis techniques, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-10910, https://doi.org/10.5194/egusphere-egu23-10910, 2023.

EGU23-12336 | ECS | Posters on site | ESSI4.2

Assessment of airborne remote sensing data for high-resolution mapping of invasive Prosopis spp. in a semi-arid environment in Kenya 

Ilja Vuorinne, Janne Heiskanen, Ian Ocholla, Rose Kihungu, and Petri Pellikka

Invasive alien plant species are a major global problem threatening biodiversity and livelihoods and their mapping is needed for understanding their distribution dynamics, and for facilitating control and eradication measures. Prosopis spp., a fast-growing woody species native to South America, have been widely introduced into the tropics to restore degraded areas, but they have spread uncontrollably. For example, in East Africa, Prosopis spp. have invaded rangelands and thus decreased plant diversity and affected the livelihoods of pastoral communities. Remote sensing instruments mounted on an aircraft can be used to map such species and especially a combination of different sensors holds a potential for accurate detection.

The objective of this study was to test how a combination of airborne light detection and ranging (LiDAR), hyperspectral, and fine resolution multispectral data can be used to map Prosopis spp. in a semi-arid environment in Kenya. The remotely sensed spectral, structural, and textural features were used in a one-class machine learning algorithms to detect these species in a complex landcover. The results provide information on the use of different airborne remote sensing instruments and their combination in mapping woody alien invasive species and offer insights on the distribution of Prosopis spp. in the study area.

How to cite: Vuorinne, I., Heiskanen, J., Ocholla, I., Kihungu, R., and Pellikka, P.: Assessment of airborne remote sensing data for high-resolution mapping of invasive Prosopis spp. in a semi-arid environment in Kenya, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-12336, https://doi.org/10.5194/egusphere-egu23-12336, 2023.

EGU23-12887 | ECS | Orals | ESSI4.2

Demonstrating the enhanced research capacity of the ERATOSTHENES Centre of Excellence for detecting ground displacements in Cyprus using advanced SAR satellite image processing techniques 

Kyriaki Fotiou, Christos Theocharidis, Maria Prodromou, Stavroula Alatza, Alex Apostolakis, Athanasios V. Argyriou, Thomaida Polydorou, Constantinos Loupasakis, Charalampos Kontoes, Diofantos Hadjimitsis, and Marios Tzouvaras

In the last few years, the consequences of the active landslides that occurred in Cyprus have determined the necessity to provide a systematic displacement monitoring system of different areas using satellite-based techniques. Earth Observation and more specifically satellite remote sensing techniques using Synthetic Aperture Radar (SAR) imagery is the way forward exploiting the freely available Copernicus datasets that offer frequent revisit times and large spatial coverage. Moreover, Persistent Scatterer Interferometry (PSI) is among the most effective methods to monitor ground displacements, such as landslides, and assess their impact in residential areas. The purpose of this study is to showcase the use of advanced satellite image processing techniques, exploiting SAR satellite images to effectively identify ground displacements in different regions in Cyprus. The enhanced scientific and expertise skills of the ERATOSTHENES Centre of Excellence (ECoE) personnel on the application of PSI were acquired through a capacity building activity carried out by the National Observatory of Athens within the framework of EXCELSIOR project. The multi-temporal InSAR analysis in Cyprus revealed several deforming sites, which were also confirmed by the national authority responsible, i.e., the Geological Survey Department of the Ministry of Agriculture, Rural Development and Environment. ThCe villages of Pedoulas in Nicosia District and Pyrgos-Parekklisia in Limassol District are indicative deforming areas in Cyprus and were selected as test sites for further investigation. The ongoing implementation of additional InSAR techniques, fusion of remote sensing data and site visits for further validation, build a complete ground deformation monitoring system, aiming to migrate to a national scale project and serve as a valuable tool for natural hazards monitoring and risk reduction in Cyprus. 

 

Acknowledgements 

The authors acknowledge the 'EXCELSIOR': ERATOSTHENES: Excellence Research Centre for Earth Surveillance and Space-Based Monitoring of the Environment H2020 Widespread Teaming project (www.excelsior2020.eu). The 'EXCELSIOR' project has received funding from the European Union's Horizon 2020 research and innovation programme under Grant Agreement No 857510, from the Government of the Republic of Cyprus through the Directorate General for the European Programmes, Coordination and Development and the Cyprus University of Technology. 

How to cite: Fotiou, K., Theocharidis, C., Prodromou, M., Alatza, S., Apostolakis, A., Argyriou, A. V., Polydorou, T., Loupasakis, C., Kontoes, C., Hadjimitsis, D., and Tzouvaras, M.: Demonstrating the enhanced research capacity of the ERATOSTHENES Centre of Excellence for detecting ground displacements in Cyprus using advanced SAR satellite image processing techniques, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-12887, https://doi.org/10.5194/egusphere-egu23-12887, 2023.

EGU23-12958 | Posters on site | ESSI4.2

Development of algorithms based on the integration of meteorological data and remote sensing indices for the identification of low-productivity agricultural areas 

Rosa Coluzzi, Francesco Di Paola, Vito Imbrenda, Maria Lanfredi, Letizia Pace, Elisabetta Ricciardelli, Caterina Samela, and Valerio Tramutoli

Agricultural areas of Mediterranean regions host an extraordinary wealth of biodiversity and represent the source of income for a large population often living below the average economic conditions of the most advanced regions of Europe. In these areas, the semi-arid climates, the impact of climate change, the parcelization of land property, and the poor soils, contribute to create widespread conditions of low profitability of agricultural areas. This is likely to have an impact on the increasing occurrence of land abandonment phenomena and on growing hydrogeological risk linked to the lack of land maintenance.

The productivity estimation of these agricultural areas represents a crucial information to detect hotspots of degradation helping policy makers in taking specific actions to increase productivity and reduce migration fluxes.

In this work, realized in the framework of the ODESSA (On DEmand Services for Smart Agriculture) project (financed by the European Regional Development Fund Operational Programme 2014-2020 of Basilicata Region), the procedure adopted involves the use of climate and vegetation geospatial data, including both direct observational data (temperature, rainfall, etc.) and satellite-derived vegetation indexes. For the climatic component, we exploited a database of daily temperature and rainfall data (2000-2021) acquired by the agrometeorological network of ALSIA (Lucana Agency for Development and Innovation in Agriculture) and the CHIRPS (Climate Hazards Group InfraRed Precipitation with Station data) dataset providing rainfall data (1981-2020) at a spatial resolution of 0.050 to produce different diagnostic indices able to capture low-productivity areas. We tested this procedure in two districts of Basilicata (Southern Italy): the Vulture-Melfese and the Metapontino, representing the core areas of regional agricultural specialization for vineyards and intensive fruit and vegetable crops, respectively.

 

How to cite: Coluzzi, R., Di Paola, F., Imbrenda, V., Lanfredi, M., Pace, L., Ricciardelli, E., Samela, C., and Tramutoli, V.: Development of algorithms based on the integration of meteorological data and remote sensing indices for the identification of low-productivity agricultural areas, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-12958, https://doi.org/10.5194/egusphere-egu23-12958, 2023.

EGU23-13385 | ECS | Orals | ESSI4.2

The synergy of Sentinel missions for fire damage assessment on land surface and atmosphere: the Arakapas village case study 

Maria Prodromou, Rodanthi-Elisavet Mamouri, Argyro Nisantzi, Dragos Ene, Ioannis Gitas, Kyriacos Themistocleous, Chris Danezis, and Diofantos Hadjimitsis

Fires are a widespread ecological factor since ancient times. It has a negative impact not only on the environment but on the economy, society and people. A forest fire can lead to a change in land surface, the destruction of large areas of vegetation and soil erosion. As a result, the economy is negatively affected, the balance of ecosystems is disturbed, and the flora, fauna and natural beauty are destructed. Also, biomass burning smoke affects air quality due to the large quantities of trace gases and aerosol particles that are emitted, leading to global climate change and playing a significant role in troposphere chemistry. A fundamental tool for forest fire management is the science of remote sensing. Remote sensing is commonly used for mapping burnt areas as well as for studying the effects of fire incidents and this statement is very well supported by the literature at local, regional and global levels. This study is mainly focused on burned area mapping and damage assessment on land surface and atmosphere for the case of the Arakapas fire in Cyprus. For the purposes of this study, the satellite images acquired from Sentinel-2 were used for the burnt area mapping and the fire severity estimation based on the dNBR (difference Normalized Burn Ratio) spectral index, and the Corine land cover was used for the assessment of the vegetation type that was disturbed. This event considered one of the largest in recent years is explored using data from Sentinel-5P, where carbon monoxide product is studied in the region affected by the fires. Furthermore, on the morning of the 5th of July, due to the change of wind direction, the smoke travelled from the centre of the island to the southwest, and it was detected by the multiwavelength Raman lidar, installed in Limassol. Thus, the optical properties of the smoke plume retrieved from the lidar are presented. The PollyXT-CYP lidar system of the ECoE, observed multiple layers between 500m and 2.5km with depolarization ratio of 5-8% and lidar ratio of 75sr for the upper layers.For the purposes of this study, the image processing was performed using custom scripts in the GEE (Google Earth Engine) platform with the JavaScript programming interface. The area affected by the fire was calculated to be ~40Km2. The spatial distribution map of the dNBR was classified according to the USGS fire severity levels, where high dNBR values indicate a more severe fire and values near zero and negative values indicate unburned and/or decreased vegetation after the fire.

 

Acknowledgements

The authors acknowledge the 'EXCELSIOR': ERATOSTHENES: Excellence Research Centre for Earth Surveillance and Space-Based Monitoring of the Environment H2020 Widespread Teaming project (www.excelsior2020.eu). The 'EXCELSIOR' project has received funding from the European Union's Horizon 2020 research and innovation programme under Grant Agreement No 857510, from the Government of the Republic of Cyprus through the Directorate General for the European Programmes, Coordination and Development and the Cyprus University of Technology.

How to cite: Prodromou, M., Mamouri, R.-E., Nisantzi, A., Ene, D., Gitas, I., Themistocleous, K., Danezis, C., and Hadjimitsis, D.: The synergy of Sentinel missions for fire damage assessment on land surface and atmosphere: the Arakapas village case study, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-13385, https://doi.org/10.5194/egusphere-egu23-13385, 2023.

EGU23-14742 | ECS | Orals | ESSI4.2

Comparing reflectivity measurements between satellite- and ground-based radar observations: A case study for precipitation and drought monitoring in Cyprus 

Eleni Loulli, Johannes Bühl, Silas Michaelides, Athanasios Loukas, and Diofantos Hadjimitsis

Drought is a multidimensional phenomenon that is imperceptible at its early stages, it evolves slowly and cumulatively and results to adverse consequences, for example depletion of water volumes from rivers and reservoirs, decrease of carbon uptake in vegetation etc. Cyprus is characterized by semi-arid to arid climate conditions, experiencing extensive droughts that have a negative impact on the ecosystem, the economy and the agricultural production.

Existing research on drought events in Cyprus is limited to the usage of in-situ data, mainly temperature and precipitation measurements at meteorological stations. Polarimetric weather radars can offer more detailed information regarding precipitation phenomena, especially in areas with sparse network of meteorological stations or remote areas of interest.

This study compares reflectivity measurements from the two ground-based X-band dual polarization radars of the Department of Meteorology of the Republic of Cyprus with measurements obtained from NASA’s Global Precipitation Measurement (GPM) mission.

The DPR (Dual-frequency Precipitation Radar) aboard of GPM is employed in order to derive the radar reflectivity factor with a spatial resolution of 5-25 km for 120 km wide swath. The ground-based radars operate since 2017. They scan in PPI mode at eight (8) constant elevation angles, whereas their azimuth angle varies with a spatial resolution of 0.1° and the radius of each scan is 150 km. The radar stations are located in Rizoelia, Larnaca district, and Nata, Paphos district, providing a full coverage of the island.

Satellite-based radar reflectivity values are used to adjust the ground-based radar measurements. Consequently, the adjusted values of the ground-based radar reflectivity are used as input to modelling expressions for estimating the ground-based radar precipitation.

In order to ensure that the observations are spatially coincident, we have developed a collocated grid, hereafter called universal grid, on which both the ground- and satellite-based radar observations are interpolated at the same locations. The universal grid is a three-dimensional (3D) grid with grid cell size of approximately 2500 m along both horizontal directions, whereas the vertical resolution is set equal to the height resolution of GPM, i.e. 150 m. Regarding temporal resolution, GPM overpasses Cyprus approximately once a week. For the purposes of this study, we selected overflights after the beginning of the ground-based radar operation that coincide with precipitation events.

Additionally, statistical analysis of the reflectivity measurements has been conducted to understand the relationship between the ground-based and the satellite-based datasets and identify spatio-temporal patterns of precipitation.

Acknowledgements:

The authors acknowledge the ‘EXCELSIOR’: ERATOSTHENES: EΧcellence Research Centre for Earth Surveillance and Space-Based Monitoring of the Environment H2020 Widespread Teaming project (www.excelsior2020.eu). The ‘EXCELSIOR’ project has received funding from the European Union’s Horizon 2020 research and innovation programme under Grant Agreement No 857510, from the Government of the Republic of Cyprus through the Directorate General for the European Programmes, Coordination and Development and the Cyprus University of Technology.

The authors acknowledge also the Department of Meteorology of the Republic of Cyprus for the provision of the X-band radar data.

How to cite: Loulli, E., Bühl, J., Michaelides, S., Loukas, A., and Hadjimitsis, D.: Comparing reflectivity measurements between satellite- and ground-based radar observations: A case study for precipitation and drought monitoring in Cyprus, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-14742, https://doi.org/10.5194/egusphere-egu23-14742, 2023.

EGU23-15837 | ECS | Posters on site | ESSI4.2

Exploring the benefits of building a data cube towards the efficient risk monitoring and assessment of cultural heritage assets 

Georgios Leventis, Georgios Melillos, Athanasios Argyrioy, Ioannis Varvaris, Zampella Pittaki, Kyriacos Themistocleous, and Diofantos Hadjimitsis

The Eastern Mediterranean, Middle East, and North Africa (EMMENA) region encompasses three continents (Europe, Asia, and Africa). The region is not only strategically vital for political and military forces, but it is also archaeologically and culturally significant due to the large amount of cultural wealth, due to being an important crossroad in archaic times for various civilizations [1]. However, the cultural assets of the region are often susceptible to risks associated either to nature (like land deformation, earthquakes etc.) or to human activity (looting, war atrocities, etc.).

To protect cultural heritage in uncertain crisis scenarios, it is critical to recognize any risk situation early and support the decision-makers and cultural stakeholders with timely, accurate and relevant information, while raising at the same time public awareness on important issues that pertain to the cultural destruction, alteration and/or looting. Towards the end of responding properly in due time to any threats, ERATOSTHENES Centre of Excellence through its two departments; Big Earth Data Analytics and Cultural Heritage at the current work showcases its efforts in building and exploiting a cultural data cube based and building upon the open-source project called Open Data Cube [2]. Taking advantage of such endeavor, centre’s researchers are able to store, extract and analyse geospatial and satellite data, which due to their cube-shaped transformation can be accessed quickly thus providing a better understanding of any critical risk situations that might affect possible cultural assets. As the scale and pattern of occurrence fluctuate based on the type of disaster, as well as the extent of damage may vary from time to time depending on regional features, the timing of incident(s) and of the response, the proposed work encapsulates various forms of data acquired throughout an entire risk scenario (prior to the event, during the event and post to the event), to ensure the best possible assessment of any ongoing risk(s).

It becomes perceivable that damaged cultural assets cannot be restored to their former condition, hence is crucial to preserve them as much as possible and increase the resilience of cultural properties by reducing the harm brought on by disaster scenarios. Fostering on geospatial advances, the particular work aspires to become a common ground and valuable tool for efficient incident management within the EMMENA region starting from the field of Cultural Heritage and extending to others (i.e., marine security, agriculture, water resources management etc.).

 

Acknowledgements

The authors acknowledge the ‘EXCELSIOR’: ERATOSTHENES: EΧcellence Research Centre for Earth Surveillance and Space-Based Monitoring of the Environment H2020 Widespread Teaming project (www.excelsior2020.eu). The ‘EXCELSIOR’ project has received funding from the European Union’s Horizon 2020 research and innovation programme under Grant Agreement No 857510, from the Government of the Republic of Cyprus through the Directorate General for the European Programmes, Coordination and Development and the Cyprus University of Technology.

 

References

[1] - Longuet, R.: Encyclopaedia of the History of Science, Technology, and Medicine in Non-Western Cultures. Springer, Netherlands, Dordrecht (2008)

[2] – Open Data Cube, open-source project, https://www.opendatacube.org/about. Last accessed on 8/01/2023.

How to cite: Leventis, G., Melillos, G., Argyrioy, A., Varvaris, I., Pittaki, Z., Themistocleous, K., and Hadjimitsis, D.: Exploring the benefits of building a data cube towards the efficient risk monitoring and assessment of cultural heritage assets, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-15837, https://doi.org/10.5194/egusphere-egu23-15837, 2023.

EGU23-16436 | Orals | ESSI4.2

Monitoring natural and geo- hazards at cultural heritage sites using Earth observation: the case study of Choirokoitia, Cyprus 

Kyriacos Themistocleous, Kyriaki Fotiou, and Marios Tzouvaras

Monitoring natural hazards due to climate change and natural hazards at cultural heritage sites facilitates the early recognition of potential risks and enables effective conservation monitoring and planning. Landslides, earthquakes, rock falls, ground subsidence and erosion are the predominant natural hazards in Cyprus, which pose serious disadvantages to cultural heritage sites as well as potential danger to visitors. To identify and monitor natural hazards and environmental displacements Earth observation techniques, such as SAR, can be used in combination with in-situ methods.

The EXCELSIOR H2020 Widespread Teaming project under Grant Agreement No 857510 and the TRIQUETRA project Horizon Europe, Grant Agreement No. 101094818 will study the use of Earth observation techniques for examining cultural heritage sites. The TRIQUETRA project will examine Choirokoitia, Cyprus as a pilot project using these techniques. Choirokoitia is a UNESCO World Heritage Site and is one of the best-preserved Neolithic sites in the Mediterranean. The project will examine the potential risk of rockfall at the Choirokoitia site, as the topology of the site is vulnerable to movements as a result of extreme climate change as well as of daily/seasonal stressing actions. Rockfall poses a significant danger to visitor safety as well as damage to cultural heritage sites.

As well, the Choirokoitia site will be used to detect and analyse natural hazards induced ground deformation based on InSAR ground motion data and field survey techniques for cultural heritage applications. InSAR data, satellite positioning and conventional surveying techniques will be employed to measure micromovements, while other techniques such as UAVs and photogrammetry will be used for documentation purposes and 3D modelling comparisons. In order to identify and monitor natural hazards and their severity, a permanent GNSS station and corner reflector, as well as analysing multitemporal SAR satellite data will be used to estimate the rate of land movement. SAR monitoring provides the opportunity to identify deformation phenomena resulting from natural hazards for monitoring and assessing potential hazards using remote sensing techniques to measure and document the extent of change caused by the natural and/or geo-hazards. PSI (Persistent Scatterer Interferometry) analysis can be used in the wider area to determine potential displacements.

The study is expected to lead towards the systematic monitoring of geohazards, and more specifically those of ground deformation and rock falls to facilitate the early recognition of potential risks and enable effective conservation monitoring and planning. The methodology can be used to monitor cultural heritage sites worldwide which are vulnerable to natural hazards.

How to cite: Themistocleous, K., Fotiou, K., and Tzouvaras, M.: Monitoring natural and geo- hazards at cultural heritage sites using Earth observation: the case study of Choirokoitia, Cyprus, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-16436, https://doi.org/10.5194/egusphere-egu23-16436, 2023.

EGU23-16794 | Orals | ESSI4.2

Evaluating the influence of human induced landscape alterations on ecosystem services in semiarid regions of India. 

Vinay Shivamurthy, Thokala Manoj, Kamera Arun Kumar, Maddela Harsha Vardhan, Sangannagari Lavanya, and Sankalamaddi Manasa

In the Anthropocene, with human centric planning, the landscapes are continually altered endangering the existence of biota, triggering climate changes, affecting the ecosystem services provided by the regional landscapes. However in special cases, meticulous planning and prioritized alterations of landscape has aided in improving the regional economy and the services provided by them. In the current communication we spatially evaluate the influence of irrigation projects on the lentic ecosystems and agrarian ecosystems at regional scale. Karimnagar, located in Telangana State India, along with 25km buffer from the city center was analyzed. Being in the semiarid zones, Karimnagar experience sever temperature during summer. Spatiotemporal variation in Zaid cropping pattern and water bodies were studied using Landsat series satellite data for the past 5 decades i.e., between 1973 to 2022. Indices based methods such as Normalized Difference Vegetation Index and modified Normalized Difference Water index were used followed by segmentation to determine the areas under Zaid cropping and extent of water. It was evident that during 1973 area under Zaid crops were as low as 231km2 with water bodies about 1.7km2. with commission of lower Manair in the year 1985, downstream regions of the reservoir showed large scale improvement i.e., the lakes were rejuvenated and the area under Zaid cropping improved significantly. Area under Zaid agriculture improved by four folds i.e., over 1000km2 and water bodies increased to 53km2. In the recent past, Mid Manair was commissioned in the year 2018 post which area under water has increase to 113km2 and area under Zaid cropping has increased to 1569km2. Post Lower Manair and Mid Manair Projects, most of the lentic ecosystems in the study area have become perennial catering to agrarian, domestic and environment. The Agriculture Ecosystem Service Value in the study area particularly due the Zaid Cropping has increased from 34 Million US$ in 1973 to ~128Million US$ after commission of Lower Manair and the same has increased to 235 Million US$ by 2022, like wise ecosystem services of lentic ecosystems have increased from 0.59Million US$ in 1973 to 39.57 Million US$ in 2022. The results indicates that with sensible planning and development, both society and regional environs get mutually benefitted thus ensuring superior wellbeing.

How to cite: Shivamurthy, V., Manoj, T., Arun Kumar, K., Harsha Vardhan, M., Lavanya, S., and Manasa, S.: Evaluating the influence of human induced landscape alterations on ecosystem services in semiarid regions of India., EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-16794, https://doi.org/10.5194/egusphere-egu23-16794, 2023.

EGU23-16909 | ECS | Posters virtual | ESSI4.2

Sea Surface Temperature and Ocean Wind Speed Data in the Cyprus region from Sentinel-3 using Sentinel Application Platform (SNAP) and Arc GIS Pro. 

Eleftheria Kalogirou, George Melillos, Diofantos Hadjimitsis, and Despoina Makri

The ability to measure sea surface temperature allows us to observe the global system and quantify ongoing weather and climate change. Several industries are particularly affected by increased SST the shipping industry, the offshore oil and gas industry, the fishing industry, etc. Knowledge of ocean wind behaviour will enable ship masters to choose routes that avoid heavy seas or high headwinds that may slow the ship's travel, increase fuel consumption, or possibly cause damage to vessels and loss of life. This paper aims to realise the Cyprus region's sea surface temperature and wind speed data. The comparison of results obtained using Sentinel Application Platform (SNAP) and ArcGIS Pro, shows that both tools can be used to realise Sea Surface Temperature and Ocean Wind Speed Data and give satisfactory results.

Keywords: Sea surface temperature, Ocean Wind Speed Data, Sentinel-3, SNAP, ArcGIS Pro.

How to cite: Kalogirou, E., Melillos, G., Hadjimitsis, D., and Makri, D.: Sea Surface Temperature and Ocean Wind Speed Data in the Cyprus region from Sentinel-3 using Sentinel Application Platform (SNAP) and Arc GIS Pro., EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-16909, https://doi.org/10.5194/egusphere-egu23-16909, 2023.

EGU23-17144 | Orals | ESSI4.2

Comparison of three algorithms for tree crown area and available pruning biomass monitoring 

Sofia Fidani, Ioannis Maroufidis, Stavros Chlorokostas, Ioannis N. Daliakopoulos, Dimitrios Papadimitriou, Ioannis Louloudakis, Georgios Daskalakis, Betty Charalambopoulou, and Thrassyvoulos Manios

Fast and rigorous assessment of tree characteristics from earth observation products has many environmental applications, including monitoring of the canopy biomass available for pruning and utilisation as soil amendment or energy source. Here we explore the efficiency of three supervised classification algorithms in assessing canopy area of olive trees, the staple food crop of the Mediterranean that annually produces an estimated 2,82 Μt ha-1 of residual biomass (Velázquez-Martí et al., 2011) which is currently largely unexploited and often an environmental hazard due to on-site fires. The algorithms include (a) a thresholding algorithm (Daliakopoulos et al., 2009) processing Normalized Difference Vegetation Index values, (b) a supervised machine learning algorithm comprised on an Artificial Neural Network (ANN) with 4 hidden layers, and (c) the AdaBoost supervised deep learning algorithm. Following Yang et al. (2009), the latter two methods use image colour, texture, and entropy as inputs. Ground truth was developed by manually producing a binary mask where pixels depicting tree crown were marked with 1 and otherwise 0, and classification results were evaluated using the Dice similarity coefficient (DSC; Nisio et al., 2020). The three algorithms were tested on assessing olive tree crown projected surface area on a WorldView II image of resolution 0.5 × 0.5 m of a rural area of Heraklion, Crete, Greece, acquired on November 10, 2020. Masking was performed in 42 olive tree plots including a total of 1,080 olive trees, including on-site visual validation of the masking results. Results show that the ANN performed better than AdaBoost and NDVI thresholding, scoring 81.98%, compared to 75.06 and 70.03%, respectively. The trained ANN is currently used to provide olive tree canopy estimates, used as input to assess canopy biomass available for pruning for the CompOlive system, an online platform that facilitates matchmaking of olive tree farms, olive mills, and mobile composting equipment, to optimise on-farm compost production and utilisation.

Acknowledgements

This research is co-financed by the European Union and Greek national funds through the Operational Program CRETE 2014-2020, under Project “CompOlive: Integrated System for the Exploitation of Olive Cultivation Byproducts Soil Amendments” (KPHP3-0028773).

References

Daliakopoulos, I. N., Grillakis, E. G., Koutroulis, A. G., & Tsanis, L. K. (2009). Tree Crown Detection on Multispectral VHR Satellite Imagery. Photogrammetric Engineering and Remote Sensing, 75(10), 1201–1211. https://doi.org/10.14358/PERS.75.10.1201

Velázquez-Martí, B., Fernández-González, E., López-Cortés, I., & Salazar-Hernández, D. M. (2011). Quantification of the residual biomass obtained from pruning of trees in Mediterranean olive groves. Biomass and Bioenergy, 35(7), 3208–3217. https://doi.org/10.1016/J.BIOMBIOE.2011.04.042

Yang, L., Wu, X., Praun, E., & Ma, X. (2009). Tree detection from aerial imagery. GIS: Proceedings of the ACM International Symposium on Advances in Geographic Information Systems, 131–137. https://doi.org/10.1145/1653771.1653792

 

How to cite: Fidani, S., Maroufidis, I., Chlorokostas, S., Daliakopoulos, I. N., Papadimitriou, D., Louloudakis, I., Daskalakis, G., Charalambopoulou, B., and Manios, T.: Comparison of three algorithms for tree crown area and available pruning biomass monitoring, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-17144, https://doi.org/10.5194/egusphere-egu23-17144, 2023.

EGU23-969 | Posters on site | PS5.3

Lunar Mission Planning and Exploration using NASA’s Moon Trek Portal 

Emily Law and Brian Day and the Solar System Treks

NASA’s Moon Trek (https://trek.nasa.gov/moon/) is one of a growing number of interactive, browser-based, online portals for planetary data visualization and analysis produced by NASA’s Solar System Treks Project (SSTP). Moon Trek continues to be enhanced with new data and new capabilities enabling it to facilitate the planning and conducting of upcoming lunar missions by NASA, its commercial partners, and its international partners, as well as scientific research.

Moon Trek’s innovation visualization and analysis tools are already being used by a growing number of missions and scientists around the world. The tools deployed including interactive 2D and 3D visualization, a DEM and Ortho Mosaic Image production pipeline as well as tools for distance measurement, elevation profile generation, solar altitude and azimuth calculation, 3D print file generation, virtual reality visualization generation, lighting analysis, electrostatic surface potential analysis, slope analysis, rock detection, crater detection, rockfall detection, and profiling of raster data.

Moon Trek has added a new set of visualization and analysis tools include line of sight analysis (facilitating communications planning and detailed studies of solar illumination), traverse path planning, and 3D traverse path visualization tool, among others. This presentation for EGU will highlight Moon Trek’s latest tools and demonstrate their usage targeted for Lunar mission planning and exploration in this exciting Artemis era.

How to cite: Law, E. and Day, B. and the Solar System Treks: Lunar Mission Planning and Exploration using NASA’s Moon Trek Portal, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-969, https://doi.org/10.5194/egusphere-egu23-969, 2023.

This work studies the remanent magnetization under a weak and a strong magnetic anomaly in Tranquillitatis and in Oceanus Procellarum respectively, which show similar surface ages of 3.6 Ga and 3.3 Ga. A 3D amplitude inversion is used to reconstruct the distributions of magnetization underground. Since there is no globally measured surface magneticeld for the Moon, a crustal magnetic anomaly model with grid resolution of 0.2° is used. The depth to the bottom of the magnetic source is fixed by the boundary identified by a relative criterion, which is 20% of the recovered maximum magnetization. The results show that the two anomalies have different depths to the bottom and different volumes of magnetic sources. The depth to the bottom of the magnetic carriers, which is possibly the Curie depth, is about 30 km and 50 km under Oceanus and Tranquillitatis. The volumes of the two magnetic sources are at the scale of 104 and 105 km3, respectively. The Bouguer gravity anomalies with spherical harmonics reaching 1200 degree in the two studied regions are also checked. The results supports that the magma intrusions containing different abundances of metallic iron are the most possible origins of the magnetic sources in the studied regions. Besides, the thermal states of lunar crust under the two studied maria were probably different during the acquisition process of remanent magnetization.

How to cite: Wang, H. and Yao, S.: Depths to the Bottom and Volumes of Magnetic Sources under a Weak and a Strong Lunar Magnetic Anomaly Revealed by 3D Inversion, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-3164, https://doi.org/10.5194/egusphere-egu23-3164, 2023.

The planetary magnetic field, caused by convective currents in the cores and linking thermal and interior, is a fundamental way to determine the angular momentum exchange and secular variation in the core motions & core-mantle coupling system. But understanding the high temperature-pressure (e.g., ~5000 °C, 135~330 Gigapascals) rheology fluid flows in planetary cores is a tremendous interdisciplinary challenge. The fine-structure investigation requires understanding the fundamental rheology fluid dynamic involving turbulence and rotation from continuing hydro-dynamo-kinetic coupling scales well beyond the present traditional partial differential equation virtual test.

The lunar magnetic field is believed not currently to possess a feeble global magnetic field and can be ignored when exploring the solar-flare CME-induced solar storm transplant on the lunar surface. The hypothesis holds that the crustal magnetizations were acquired early in lunar history when dynamics were still operating. At that time, the dynamo magnetic fields were generated by the thermochemical convection of electrically conductive alloy metal liquid within lunar cores and reduced with the convection cooling process. The turbulence mechanical stirring of lunar core rheology fluids and perturbations by the tidal effect and orbital precession can contribute to sustaining dynamo fields.

With the supporting observations of China’s lunar and deep space exploration in recent years, it has become possible to re-estimates the past magnetic field by considering combining the tidal heating induced dissipation from viscous friction associated with the differential procession at a different angle and dynamo action (the non-ideal plasma; inner core-outer core-mantle; warm dense matter; liquid iron alloy; chemical-geological properties; density-temperature-pressure) together again.

In this work, based on the newly developed optimization methodology and numerical algorithm of relativistic hybrid particle-in-cell and lattice Boltzmann (RHPIC-LBM version 1.1.2), we establish the 3D lunular magnetic field modeling with combined rheology dynamo thermally and tidal-heating of its lunar cores and investigate the history of magnetic field evolution; And figure out the effect of tidal heating in the deepest lunar mantle,  and offer a possible unprecedented window on this intermediate state of rheology matter and providing a new virtual testing ground for dense rheology plasma theories.

How to cite: Zhu, B.: Exploration of Lunar magnetic fields with dynamo thermally and tidal heating-driven rheology model, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-3206, https://doi.org/10.5194/egusphere-egu23-3206, 2023.

EGU23-3992 | Posters on site | PS5.3

Lunar secondary crater distributions and ejecta fragment size velocity distributions: implications for regolith redistribution 

Kelsi Singer, Helle Skjetne, Julie Stopar, Mikayla Huffman, Clark Chapman, Lillian Ostrach, Brad Jolliff, and William McKinnon

We have performed an extensive study of secondary craters associated with specific primary craters on the Moon.  These data can be used to understand aspects of both (1) the secondary craters themselves and (2) the ejecta fragments that formed them.  Studying ejecta and secondary craters are a part of understanding the overall contributions of impacts to shaping and redistributing material across the lunar surface. 

We produced secondary crater size-range distributions for a large range of primary crater sizes (~0.8-660 km dimeter primaries).  Our results can be used to make a map of estimated maximum secondary crater sizes across the Moon.  They can also be used to test if a specific secondary crater cluster is likely related to a given primary crater.   

We also produced ejecta fragment size-velocity distributions for all our study sites.  These results can be used to understand the size and velocity of the ejecta fragments that were ejected as part of the primary impact.  This helps us understand the dynamics of the primary impact and the formation of fragments (or clusters of fragments) and how they are ejected during the passage of the shock wave through a planetary surface.  This new empirical data can be used to help constrain analytical and numerical models of dynamic fragmentation, place constraints on the largest ejecta fragments expected be ejected at escape velocity from the Moon, and used as inputs into models of regolith development and impact gardening. 

We will present the most current results on the above topics.  Initial results for 6 primary craters are presented in Singer et al. 2020 where we discovered a previously unrecognized trend where the size velocity distributions are dependent on the size of the impact (i.e., scale dependent).  We now have data on 10 additional primaries and further applications of the study. 

Singer, K. N., Jolliff, B. L., & McKinnon, W. B. (2020). Lunar secondary craters and estimated ejecta block sizes reveal a scale-dependent fragmentation trend. J. Geophys. Res., 125(8), e2019JE006313. doi:10.1029/2019JE006313

How to cite: Singer, K., Skjetne, H., Stopar, J., Huffman, M., Chapman, C., Ostrach, L., Jolliff, B., and McKinnon, W.: Lunar secondary crater distributions and ejecta fragment size velocity distributions: implications for regolith redistribution, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-3992, https://doi.org/10.5194/egusphere-egu23-3992, 2023.

EGU23-5040 | Orals | PS5.3

The ESA/DLR LUNA Habitat as geophysical experimentation facility 

Martin Knapmeyer, Brigitte Knapmeyer-Endrun, Michael Maibaum, Jens Biele, Cinzia Fantinati, Oliver Küchemann, Stephan Ulamec, and Jean-Pierre de Vera

Recently, NASA’s InSight mission has shown the value of geophysical landers by greatly increasing our knowledge of the interior of Mars. Correspondingly, geophysical experiments are also of great relevance to lunar exploration: a number of geophysical experiments were proposed in response to the ESA's 2020 call for ideas for a scientific utilization of the large logistics lander (Argonaut). Geophysical payloads are already planned for the Moon, e.g. the Farside Seismic Suite will land a broad-band seismometer in 2025. We here present how the LUNA Habitat training facility under construction in Cologne, Germany, can contribute to the development and testing of lunar geophysical instrumentation.

The about 700 square meters of the LUNA Habitat will be covered by 60 cm of EAC-1 regolith simulant on most of the area. On an area of 140 square meters, regolith depth increases to 3 m along a sloping bottom (25° and 40°). This part of LUNA provides an invisible, but explorable underground structure suitable for seismic profiling, ground penetrating radar, geoelectrics, geomagnetics and other techniques, as well as sufficient depth for drilling, subsurface sampling, and deployment of heat flow probes. Sculpting craters and even caves in the regolith, as well as cooling small portions of it, is envisioned. Support by the facility will include personnel with experience in geophysical measurements and data analysis, an end-to-end operational environment including a remote control center with standard communication technology, and, last but not least, training of astronauts in co-operation with robotic units to operate the equipment in lunar surface suits and under gravity offloading.

A four-element, Y-shaped array of short period seismometers, based on the layout of the Apollo 17 seismic experiment, will be deployed on the LUNA construction site before erecting the building to record seismic noise sources (car traffic on the DLR campus, the ENVIHAB short arm centrifuge, wind tunnel discharges, air traffic on the nearby CGN international airport etc.). It will also allow for ambient noise analysis aimed at the underground structure, which is expected to consist of Rhine sediments. An active refraction seismic experiment and the deployment of 12 nodal sensors will further aid in site characterization. LUNA will have a concrete floor of up to 60 cm thickness, but with a structured underside for static reasons. The array will be re-deployed on the concrete once the hall is erected to characterize in how far the new high-velocity layer hides the underlying sediments from seismic observation. After completion of LUNA, the effect of the regolith cover on seismic recordings will be characterized by a third array deployment. Documentation of construction details, especially steel enforcing in the concrete, is foreseen.  A broad-band seismometer will be installed in the LUNA Habitat permanently, once construction is finished, to support the identification of artificial noise sources and local seismicity in the recordings of customer instruments, and monitor possible changes in the background e.g. due to new buildings or other large-scale research facilities on the DLR campus.

How to cite: Knapmeyer, M., Knapmeyer-Endrun, B., Maibaum, M., Biele, J., Fantinati, C., Küchemann, O., Ulamec, S., and de Vera, J.-P.: The ESA/DLR LUNA Habitat as geophysical experimentation facility, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-5040, https://doi.org/10.5194/egusphere-egu23-5040, 2023.

EGU23-5929 | ECS | Orals | PS5.3

Mafic Mineral Anomaly in the Ohm Crater of the Moon 

Shreekumari Patel, Animireddi V Satyakumar, Paras Solanki, and Mohamed R El-maarry

The 64 km wide Ohm crater is a complex impact crater located on the northern side of the lunar farside. In this study, we generated abundance maps for FeO and TiO2 as well as Spectral Parameter maps to determine the composition. Orthopyroxene and Clinopyroxene, two mafic minerals, are present in the Ohm crater, according to spectral analyses of M3 data. A geostatistical technique is used to optimize the variation trend of diagnostic characteristics across different sites. We noticed that Opx dominates the rest of the crater, while Cpx dominates the western portion of Ohm. Opx denotes sources from above and/or below the crust-mantle boundary, whereas Cpx suggests impact melt crystallization of an anorthositic target crust. The NASA mission GRAIL, which is specifically designed to study gravity anomalies, has found negative anomalies near the Ohm crater that may indicate a thicker crust beneath the crater. Unequal Bouguer gravity anomalies and negative anomalies have been found in the vicinity of the Ohm crater, but they are not clearly connected to the internal morphology. Surface morphological features have no connection to these anomalies of uneven gravity. In addition, the Bouguer gravity signature may be affected by pre-existing subsurface density structure, and post-impact events (such as magmatism), which could account for some of the observed scatter. The regional gravity anomaly also indicates low values in the Ohm crater, suggesting that the thicker crust and the source of the geochemical anomalies are at deeper levels. Strong negative anomalies are seen in the predicted residual gravity data close to the Ohm crater, which suggests low-density bodies at the crustal level. We propose that the pyroxenes are the end product of impact melt crystallization based on regional and residual gravity anomalies, compositional and mineralogical features of the Ohm crater, and geophysical data. Ejecta from the SPA, Orientale, and Mascon Hertzsprung basins, which may or may not have differed from impact melt formed during the Ohm impact event, should also be looked at when analyzing the distribution of mafic minerals throughout the crater. The GRAIL crustal thickness model-1 for the Ohm crater indicates a thicker crust, demonstrating that the mantle upliftment is not the underlying cause of the geochemical anomalies in this area.

Acknowledgement: S. M. Patel and M. R. El-maarry acknowledge support for this work through an internal grant (8474000336-KU-SPSC).

How to cite: Patel, S., Satyakumar, A. V., Solanki, P., and El-maarry, M. R.: Mafic Mineral Anomaly in the Ohm Crater of the Moon, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-5929, https://doi.org/10.5194/egusphere-egu23-5929, 2023.

Because the Moon is much less flattened than the Earth, most lunar GIS applications use a spherical datum. However, nowadays, with the renaissance of lunar missions approaching, it seems worthwhile to define an ellipsoid of revolution that better fits the lunar gravity potential surface. The main long-term benefit of this might be to make the lunar adaptation of methods already implemented in terrestrial GNSS, gravimetry and GPS applications easier and somewhat more accurate.

In our work, we used a 660th degree and order potential surface called GRGM 1200A Lunar Geoid, developed in the frame of the GRAIL project. Samples were taken from the potential surface along a mesh that represents equal area pieces of the surface. The method of point grid selection was provided by a relatively simple Fibonacci sphere. We tried Fibonacci spheres with 100, 1000, 3000, 5000, 10000 and 100000 points and also separately examined the effect of rotating the network by length for a given number of points on the estimated parameters, but these differences was only noticeable for the lower resolution networks.

We estimated the best-fitting rotation ellipsoid semi-major axis and flatness data for the selenoid undulation values at the network points, which were obtained for a=1,737,576.6 m and f=0.000305. This parameter pair is already obtained for a 10000 point grid, while the case of reducing the points of the equidistant grid to 3000 does not cause a deviation in the axis data of more than 10 centimetres. As expected, the absolute value of the selenoid undulations has decreased compared to the values taken with respect to the spherical basal surface, with maxima exceeding +400 m still being found for Mare Serenitatis and Mare Imbrium, and the largest negative values for South Pole Aitken and Mare Orientale.

Supported by the ÚNKP-22-6 New National Excellence Program of the Ministry for Culture and Innovation from the source of the National Research, Development and Innovation Fund.

How to cite: Cziráki, K. and Timár, G.: Estimation of the parameters of a lunar ellipsoid of revolution based on GRAIL selenoid data and Fibonacci mesh, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-7979, https://doi.org/10.5194/egusphere-egu23-7979, 2023.

EGU23-8116 | Orals | PS5.3

Tungsten isotopes and the early evolution of the Moon 

Thomas Kruijer, Gregory Archer, and Thorsten Kleine

Key events in the early history of the Moon include its formation by a giant impact, the solidification of the lunar magma ocean, and late accretion. The 182Hf-182W system (t1/2 ~9 Ma) constitutes a versatile tool to study each of these processes because they can all result in measurable 182W variations. Here we review the 182W record of lunar rocks and highlight key constraints on the early evolution of the Moon. Tungsten isotope studies on lunar samples demonstrate that there are no resolvable 182W variations within the Moon, implying that lunar magma ocean differentiation later than ~70 Ma after Solar System formation. Nevertheless, the Moon is characterized by a uniform ~25 parts-per-million 182W excess over the present-day bulk silicate Earth (BSE). One possibility is that this 182W difference is radiogenic in origin, in which case the Hf-W system can potentially be used to date the formation of the Moon. However, this interpretation is problematic for two reasons. First, mixing processes during the giant impact very likely modified the 182W composition of the Moon and led to distinct initial 182W compositions of the Moon and Earth. Second, the pre-late accretion 182W compositions of the Moon and BSE overlap within uncertainty, and hence there is no resolved radiogenic 182W difference between the BSE and the Moon. Consequently, the Hf-W system does not provide reliable constraints on the age of the Moon. Instead, the Hf–W systematics are fully consistent with 'young' ages of the Moon, well after the effective lifetime of 182Hf.

How to cite: Kruijer, T., Archer, G., and Kleine, T.: Tungsten isotopes and the early evolution of the Moon, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-8116, https://doi.org/10.5194/egusphere-egu23-8116, 2023.

EGU23-10526 | ECS | Orals | PS5.3

Lunar Vertex: A PRISM Science Investigation of the Reiner Gamma Lunar Magnetic Anomaly and Swirl 

Sarah Vines, George Ho, David Blewett, Jasper Halekas, Benjamin Greenhagen, Brian Anderson, Dany Waller, Jörg-Micha Jahn, Peter Kollmann, Brett Denevi, Heather Meyer, Rachel Klima, Joshua Cahill, Lon Hood, Sonia Tikoo, Xiao-Duan Zou, Mark Weiczorek, Myriam Lemelin, Shahab Fatemi, and Edward Cloutis

Lunar Vertex is a mission at the intersection of multiple science communities, from planetary geology to space plasma physics. As the first Payloads and Research Investigations on the Surface of the Moon (PRISM1) investigation, scheduled for delivery to the Reiner Gamma (RG) magnetic anomaly in 2024 aboard a commercial lunar lander, Lunar Vertex will unravel the nature of the RG anomaly, the connection to and origin of the associated lunar swirl surface feature, and the structure and impact of the “mini-magnetosphere” in this region. Lunar Vertex includes a suite of magnetometers (Vector Magnetometer – Lander; VML), a fixed-mounted set of cameras (Vertex Camera Array; VCA), and a low-energy ion and electron plasma analyzer (Magnetic Anomaly Plasma Spectrometer; MAPS) on the lander. In addition, a second suite of commercial fluxgate magnetometers (Vector Magnetometer – Rover; VMR) and a multispectral imager (Rover Multispectral Microscope; RMM) are mounted on a dedicated rover that will traverse a distance of at least 500 m from the lander, providing additional multi-point measurements. The combination of magnetic field measurements taken during cruise and descent by VML and during surface operations by both VML and VMR will characterize the surface magnetic field within a strong lunar magnetic anomaly. The combined magnetic field and plasma measurements from VML and MAPS will provide direct observations of plasma populations reaching the lunar surface and the associated local magnetic field configuration. Furthermore, the lunar regolith within the RG magnetic anomaly and over different regions of the associated lunar swirl will be characterized by RMM and VCA to reveal the surface texture, composition, and particle distribution around both the lander and rover locations and the correspondence to potential surface weathering processes.

How to cite: Vines, S., Ho, G., Blewett, D., Halekas, J., Greenhagen, B., Anderson, B., Waller, D., Jahn, J.-M., Kollmann, P., Denevi, B., Meyer, H., Klima, R., Cahill, J., Hood, L., Tikoo, S., Zou, X.-D., Weiczorek, M., Lemelin, M., Fatemi, S., and Cloutis, E.: Lunar Vertex: A PRISM Science Investigation of the Reiner Gamma Lunar Magnetic Anomaly and Swirl, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-10526, https://doi.org/10.5194/egusphere-egu23-10526, 2023.

EGU23-10737 | ECS | Orals | PS5.3

An Ultra-Wideband Spectrometer for Lunar Heat-Flow Measurements 

Mehmet Ogut, Shannon Brown, Alan Tanner, Sidharth Misra, Chris Ruf, Chi-Chih Chen, and Matthew Siegler

The lunar heat-flow ultra-wideband spectrometer operates over an extended frequency band from 300 MHz to 6.0 GHz. It is a direct-acquisition single-chain digital spectrometer measuring 1024 spectral channels over 6 GHz bandwidth with each channel bandwidth about 6 MHz. The LHR instrument is intended to characterize the near surface regolith thermal and dielectric properties in order to determine the local geothermal heat flux. It would also reveal subsurface thermal and dielectric property changes due to buried ice, dielectric materials like ilmenite, and bedrock. The wide spectral bandwidth is expected to provide up to 1 m deep brightness temperature measurements from as close as 5 cm penetration depth at higher frequency end of the spectra. Using information obtained at multiple frequency bands, the subsurface temperatures and dielectric properties can be reconstructed.

 

The instrument is currently being developed at Jet Propulsion Laboratory in Pasadena, CA. The design includes a novel receiver architecture allowing a single chain design for the ultra-wideband channelized spectral operation for enabling the science objectives of the instrument. The lab-bench demonstration of the lunar spectro-radiometer has been performed including the calibration testing. The environmental testing will be further conducted before proceeding with the flight model. The final flight version of the spectro-radiometer instrument is expected to have light weight, low-power and small-size suitable for a deployment into a lunar rover or lander.

 

 

How to cite: Ogut, M., Brown, S., Tanner, A., Misra, S., Ruf, C., Chen, C.-C., and Siegler, M.: An Ultra-Wideband Spectrometer for Lunar Heat-Flow Measurements, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-10737, https://doi.org/10.5194/egusphere-egu23-10737, 2023.

EGU23-10755 | Posters on site | PS5.3

Radio Instrument Package for Lunar Ionospheric Observation: A Concept Study 

Christopher Watson, Thayyil Jayachandran, Anton Kascheyev, David Themens, Richard Langley, Richard Marchand, and Andrew Yau

The lunar ionosphere is a ~100 km thick layer of electrically charged plasma surrounding the moon.  Despite knowledge of its existence for decades, the structure and dynamics of the lunar plasma remain a mystery due to lack of consistent observational capacity. An enhanced observational picture of the lunar ionosphere and improved understanding of its formation/loss mechanisms is critical for understanding the lunar environment as a whole and assessing potential safety and economic hazards associated with lunar exploration and habitation. To address the high priority need for observations of the electrically charged constituents near the lunar surface, we introduce a concept study for the Radio Instrument Package for Lunar Ionospheric Observation (RIPLIO). RIPLIO would consist of a multi-CubeSat constellation (at least two satellites) in lunar orbit for the purpose of conducting “crosslink” radio occultation measurements of the lunar ionosphere, with at least one satellite carrying a very high frequency (VHF) transmitter broadcasting at multiple frequencies, and at least one satellite flying a broadband receiver to monitor transmitting satellites. Radio occultations intermittently occur when satellite-to-satellite signals cross through the lunar ionosphere, and the resulting phase perturbations of VHF signals may be analyzed to infer the ionosphere electron content and high- resolution vertical electron density profiles. As demonstrated in this study, RIPLIO would provide a novel means for lunar observation, with the potential to provide long-term, high-resolution observations of the lunar ionosphere with unprecedented pan-lunar detail.

How to cite: Watson, C., Jayachandran, T., Kascheyev, A., Themens, D., Langley, R., Marchand, R., and Yau, A.: Radio Instrument Package for Lunar Ionospheric Observation: A Concept Study, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-10755, https://doi.org/10.5194/egusphere-egu23-10755, 2023.

Boulders are a major surface feature on solid planets and small bodies, including asteroids and comets. Interest in these clasts range from applications relevant for landing site selection to geomechanical parameter characterization of the soil on which they rest [1], to measurements of their size frequency distributions [2] which is relevant for an understanding of their formation and erosion processes. On the Moon boulders are generally found in association with craters, hilltops, rilles, and other steep relief forms. Two main mechanisms of boulder formation are bedrock fragmentation and excavation by impacts, and progressive exposure of pre-existing blocks and fractured bedrock by removal of regolith from steep reliefs by diffusive creep.

An important issue are transport processes which can move the stones on the surface of their parent bodies. On the Moon, one group of boulders, frequently called “rolling stones”, have left tracks on the surface which can cover large distances. Mainly two mechanisms, meteoritic impact and moonquakes [3], have been cited in the literature as drivers of boulder displacements. Much less attention has been given to the hypothesis that other processes like thermal solar-induced rock breakdown [4] could deliver the initial momenta that could initiate the movement of meta stabile rocks.

From an AI -based mapping of the distribution of boulders with tracks on the lunar surface [5] we know that the majority of these boulders are found – not surprisingly - within craters. However, as the AI-based procedure strongly underestimated the number of boulders with tracks, we have conducted a new investigation to map these boulders. However, such a mapping it is only one prerequisite in understanding whether a thermally-induced breakdown could be responsible for an initial triggering of boulder movements. Boulders moving down the slopes disturb the mature regolith and move fresh lunar soil to the surface. This process should remotely be detectable through the stronger spectral features of the fresher optically immature regolith. The number of non-decayed boulders along crater walls should therefore be correlated with the strength of the absorption bands in spectra taken from those crater walls. Spectral characteristics of the refreshed crater walls are measurable through various quantities in the VIS-NIR (e.g. color ratios, etc.)

To start addressing the question to what extent a solar-induced breakdown can trigger rock movements, we have chosen lunar craters for which we have generated new boulder maps. For these craters we determine spectral characteristics and mineralogical composition based on a nonlinear spectral mixing model using M3 hyperspectral imager data from Chandrayaan-1. We are reporting the first results of spectral feature mapping for these craters and discuss the mineralogical interpretation, as well as the existence of a correlation between the number of observable boulders inside craters and identified spectral features of the regolith.

References:

[1] Filice, A., 1967, Science, 1967-06-16 156(3781): 1486-1487. [2] Ruesch, O. et al., 2022 Icarus, 387, 115200. [3] Kumar, S. et al., 2016, J. Geophys. Res. Planets, 121, 147– 179. [4] Molaro, J.L. et al., 2017, Icarus, 294, 247-261. [5] Bickel, V.T. et al., 2020, Nat Commun 11, 2862.

How to cite: Mall, U. and Surkov, Y.: Are day-night heating cycles a trigger for launching the “stones” on tour?, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-10887, https://doi.org/10.5194/egusphere-egu23-10887, 2023.

EGU23-11166 | ECS | Orals | PS5.3

Polar Ice Accumulation from Volcanically Induced Transient Atmospheres on the Moon 

Andrew Wilcoski, Paul Hayne, and Margaret Landis

Over the last few decades, observations have revealed the presence of water ice at the lunar poles and upended the notion of a completely dry lunar surface. These ice deposits hold information about the history of water on the Moon and in the Earth-Moon system, and are potential resources for future human exploration of the Moon. However, they remain relatively uncharacterized in abundance, distribution, and composition. Foremost among the open questions about lunar ice are: What were the sources of ice on the Moon’s surface, and how much water could have been delivered? The three most likely sources of lunar water ice are: (1) impact delivery from asteroids and/or comets, (2) solar wind ion implantation, and (3) volcanic outgassing of volatiles from the lunar interior. Here, we assess the viability of a volcanic source for water ice accumulated at the lunar poles.

[1] first suggested the occurrence of a volcanically induced transient atmosphere on the ancient Moon that would have been dominated by CO, but with a significant amount of H­2O. Further studies investigated the dynamics [2] and atmospheric escape processes [3] that would have affected such an atmosphere. [4] later suggested that a large number (30,000-100,000) of eruptions would have created less massive atmospheres during the Moon’s most volcanically active period (4-2 Ga).

We model the generation of transient atmospheres from 50,000 eruptions from 4 to 2 Ga, the subsequent escape of these atmospheres to space, and the concurrent accumulation of atmospheric water vapor as ice at the lunar poles [5]. The molecular composition of the modeled atmospheres is determined using estimates of outgassed volatile content for lunar volcanic eruptions derived from analyses of Apollo samples [4,6]. We model three atmospheric escape processes: (1) Jeans escape, (2) sputtering escape, and (3) photodissociative escape [3], and model photodissociative escape separately for both CO and H2O. We use maximum annual surface temperatures [7] measured by the Diviner Lunar Radiometer Experiment on board the Lunar Reconnaissance Orbiter [8] to calculate ice accumulation rates for each Diviner pixel within 30° latitude of the poles [5].

We find that water vapor is removed from a typical transient atmosphere in about 50 years via ice accumulation and photodissociative escape. About 41% of the total water vapor mass outgassed from 4 to 2 Ga is accumulated as ice on the surface. This demonstrates that a significant amount of ice (~8×1015 kg) could have been sourced from volcanic outgassing, though atmospheric escape processes also strongly control the efficacy of this mechanism.

 

[1] Needham, D. H. and Kring, D. A. (2017) EPSL, 478, 175-178. [2] Aleinov, I., et al. (2019) GRL 46, 5107-5116. [3] Tucker, O. J., et al. (2021) Icarus, 359, 114304. [4] Head, J. W., et al. (2020) GRL, 47, e2020GL089509. [5] Wilcoski, A. X., et al. (2022) PSJ 3.5, 99. [6] Rutherford, M. J., et al. (2017) Amer. Mineralogist, 102, 2045-2053. [7] Landis, Margaret E., et al. (2022) PSJ 3.2, 39. [8] Paige, D. A., et al. (2010) Space Sci. Rev., 150, 125- 160.

How to cite: Wilcoski, A., Hayne, P., and Landis, M.: Polar Ice Accumulation from Volcanically Induced Transient Atmospheres on the Moon, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-11166, https://doi.org/10.5194/egusphere-egu23-11166, 2023.

EGU23-13387 | ECS | Orals | PS5.3

Instrumentation for laser ablation ionisation mass spectrometry on the lunar surface 

Peter Keresztes Schmidt, Matthias Blaukovitsch, Nikita J. Boeren, Marek Tulej, Andreas Riedo, and Peter Wurz

With NASA’s increased focus on exploration of our Moon within the Artemis program, new scientific goals have been formulated to expand our knowledge on the history of our Solar System, including the evolution of the Earth-Moon system. Additionally, establishing a permanent human presence on the Moon has been declared a goal of the Artemis program, the success of which will inevitably depend on in-situ resource utilization (ISRU) of lunar material. In turn, successful ISRU requires methods capable of analysing and selecting suitable materials in place. To support these tasks, sensitive instrumentation capable of determining the elemental and isotope composition of geological samples from the lunar surface is essential. Consequently, defining and determining the technical requirements of such instrumentation, constructing it accordingly, and verifying its performance are all crucial steps in maximising the scientific return of such a mission. Furthermore, NASA’s Artemis program also aims to facilitate future human exploration of Mars, which implies that instrumentation applied successfully on the Moon might find its application on the Martian surface in the future.

 

We present our progress in designing, constructing and testing a prototype miniature laser ablation ionisation mass spectrometer (LIMS) for in-situ measurements on the lunar surface. The finalised instrument will be deployed on the Commercial Lunar Payload Service (CLPS) mission CP-22 scheduled for launch in late 2026 and land in the lunar south pole region. Our miniature reflectron-type time-of-flight mass analyser (160 mm x Ø 60 mm) designed for in-situ space applications was coupled to a pulsed Nd:YAG microchip laser system (SB1 series, Bright Microlaser Srl, Italy) operating at 532 nm (max. laser pulse energy of 40 µJ, pulse repetition rate of 100 Hz). The laser source and the optics were mounted colinearly to the optical axis of the instrument assembly into a cage system. This construction is modelled after the envisioned flight design, and therefore used to determine the required optical and electronic performance characteristics of the future flight instrument. The current flight design will be presented as well. Furthermore, validation of the technical implementation and verification of the scientific requirements will be discussed through the results of laser ablation experiments conducted on lunar regolith simulant.

How to cite: Keresztes Schmidt, P., Blaukovitsch, M., Boeren, N. J., Tulej, M., Riedo, A., and Wurz, P.: Instrumentation for laser ablation ionisation mass spectrometry on the lunar surface, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-13387, https://doi.org/10.5194/egusphere-egu23-13387, 2023.

EGU23-13471 | ECS | Posters on site | PS5.3

Statistical Analysis of Lunar 1 Hz Waves Using ARTEMIS Observations 

Yuequn Lou, Xudong Gu, Xing Cao, Mingyu Wu, Sudong Xiao, Guoqiang Wang, Binbin Ni, and Tielong Zhang

Like 1 Hz waves occurring in the upstream of various celestial bodies in the solar system, 1 Hz narrowband whistler-mode waves are often observed around the Moon. However, the wave properties have not been thoroughly investigated, which makes it difficult to proclaim the generation mechanism of the waves. Using 5.5-year wave data from ARTEMIS, we perform a detailed investigation of 1 Hz waves in the near lunar space. The amplitude of lunar 1 Hz waves is generally 0.05-0.1 nT. In the GSE coordinates, the waves show no significant regional differentiation pattern but an absence inside the magnetosphere. Correspondingly, in the SSE coordinates, they can occur extensively at ~1.1-12 RL, while few events observed in the lunar wake due to a lack of interaction with the solar wind. Furthermore, the wave distributions exhibit modest day-night and dawn-dusk asymmetries, but less apparent north-south asymmetry. Compared with nightside, more intense waves with lower peak wave frequency are present on the dayside. The preferential distribution of 1 Hz waves exhibits a moderate correlation with strong magnetic anomalies. The waves propagate primarily at wave normal angles < 60° with an ellipticity of [-0.8, -0.3]. For stronger wave amplitudes and lower latitudes, 1 Hz waves generally have smaller wave normal angles and become more left-hand circularly polarized. Owing to the unique interaction between the Moon and solar wind, our statistical results might provide new insights into the generation mechanism(s) of 1 Hz waves in planetary plasma environments and promote the understanding of lunar plasma dynamics.

How to cite: Lou, Y., Gu, X., Cao, X., Wu, M., Xiao, S., Wang, G., Ni, B., and Zhang, T.: Statistical Analysis of Lunar 1 Hz Waves Using ARTEMIS Observations, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-13471, https://doi.org/10.5194/egusphere-egu23-13471, 2023.

EGU23-13668 | ECS | Posters on site | PS5.3

Urban seismic noise investigation at the site of ESA/DLR’s future LUNA facility at Cologne, Germany 

Stefanie Hempel, Martin Knapmeyer, Jens Biele, and Hans-Herbert Fischer

As international efforts to return humans to the Moon are increasing, ESA's European Astronaut Center (EAC) and the German Aerospace Center (DLR) are expanding their facilities by the LUNA habitat providing a 700m²-wide testbed covered by 60cm lunar regolith simulant (EAC-1) for astronaut training, including deploying and operating geological and seismic regolith characterization experiments, In Site Resource Utilization technologies (ISRU), biological and chemical experiments by both telerobotic and human activity. The LUNA facility will be operated as collaboration between ESA and DLR's Microgravity User Support Center (MUSC, see also the presentation by Knapmeyer et al. at this conference).

Geophysical experiments have proven useful to investigate the subsurface structure at the landing sites of e.g. Apollo and Chang'e missions on the Moon, but also at the InSight landing site on Mars, and a seismometer experiment to the lunar far side is already scheduled (Far Side Seismic suite, in 2025). To support future geophysical investigations on the Moon, a first seismic experiment was conducted in June, 2018 at the previously envisioned site of the LUNA facility between the :envihab, a research facility of the Institute for Aerospace Medicine and the European Astronaut Center (EAC) at Cologne-Porz. This passive seismic experiment consisted of a four-element, Y-shaped array of short period seismometers, based on the layout of the Apollo 17 seismic experiment. It recorded regional seismicity as well as urban noise. These measurements will be repeated and expanded by an active seismic refraction experiment at the new construction site just south of the EAC - before, during and after the construction of the facility, before and after the installment of the regolith cover to investigate the impact of the LUNA facility on the data quality and coupling to the ground.

We present details of the 2018 experiment as well as preliminary results, analyzing ambient noise to map the dominant sources of urban noise such as car traffic and airplane traffic at the nearby CGN international airport, the operational noises of the :envihab centrifuge and the wind tunnel as well as nearby construction and drilling.

How to cite: Hempel, S., Knapmeyer, M., Biele, J., and Fischer, H.-H.: Urban seismic noise investigation at the site of ESA/DLR’s future LUNA facility at Cologne, Germany, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-13668, https://doi.org/10.5194/egusphere-egu23-13668, 2023.

EGU23-14255 | ECS | Posters on site | PS5.3

Photometry of rock-rich surfaces on airless bodies. 

Rachael Martina Marshal, Ottaviano Rüsch, Christian Wöhler, Kay Wohlfarth, Sergey Velichko, and Markus Patzek

Introduction and Methods:  Our understanding of the response of boulders to space weathering, micrometeorite abrasion, thermal fatigue, and consequently their evolution into regolith can be improved by characterizing the surface roughness of the uppermost layer of boulders. In the first phase of our study [1] we characterize the surface roughness of boulder fields photometrically by using the phase ratio methodology applied to orbital image data. In the second phase of our study (in-progress) we focus on characterizing the sub-mm scale topography and roughness of naturally fresh surfaces of meteorite samples. The photometric roughness of boulder fields on the lunar surface is studied by employing a normalized logarithmic phase ratio difference (NLPRD) metric, described in [1], to measure and compare the slope of the phase curve (reflectance versus phase angle) of a rock-rich field to a rock-free field . We compare the photometric roughness of rock-rich fields on simulated images, with the photometric roughness of rock-rich fields on Lunar Reconnaissance Orbiter Narrow Angle Camera (LROC NAC) images sampled around an Unnamed crater at Hertzsprung S.


Results and Discussion: The NLPRD is normalized to a rock-free reference surface, assuming the roughness of the regolith within the boulderfield is comparable to the roughness of the regolith at the rock-free reference regions, the higher roughness of the boulder-fields implies the presence of rocks with diverse sub-mm scale roughness and, possibly, variable single scattering albedo. In figure 1b, the spread in NLPRD values for different rock morphologies, is exceeded by the spread in  NLPRD of the NAC-resolved boulderfields. We find spatial clustering of photometrically smooth and rough boulderfields in the downrange and up-range respectively of the Unnamed crater at Hertzsprung S, reflecting ejecta asymmetry (in agreement with [2]) and possibly indicating asymmetric modification of ejecta rock surfaces during impact excavation process. Our results imply that rock physical properties at the start of the surface exposure period are a function of petrology as well as the (shock) effects imparted upon ejecta rock formation and excavation. The work-in progress deals with supplementing our findings with investigation of the sub-mm scale topography and roughness of meteorite and lunar samples. To study the sub-mm scale roughness of these samples we produce high-resolution DTMs at the µm scale using a non-contact optical profilometer. A sample high-resolution DTM of lunar breccia NWA11273 is shown in figure 2.



Figure 3 shows that variations of the mean slope with spatial scale exists within different meteorites types. Next, we will investigate the scale-dependent rock micro-texture of various samples (i.e., ordinary and carbonaceous chondrites, lunar basalts and breccias as well as meteorites from the HED clan), and provide typical values of surface roughness that will inform photometric modelling of rock surfaces.

References: [1] Marshal, R. M., Rüsch, O., Wöhler, C., Wohlfarth, K., & Velichko, S. (2022). Icarus, 115419 [2] Velichko, S., Korokhin, V., Velikodsky, Y., Kaydash, V., Shkuratov, Y., & Videen, G. (2020). PSS, 193.

How to cite: Marshal, R. M., Rüsch, O., Wöhler, C., Wohlfarth, K., Velichko, S., and Patzek, M.: Photometry of rock-rich surfaces on airless bodies., EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-14255, https://doi.org/10.5194/egusphere-egu23-14255, 2023.

EGU23-15759 | ECS | Posters on site | PS5.3

The MoonLIGHT Pointing Actuator (MPAc) project 

Laura Rubino, Alejandro Remujo Castro, Ubaldo Denni, Marco Muccino, Lorenzo Salvatori, Mattia Tibuzzi, Matteo Petrassi, Michele Montanari, Marco Traini, Luciana Filomena, Lorenza Mauro, Luca Porcelli, and Simone Dell'Agnello

Laser Ranging is a technique used to perform accurate precision distance measurements between a laser ground station and an optical target, a Cube Corner Retroreflector (CCR). Since 1969 it is possible to realize Lunar Laser Ranging (LLR) measurements thanks to Apollo and Luna missions that placed some arrays of CCRs on the lunar surface. LLR outputs include accurate tests of General Relativity, information of the composition of the Moon, its ephemerides and its internal structure or geocentric positions and motions of ground stations: research uniquely enabled by the Moon.

Despite laser ground stations have significantly improved during the years, the current limitation of the lunar optical target is due to lunar librations. In order to achieve more precise LLR measurements, MoonLIGHT project is designed by SCF_Lab joined UMD. The aim of the project is designing a next-generation of retroreflectors, prototyping, manufacturing and qualify them for the Moon’s environment. Moving from a multi small CCRs array to a single large 100 mm CCR, called MoonLIGHT, unaffected by the lunar librations.

The field of view of each CCR is limited: the retroreflector needs to be pointed precisely to the ground station. The Apollo CCR arrays were manually arranged by the astronauts. In 2018 INFN proposed to ESA the MoonLIGHT Pointing Actuators (MPAc) project, able to perform unmanned pointing operation of MoonLIGHT. In 2019 ESA chose MPAc among 135 eligible scientific project proposals. In 2021 ESA agreed with NASA to launch MPAc to the Reiner Gamma region of the Moon, with a Commercial Lunar Payload Services (CLPS), which is part of the Artemis program. The lander on which MPAc will be integrated is designed by Intuitive Machines (IM). The launch expected date is in April 2024.

MPAc must be able to perform two continuous perpendicular rotations to accurately point the frontal face of the CCR towards the Earth. The device is continuously evolving to ensure the success of the mission, that will take place in Ultra High Vacuum space conditions, in a wide operating temperature range. Terrestrial prototypes, with all the characteristics of the final structure, have been developed for the study of mechanical and electronics components. Qualification tests for space are being planned as the components for the Proto Flight Model (PFM) arrived to the LNF. Payload delivery is scheduled for August 2023.

MPAc will contribute to attain lunar orbit range accuracy below few mm. This will improve, in turn, the precision of the Parametrized Post-Newtonian (PPN) parameters and put more stringent constraints on departures from GR predictions with observations.

How to cite: Rubino, L., Remujo Castro, A., Denni, U., Muccino, M., Salvatori, L., Tibuzzi, M., Petrassi, M., Montanari, M., Traini, M., Filomena, L., Mauro, L., Porcelli, L., and Dell'Agnello, S.: The MoonLIGHT Pointing Actuator (MPAc) project, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-15759, https://doi.org/10.5194/egusphere-egu23-15759, 2023.

EGU23-15933 | ECS | Orals | PS5.3

Orbit Determination and Time Transfer for a Lunar Radio Navigation System 

Andrea Sesta, Mauro Di Benedetto, Daniele Durante, Luciano Iess, Michael Plumaris, Paolo Racioppa, Paolo Cappuccio, Ivan di Stefano, Debora Pastina, Giovanni Boscagli, Serena Molli, Fabrizio De Marchi, Gael Cascioli, Krzysztof Sosnica, Agnes Fienga, Nicola Linty, and Jacopo Belfi

Within the pre-phase A of the Moonlight project proposed and funded by the European Space Agency (ESA), the ATLAS consortium has proposed an architecture to support a Lunar Radio Navigation System (LRNS) capable of providing PNT (Positioning, Navigation, and Timing) services to various lunar users. The Moonlight LRNS will be a powerful tool in support of the lunar exploration endeavors, both human and robotic.

The ESA LRNS will consist of a small constellation of 3-4 satellites put in Elliptical Lunar Frozen Orbits (ELFO) with the aposelene above the southern hemisphere to better cover this region, given its interest for future lunar missions. This LRNS will be supported by a ground station network of small dish antennas (~30 cm), which can establish Multiple Spacecraft Per Aperture (MSPA) tracking at K-band. Any Earth station will be capable of sending a single uplink signal to multiple spacecraft thanks to Code Division Multiplexing modulation, while in the downlink multiple carriers can share the same K-band bandwidth by implementing Code Division Multiple Access (CDMA) on the onboard transponders. This allows the implementation of the Same Beam Interferometry (SBI) technique [1], which adds to spread spectrum ranging and Doppler measurements. In the scope of disseminating accurate PNT services to end users, the constellation will also be capable of maintaining a synchronization to the Earth station clocks to the ns level.

The performances of the proposed architecture have been validated through numerical simulations performed with the ESA GODOT software, enhanced with additional user-defined features and capabilities. For each satellite of the LRNS constellation, the attainable orbital accuracy is at level of a few meters for most orbit mean anomalies and it has been computed considering a setup which includes a perturbed dynamical model (mainly coming from uncertainties in the accelerations induced by the solar radiation pressure and orbital maneuvers) and a realistic error model for Doppler, ranging and SBI measurements.

REFERENCE:

  • Gregnanin, M. et al. (2012). Same beam interferometry as a tool for the investigation of the lunar interior. Planetary and Space Science 74, 194-201

How to cite: Sesta, A., Di Benedetto, M., Durante, D., Iess, L., Plumaris, M., Racioppa, P., Cappuccio, P., di Stefano, I., Pastina, D., Boscagli, G., Molli, S., De Marchi, F., Cascioli, G., Sosnica, K., Fienga, A., Linty, N., and Belfi, J.: Orbit Determination and Time Transfer for a Lunar Radio Navigation System, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-15933, https://doi.org/10.5194/egusphere-egu23-15933, 2023.

EGU23-16681 | Posters on site | PS5.3

Lunar plasma and electrostatic environment: numerical approach and its future prospects 

Yohei Miyake and Jin Nakazono

Mission preparation for lunar exploration using landers has been rapidly increasing, and strong demand should arise toward precise understanding of the electrostatic environment. The lunar surface, which has neither a dense atmosphere nor a global magnetic field, gets charged electrically by the collection of surrounding charged particles of the solar wind or the Earth's magnetosphere. As a result of the charging processes, the surface regolith particles behave as "charged dust grains". Dust particles have been suggested to have adverse effects on exploration instruments and living organisms during the lunar landing missions, and their safety evaluation is an issue to be solved for the realization of sustainable manned lunar explorations. It is necessary to develop comprehensive and organized understanding of lunar charging phenomena and the electrodynamic characteristics of charged dust particles.

It is widely accepted that the surface potential of the lunar dayside is, "on average" several to 10 V positive due to photoelectron emission in addition to the solar wind plasma precipitation. Recent studies, however, have shown that insulating and rugged surfaces of the Moon tend to make positive and negative charges separated and irregularly distributed, and intense and structured electric fields can be formed around them. This strong electric field lies in the innermost part of the photoelectron sheath and may contribute to mobilizations of the charged dust particles. Since this strong electric field develops on a spatial scale of less than the Debye length and can take various states depending on the lunar surface geometry, it is necessary to update the research approach. In this paper, we will discuss the direction of the near-surface plasma, electrostatic, and dust environment for upcoming lunar landing missions.

How to cite: Miyake, Y. and Nakazono, J.: Lunar plasma and electrostatic environment: numerical approach and its future prospects, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-16681, https://doi.org/10.5194/egusphere-egu23-16681, 2023.

EGU23-17528 | Posters on site | PS5.3

ILEWG/LUNEX EuroMoonMars & EuroSpaceHub Academy: Recent Highlights 

Bernard Foing, Henk Rogers, Serena Crotti, and Jara Pascual and the ILEWG LUNEX EuroMoonMars Team and EuroSpaceHub Academy

EuroMoonMars programme in Data Analysis, Instrumentation, Field Work and Astronautics: EuroMoonMars is an ILEWG programme [1-226] in collaboration with space agencies, academia, universities and research institutions and industries. The programme includes research activities for data analysis, instruments tests and development, field tests in MoonMars analogue, pilot projects , training and hands-on workshops , and outreach activities. Extreme environments on Earth often provide similar terrain conditions to sites on the Moon and Mars. In order to maximize scientific return it becomes more important to rehearse mission operations in the field and through simulations. EuroMoonMars field campaigns have then been organised in specific locations of technical, scientific and exploration interest. Lunex EuroMoonMars, has been organizing in collaboration with ESA, NASA, European and US universities a programme of data analysis, instrumentation tests, field work and analog missions for students and researchers in different locations worldwide since 2009, including Hawaii HI-SEAs, Utah MDRS, Iceland, Etna/ Vulcano Italy, Atacama, AATC Poland, ESTEC Netherlands, Eifel Germany, etc… Analogue missions provide a practical ground in which students can test the notions learnt at the university in a realistic simulation context. Over the course of these missions, students have access to special Space instrumentation, laboratories, Facilities, Science Operations, Human Robotic partnerships. In 2023 , EuroMoonMars and EuroSpaceHub Academy co-sponsored a series of EMMPOL Moonbase isolation simulation campaigns in Poland.

EuroSpaceHub programme for Space Innovation Workforce Development: The EuroSpaceHub project to facilitate accessibility to the Aerospace sector. EuroSpaceHub is a European-led project with collaborators worldwide, funded by the EIT HEI initiative - Innovation Capacity Building for Higher Education – with Agenda 2021-2027. The project includes six  core partners: Vilnius TU, ISU, U C Madrid,  Sikorsky Kyiv, Collabwith and Lunex. The project was created to foster collaboration, innovation and entrepreneurship in the European Aerospace sector. EuroSpaceHub Academy develops training programme for Space researchers and entrepreneurs.

Space Engineering Workforce Development: we have also developed a semester course of Space System Design Engineering at  EPFL Lausanne sicne 2020.

Interdisciplinary Space Workforce Development: In the frame of ISU International Space University and EuroSpaceHub academy, we performed lectures,  hands-on workshops including the operations of instruments on EuroMoonMars ExoGeoLab lander, workshops on MoonOutpost design performed in the frame of MSS master , or SSP Space Studies Programme. Together with ISU , EuroSpaceHub staff co-supervised various IP Individual Projects of students, and Master Research Projects.

EuroSpaceHub Participation to Congress and Events: We also co-sponsored the participation to conferences such as LPSC, EGU, IAC and the organization of events or workshops connecting the space scientists, engineers, innovators, entrepreneurs to space stakeholders. This included talks and expo booths at IAC International Astronautical Congress and Rome New Space Economy Forum.

How to cite: Foing, B., Rogers, H., Crotti, S., and Pascual, J. and the ILEWG LUNEX EuroMoonMars Team and EuroSpaceHub Academy: ILEWG/LUNEX EuroMoonMars & EuroSpaceHub Academy: Recent Highlights, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-17528, https://doi.org/10.5194/egusphere-egu23-17528, 2023.

GI4 – Atmosphere and ocean monitoring

EGU23-135 | ECS | Orals | GI4.2

Deep-Pathfinder: A machine learning algorithm for mixing layer height detection based on lidar remote sensing data 

Jasper Wijnands, Arnoud Apituley, Diego Alves Gouveia, and Jan Willem Noteboom

The mixing layer height (MLH) indicates the change between vertical mixing of air near the surface and less turbulent air above. MLH is important for the dispersion of air pollutants and greenhouse gases, and assessing the performance of numerical weather prediction systems. Existing lidar-based MLH detection algorithms typically do not use the full resolution of the ceilometer, require manual feature engineering, and often do not enforce temporal consistency of the MLH profile. Given the large-scale availability of lidar remote sensing data and the high temporal and spatial resolution at which it is recorded, this domain is very suitable for machine learning approaches such as deep learning. This presentation introduces a completely new approach to estimate MLH: the Deep-Pathfinder algorithm, based on deep learning techniques for image segmentation.

The concept of Deep-Pathfinder is to represent the 24-hour MLH profile as a mask (i.e., black indicating the mixing layer, white indicating the non-turbulent atmosphere above) and directly predict the mask from an image with lidar observations. Range-corrected signal (RCS) data at 12-second temporal and 10-meter vertical resolution was obtained from Lufft CHM 15k ceilometers at five locations in the Netherlands (2020–2022). High-resolution annotations were created for 50 days, informed by a visual inspection of the RCS image, the manufacturer's layer detection algorithm, gradient fields, thermodynamic MLH estimates, and humidity profiles of the 213-meter mast at Cabauw.

Our model is based on a customised U-Net architecture with MobileNetV2 encoder to ensure fast inference times. A nighttime variable indicated whether the sample occurred between sunset and sunrise and hence, whether an estimate of the stable or convective boundary layer was required. Model calibration was performed on the Dutch National Supercomputer Snellius. First, input samples were randomly cropped to 224x224 pixels, covering a 45-minute period and maximum altitude of 2240 meters. Then, the model was pre-trained on 19.4 million samples of unlabelled data. Finally, the labelled data was used to fine-tune the model for the task of mask prediction. Performance on a test set was compared to MLH estimates from ceilometer manufacturer Lufft and the STRATfinder algorithm.

Results showed that days with a clear convective boundary layer were captured well by all three methods, with minimal differences between them. The Lufft wavelet covariance transform algorithm contained a slight temporal shift in MLH estimates. Further, it had more missing data in complex atmospheric conditions. STRATfinder estimates for the nocturnal boundary layer were consistently low due to guiding restrictions in the algorithm. In contrast, Deep-Pathfinder followed short-term fluctuations in MLH more closely due to the use of high-resolution input data. Path optimisation algorithms like STRATfinder have good temporal consistency but can only be evaluated after a full day of ceilometer data has been recorded. Deep-Pathfinder retains the advantages of temporal consistency by assessing MLH evolution in 45-minute samples, however, it can also provide real-time estimates. This makes a deep learning approach as presented here valuable for operational use, as real-time MLH detection better meets the requirements of users in aviation, weather forecasting and air quality monitoring.

How to cite: Wijnands, J., Apituley, A., Alves Gouveia, D., and Noteboom, J. W.: Deep-Pathfinder: A machine learning algorithm for mixing layer height detection based on lidar remote sensing data, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-135, https://doi.org/10.5194/egusphere-egu23-135, 2023.

EGU23-1387 | ECS | Posters on site | GI4.2

Optical properties of birch pollen using a synergy of three lidar instruments 

Maria Filioglou, Ari Leskinen, Ville Vakkari, Minttu Tuononen, Xiaoxia Shang, and Mika Komppula

Pollen has important implications for health, but also for the climate as it can act as cloud condensation nuclei or ice nuclei in cloud processing. Active remote sensing instruments equipped with polarization capability can extend the detection of pollen from the surface up to several kilometres in the atmosphere maintaining continuous and high time resolution operation. In this study, we use a synergy of three lidars, namely, a multi-wavelength PollyXT lidar, a Vaisala CL61 ceilometer and a Halo Photonics StreamLine Doppler lidar, to investigate the optical properties of birch pollen particles. All three lidars are equipped with polarization channels enabling the investigation of the wavelength dependence at 355, 532, 910 and 1565 nm. Together with pollen observations from a Hirst-type spore sampler and aerosol in situ observations, we were able to characterize the linear particle depolarization ratio (PDR) and backscatter-related Angstrom exponents of the pollen particles. Both optical properties have been extensively used in aerosol classification algorithms and they are therefore highly desired in the lidar community. We found that birch pollen exhibits a spectral dependence in the PDR, and its classification is feasible when, preferably, two or more polarization wavelengths are available.

How to cite: Filioglou, M., Leskinen, A., Vakkari, V., Tuononen, M., Shang, X., and Komppula, M.: Optical properties of birch pollen using a synergy of three lidar instruments, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-1387, https://doi.org/10.5194/egusphere-egu23-1387, 2023.

Ice clouds in the Arctic are expected to have different radiative properties compared to mid latitude cirrus, because of the different humidity and temperature profile and also the prevalent aerosol loading in northern latitudes which govern their formation. During the late winter and early spring 2022 the HALO-(AC)3 campaign was conducted out of Kiruna (Sweden) to probe artic clouds with an airborne remote sensing payload. For this purpose, the German research aircraft HALO was equipped with a water vapor Differential Absorption Lidar (DIAL), a cloud radar, micro-wave radiometers, radiation measurements in the visible, near infrared and thermal region and a drop-sonde dispenser. A total of 25 flights where performed mainly over the sea between Svalbard and Greenland and up to nearly 90°N.

The primary observable to study ice cloud formation is the relative humidity, which is not directly measurable by lidar, but can only be computed with the aid of additional temperature information. By comparison with a large number of dropsondes launched during flight, we will show that the temperature field from ECMWF IFS analyses and short-term forecasts provides sufficient accuracy to retrieve the relative humidity for ice cloud studies. Using this method we will analyse different scenarios of arctic cirrus formation: under stable artic conditions, during a warm air intrusion and while a cold air outbreak. An interesting special case is the modification of cirrus properties by the presence of an aerosol layer which is most probably composed of long range transported Sharan dust. 

How to cite: Wirth, M. and Groß, S.: Characterisation of Arctic Cirrus by Airborne Water Vapor and High Spectral Resolution Lidar, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-2024, https://doi.org/10.5194/egusphere-egu23-2024, 2023.

The atmospheric boundary layer is a layer that directly responds to energy emitted or absorbed from the ground into the atmosphere, and is greatly affected by various meteorological factors, which change the concentration of air pollutants. There is generally an inversion layer above the atmospheric boundary layer, so most of the air pollutants emitted by humans cannot escape to the outside of the atmospheric boundary layer and remain there. Ulsan Metropolitan City in Korea is known as the largest industrial city in Korea. These industrial cities generally emit more air pollutants than other cities. Since these air pollutants are greatly affected by the boundary layer, it is important to accurately calculate the height of the boundary layer. In this study, we compare the height of the atmospheric boundary layer based on LiDAR and the height of the atmospheric boundary layer in the Weather Research and Forecasting numerical model, and examine how the height of the atmospheric boundary layer affects the change in the concentration of air pollutants.

How to cite: Choi, K. and Song, C.: Effect of air pollutant concentration according to the height of the Planetary boundary layer in Ulsan, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-3042, https://doi.org/10.5194/egusphere-egu23-3042, 2023.

EGU23-3068 | ECS | Posters on site | GI4.2

Improvement of wind vector retrieval method for increasing data acquisition rate of the wind profiler and the wind lidar 

Yujin Kim, Byung Hyuk Kwon, Jiwoo Seo, Geon Myeong Lee, and KyungHun Lee

Representative meteorological instruments that utilize the Doppler effect include Doppler radar, wind profiler, and wind lidar. The latter two instruments produce a vertical profile of winds in high spatio-temporal resolution, in the atmospheric boundary layer. Wind lidar observes with a vertical resolution of 50 m or less and a temporal resolution in minutes, so it fills the observation gap in the lower layer where the wind profiler misses meteorological data. The wind lidar makes the wind vector using DBS (Doppler Beam Swinging) and VAD (Velocity Azimuth Display) methods. It is known that the wind by the VAD method is more accurate than the wind by the DBS method. The DBS method has the advantage of obtaining a wind profile with a fast scan time. On the other hand, there is a restriction that requires at least two beams including vertical beams (one of the east and west beams, and one of the south and north beams), which causes a decrease in the data acquisition rate. The VAD method was improved to produce more wind vector of the wind profiler as well as the wind lidar, which generally uses 5 beams. First, the Fourier series was estimated with the radial velocity by the DBS method. Next, the wind vector was determined by setting the azimuth interval and applying the radial velocity by the Fourier series to the VAD method. The wind vectors were retrieved at the altitude where the wind was not calculated by the DBS method, and the results of the two methods were consistent at the altitude where the wind was calculated by the DBS and the improved VAD method. In this study, we propose a method to increase the data acquisition rate even if the vertical beam or one of the inclined beams is insufficient.

How to cite: Kim, Y., Kwon, B. H., Seo, J., Lee, G. M., and Lee, K.: Improvement of wind vector retrieval method for increasing data acquisition rate of the wind profiler and the wind lidar, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-3068, https://doi.org/10.5194/egusphere-egu23-3068, 2023.

EGU23-4339 | Orals | GI4.2

Climatological assessment of the vertically resolved optical aerosol properties by lidar measurements and their influence on radiative budget over the last two decades at UPC Barcelona 

Simone Lolli, Adolfo Comeron, Cristina Gíl-Diaz, Tony Landi, Constantin Munoz-Porcar, Daniel Oliveira, Alejandro Rodriguez-Gomez, Michael Sïcard, Andrés Alastuey, Xavier Querol, and Cristina Reche

In the last two decades, several scientific studies have highlighted the adverse effects, primarily on population health, transportation, and climate, of urban atmospheric particulate due to anthropogenic emissions. For these reasons, aerosols have been monitored through both, remote sensing and in-situ observation platforms, also to establish if the reduction emission policies implemented at the government level have had positive outcomes. In this study, for the first time, we assess how the vertically resolved properties of the atmospheric particulate have changed and consequently their radiative effect during the last twenty years in Barcelona, Spain, one of the largest metropolitan areas of the Mediterranean basin. This study is carried out in the frame of the ACTRIS project through synergy between lidar measurements and the meteorological variables, e.g. wind, temperature, and humidity at the surface. This research, thanks to twenty-year measurements, can shed some light on the meteorological processes and conditions that can lead to haze formation and can help decision-makers to adopt mitigation strategies to preserve large marine Mediterranean metropolitan regions.

How to cite: Lolli, S., Comeron, A., Gíl-Diaz, C., Landi, T., Munoz-Porcar, C., Oliveira, D., Rodriguez-Gomez, A., Sïcard, M., Alastuey, A., Querol, X., and Reche, C.: Climatological assessment of the vertically resolved optical aerosol properties by lidar measurements and their influence on radiative budget over the last two decades at UPC Barcelona, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-4339, https://doi.org/10.5194/egusphere-egu23-4339, 2023.

EGU23-5753 | Posters on site | GI4.2

Latent flow measurement by Wind Lidar and Raman Lidar during WaLiNeas campaign 

Donato Summa, Paolo Di Girolamo, Noemi Franco, Ilaria Gandolfi, Marco Di Paolantonio, Marco Rosoldi, Fabio Madonna, Aldo Giunta, and Davide Dionisi

A network of water vapor Raman lidars  WaLiNeas (Lidar Network Assimilation) for improving heavy precipitation forecasting in the Mediterranean Sea has been designed among with the aim of providing water vapor measurements with high spatial-temporal resolution and accuracy, in order to be assimilated into AROME mesoscale models using a four-dimensional ensemble-variational approach with 15-min updates. The CONCERNIG Lidar from University of Basilicata and a Wind Lidar form CNR–IMAA are co-located in the University of Toulone between October 2022 and January 2023 in order to reach the campaign objective. For this scope a of vertical profiles of latent heat flux were obtained  as a  Covariance matrices from vertical wind component (w') and mixing ratio (q') are estimated as a retrieval of a Wind Lidar and Raman Lidar UV respectively.

In this way, a time series of vertical wind profiles from the selected case (31 Oct to 03 Nov) are computed. with temporal resolution Δt = 15 min and vertical resolution Δz = 90 m.  The specific humidity flux < w’ · q’>  [g/kg · m/s] is converted into the flux of latent heat (W/m2) by multiplication with the air density ρ obtained from the radiosonde and the latent heat of vaporization of water Lv. A flux comparison with ground-based water vapour Raman and wind lidar shows agreement within the instruments and the results will be presented during the conference

How to cite: Summa, D., Di Girolamo, P., Franco, N., Gandolfi, I., Di Paolantonio, M., Rosoldi, M., Madonna, F., Giunta, A., and Dionisi, D.: Latent flow measurement by Wind Lidar and Raman Lidar during WaLiNeas campaign, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-5753, https://doi.org/10.5194/egusphere-egu23-5753, 2023.

EGU23-7065 | Orals | GI4.2

Nitrous Oxide, N2O: Spectroscopic Investigations for Future Lidar Applications 

Christoph Kiemle, Christian Fruck, and Andreas Fix

Nitrous Oxide, N2O, is the third most important GHG contributing to human-induced global warming, after carbon dioxide and methane. Its growth rate is constantly increasing and its global warming potential is estimated to be 273 times higher than that of CO2 over 100 years. The major anthropogenic source is nitrogen fertilization in croplands. Soil N2O emissions are increasing due to interactions between nitrogen inputs and global warming, constituting an emerging positive N2O-climate feedback. The recent increase in global N2O emissions exceeds even the most pessimistic emission trend scenarios developed by the IPCC, underscoring the urgency of mitigating N2O emissions (Global Carbon Project, 2020). Estimating N2O emissions from agriculture is inherently complex and comes with a high degree of uncertainty, due to variability in weather and soil characteristics, in agricultural management options and in the interaction of field management with environmental variables. Further sources of N2O are processes in the chemical industry and combustion processes. The sink of N2O in the stratosphere increases the NOx concentration which catalytically depletes ozone. Better N2O measurements thus are urgently needed, particularly by means of remote sensing.

Airborne or satellite based N2O lidar remote sensing combines the advantages of high measurement accuracy, large-area coverage and nighttime measurement capability. Past initial feasibility studies revealed that Integrated-Path Differential-Absorption (IPDA) lidar providing vertical column concentrations of N2O would be the method of choice. In this current study we use the latest HITRAN spectroscopic data in order to identify appropriate N2O absorption lines in the wavelength region between 2.9 and 4.6 µm. The infrared spectral region challenges both lidar transmission and detection options. On the transmitter side, the use of optical parametric conversion schemes looks promising, while HgCdTe avalanche photodiode (APD), superconducting nanowire single-photon (SNSPD) or upconversion detectors (UCD) could offer high-efficiency low-noise signal detection. These options are implemented into a lidar simulation model in order to identify the optimal lidar system configuration for measuring N2O from aircraft or satellite using state-of-the-art technology.

How to cite: Kiemle, C., Fruck, C., and Fix, A.: Nitrous Oxide, N2O: Spectroscopic Investigations for Future Lidar Applications, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-7065, https://doi.org/10.5194/egusphere-egu23-7065, 2023.

EGU23-7093 | ECS | Orals | GI4.2

Ceilometer aerosol retrieval and comparison with in-situ tower-measurements 

Marcus Müller, Ulrich Löhnert, and Birger Bohn

In recent years there is a growing interest in real-time aerosol profiling and in this context, the use of automated lidars and ceilometers (ALC) for aerosol remote sensing increased. Ceilometers were originally developed to measure cloud-base height automatically. Apart from this, they also provide vertically resolved backscatter information. Several algorithms have been developed to calibrate this signal and to derive aerosol concentration from it, bringing up new opportunities in air quality monitoring and boundary layer research.

The quality of ALCs is often evaluated by comparing the attenuated backscatter to measurements from high-power lidars. This approach is suitable to validate the backscatter signal. However, for the validation of the aerosol concentration, a direct comparison with an in-situ, optical aerosol measurement is more significant.

In this work, a comparison study was performed using the Jülich Observatory for Cloud Evolution. Data were processed and calibrated with algorithms by E-Profile (https://www.eumetnet.eu/activities/observations-programme/current-activities/e-profile/alc-network/). The aerosol retrieval was performed using a Klett inversion algorithm. Close to the JOYCE site a 120 m meteorological tower is located. This tower was used as a platform for the in-situ aerosol measurement, where an optical particle sizer was mounted 100 m above the ceilometer position.

We will show the setup and data processing of the in-situ measurements as well as an approach how ceilometer raw data can be processed, calibrated and used to retrieve aerosol concentration. First results of the comparison will be presented to evaluate the quality of ALC aerosol-measurement.

How to cite: Müller, M., Löhnert, U., and Bohn, B.: Ceilometer aerosol retrieval and comparison with in-situ tower-measurements, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-7093, https://doi.org/10.5194/egusphere-egu23-7093, 2023.

EGU23-7775 | Posters on site | GI4.2

Several months of continuous operation of two thermodynamic Raman lidars in the frame of WaLiNeAs 

Paolo Di Girolamo, Noemi Franco, Marco Di Paolantonio, Donato Summa, Davide Dionisi, Annalisa Di Bernardino, Anna Maria Iannarelli, and Tatiana Di Iorio

The University of Basilicata, in cooperation with ISMAR-CNR, deployed two compact Raman lidars, namely the system CONCERNING and the system MARCO, in Southern France in the frame of the “Water Vapor Lidar Network Assimilation (WaLiNeAs)” experiment. WaLiNeAs, primarily funded by the “French National Research Agency” (ANR), is an international field experiment aimed at studying extreme precipitation events and improving their predictability through the assimilation of water vapour profile measurements from a network of Raman lidar systems into mesoscale numerical models. The experiment has a specific geographical focus on Southern France. The measurement strategy implies the exploitation of seven Raman lidars along the Mediterranean coasts of Spain and France, capable to provide real-time measurements of water vapour mixing ratio profiles over a three-month period starting on October 1st, 2022.

CONCERNING (COmpact RamaN lidar for Atmospheric CO2 and ThERmodyNamic ProfilING), developed in the frame of a cooperation between University of Basilicata, ISMAR-CNR and University of Rome, is a compact and transportable Raman lidar system designed for long-term all-weather continuous operation, capable to perform high-resolution and accurate carbon dioxide and water vapour mixing ratio profile measurements, together with temperature and multi-wavelength (355, 532 and 1064 nm) particle backscattering/extinction/depolarization profile measurements. The system relies on a 45-cm diameter Newtonian telescope and on a diode-pumped Nd:Yag laser source, capable of emitting pulses at the three traditional wavelengths of this laser source(355, 532 and 1064 nm), with a single pulse energy at 355 nm of 110 mJ and an average emitted power of 11 watts, based on a pulse repetition frequency of 100 Hz.

MARCO (Micropulse Atmospheric Optical Radar for Climate Observations) is also a compact and easily transportable Raman lidar system, developed around a high- frequency laser source (20 kHz), capable to perform 24/7 high-resolution and accurate CO2 and water vapour mixing ratio profile measurements, together with temperature and single-wavelength (355 nm) particle backscattering/extinction/depolarization measurements. In the frame of WaLiNeAs, as a result of the restrictions imposed by air traffic authority in the use of the visible and infrared laser radiation, only the 355 wavelength was exploited in CONCERNING, the temperature channel was not available in MARCO, while the CO2 channels, not needed for the purposes of WaLiNeAs, were temporarily deactivated in both systems.

Both systems have been recently designed and developed and WaLiNeAs represents the first international field deployment for both. CONCERNING was deployed at the University of Toulon in La Garde (Lat.: 43.136040 N, Long.: 6.011650 E, Elev.: 65 m, with continuous measurements since 29 September 2022, i.e. over more than 100 days up to now), while MARCO, was deployed at the Direction de Services Techniques in Port-Saint-Louis-du-Rhône, Camargue (Lat.: 43.392570 N, Long.: 4.813480 E, Elev.: 5 m, with continuous measurements since 19 October 2022, i.e. over more than 80 days up to now). At the time of the submission of this abstract, both system are still operational with a tbc date for the stop of the operation of 31 January 2023. Preliminary results from these two systems will be illustrated and discussed during the Conference.

How to cite: Di Girolamo, P., Franco, N., Di Paolantonio, M., Summa, D., Dionisi, D., Di Bernardino, A., Iannarelli, A. M., and Di Iorio, T.: Several months of continuous operation of two thermodynamic Raman lidars in the frame of WaLiNeAs, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-7775, https://doi.org/10.5194/egusphere-egu23-7775, 2023.

EGU23-8014 | ECS | Posters on site | GI4.2

A compact general-purpose Doppler Lidar for lidar networks 

Jan Froh, Josef Höffner, Alsu Mauer, Thorben Mense, Ronald Eixmann, Gerd Baumgarten, Franz-Josef Lübken, Alexander Munk, Sarah Scheuer, Michael Strotkamp, and Bernd Jungbluth

We present the state of the VAHCOLI (Vertical and Horizontal COverage by Lidar) project for investigating small- to large-scale processes in the atmosphere. In the future, an array of compact lidars with multiple fields of view will allow for measurements of temperatures, winds and aerosols with high temporal and vertical resolution.

Doppler lidars, in particular resonance Doppler lidars, with daylight capability are challenging systems because of the small field of view, spectral filtering and other additional subsystems required compared to observations at night. We developed a universal Doppler lidar platform (~1m3, ~500kg) with all required technologies for automatic operation. The system is capable of studying Mie scattering (aerosols), Rayleigh scattering (air molecules), and resonance fluorescence on free potassium atoms in the middle atmosphere from 5 km to 100 km. Unique spectral methods and narrowband optical components allow precise wind, temperature, and aerosol measurements by studying the Doppler shift and broadening of the scattered signals. The combination of cost-efficient design and fast assembling of such a system allows the construction of a Doppler lidar network with identical units

We will show the latest results and discuss the next scientific and technical steps for network operation and transferring the technology into industry.

How to cite: Froh, J., Höffner, J., Mauer, A., Mense, T., Eixmann, R., Baumgarten, G., Lübken, F.-J., Munk, A., Scheuer, S., Strotkamp, M., and Jungbluth, B.: A compact general-purpose Doppler Lidar for lidar networks, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-8014, https://doi.org/10.5194/egusphere-egu23-8014, 2023.

EGU23-8149 | ECS | Orals | GI4.2

Water vapor retrieval from mini Raman lidar HORUS in the framework of the WaLiNeAs campaign 

Frédéric Laly, Patrick Chazette, Julien Totems, and Jérémy Lagarrigue

The Mediterranean Rim, and more particularly the western Mediterranean area, is one of the most sensitive regions to climate change. The associated environmental changes are already evident through periods of drought and intense rainfall. The predictions of these phenomena are a major societal issue, which led us to use lidar systems to constrain regional modelling. The Raman lidars HORUS-1 and -2 are each composed of two telescopes of 15 cm diameter.  For each telescope N2 and H2O channels are associated. Lidars components have been specifically defined for this task and put into operation during the international Water vapor Lidar Network Assimilation (WaLiNeAs) campaign led by French research teams. Among the three stations managed by the LSCE team, two of them were equipped with HORUS lidar systems at the Port Camargue (43.52 N 4.13 E) and Coursan (43.23 N 3.06 E) sites. The main difference between the two HORUS lidars is the laser used. For HORUS-1 we used an ULTRA laser (optimally pumped by a flash lamp at 30 mJ/20Hz) which showed a good reliability since the beginning of the lidar installation. However, the MERION-C laser (optimally pumped by diodes at 30 mJ/100 Hz) installed in HORUS-2 did not live up to our expectations with several failures, to the point of stopping the measurements in Coursan. We will nevertheless discuss the relative interest of these two lasers in projection of future Raman lidar networks. Observations available from these two lidar systems will be presented and discussed with respect to the meteorological processes encountered during their operating periods.

We give a special acknowledgment to the ANR grant #ANR-20-CE04-0001 for the contribution to the WaLiNeAs program and a special acknowledgment to Meteo France and to the CNRS INSU national LEFE program for their financial contribution for this project. The CEA is acknowledged for the provision of its staff and facilities.

How to cite: Laly, F., Chazette, P., Totems, J., and Lagarrigue, J.: Water vapor retrieval from mini Raman lidar HORUS in the framework of the WaLiNeAs campaign, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-8149, https://doi.org/10.5194/egusphere-egu23-8149, 2023.

Since there are only a very few suitable remote sensing measurements, the thermodynamic field of the atmospheric boundary layer and lower free troposphere is largely still Terra Incognita. To close this gap, we developed an automated thermodynamic profiler based on the Raman lidar technique, the Atmospheric Raman Temperature and Humidity Sounder (ARTHUS) (Lange et al. 2019).

It uses only the ice-safe 355-nm radiation of an injection-seeded Nd:YAG laser as transmitter. The laser power is about 20 W at 200 Hz. The diameter of the receiving telescope is 40 cm. Four receiving channels (elastic, water vapor, two rotational Raman signals) allow for four independently measured parameters: temperature (T), water vapor mixing ratio (WVMR), particle extinction coefficient, and particle backscatter coefficient.

With these data, ARTHUS resolves, e.g., the strength of the inversion layer at the atmospheric boundary layer (ABL) top, elevated lids in the free troposphere, and turbulent fluctuations. Furthermore, in combination with Doppler lidars, sensible and latent heat flux profiles in the convective ABL and thus flux-gradient relationships can be studied (Behrendt et al. 2020). Consequently, ARTHUS can be applied for process studies of land-atmosphere feedback, weather and climate monitoring, model verification, and data assimilation.

Resolutions of the measurements are a few seconds and meters in the lower troposphere. With the data, also the statistical uncertainties of the measured parameters are derived. Continuous operations over long periods were achieved not only at the Land Atmosphere Feedback Observatory (LAFO) at University of Hohenheim but also during several field campaigns elsewhere covering a large variety of atmospheric conditions.

During the EUREC4A field campaign (Stevens et al, 2020), ARTHUS was deployed onboard the research vessel Maria S Merian between 18 January and 18 February 2020 to study ocean-atmosphere interaction. Here, ARTHUS was collocated with two Doppler lidars: one in vertically pointing mode and one in a 6-beam scanning mode.

Between 15 July and 20 September 2021, ARTHUS was deployed at the Lindenberg Observatory of the German Weather Service (DWD). The objective of the campaign was to investigate the long-term stability of ARTHUS by comparisons with four local radiosondes. Indeed, the very high accuracy during day and night were verified.

ARTHUS participated in WaLiNeAs (Water Vapor Lidar Network Assimilation) between 15 September and 10 December 2022. For this campaign, ARTHUS was deployed at the west coast of Corsica. The objective was to implement an integrated prediction tool to enhance the forecast of heavy precipitation events in southern France, primarily demonstrating the benefit of assimilating vertically resolved WVMR lidar data in the new version of the French operational AROME numerical weather prediction system.

At the conference, highlights of ARTHUS’ measurements during WaLiNeAs will be shown.

References:

Behrendt et al. 2020, https://doi.org/10.5194/amt-13-3221-2020

Lange et al. 2019, https://doi.org/10.1029/2019GL085774

Stevens et. al. 2021, https://doi.org/10.5194/essd-2021-18

How to cite: Lange, D., Behrendt, A., and Wulfmeyer, V.: The Atmospheric Raman Temperature and Humidity Sounder: Highlights of Four Years of Automated Measurements of the Atmospheric Boundary Layer and Free Troposphere, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-10606, https://doi.org/10.5194/egusphere-egu23-10606, 2023.

The increasing atmospheric carbon dioxide (CO2) is the most important factor forcing climate change. However, due to lack of observation data about large-scale range-resolved CO2, there remains substantial uncertainty in current global atmospheric CO2 budget, which hinders giving insight into CO2 cycle and modeling its forcing to climate change. Space-based range-resolved differential absorption lidar (range-resolved DIAL), is a promising and powerful means for obtaining large-scale range-resolved CO2 data, but has been rarely studied. Prior to developing spaceborne range-resolved DIAL, a preliminary study on optimization of on/off-line wavelengths must be performed to ensure high signal-to-noise (SNR), high sensitivity to near surface region and minimize the interference of atmospheric factors. This study aims to find the optimum wavelength scenarios in terms of random errors determined by SNR, weighting functions used to assess sensitivity to near-surface region, and systematic errors affected by atmospheric factors. Firstly, we find the optimal on/off-line wavelengths at 1.57μm and 2.05μm, which are widely used and show good results for measuring CO2 concentration, after estimating on-line and off-line wavenumbers separately using evaluation indexes called  and . Furthermore, we get the optimum wavelength scenarios of spaceborne range-resolved DIAL by comparing the random, systematic errors and weighting functions of optimal on-line and off-line wavelengths at 1.57μm and 2.05μm. Results show that the wavelength scenario at 2.05μm is the optimal for spaceborne range-resolved CO2 detection. To satisfy the requirement that the relative random errors are smaller than 0.01 (<1%), systems at 2.05μm wavelength scenario with vertical resolution of 0.5 km, 0.7 km, 0.8 km, 0.9 km separately require that SNR values of on-line wavelength at 0 km height are larger than 10, 9, 8, 7.

How to cite: Hu, L., Yu, Z., Huang, Y., and Ma, R.: Performance Simulation of Spaceborne Range-resolved Differential Absorption Lidar System For CO2 Profile Detection At 1.57μm and 2.05μm Wavelength Scenarios, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-11104, https://doi.org/10.5194/egusphere-egu23-11104, 2023.

Cirrus clouds, forming in the cold upper troposphere, are composed of ice crystals with various sizes and nonspherical shapes. They are observed at all latitudes covering over 30% of the Earth’s surface. Studies reveal that they have a significant impact on the radiation balance of our planet and, consequently, on the climate evolution. The radiative effect of cirrus clouds is strongly determined by the cloud microphysical, thermal, and optical properties. Furthermore, global aviation affects the Earth’s radiation balance by inducing contrails and exerting an indirect effect on the microphysical properties of naturally-formed cirrus clouds. In the last decades, the Arctic surface has been warming faster than other regions of the globe, which is known as Arctic Amplification. The thin and small-coverage cirrus clouds over the Arctic are presumed to largely contribute to it. Unfortunately, however, the optical and microphysical properties of cirrus clouds over the Arctic and the exact role they play in the elevated warming of the Arctic are far from understanding. Compared with the intensive studies of cirrus clouds in the tropics and midlatitude regions, cirrus cloud measurements and model studies at high latitudes are sparse. In this study, we present the comparisons of the particle linear depolarization ratio (PLDR) and occurrence rates of cirrus clouds at midlatitudes (35–60 oN; 30 oW–30 oE) and high latitudes (60–80 oN; 30 oW–30 oE) based on the analysis of lidar measurements of CALIPSO in the years 2018–2021. The results show that cirrus clouds at high latitudes appear at lower altitudes than the midlatitude cirrus clouds. The PLDR and occurrence rates of cirrus clouds at high latitudes are smaller than the midlatitude cirrus clouds. Furthermore, air traffic over Europe was significantly reduced in 2020 (starting from March) and only moderately reduced in 2021 due to the COVID-19 pandemic. Under this condition we are able to study the difference in the aviation impacts on the cirrus cloud properties at high latitudes and midlatitudes.

How to cite: Li, Q. and Groß, S.: CALIPSO observations of cirrus cloud properties: investigation of latitude differences and possible aviation impacts, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-11893, https://doi.org/10.5194/egusphere-egu23-11893, 2023.

EGU23-12585 | Posters on site | GI4.2

Investigation of 2021 summer wildfires in the Eastern Mediterranean: The ERATOSTHENES Centre of Excellence capabilities for atmospheric studies 

Rodanthi-Elisavet Mamouri, Dragos Ene, Holger Baars, Ronny Engelmann, Argyro Nisantzi, Maria Prodromou, Diofantos Hadjimitsis, and Albert Ansmann

In the summer of 2021, several wildfires were reported in the south of Turkey, fires that are considered one of the worst in the history of Turkey. Due to atmospheric conditions, the smoke plume travelled south between 27 July to 5 August 2021, and smoke layers arrived above Cyprus. 

In this work, the capabilities of the newly established ERATOSTHENES Centre of Excellence (CoE), to study large-scale atmospheric events is presented. The study is based on the synergistic use of different datasets of remote sensing techniques both from ground and space. The EARLINET multiwavelength-Raman-polarization lidar PollyXT-CYP hosted by the ERATOSTHENES CoE is continuously running since October 2020 in Limassol, and during summer 2021, the lidar observed smoke plumes from these extreme wildfires on the south coast of Turkey.  

The PollyXT-CYP is a key research infrastructure of the Cyprus Atmospheric Remote Sensing Observatory (CARO) of the ERATOSTHENES CoE established through the EXCELSIOR H2020 EU Teaming project coordinated by the Cyprus University of Technology. CARO will consist of two high-tech containers housing the PollyXT-CYP lidar and state-of-the art doppler lidar, cloud radar and radiometric equipment which will be used to measure the air quality, the dust transport, and the cloud properties over Cyprus. The CARO is a planning National Facility of the Republic of Cyprus for Aerosol and Cloud Remote Sensing Observations.

Land cover information which shows the type of burned vegetation is used together with satellite products to capture additionally the burned area and to investigate the carbon monoxide of the smoke plume. The study is focusing on the optical characteristics of the plume, as it was detected by the PollyXT-CYP lidar at Limassol. An intense fresh smoke layer was detected on 28-29 July 2021, at an altitude between 2.5 to 4.0 km, having a volume depolarization ratio of ~15% at 355n and ~20% at 532nm, and lidar ratio of 75-80sr at 355nm and 65-70sr at 532nm.

 

Acknowledgements

The authors acknowledge the ‘EXCELSIOR’: ERATOSTHENES: EΧcellence Research Centre for Earth Surveillance and Space-Based Monitoring of the Environment H2020 Widespread Teaming project (www.excelsior2020.eu). The ‘EXCELSIOR’ project has received funding from the European Union’s Horizon 2020 research and innovation programme under Grant Agreement No 857510, from the Government of the Republic of Cyprus through the Directorate General for the European Programmes, Coordination and Development and the Cyprus University of Technology. The PollyXT-CYP was funded by the German Federal Ministry of Education and Research (BMBF) via the PoLiCyTa project (grant no. 01LK1603A). The study is supported by “ACCEPT” project (Prot. No: LOCALDEV-0008) co-financed by the Financial Mechanism of Norway (85%) and the Republic of Cyprus (15%) in the framework of the programming period 2014 - 2021.

How to cite: Mamouri, R.-E., Ene, D., Baars, H., Engelmann, R., Nisantzi, A., Prodromou, M., Hadjimitsis, D., and Ansmann, A.: Investigation of 2021 summer wildfires in the Eastern Mediterranean: The ERATOSTHENES Centre of Excellence capabilities for atmospheric studies, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-12585, https://doi.org/10.5194/egusphere-egu23-12585, 2023.

A newly available Raman lidar (Purple Pulse Lidar Systems) for vertical profiling of atmospheric water vapor, temperature and aerosols was evaluated during the TEAMx pre-campaign (TEAMx-PC22) in summer 2022 in the Inn Valley (Austria). TEAMx (Multi-scale transport and exchange processes in the atmosphere over mountains – programme and experiment) is an international research program addressing exchange processes in the atmosphere over mountains and their parametrization in numerical weather models and climate models. Prior to the multi disciplinary measurement campaign, planned in 2024/2025, the pre-campaign 2022 was rather performed for testing (new) instruments and measurement sites and finding synergies between certain devices.

The Raman lidar system is capable of profiling water vapor and temperature throughout the entire planetary boundary layer (typically 3 km to 4 km agl. on summer days) continuously with a basic temporal resolution of 10 s and a reasonable vertical resolution of 30 m to 100 m. Depending on conditions and temporal averaging, water vapor profiles could even be obtained up to ~7.5 km agl. during nighttime. The lidar system was located at the University of Innsbruck (downtown). It was operated side by side with a vertically staring Doppler wind lidar and a nearby (50 m) scanning Doppler wind lidar on the rooftop of the university building, which provide vertical profiles of the vertical wind component at a 1-s interval and vertical profiles of the three-dimensional wind vector at a 10-min interval, respectively. During the measurement period (Aug 2022 to Sep 2022), operational radiosondes were launched in close proximity, at Innsbruck Airport, roughly 3 km to the west of the lidar site. In addition to the daily ascent at 2 UTC, radiosondes were launched at about 8, 14 and 20 UTC on selected days with potentially complex meteorological conditions. We present a first assessment of the Raman lidar measurements through comparisons with the radiosonde data. Together with data from the wind lidars, we also present an interpretation for significant meteorological situations and events, such as foehn, a passing front, a thunderstorm and the formation of a convective boundary layer during a warm period.

How to cite: Vogelmann, H., Federer, M., Speidel, J., and Gohm, A.: Assessing the performance of a Combined Water Vapor / Temperature / Aerosol Raman Lidar within the TEAMx pre-campaign in the Inn Valley (Innsbruck, Austria) during Summer 2022, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-13101, https://doi.org/10.5194/egusphere-egu23-13101, 2023.

EGU23-13218 | ECS | Orals | GI4.2

ARC and ATLAS: CARS software tools for the data analysis and quality assurance of lidar measurements performed within ACTRIS 

Nikolaos Siomos, Ioannis Binietoglou, Peristera Paschou, Mariana Adam, Giuseppe D'Amico, Benedikt Gast, Moritz Haarig, and Volker Freudenthaler

We present newly developed software for the data analysis and quality assurance of lidar systems operated in the ACTRIS (Aerosol Clouds and Trace Gases Research Infrastructure) research infrastructure. The software development is coordinated by the Meteorological Institute of Munich (MIM), which operates as one of the central facilities of the Center of Aerosol Remote Sensing (CARS) of ACTRIS. In the frame of ACTRIS, a large number of national facilities (NF) are operating lidar systems for aerosol remote sensing. In order to ensure homogeneously high data quality, CARS is developing appropriate common software tools to assist data processing, system intercomparison, and routine quality assurance of lidar data. Here, we present two such software tools, developed and tested using the long experience of the EARLINET (European Aerosol Research Lidar Network) community.

The ARC (Algorithm for Rayleigh Calculations) has been designed to calculate the cross-section and depolarization ratio of molecular back-scattering. The effect of Rotational Raman (RR) scattering is included line-by-line in ARC considering especially the partial blocking of the RR spectrum due to transmission through narrow-band interference filters. The algorithm supports calculation in variable meteorological conditions for an atmosphere that consists of up to five major gas components (N2, O2, CO2, Ar, H2O). Such a tool is needed in order to properly take into account the effect of air temperature in the molecular depolarization ratio measured by the NF lidar systems. It is also crucial for designing lidars that rely on RR scattering such as temperature and RR aerosol lidars and can even be applied for the algorithmic correction of unwanted effects introduced by the interference filter in such systems.

The second software package developed by CARS-MIM is ATLAS (AuTomated Lidar Analysis Software). It has been designed for the operational analysis of the quality assurance tests that should be regularly performed and submitted to CARS by the NF for the ACTRIS labeling process. ATLAS currently supports the analysis of all main CARS test procedures, that is, the Rayleigh fit, the Telecover, and the Polarization Calibration. It can also be used to directly compare signals from two lidar systems; It has already been applied in the first intercomparison campaign of CARS reference systems, organized in September 2022 in Magurele, Romania. The software takes raw lidar data as input so the user can detect otherwise-hidden issues in the preprocessing steps. At the time of writing, ATLAS is compatible with all ACTRIS lidar systems. Future updates will include automated syncing of the system metadata from the handbook of instruments of the network, currently hosted by the Single Calculus Chain (SCC), and a graphical user interface that will facilitate its adoption by the NF users. Both software packages are written in python and are open-source projects.

How to cite: Siomos, N., Binietoglou, I., Paschou, P., Adam, M., D'Amico, G., Gast, B., Haarig, M., and Freudenthaler, V.: ARC and ATLAS: CARS software tools for the data analysis and quality assurance of lidar measurements performed within ACTRIS, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-13218, https://doi.org/10.5194/egusphere-egu23-13218, 2023.

EGU23-13416 | Orals | GI4.2

Merging clouds retrieved from ALADIN/Aeolus and CALIOP/CALIPSO spaceborne lidars 

Artem Feofilov, Hélène Chepfer, and Vincent Noël

Clouds play an important role for the energy budget of Earth. But, when it comes to predicting the climate's future, their behavior in response to climate change is a major source of uncertainty. To understand and accurately predict the Earth's energy budget and climate, it is necessary to have a thorough understanding of the cloud variability, including their vertical distribution and optical properties.

Satellite observations have been able to provide ongoing monitoring of clouds all around the globe. Among them, active sounders hold a special place thanks to their capability of measuring the vertical position of the cloud with an accuracy of about 100 meters and with a typical horizontal sampling on the order of hundreds of meters. However, clouds retrieved from two spaceborne lidars are different, because the instruments use different wavelengths, pulse energies, pulse repetition frequencies, telescopes, and detectors. In addition, they do not overpass the atmosphere at the same local time.

In this work, we discuss the approach to merging the clouds retrieved from the space-borne lidar ALADIN/Aeolus, which has been orbiting the Earth since August 2018 and operating at 355nm wavelength with the clouds measured since 2006 by CALIPSO lidar, which operates at 532nm.

We demonstrate how to compensate for the existing instrumental differences to get an almost comparable cloud dataset and we discuss the importance of the aforementioned differences between the instruments. The method developed in this study sets the path for adding future lidars (e.g. ATLID/EarthCare) to the global climate lidar cloud record.

How to cite: Feofilov, A., Chepfer, H., and Noël, V.: Merging clouds retrieved from ALADIN/Aeolus and CALIOP/CALIPSO spaceborne lidars, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-13416, https://doi.org/10.5194/egusphere-egu23-13416, 2023.

EGU23-13419 | ECS | Orals | GI4.2

Columnar heating rate and  radiative effects of dust aerosols using 20 years of lidar observations. 

Benedetto De Rosa, Lucia Mona, Simone Lolli, Aldo Amodeo, and Michalis Mytilinaios

The uncertainties of the Earth-atmosphere energy budget are associated with a poor understanding of direct and indirect aerosol effects. Dust is a mixture of different minerals, and its chemical and microphysical properties change during transport. Therefore, the influence of dust aerosols on radiative effects is characterized by great uncertainty. Due to meteorological atmospheric patterns, aerosol intrusions are very frequent in the Mediterranean, which is a climatic hot spot and where climate change is much stronger than in other parts of the world. In this study, we analyzed and assessed long-term trends of the surface and columnar heating rate and the radiative effects of dust aerosols using lidar observations. These measurements were taken in the framework of the European Aerosol Research Lidar Network (EARLINET) at Istituto di Metodologie per l'Analisi Ambientale (IMAA) with the Raman/elastic lidar MUSA (40°36′N, 15°44′E). The radiative transfer model Fu–Liou–Gu (FLG) was used to solve aerosol (no clouds) radiative fluxes, with aerosol extinction coefficient profiles from lidar observations as input data. All the cases of dust intrusion that occurred in the last twenty years were selected to understand how they affected the Earth-atmosphere radiative budget, both at the surface and at the top-of-the-atmosphere. In the future, these studies will be important for improving the accuracy of climate predictions.

How to cite: De Rosa, B., Mona, L., Lolli, S., Amodeo, A., and Mytilinaios, M.: Columnar heating rate and  radiative effects of dust aerosols using 20 years of lidar observations., EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-13419, https://doi.org/10.5194/egusphere-egu23-13419, 2023.

EGU23-13643 | ECS | Orals | GI4.2

Is your aerosol backscatter retrieval afflicted by a sign error? 

Johannes Speidel and Hannes Vogelmann

Precise knowledge about the prevailing aerosol content in the atmosphere is very important for several reasons, as aerosols are involved in multiple important processes that not only have a direct impact on air quality, but also influence cloud formation and the earth's radiation budget. Besides that, continuous aerosol observations provide valuable information on atmospheric transport dynamics.
Aerosol backscatter coefficient measurements with elastic backscatter lidars are conducted since multiple decades [1], while the implemented retrieval algorithms predominantly refer to the seminal publications by Klett 1985, Fernald 1984 and Sasano 1985 [2,3,4]. The respective inversion algorithm is often simply called the 'Klett inversion', being a main reason why this algorithm is most often adapted. While more sophisticated aerosol lidars (e.g. Raman lidars, HSRL, ...) have been developed since, simple elastic backscatter lidar measurements are still very frequently conducted as they are technically easy to implement, often as a byproduct. In most cases, the corresponding retrieval algorithms still refer to the 'Klett inversion'.
Unfortunately, the inversion algorithm by Klett 1985 is afflicted by a sign error. In his publication, the sign error is hidden within a substitute, making it very hard to be recognized, representing a major pitfall. A comprehensive literature review revealed, that large parts of the aerosol lidar community are aware of this problem and have tacitly corrected it or, to a much smaller amount, even referred to an erratum which was published by Kaestner in 1986 [5].
However, at the same time and up to this date, a considerable error propagation can be found in literature as well, using and referring to the incorrect algorithm with the sign error included.
Therefore, we want to renew the awareness towards this sign error and show a corrected and slightly improved Klett inversion algorithm. In addition, we present the overall implication resulting from the uncorrected inversion algorithm by using exemplary case studies. Depending on the lidar location and prevailing atmospheric conditions, potential errors reach from marginal to major, often preventing error detection solely based on the magnitude of the calculated results. Simple a posteriori corrections are not possible, as the error magnitude depends on multiple factors.

[1] T. Trickl, H. Giehl, H. Jäger, and H. Vogelmann. 35 yr of stratospheric aerosol measurements at Garmisch-Partenkirchen: From Fuego to Eyjafjalla-   jökull, and beyond. Atmospheric Chemistry and Physics, 13(10):5205–5225, 2013.
[2] James D. Klett. Lidar inversion with variable backscatter/extinction ratios. Appl. Opt., 24(11):1638–1643, June 1985.
[3] Frederick G. Fernald. Analysis of atmospheric lidar observations: Some comments. Appl. Opt., 1984.
[4] Yasuhiro Sasano, Edward V. Browell, and Syed Ismail. Error caused by using a constant extinction/backscattering ratio in the lidar solution. Appl. Opt., 24(22):3929–3932, November 1985.
[5] Martina Kaestner. Lidar inversion with variable backscatter/extinction ratios: Comment. Applied Optics, 25(6):833–835, March 1986.

How to cite: Speidel, J. and Vogelmann, H.: Is your aerosol backscatter retrieval afflicted by a sign error?, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-13643, https://doi.org/10.5194/egusphere-egu23-13643, 2023.

EGU23-13823 | ECS | Posters on site | GI4.2

Development of a Carbon Dioxide Raman Lidar 

Moritz Schumacher, Andreas Behrendt, Diego Lange, and Volker Wulfmeyer

Carbon dioxide (CO2) is one of the most important greenhouse gases and therefore its detailed measurement is of high interest. As the concentration varies significantly with altitude and time, it is desirable to be able to measure vertical CO2 profiles with high temporal resolution. Profiles of high resolution will improve our understanding of atmospheric systems and the impact of the local environment, e.g., due to natural and anthropogenic sources and sinks. The use of these data in data assimilation provides the potential of improving climate models.

For water vapor and temperature the Atmospheric Raman Temperature and Humidity Sounder (ARTHUS) system has proven to be able to provide profiles with high resolution (10-60 s in time and 7.5-100 m vertically) and accuracy in the lower troposphere. Now this successful system will be expanded with a CO2 Raman channel, which is currently in development. After successful integration it will be possible to simultaneously measure CO2, water vapor and temperature profiles. Challenges are the weak signal of the backscattered light due to the low concentration and the small Raman backscatter cross section of CO2.

Further information on the CO2 Raman lidar will be given at the conference.

How to cite: Schumacher, M., Behrendt, A., Lange, D., and Wulfmeyer, V.: Development of a Carbon Dioxide Raman Lidar, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-13823, https://doi.org/10.5194/egusphere-egu23-13823, 2023.

Extreme heavy precipitation events (HPEs) pose a threat to human life but, despite regular improvement, remain difficult to predict because of the lack of adequate high frequency and high-resolution water vapor (WV) observations in the low troposphere (below 3 km). To fill this observational gap, The Water vapor Lidar Network Assimilation (WaLiNeAs) initiative aims at implementing an integrated prediction tool (IPT), coupling network measurements of WV profiles, and a numerical weather prediction system to try to improve the  forecasts of  the amount, timing, and location of rainfall associated with HPEs in southern France (struck by ~ 7 HPEs per year on average during the fall).

In the fall/winter of 2022-2023, a network of 6 mobile Raman WV lidars was specifically implemented in Southern France (Aude, Gard, Var and Bouche du Rhone) and in Corsica. The network was complemented by 2 fixed Raman WV lidars in Barcelona and Valencia with the aim to provide measurements with high vertical resolution and accuracy to be assimilated in the French Application of Research to Operations at Mesoscale (AROME-France) model, using a four-dimensional ensemble-variational approach with 15-min updates in addition to the observations operationally assimilated (radar, satellites, …). This innovative IPT is expected to enhance the model capability for kilometer-scale prediction of HPEs over southern France up to 48 h in advance.

The field campaign was conducted from October of 2022 to January 2023, to cover the period most propitious to heavy precipitation events in southern France. A consortium of French, German, Italian, and Spanish research groups operated the Raman WV lidar network

In this presentation, we will provide an overview of the precipitation events in southern France during the WaLiNeAs campaign, as well as an outline of the operations period of the different Raman WV lidars and the lidar data monitoring procedure implemented during the experiment. We will highlight the cases of interest and provide an outlook at next steps towards lidar data assimilation in AROME.

How to cite: Flamant, C. and the WaLiNeAs Team: A network of water vapor Raman lidars for improving heavy precipitation forecasting in southern France: introducing the WaLiNeAs initiative and first highlights from the 2022 field campaign, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-14747, https://doi.org/10.5194/egusphere-egu23-14747, 2023.

Each year, during both boreal winter and summer, large amounts of Saharan mineral dust particles get carried westwards over the Atlantic Ocean towards the Caribbean. During their transport, Saharan dust particles can affect the Earth’s radiation budget in different ways. They can either directly scatter, absorb and emit radiation or have an indirect effect by modifying cloud properties through their interactions as cloud condensation nuclei or ice nucleating particles. While during the summer months – the peak season of transatlantic mineral dust transport – the particles are mostly advected in elevated Saharan Air Layers at altitudes of up to 6 km and at latitudes around 15°N, wintertime transport takes place at lower atmospheric levels (<3 km altitude) and lower latitudes. Our recent studies have shown that, during both boreal winter and summer, transported Saharan dust layers are characterized by enhanced concentrations of water vapor compared to the surrounding atmosphere. In this way the dust layers have to potential to modify the radiation budget not only through particle-radiation-interactions, but also through the absorption and emission of radiation by water vapor. This in turn may affect the atmospheric stability and stratification in and around the aerosol layers.

In this study, the turbulent structure as well as the atmospheric stability in and around transported Saharan mineral dust is analyzed and possible differences between summer and wintertime are investigated. Therefore, measurements by both the water vapor and aerosol lidar WALES as well as by dropsondes are studied. They were collected upstream the Caribbean island of Barbados aboard the German research aircraft HALO (High Altitude and Long Range). To identify possible seasonal differences, not only data collected in boreal summer in the framework of the NARVAL-II campaign (August 2016), but also data collected in winter during the EUREC4A research campaign (January & February 2020) are analyzed. During both campaigns several research flights were designed to lead over long-range-transported Saharan mineral dust, thus allowing and in-depth investigation of their properties. The analysis shows that dust layers are highly turbulent and therefore help dust particles to stay airborne for a longer time. Additionally, the dust layers modify the atmospheric stability in a way that the evolution of marine clouds can be affected.

In our presentation, we will give an overview of the performed measurements over long-range-transported Saharan dust layers and present the conducted analyses on atmospheric stability and turbulence from dropsonde measurements and calculated power spectra from lidar data.

How to cite: Gutleben, M. and Groß, S.: Atmospheric turbulence and stability in and around long-range-transported Saharan dust layers as observed by airborne lidar and dropsondes, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-15538, https://doi.org/10.5194/egusphere-egu23-15538, 2023.

EGU23-15605 | Posters on site | GI4.2

How good are temperature and humidity measurements with lidar? 

Andreas Behrendt, Diego Lange, and Volker Wulfmeyer

In this contribution, we will discuss the performance of state-of-the-art automatic temperature and humidity lidar (e.g., Wulfmeyer and Behrendt 2022). As example, we will investigate ARTHUS (Lange et al., 2019), a lidar system developed at University of Hohenheim. This automatic mobile instrument participated in recent years in a number of field campaigns.

ARTHUS technical configuration is the following: A strong diode-pumped Nd:YAG laser is used as transmitter. It produces 200 Hz laser pulses with up to 20 W average power at 355 nm. Only this UV light is sent after beam expansion into the atmosphere so that the system remains eye safe. The atmospheric backscatter signals are collected with a 40 cm telescope. A polychromator extracts the elastic backscatter signal and three inelastic signals, namely the vibrational Raman signal of water vapor, and two pure rotational Raman signals. The detection resolution of these backscatter signals are 1 to 10 s and 3.75 to 7.5 m. All four signals are simultaneously analyzed and stored in both photon-counting (PC) mode and voltage (so-called “analog” mode) in order to make optimum use of the large intensity range of the backscatter signals covering several orders of magnitude.

From these eight primary signals measured by ARTHUS, four independent atmospheric parameters are calculated merging the PC and analog signals: temperature, water vapor mixing ratio, particle backscatter coefficient, and particle extinction coefficient. The temporal resolution of these data is also 1 to 10 s, allowing studies of boundary layer turbulence (Behrendt et al, 2015) and - in combination with a vertical pointing Doppler lidar - sensible and latent heat fluxes (Behrendt et al, 2020).

From the measured number of photon counts in each range bin, the statistical uncertainty of the measured data due to so-called shot-noise can directly be calculated. This value, however, while determining the major part of the uncertainty, does not cover the total uncertainty because additional noise of the analog signals is not included. So the shot-noise uncertainty alone underestimates the uncertainties in the near range where the analog data is used. To solve with this problem, higher-order analyses of the turbulent fluctuations can be performed which allow to determine the total statistical uncertainty of the measurements (Behrendt et al, 2020).

Finally, to investigate the stability of the calibration and thus the accuracy of the measured data, we decided to compare averaged ARTHUS data with local radiosondes. In order to cope with the unavoidable sampling of different air masses between these different instruments, we are investigating the average of a larger number of profiles.  We found that the performance of the measured data of ARTHUS reaches even the stringent requirements of WMO.

The results will be presented at the conference.

 

References:

Behrendt et al. 2015, https://doi.org/10.5194/acp-15-5485-2015

Behrendt et al. 2020, https://doi.org/10.5194/amt-13-3221-2020

Lange et al. 2019, https://doi.org/10.1029/2019GL085774

Wulfmeyer and Behrendt 2022, https://doi.org/10.1007/978-3-030-52171-4_25

How to cite: Behrendt, A., Lange, D., and Wulfmeyer, V.: How good are temperature and humidity measurements with lidar?, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-15605, https://doi.org/10.5194/egusphere-egu23-15605, 2023.

EGU23-15696 | Posters on site | GI4.2

A new filtering approach for multiple Doppler Lidar setups 

Kevin Wolz, Christopher Holst, Frank Beyrich, and Matthias Mauder

We compare the wind measurements of a virtual tower triple Doppler Lidar setup to those of a sonic anemometer located at a height of 90 m above ground on an instrumented tower and with those of a single Doppler Lidar. The instruments were set up at the boundary-layer field site of the German Meteorological Service (DWD) in July and August of 2020 during the FESST@MOL (Field Experiment on sub-mesoscale spatio-temporal variability at the Meteorological Observatory Lindenberg) 2020 campaign.  The triple Lidar setup was operated in a stare and in a step/stare mode at six heights between 90 and 500 m above ground, while the single Lidar was operated in a continuous scan Velocity-Azimuth-Display (VAD) mode with an azimuthal resolution of around 1.5 ° and a zenith angle of 55.5 °. Overall, both Lidar methods showed a good agreement for the whole study period for different averaging times and scan modes compared to the sonic anemometer. Additionally, we developed and show a new filtering approach based on a Median Absolute Deviation (MAD) filter for the virtual tower setup and compare it to a filtering approach based on a signal-to-noise ratio SNR threshold. The advantage of the MAD filter is that it is not based on a strict threshold but on the MAD of each 30-second period and can, therefore, better adapt to changing atmospheric conditions. In the comparison the MAD filter leads to a greater data availability while upholding similar comparability and bias values between the triple Lidar and sonic anemometer setups. Our results also show that a single Doppler Lidar is a viable method for measuring wind speed and direction with only small disadvantages, at least for measurement heights similar to our investigation and for comparable heterogeneous but flat landscapes.

How to cite: Wolz, K., Holst, C., Beyrich, F., and Mauder, M.: A new filtering approach for multiple Doppler Lidar setups, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-15696, https://doi.org/10.5194/egusphere-egu23-15696, 2023.

EGU23-15942 | ECS | Posters on site | GI4.2

Study of the Atmospheric Boundary Layer and Land-Atmosphere Interaction with Lidars 

Syed Abbas, Andreas Behrendt, Florian Späth, Diego Lange, Osama Alnayef, and Volker Wulfmeyer

Investigating the dynamics of the atmospheric boundary layer (ABL) is essential for studies of air quality, the energy and water cycles and for the improvement of weather and climate models. During daytime in convective conditions, the convective boundary layer (CBL) is formed. Here, we present our approach of how to continuously study CBL characteristics with an improved algorithm including fuzzy logic. The Land-Atmosphere Feedback Observatory (LAFO) of University of Hohenheim consists of two Doppler lidars, a Doppler Cloud Radar, the Atmospheric Raman Temperature and Humidity Sounder (ARTHUS), and Eddy covariance stations. These are excellent tools for observing high resolution atmospheric wind profiles, clouds and precipitation events, as well as thermodynamic profiles and surface fluxes. The data are collected at LAFO by operating continuously two Doppler lidars, one in vertical and one in six-beam scanning mode, to obtain vertical and horizontal wind profiles. Both Doppler lidars are operated with resolutions of 1 s and 30 m. The six-beam staring Doppler lidar is used for obtaining time series of turbulent kinetic energy (TKE), momentum flux, TKE dissipation rate and horizontal wind profiles statistics. The vertically staring Doppler lidar is used to compute statistics of higher-order moments of vertical wind fluctuations, the CBL height, and cloud base height. With these data, the land-atmosphere coupling processes and the associated nonlinear feedbacks are investigated as well as their impact on the turbulent structure of the CBL.

We will present analyses of two three-month periods covering different weather conditions: 1 May to 31 July 2021 and 2022.

How to cite: Abbas, S., Behrendt, A., Späth, F., Lange, D., Alnayef, O., and Wulfmeyer, V.: Study of the Atmospheric Boundary Layer and Land-Atmosphere Interaction with Lidars, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-15942, https://doi.org/10.5194/egusphere-egu23-15942, 2023.

EGU23-16149 | ECS | Orals | GI4.2

Performance Simulation and Preliminary Measurements of a Raman Lidar for the Retrieval of CO2 Atmospheric Profiles 

Marco Di Paolantonio, Paolo Di Girolamo, Davide Dionisi, Annalisa Di Bernardino, Tatiana Di Iorio, Noemi Franco, Giovanni Giuliano, Anna Maria Iannarelli, Gian Luigi Liberti, and Donato Summa

Within the frame of the project CONCERNING (COmpact RamaN lidar for Atmospheric CO2 and ThERmodyNamic ProfilING), we investigated the feasibility and the limits of a ground-based Raman lidar system dedicated to the measurement of CO2 profiles. The performance of the lidar system was evaluated through a set of numerical simulations. The possibility of exploiting both CO2 Raman lines of the ν1:2ν2 resonance was explored. An accurate quantification of the contribution of the Raman O2 lines on the signal and other (e.g., aerosol, absorbing gases) disturbance sources was carried out. The signal integration over the vertical and over time required to reach a useful signal to noise ratio both in day-time and night-time needed for a quantitative analysis of carbon dioxide sources and sinks was evaluated. The above objectives were obtained developing an instrument simulator software consisting of a radiative transfer model able to simulate, in a spectrally resolved manner, all laser light interaction mechanisms with atmospheric constituents, a consistent background signal, and all the devices present in the considered Raman lidar experimental setup. The results indicate that the simulated lidar system, provided to have a low overlap height, could perform measurements on the low troposphere (<1 km) gradients (1-5 ppm) with sufficient precision both in day-time and night-time with an integration time of 1-3 h and a vertical resolution of 75 m. The selected Raman lidar setup is currently being tested and we aim to present preliminary results during the conference.

How to cite: Di Paolantonio, M., Di Girolamo, P., Dionisi, D., Di Bernardino, A., Di Iorio, T., Franco, N., Giuliano, G., Iannarelli, A. M., Liberti, G. L., and Summa, D.: Performance Simulation and Preliminary Measurements of a Raman Lidar for the Retrieval of CO2 Atmospheric Profiles, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-16149, https://doi.org/10.5194/egusphere-egu23-16149, 2023.

EGU23-16192 | ECS | Posters on site | GI4.2

Investigation of Boundary Layer Aerosol Processes with Turbulence-Resolving Lidar 

Osama Alnayef, Andreas Behrendt, Diego Lange, Florian Späth, Volker Wulfmeyer, and Syed Abbas

Our research focuses on the vertical transport of aerosol particles, and the properties of these aerosol particles in dependence on relative humidity. For this, we use the synergy of Raman and Doppler lidar systems operated during the Land-Atmosphere Feedback Experiment (LAFE) (see https://www.arm.gov/research/campaigns/sgp2017lafe).

We will present our first results of investigating the aerosol flux. For this, we use the aerosols backscatter coefficient and vertical wind velocity collected with Raman lidar and Doppler lidar.

The LAFE project was executed at the Southern Great Plains (SGP) site of the Atmospheric Radiation Measurement (ARM) program in August 2017 in the USA.  In addition, data collected at the Land-Atmosphere Feedback Observatory (LAFO) at the University of Hohenheim, Germany is used. Results of the combined aerosol backscatter measurements with water-vapor and temperature lidar measurements to detail insights into the relative humidity dependencies on the growth of aerosols.

How to cite: Alnayef, O., Behrendt, A., Lange, D., Späth, F., Wulfmeyer, V., and Abbas, S.: Investigation of Boundary Layer Aerosol Processes with Turbulence-Resolving Lidar, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-16192, https://doi.org/10.5194/egusphere-egu23-16192, 2023.

EGU23-16695 | ECS | Orals | GI4.2

Preliminary Studies and Performance Simulations in support of the mission “CALIGOLA” 

Noemi Franco, Paolo Di Girolamo, Donato Summa, Marco Di Paolantonio, and Davide Dionisi

CALIGOLA (Cloud Aerosol Lidar for Global Scale Observations of the Ocean-Land-Atmosphere System) is a mission funded by the Italian Space Agency (ASI), aimed at the development of a space-borne Raman Lidar. A Phase A study to assess the technological feasibility of the laser source and receiver system is currently underway at the Leonardo S.p.A., while scientific studies in support of the mission are conducted by the University of Basilicata. Scientific and technical studies are furthermore supported by other Italian institutions (CNR-ISMAR, CNR-IMAA), with NASA also having expressed an interest in contributing to the mission .

Mission objectives include the observation of the Earth atmosphere, surface (ocean and land). Among the atmospheric objectives, the characterization of the global scale distribution of natural and anthropogenic aerosols, their radiative properties and interactions with clouds, and the measurements of ocean color, suspended particulate matter and marine chlorophyll.

The expected performance of CALIGOLA has been assessed based on the application of an end-to-end lidar simulator. Specifically, sensitivity studies have been carried out to define the technical specifications for the laser source, the telescope, the optics of transceiver, the detectors and the acquisition system. Simulations reveal that the system can measure Rotational Raman echoes from nitrogen and oxygen molecules stimulated at the three lengths wavelength of 355, 532 and 1064 nm. Simulations also reveal that elastic signals are strong enough to meet the requirements under different environmental conditions. As reference signal, several options have been considered. Among others, a temperature-insensitive rotational Raman signal including rotational lines from nitrogen and oxygen molecules.

A careful analysis of different potential orbits is ongoing, with the goal to identify solutions which maximize performance and scientific impact of both atmospheric and oceanic measurements. Near noon-midnight equatorial crossing times are preferable on the ocean side for diel vertical migration and phytoplankton observations, but degrade significantly the performances of atmospheric measurements due to the high solar background. For this reason is essential to find an orbit in which the solar contribution is low enough to obtain acceptable atmospheric results and at the same time the oceanic measurements are far enough from the night-day transitions for as many days a year as possible to assure correct interpretation of phytoplankton physiology. To counterbalance the degraded signal performances also lower obit height are considered, as well as the use of polarized filters to reduce the amount of solar radiation. The estimated performances under different conditions and considering different orbits will be showed during the presentation.

How to cite: Franco, N., Di Girolamo, P., Summa, D., Di Paolantonio, M., and Dionisi, D.: Preliminary Studies and Performance Simulations in support of the mission “CALIGOLA”, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-16695, https://doi.org/10.5194/egusphere-egu23-16695, 2023.

EGU23-370 | PICO | NH9.9

A climate based dengue early warning system for Pune, India 

Sophia Yacob and Roxy Mathew Koll

Dengue incidence has grown dramatically in recent decades, with about half of the world’s population now at risk. Climate plays a significant role in the incidence of dengue. However, the climate-dengue association needs to be clearly understood at regional levels due to the high spatial variability in weather conditions and the non-linear relationship between climate and dengue. The current study evaluates the impacts of weather on dengue mortality in the Pune district of India, for a 15-year period, from 2001 to 2015. To effectively resolve the complexity involved in the weather-dengue association, a new dengue metric is defined that includes temperature, relative humidity, and rainfall-dependent variables such as intraseasonal variability of monsoon (wet and dry spells), wet-week counts, flushing events, and weekly cumulative rains. We find that high dengue mortality years in Pune are comparatively dry, with fewer monsoon rains and flush events (rainfall > 150 mm), but they have more wet weeks and optimal humid days (days with relative humidity between 60–78%) than low dengue mortality years. These years also do not have heavy rains during the early monsoon days of June, and the temperatures mostly range between 27–35°C during the summer monsoon season (June–September).  Further, our analysis shows that dengue mortality over Pune occurs with a 2-5 months lag following the occurrence of favourable climatic conditions. Based on these weather-dengue associations, an early warning prediction model is built using the machine learning algorithm random forest regression. It provides a reasonable forecast accuracy with root mean square error (RMSE) = 1.01. To assess the future of dengue mortality over Pune under a global warming scenario, the dengue model is used in conjunction with climate change simulations from the Coupled Model Intercomparison Project phase 6 (CMIP6). Future projections show that dengue mortality over Pune will increase in the future by up to 86 percent (relative to the reference period 1980–2014) by the end of the 21st century under the high emission scenario SSP5-8.5, primarily due to an increase in mean temperature (3°C increase relative to the reference period). The projected increase in dengue mortality due to climate change is a serious concern that necessitates effective prevention strategies and policy-making to control the disease spread.

How to cite: Yacob, S. and Mathew Koll, R.: A climate based dengue early warning system for Pune, India, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-370, https://doi.org/10.5194/egusphere-egu23-370, 2023.

EGU23-570 | ECS | PICO | NH9.9

Human health as an indicator of climate change. 

Moiz Usmani, Kyle Brumfield, Yusuf Jamal, Mayank Gangwar, Rita Colwell, and Antarpreet Jutla

The association of climatic conditions with human health outcomes has been known for ages; however, the impact of climate on infectious agents in disease transmission is still evolving. Climate change alters the regional weather impacting the emergence, distribution, and prevalence of infectious (vector-, water- or air-borne) diseases. Since the last few decades, the world has experienced an apparent increase in the emergence and re-emergence of infectious diseases, such as Middle East respiratory syndrome coronavirus (MERS-CoV); severe acute respiratory syndrome coronavirus (SARS-CoV); Ebola virus; Zika virus; and recently SARS-CoV-2. With many health agencies recommending handwashing, clean water access, and household cleaning as prevention measures, the threat to water security looms over the world population resulting in a significant public health burden under the lens of the emergence of infectious diseases. Under-resourced regions that lack adequate water supplies are on the verge of an enormous additional burden from such outbreaks. Thus, studying anthropogenic and naturogenic factors involved in the emergence of infectious diseases is crucial to managing and mitigating inequalities. This study aims to determine the impacts of climate variability on infectious diseases, namely water-, air-, and vector-borne diseases, and their association with the distribution and transmission of infectious agents. We also discuss the advancement of built infrastructure globally and its role as a mitigation or adaptation tool when coupled with an early warning system. Our study, therefore, will provide a climate-based platform to adapt and mitigate the impact of climatic variability on the transmission of infectious diseases and water insecurity.

How to cite: Usmani, M., Brumfield, K., Jamal, Y., Gangwar, M., Colwell, R., and Jutla, A.: Human health as an indicator of climate change., EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-570, https://doi.org/10.5194/egusphere-egu23-570, 2023.

EGU23-593 | ECS | PICO | NH9.9

Variability and the odds of Total and Pathogenic Vibrio abundance in Chesapeake Bay 

Mayank Gangwar, Kyle Brumfield, Moiz Usmani, Yusuf Jamal, Antar Jutla, Anwar Huq, and Rita Colwell

Vibrio spp. is typically found in salty waters and is indigenous to coastal environments.  V. vulnificus and V. parahaemolyticus frequently causes food-borne and non-food-borne infections in the United States. Vibrio spp. is sensitive to changes in environmental conditions and various studies have explored their relationship with the environment and have identified water temperature as the strongest environmental predictor with salinity also affecting the abundance in some cases. It is unclear how additional environmental factors will affect intra-seasonal variance as well as the seasonal cycle. This study investigated the intra-seasonal variations in total and pathogenic V. parahaemolyticus and V. vulnificus organisms in oysters and surrounding waters from 2009 to 2012 at a few locations in the Chesapeake Bay. V. Vulnificus is always pathogenic, but it has been observed that there was greater sample-to-sample variability in pathogenic V. parahaemolyticus than in total V. parahaemolyticus. To determine the increase in the likelihood of vibrio presence when the value of a certain environmental parameter has changed, the odds ratio is examined for various values of environmental factors. The odds ratio that we employed measures the likelihood that the desired outcome would occur in samples with the vibrio in comparison to the likelihood that the desired outcome will occur in samples without the vibrio. This technique will give us the threshold value of the environmental variable above which the likelihood of vibrio spp. presence has increased drastically. With changing climate and environmental conditions, vibrio is posing increasing risks to human health. The findings of this study will demonstrate the effectiveness of the odds ratio technique in estimating the likelihood that vibrio abundance would increase when environmental conditions change, which can then be incorporated into prediction models to reduce the danger to the public's health.

How to cite: Gangwar, M., Brumfield, K., Usmani, M., Jamal, Y., Jutla, A., Huq, A., and Colwell, R.: Variability and the odds of Total and Pathogenic Vibrio abundance in Chesapeake Bay, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-593, https://doi.org/10.5194/egusphere-egu23-593, 2023.

Some studies suggest atmospheric particulate matter with diameters 2.5 micron and smaller (PM2.5) may possibly play a role in the transmission of influenza and influenza-like illness (ILI) symptoms.  Those studies were predominantly conducted under moderately to highly polluted outdoor atmospheres.  We conducted our study to extend the understanding to include a less polluted atmospheric environment.  A relationship between PM2.5 and ILI activity extended to include lightly to moderately polluted atmospheres could imply a comparatively more complicated transmission mechanism.  We obtained concurrent PM2.5 mass concentration data, meteorological data and reported Influenza and influenza-like illness (ILI) activity for the light to moderately polluted atmospheres over the Tucson, AZ region. We found no relation between PM2.5 mass concentration and ILI activity. There was an expected relation between ILI, activity, temperature, and relative humidity.  There was a possible relation between PM2.5 mass concentration anomalies and ILI activity. These results might be due to the small dataset size and to the technological limitations of the PM measurements. Further study is recommended since it would improve the understanding of ILI transmission and thereby improve ILI activity/outbreak forecasts and transmission model accuracies.

How to cite: DeFelice, PhD, T.: On the Understanding of the transmission route tied to Reported Influenza/Influenza-Like Illness Activity, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-2467, https://doi.org/10.5194/egusphere-egu23-2467, 2023.

EGU23-5923 | ECS | PICO | NH9.9

Impact of global warming and Greenland ice sheet melting on malaria and Rift Valley Fever 

Alizée Chemison, Dimitri Defrance, Gilles Ramstein, and Cyril Caminade

Mosquitoes are climate-sensitive disease vectors. They need an aquatic environment for the development of their immature stages (egg-larva-nymph). The presence and maintenance of these egg-laying sites depends on rainfall. The development period of mosquitoes is reduced when temperature increases, up to a lethal threshold. Global warming will impact vector’s distribution and the diseases they transmit. The last deglaciation taught us that the melting of the ice sheet is highly non-linear and can include acceleration phases corresponding to sea level rise of more than 4 m per century. In addition, glacial instabilities such as iceberg break-ups (Heinrich events) had significant impacts on the North Atlantic Ocean circulation, causing major global climate changes. These melting processes and their feedbacks on climate are not considered in current climate models and their detailed impacts on health have not yet been studied.

To simulate an accelerated partial melting of the Greenland ice sheet, a freshwater flux corresponding to a sea level rise of +1 and +3 m over a 50-year period is superimposed on the standard RCP8.5 radiative forcing scenario. These scenarios are then used as inputs for the IPSL-CM5A climate model to simulate global climate change for the 21st century. These simulations allow to explore the consequences of such melting on the distribution of two vector-borne diseases which affect the African continent: malaria and Rift Valley Fever (RVF).  Malaria is a parasitic disease that causes more than 200 million cases and more than 600,000 deaths annually worldwide. RVF causes deaths and high abortion rates in herds and poses health risks to humans through contact with infected blood. Former studies have already characterised the evolution of the global distribution of malaria according to standard RCPs. Using the same malaria mathematical models, we study the impact of an accelerated Greenland melting on simulated malaria transmission risk in Africa. Future malaria transmission risk decreases over the Sahel and increases over East African highlands. The decrease over the Sahel is stronger in our simulations with respect to the standard RCP8.5 scenario, while the increase over east Africa is more moderate. Malaria risk strongly increases over southern Africa due to a southern shift of the rain belt which is induced by Greenland ice sheet melting.,. For RVF, the disease model correctly simulates historical epidemics over Somalia, Kenya, Mauritania, Zambia and Senegal.  However, our results show the difficulty to validate continental scale models with available health data. It is essential to develop climate scenarios that consider climate tipping points. Assessing the impact of these tipping point scenarios and the associated uncertainties on critical sectors, such as public health, should be a future research priority.

 

How to cite: Chemison, A., Defrance, D., Ramstein, G., and Caminade, C.: Impact of global warming and Greenland ice sheet melting on malaria and Rift Valley Fever, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-5923, https://doi.org/10.5194/egusphere-egu23-5923, 2023.

EGU23-6855 | PICO | NH9.9

An early warning decision support system for disease outbreaks in the livestock sector 

Paola Nassisi, Alessandro D'anca, Marco Mancini, Monia Santini, Marco Milanesi, Cinzia Caroli, Giovanni Aloisio, Giovanni Chillemi, Riccardo Valentini, Riccardo Negrini, and Paolo Ajmone Marsan

New climate regimes, variability and extreme events affect the livestock sector in many aspects, ranging from animal welfare, production, reproduction, diseases and their spread, feed quality and availability. Heat stress, especially when combined with excess or low humidity, exacerbates the perceived temperature or the drought conditions, respectively, increasing hazards for animals. Also, cold extremes, extraordinary windy conditions and altered radiation regimes are detrimental to both animals and fodder.

In this context, the EU-funded SEBASTIEN project aims to provide stakeholders with a Decision Support System (DSS) for more efficient and sustainable management, and consequent valuation, of the livestock sector in Italy. SEBASTIEN DSS will integrate GIS, environmental and biological variables to generate updated risk maps for livestock diseases and zoonoses and their spread, alerting about the expected occurrence of stressing conditions for animals due to abiotic and biotic factors.

The presence of parasites, vectors, and outbreaks will be combined with environmental data, gathered by spatially distributed meteorological and satellite monitoring, to detect conditions that can potentially favor or trigger the spread of related diseases. Sensor-based monitoring data will be integrated with the above information to determine ranges in animal parameters potentially associated with a higher risk of critical pathogen load or density of vectors potential carriers of diseases. Medium to long-term climate forecasts will support predicting possible shifts of favorable conditions that will open up new areas for parasites and pathogens. The vast amounts of data will be integrated and summarized into user-tailored information through a range of techniques, from empirical/statistical indicators to Machine Learning algorithms.

How to cite: Nassisi, P., D'anca, A., Mancini, M., Santini, M., Milanesi, M., Caroli, C., Aloisio, G., Chillemi, G., Valentini, R., Negrini, R., and Ajmone Marsan, P.: An early warning decision support system for disease outbreaks in the livestock sector, EGU General Assembly 2023, Vienna, Austria, 23–28 Apr 2023, EGU23-6855, https://doi.org/10.5194/egusphere-egu23-6855, 2023.

EGU23-7652 | PICO | NH9.9

Forecasting the risk of vector-borne diseases at different time scales: an overview of the CLIMate SEnsitive DISease (CLIMSEDIS) Forecasting Tool project for the Horn of Africa 

Cyril Caminade, Andrew P. Morse, Eric M. Fevre, Siobhan Mor, Mathew Baylis, and Louise Kelly-Hope

Vector-borne diseases are transmitted by a range of arthropod insects that are climate sensitive. Arthropods are ectothermic; hence air temperature has a significant impact on their biting and development rates. In addition, higher temperatures shorten the extrinsic incubation period of pathogens, namely the time required for an insect vector to become infectious once it has been infected. Rainfall also creates suitable conditions for breeding sites. The latest IPCC-AR6 report unequivocally concluded that recent climate change already had an impact on the distribution of important human and animal diseases and their vectors. For example, dengue is now transmitted in temperate regions of Europe, and malaria vectors are now found at higher altitudes and latitudes in the Tropics. Different streams of climate forecasts, ranging from short range numerical weather prediction (NWP) models to seasonal forecasting systems, to future climate change ensembles can be used to forecast the risk posed by key vector-borne diseases at different time scales.  

This work will first introduce vector-borne disease forecasting system prototypes developed for different time scales and applications. Three examples will be presented; first a NWP driven model to forecast the risk of the animal disease Bluetongue in the UK, second the skill of the Liverpool malaria model simulations driven by seasona