Presentation type:
GI – Geosciences Instrumentation & Data Systems

EGU24-8419 | ECS | Orals | MAL20-GI | GI Division Outstanding Early Career Scientist Award Lecture

Towards Sustainable Futures in Tree Assessment using Ground Penetrating Radar: Insights, Developments and Novel Perspectives 

Livia Lantini and Fabio Tosti

The global impact of diseases and environmental pressures on trees and forests has resulted in the decay and loss of a significant portion of the Earth’s natural heritage. Responding to this challenge, Ground Penetrating Radar (GPR), a well-established and reliable non-destructive testing (NDT) method, emerges as a fundamental assessment technique with vast potential. Its efficacy spans various domains, from Earth sciences to engineering, making GPR uniquely suited for forestry applications and offering a sustainable and non-invasive alternative to destructive methods like coring.

Within forestry applications, GPR assumes a critical role in optimising economic expenditure for tree maintenance while simultaneously enhancing public safety. Swift and reliable detection of subsurface anomalies make GPR essential in safeguarding natural heritage and facilitating early identification of tree decay, ultimately supporting effective tree disease control.

The present work explores the extension of GPR's capabilities to evaluate critical parameters in tree health, focusing on the assessment of root systems and the identification of potential structural weaknesses within tree trunks.

The study introduces a series of recent experimental-based and theoretical models, each contributing to the understanding and enhancement of tree assessment. These models refine the interpretation of intricate reflection patterns, providing a refined understanding of tree trunk conditions. Additionally, models for the early detection of decays and cavities in tree trunks are presented, offering valuable insights into the internal structure of trees and enhancing the sensitivity and precision of GPR for proactive tree health management.

In terms of assessing and monitoring tree roots, the study introduces methodologies designed to enhance the understanding of below-ground ecosystems. Developed algorithms for root detection and tracking, along with methodologies for estimating root mass density, offer insights into growth patterns and contribute to sustainable tree management practices. Furthermore, recent methodologies focus on understanding interconnections within tree root systems and the surrounding environment, identifying buried structures within the root system, addressing unique challenges faced by street trees in urban environments, refining the analysis of tree root systems using frequency spectrum-based processing, and integrating artificial intelligence for automatic recognition to enhance the efficiency of root system assessment.

Finally, unique case studies are presented, showcasing the methodology, survey planning, and site procedures. These case studies add depth to the exploration, reflecting the practical application of the research in diverse and challenging scenarios.

How to cite: Lantini, L. and Tosti, F.: Towards Sustainable Futures in Tree Assessment using Ground Penetrating Radar: Insights, Developments and Novel Perspectives, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-8419,, 2024.

EGU24-14781 | Orals | MAL20-GI | Christiaan Huygens Medal Lecture

The silent degassing of volcanoes: a useful tool for volcanic surveillance and a significant contributor to the global CO2 emission from subaerial volcanism  

Nemesio M. Pérez and the INVOLCAN/ITER Research Team

Volcanoes emit significant amounts of gases into the atmosphere through visible and non-visual degassing manifestations regardless of whether volcanoes are active or quiescent. The latter, also known as diffuse or silent degassing, occurs across the entire volcanic building. Water vapour (H2O), carbon dioxide (CO2), and sulfur (S) are the three most abundant magmatic volatiles, with CO2 being the least soluble in silicate melts. Diffuse volcanic degassing alters the chemical composition of volcanes' ground/soil gas atmosphere, resulting in enrichment of CO2, He, and other gas species. Over the last 25 years, extensive research on diffuse CO2 degassing has been conducted at volcanic and geothermal systems, indicating that silent CO2 degassing is an important mechanism for dissipating energy at volcanoes and contributes significantly to global CO2 emissions from subaerial volcanism. As a result, diffuse CO2 degassing studies have been regarded as a powerful tool in geochemical monitoring programs for volcanic surveillance, particularly in volcanic areas lacking visible gas manifestations (plume, fumaroles, hot springs, etc.), a valuable tool for identifying productive geothermal reservoirs, and a potential source of large amounts of CO2 to the atmosphere via gobal subaerial volcanism.

Diffuse degassing investigations on volcanoes involve primarily in-situ ground CO2 efflux measurements and the collecting of gases at a certain depth for later chemical and isotopic analysis. CO2 and He are the two most interesting gas species to investigate in diffuse degassing studies due to their similar low solubility in silicate melts at low pressures and suitability as geochemical tracers of magmatic activity. However, once exsolved from the silicate melts, their journey through the crust to the surface is considerably different. While CO2, as a reactive gas, is influenced by interfering processes (gas scrubbing by groundwaters and interaction with rocks, decarbonatation processes, biogenic production, and so on), He is chemically inert, radioactively stable, non-biogenic, highly mobile, and relatively insoluble in water. These properties minimize the interaction of this noble gas with the surrounding rocks or fluids during its ascent towards the surface. Their geochemical differences yield higher relative He/CO2 ratio in the fumarole gases than is actually present in the magma, but it decreases when the magma reservoir reaches enough pressure to generate incipient fracture systems approaching the eruption, thus releasing considerably more of the magma volatiles.

Quantifying global volcanic CO2 emissions from subaerial volcanism is critical for gaining a better knowledge of the rates and mechanisms of carbon cycling, as well as their effects on the long-term development of Earth's climate across geological timescales. Recent studies show that diffuse degassing contributes 47 to 174 Tg·y-1 to the atmosphere, although our understanding of the global diffuse CO2 degassing from subaerial volcanism could be larger.

Several examples of diffuse degassing research on many different volcanic systems around the world performed by our research team and collaborators during the last 25 years will be presented during my award/medal lecture, strongly supporting that diffuse degassing is a useful tool for volcanic surveillance and a significant contributor to the global CO2 emissions from subaerial volcanism.

How to cite: Pérez, N. M. and the INVOLCAN/ITER Research Team: The silent degassing of volcanoes: a useful tool for volcanic surveillance and a significant contributor to the global CO2 emission from subaerial volcanism , EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-14781,, 2024.

GI1 – General sessions on geoscience instrumentation

EGU24-201 | ECS | Orals | GI1.1

Automated Static Magnetic Cleanliness Screening for the TRACERS Small-Satellite Mission 

Cole J Dorman, Chris Piker, and David M Miles

The Tandem Reconnection and Cusp Electrodynamics Reconnaissance Satellites (TRACERS) Small Explorers mission requires high-fidelity magnetic field measurements for its magnetic reconnection science objectives and for its technology demonstration payload MAGnetometers for Innovation and Capability (MAGIC). TRACERS needs to minimize the local magnetic noise through a magnetic cleanliness program such that the stray fields from the spacecraft and its instruments do not distort the local geophysical magnetic field of interest. Here we present an automated magnetic screening apparatus and procedure to enable technicians to routinely and efficiently measure the magnetic dipole moments of potential flight parts to determine whether they are suitable for spaceflight. This procedure is simple, replicable, and accurate down to a dipole moment of 1.59 × 10-3 N m T-1. It will be used to screen parts for the MAGIC instrument and other subsystems of the TRACERS satellite mission to help ensure magnetically clean measurements on-orbit.

How to cite: Dorman, C. J., Piker, C., and Miles, D. M.: Automated Static Magnetic Cleanliness Screening for the TRACERS Small-Satellite Mission, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-201,, 2024.

EGU24-246 | ECS | Orals | GI1.1

Fiber-optic gyroscopes with enhanced temperature adaptability for geophysical rotational sensing 

Yanjun Chen, Lanxin Zhu, Wenbo Wang, Huimin Huang, Xinyu Cao, and Zhengbin Li

Fiber-optic gyroscopes, as rotational motion sensors, have emerged as powerful candidates for rotational seismology and Earth rotation observation due to their portability and high sensitivity. However, the stability of fiber-optic gyroscopes is degraded by environmental temperature variation that optical fibers are sensitive to. For the suppression of effects of temperature variation, conventional methods include the device level and post-processing level. However, the former has additional device requirements and higher costs, while the latter has a degradation of compensation for a more general environment. In this abstract, we propose a suppression method at the device level. We find that the effect of thermally induced phase fluctuations is significantly lower in the high-frequency band compared to the low-frequency band. Therefore, by upconverting the operating point of the fiber-optic gyroscope at a high-order harmonic of eigenfrequency, the effect of thermally induced phase fluctuations on the output is greatly suppressed. This method is easy to operate without requiring any additional optical or electrical components. To validate this method, we have conducted a time-varying temperature variation experiment using a portable fiber-optic gyroscope equipped with a 20 km long and 0.3 m diameter fiber-optic coil. The implementation of this upconverted frequency modulation technology resulted in a 32-fold reduction in temperature sensitivity for the fiber-optic gyroscope. The results demonstrate that the proposed technology enhances the temperature adaptability of fiber-optic gyroscopes, which is a critical aspect in practical geophysical applications. At the same time, the self-noise is reduced from 3×10-8 rad/s/√Hz to 8×10-9 rad/s/√Hz, further improving its sensitivity to observe geophysical rotation signals. Seismic records will be presented to demonstrate its utility in rotational seismology.

How to cite: Chen, Y., Zhu, L., Wang, W., Huang, H., Cao, X., and Li, Z.: Fiber-optic gyroscopes with enhanced temperature adaptability for geophysical rotational sensing, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-246,, 2024.

EGU24-1499 | Orals | GI1.1

Global sea surface air pressure observations with V-band O2 differential absorption radar 

Bing Lin, Matthew Walker Mclinden, Xia Cai, Gerald M. Heymsfield, Nikki Privé, Steven Harrah, and Lihua Li

Observed meteorological data are essential for initialization, adjustment, assimilation, and prediction of numerical weather prediction (NWP) models and influence daily activities of people and society. Many key weather variables such as temperature, humidity and winds are well observed globally by combined surface weather stations and suborbital and orbital remote sensing platforms of current Earth Observing System (EOS). However, sea surface air pressure is a significant observational gap in the current EOS. There is no operational remote sensing method available for this crucial dynamic variable of the Earth’s climate and weather systems. Over open oceans, the pressure can only be observed by in-situ sensors of very limited buoys, ships, and oceanic platforms. Studies find that accurate sea level pressure (SLP) measurements can significantly improve not only dynamics but also thermodynamics, such as temperature fields of NWP models [1]. Weather forecasts, especially severe weather predictions including hurricanes, can also be improved considerably with pressure measurements.

This study presents the SLP retrieval with emphasis on the evaluation of potential impacts of instrumental and environmental uncertainties on the retrievals for measurements of V-band O2 differential absorption radar systems operating at three spectrally even spaced close frequency bands (65.5, 67.75 and 70.0 GHz). This study finds that precise knowledge on instrument attitude in current design will result in negligible retrieval errors. The spectral control of the instrument and the knowledge on frequency changes will provide accurate information for forward radiative transfer calculations and then, SLP retrieval. Furthermore, the retrieval algorithm combining all three channels, i.e., the 3-channel approach, can effectively mitigate major atmospheric (e.g., water vapor and cloud) and sea surface influences on sea surface air pressure retrieval.

The major uncertainty for sea surface pressure retrieval is caused by the noise in radar power returns for the current design. Analysis demonstrates the potential of global SLP observation with error similar to that of marine in-situ measurements (about 1 ~ 2 mb), which is urgently needed for improvement of NWP models. Currently, NASA is developing an airborne system for demonstration of space applications.  Our presentation will provide more details on the system, SLP retrieval and their applications.



[1] Prive, N., M. Mclinden, B. Lin, I. Moradi, M. Sienkiewicz, G. Heymsfield, and W. McCarty, “Impacts of marine surface pressure observations from a spaceborne differential absorption radar investigated with an observing system simulation experiment”, J. Atmos. Oceanic Tech., 40, 897 – 918, 2023.

How to cite: Lin, B., Mclinden, M. W., Cai, X., Heymsfield, G. M., Privé, N., Harrah, S., and Li, L.: Global sea surface air pressure observations with V-band O2 differential absorption radar, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-1499,, 2024.

EGU24-2442 | ECS | Orals | GI1.1

GAIN, a Machine Learning approach for Airborne, Maritime, and Submarine Gravimeter Systems 

Lorenzo Iafolla, Massimo Chiappini, and Francesco Santoli

Precision gravimeters deployed onboard aerial, groundbased, and underwater moving platforms face significant accuracy challenges due to environmental disturbances such as non-inertial reference systems and temperature variations. The “Gravimetro Aereo INtelligente” (GAIN) concept represents a step forward in tackling this problem by using a data post-processing approach. This approach avoids cumbersome, heavy, and power-intensive active compensation systems, thus increasing the instrument's adaptability to small moving platforms.

The GAIN concept is based on three pillars that define its approach. Firstly, it incorporates a multi-sensor system within the gravimeter framework, which might include a three-axial accelerometer, a three-axial gyroscope, multiple thermometers and a barometer. This set of sensors are designed to measure both the effects of gravity and of other disturbances. By utilizing this information, GAIN employs machine learning algorithms (the second pillar) to map the complex relationship between the measurements and the desired gravity value. However, machine learning heavily relies on the availability of high-quality training datasets, which are often scarce and challenging to obtain in operational environments. To address this bottleneck, the third pillar of GAIN utilizes a training platform that can simulate a wide range of environmental situations in a controlled laboratory setting. This platform enables the generation of labeled data that mimics real-world operational scenarios.

This contribution will present the details of the initial GAIN experimental setup, highlighting the successful integration of a multi-sensor system with the training platform. Additionally, early findings will be shared, demonstrating the potential of the GAIN technique in mitigating temperature changes in gravimeters. Finally, the progress of ongoing experiments will be showcased, as we work towards expanding the capabilities of the GAIN method to also address rotations and linear accelerations as sources of interference.

How to cite: Iafolla, L., Chiappini, M., and Santoli, F.: GAIN, a Machine Learning approach for Airborne, Maritime, and Submarine Gravimeter Systems, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-2442,, 2024.

EGU24-3511 | Posters on site | GI1.1

An improved suspension system for the observatory variometer 

Vira Pronenko, Andrii Marusenkov, Igor Parylo, and Andrii Prystai

The observatory variometers have been used for long-term observation of the Earth’s magnetic field for years and more. Because of this, it is necessary to exclude the influence of every external factors at the sensors of these magnetometers. One of the most often ones are tilts of their sensors because of the seasonal tilts of the pears at which they are installed. These variometers are usually equipped with sensors installed on brass or titanium platforms, pendulously suspended for tilt compensation. Here we present a new fiber-suspended sensor that better fits the observatory demands.

Usually, the tilt compensation ratio is defined as a relationship between the pendulum and base rotations in the same vertical plane (around the same horizontal axis). We found that at certain conditions a pendulum may rotate around two other axes perpendicular to the base rotation line.  The first effect appears as a pendulum rotation around the horizontal line belonging to the vertical plane in which the sensor base is tilted.  This effect can be seen not only at the inclination of the base but also at the rotation of the sensor around its vertical axis. We used such a rotation in a horizontally directed magnetic field H to match the alignment of the mechanical axes of suspension and the magnetic axes of the sensor.

The second detected effect was manifested as the pendulum rotation around the vertical axis during the inclination of the platform in the vertical plane where the upper pair of the fibers lies. The cause of the second effect is the imbalance of the lower part of the pendulum - either due to the uneven distribution of masses or due to different lengths of the lower pair cords.

To keep both effects at the lowest possible or negligible level, a new version of the fiber-suspended sensor is designed. This sensor has three supporting feet and a worm drive for fine adjustment of the magnetic components’ orientation and the magnetic axes leveling possibility included in the firmware. The following parameters were obtained: tilt range ±4°, tilt compensation ratio (including off-axis effects) >2000, and thermal factor <0.2 nT/°C.

How to cite: Pronenko, V., Marusenkov, A., Parylo, I., and Prystai, A.: An improved suspension system for the observatory variometer, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-3511,, 2024.

EGU24-5984 | Orals | GI1.1

A portable luminescence dating instrument: a new insight for in-situ Earth applications 

Alessio Di Iorio, Vincenzo Pascucci, Roberto Filippone, Giulia Di Iorio, Daniele Sechi, Stefano Andreucci, and Ilaria Di Pietro

The age determination is crucial to unravel the evolution and to provide a framework of the geological, paleo-anthropological, and cultural conditions of a specific region of interest. The lack of chronological information makes difficult to correlate the site of interest with other temporal attributes provided by stratigraphy and paleoclimatology. Luminescence is a well-established dating technique for determining the absolute age of formation (or most recent reworking event) of geological deposits/sediments and archaeological finds on Earth, with a time range that spans from the last few decades up to one million years. The most common geological targets are dust, silt, sand, cobbles, and rock outcrops originating from different environments: aeolian, fluvial, alluvial, lacustrine, marine, glaciogenic, slope deposits, karstic, soils, tectonic activity. In the archaeological field, this technique is applied not only to the geological sediments in excavation sites but also directly to artifacts of interest, especially when they are made of pottery or stone.

A miniaturized luminescence dating prototype for in-situ examination has been designed by Alma Sistemi S.r.l., Guidonia, Italy, and validated by University of Sassari, Sassari, Italy, under European Union H2020-MSCA-RISE-2018 research programme (G.A. n.823934). The instrument is equipped with an infrared and blue optical stimulation subsystem to perform both optically and infrared-stimulated luminescence (OSL, IRSL) and is able to measure both paleo-dose and dose rate. An X-ray generator (XRG) irradiates the sample, while the response luminescence signal is obtained through a photon counting photomultiplier tube (PMT). A thermal subsystem consisting of a heater and air-cooling pumps allows the instrument to heat during SAR (single-aliquot regenerative-dose) procedure and to perform thermally stimulated luminescence. Remote control for data analysis application and a battery power supply are implemented on the instrument for usage on the field.

The development of this portable instrument is of great relevance since it finds practical use in geological and archaeological Earth's field applications. In fact, compared to current luminescence dating technologies and considering its reduced dimensions (11x11x18 cm excluding electronic box and cover) and weight (currently <5 kg), the presence of the air-cooling system Vs Nitrogen and the use of X-ray vs radiative source, qualify the instrument for direct use in the field. It is also provided with an advanced and user-friendly software tool, thus strongly reducing the need of a skilled operator. The prototype instrument has been validated using different samples and results compared with equivalent laboratory instrument (Risø TL/OSL Reader model TL/OSL-DA-20) at University of Sassari Luminescence laboratory.

At this stage, the instrument can perform a basic SAR protocol and accurately measure the response luminescence signal after different irradiations time and thus, measure the natural dose of the natural sediment sample. Here, the most recent results are presented.

How to cite: Di Iorio, A., Pascucci, V., Filippone, R., Di Iorio, G., Sechi, D., Andreucci, S., and Di Pietro, I.: A portable luminescence dating instrument: a new insight for in-situ Earth applications, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-5984,, 2024.

EGU24-7394 | Orals | GI1.1

Automated mineralogy as a key technology toward zero-waste mining – The EXCEED project 

Sonja Lavikko, Quentin Dehaine, Fernando Prado Araujo, and Philippe Muchez

The increasing demand for critical raw materials (CRMs) linked to the energy transition, Europe’s reliance on a few third countries (incl. China) for the supply of these combined with increased ESG issues calls for a new mining paradigm: i.e., responsible, zero-waste, multi-metal/mineral mining. Li-hard rock deposits (pegmatites and rare-metal granites) are perfect candidates for such an approach, where, besides lithium, numerous by-products including industrial minerals (quartz, feldspar, micas) and CRMs (Nb, Ta, and so forth) could be potentially extracted. To assess whether these by-products can be recovered during Li production requires a mineral-centric, integrated geometallurgical approach. Automated mineralogy is a key technology for such an approach. Determining how to utilize the secondary material streams and recovery of the by-products relies on the knowledge of the material, its chemical composition, particle size, crystal structure and texture to grain size, liberation grade and mineral associations. In this study, four European lithium mine projects, two pegmatite projects (Keliber, Finland and Savannah, Portugal) as well as two rare-metal granite (RMG) projects (Beauvoir, France and St Austell, UK), are investigated. Different ore types as well as process samples (concentrates, residues, and tailings) were investigated to assess the by-product potentials of industrial minerals, CRM’s as well as the status and behavior of potentially harmful elements (PHEs) throughout the processing flowsheet. Gathering of all this information is started by the Extended BSE Liberation Analysis (XBSE_STD) and Grain-Based X-ray Mapping (GXMAP) measurements with the FEI Quanta 650F, an automated Scanning Electron Microscope equipped with a field emission gun electron source, two Energy Dispersive X-ray spectrometers (EDX) (Bruker X-Flash 6130), and FEI’s Mineral Liberation Analyzer (MLA) quantitative mineralogy software v. 3.1.4. Additional data is collected with Micro-XRF Bruker M4 Tornado plus with AMICS. Details are further studied with additional methods, such as X-ray powder diffraction, Inductively Coupled Plasma Optical Emission spectroscopy, Electron Probe Micro-Analyses, Laser ablation ICP-MS, and X-ray Fluorescence measurements.  

To define how the ore properties and the PHE/CRM deportment affect the options for usability, a comprehensive geometallurgical assessment will be conducted starting from collecting the basic mineralogical data to creating process flowsheet options and predicting theoretical process performance. These results are then to be tested at the lab and pilot scale according to the produced process protocols to be validated.  

How to cite: Lavikko, S., Dehaine, Q., Prado Araujo, F., and Muchez, P.: Automated mineralogy as a key technology toward zero-waste mining – The EXCEED project, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-7394,, 2024.

EGU24-8193 | ECS | Orals | GI1.1 | Highlight

Three-dimensional reconstruction of high-resolution images of existing tunnels for geo-structural monitoring and inspection purposes. 

Saduni Melissa Dahanayaka, Adrián José Riquelme Guill, Alessia Vecchietti, and Matteo Del Soldato

Geotechnical and structural monitoring and inspection of existing linear infrastructures, with particular focus on tunnels, have gained a great prominence in recent years. In Italy, a new regulation for inspection, monitoring and maintenance of tunnels has been issued and a massive inspection plan is carried out to guarantee a constant maintenance and safe condition of existing tunnels. The relevance of geo-structural monitoring lies in the possibility of controlling the interaction between soil and structure and in prevention of damaging infrastructure systems because of deterioration. Monitoring also helps the prevention of natural disaster effects, according to the rising attention paid to hydrogeological risk in the country in the last decades. The Italian territory is vulnerable to earthquakes, floods, and landslides, which are the major hazards that can involve human settlements, constructions, and big infrastructures such as tunnels. An opportunity arose for the possibility of working on huge amounts of data and develop accurate methods of elaboration and visualisation of the most significant information for inspection and maintenance planning purposes. For this work, innovative methods and mobile survey technologies were used to get linear images and 3D point cloud data of some highway tunnels in central Italy, which are characterised by relevant structural deteriorations and cracks. High-resolution black and white images of tunnel linings were captured through a mobile system composed by line cameras, lamps for a correct illumination of the tunnel surface and positioning system. To represent the whole tunnel surface, 4 runs were performed: right and left wall and right and left ceiling. High-density point clouds of the tunnel were acquired by a mobile laser scanner mounted on a vehicle. The combination of 2D high-resolution images and 3D data can have a significant impact on the data visualisation and presentation in order to have a comprehensive representation of the actual state-of-the-art of the infrastructure. The 3D representation of the tunnel from 2D linear images is accomplished through the reconstruction of a 3D geometrical model of the tunnel section through a tool for the automatic elaboration and management of images. This automatic calculation algorithm provides the three-dimensional reconstruction of the infrastructure through 2D high-resolution images, in order to have the best representation and visualization of the elements inside the tunnel. The major advantage of the tool is the possibility of identifying and evaluating structural defects and cracks in the tunnel surface directly on the 3D model and of better understanding the effects on the infrastructure caused by deformation events in the geological context. It can also provide the comparison of subsequent surveys for monitoring. This work represents an innovative means affecting fundamental aspects of existing tunnels management and monitoring, e.g., the investigation of deformation phenomena, temporal evolution of deteriorations and cracks, causes identification/prevention, soil-structure interaction studies, geological hazard risk reduction.

How to cite: Dahanayaka, S. M., Riquelme Guill, A. J., Vecchietti, A., and Del Soldato, M.: Three-dimensional reconstruction of high-resolution images of existing tunnels for geo-structural monitoring and inspection purposes., EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-8193,, 2024.

EGU24-8243 | Orals | GI1.1

A New Approach to Infrasound Sensor Design 

Cansun Guralp, Paul Minchinton, Horst Rademacher, and Murray McGowan

Current infrasound sensor designs have shortcomings inherent in their open-loop arrangement. Among them are the limited dynamic range, the lack of linearity of the response function over the desired frequency range and – may be most importantly – the fact that such sensors can only be calibrated in the laboratory and not under real conditions in the field.

Here we present a novel infrasound sensor design, which overcomes these and other shortcomings. At the core of our new sensor lies a feedback loop. It is based on a proven technology already applied in many sensor and control systems, particularly relevant for Earth science in the design and manufacturing of high fidelity broadband seismic sensors.

The new infrasound sensor uses a precision bellow, which deflects in response to pressure variations or atmospheric infrasound waves. The movement of the bellow in single degree of deflection is measured with a differential capacitive displacement transducer. Its circuitry is a Blumlein bridge arrangement operating at a frequency of 45 KHz and a driver signal amplitude of 20 V. The transducer's output signal is then synchronously fed back to the regular linearised magnetic force transducer after passing through a Proportional Integral and Differential (PID) controller.

This design increases the bandwidth of the sensor to five decades, from 2.7 mHz to more than 200 Hz. At the same time the response of the sensor is essentially flat over the entire frequency range with only minor variations of less than +/- 0.1 dB. We measured the dynamic range of the sensor to be in excess of 155 dB, a significant increase compared to current open loop systems.

The infrasound sensors theoretical transfer function is compared to practical measurements providing sensors characteristics including its detection levels over the complete frequency response.

The system calibration is carried out analogously to the calibration of broadband seismic sensors. We inject a known calibration signal (either sinusoidal, square wave or broadband noise) directly into the feedback force transducer. This setup allows the calibration of the infrasound sensor in the laboratory as well as after deployment in a field station.

How to cite: Guralp, C., Minchinton, P., Rademacher, H., and McGowan, M.: A New Approach to Infrasound Sensor Design, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-8243,, 2024.

Since few years, a lot of companies and industries are developing battery recycling processes for the recovery of critical elements such as Li, Ni and Co. Before hydrometallurgical processing, mechanical and/or thermal treatment are applied in order to produce a black powder, also called blackmass. This high-value powder contains these above mentioned critical elements as well as graphite, rare earth metals and impurities such as solvents, plastics, aluminium and copper. The only data currently used to determine the quality of blackmass are chemical analyses of the major elements. However, micro-textures, liberation of elements and phases, as well as the amount of impurities in various phases are important parameters for the efficiency and performance of a hydrometallurgical process.

In order to evaluate the suitability for hydrometallurgical recycling process, it is essential to analyse the blackmass not only chemically but also with respect to size, shape and composition of particles. This presentation shows how these data can be acquired by using a refined QEMSCAN database. This database was created based on billions of point analyses on a total of some million particles. The results show that:

  • Particles can be micro-texturally characterized and classified with respect to chemical element contents.
  • Important textural and chemical particle variations exist in the blackmass of different origins showing different qualities.
  • Elements deleterious to hydrometallurgical processing (i.g. Si, Mg, K, Ca, Fe, Al, Cu and others) can be present in specific and well liberated particles.
  • Cathode active material compositions (different types of NMC as well as LCO, NCA, LFP, NiMH, etc) that are specific for each battery type can be distinguished.
  • Digital simulation of additional physical mineral processing can optimize blackmass quality with respect to valuable elements.
  • Special attention must be given to potential health risks during recycling and the processing of blackmass as elements like Cd and Co can be present in ultrafine particles.

How to cite: Dadé, M.: The automated mineralogy: an important tool for geometallurgy studies of battery recycling, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-9867,, 2024.

EGU24-10303 | Posters on site | GI1.1 | Highlight

Airborne Synthetic Aperture Radar and Electromagnetic Technologies of the Italian earth observation platform ITINERIS 

Ilaria Catapano, Andrea Barone, Paolo Berardino, Romeo Bernini, Carmen Esposito, Francesco Mercogliano, Antonio Natale, Stefano Perna, Jorge Andres Rosero Legarda, Pietro Tizzani, Riccardo Lanari, and Francesco Soldovieri

ITINERIS - Italian Integrated Environmental Research Infrastructures System is the Italian hub of research infrastructures in the environmental scientific domain, whose creation is financed by the national recovery and resilience plan (PNRR).

Among the large number of technological solutions made available by the infrastructure, thanks to its skills in the fields of remote sensing and electromagnetic monitoring of the environment, the Institute for Electromagnetic Sensing of the Environment of the National Research Council (CNR-IREA) has in charge the development and optimization of technologies for the Soil-Subsoil System (SSS) observation.

Specifically, CNR – IREA is carrying out activities concerning two technological assets. The first one is made up of airborne Synthetic Aperture Radar (SAR) systems and computing resources suitable to manage large amounts of data and generate SAR derived products. The second one involves mobile (also by exploiting drones) and fixed in-situ sensors, consisting of magnetometers, gradiometer, multi-antenna ground penetrating radar and optical backscatter reflectometer, which are suitable for high resolution imaging and monitoring of the shallower layers of the subsoil, including the groundwater. These activities also involve the design of innovative data processing procedures, aimed at increasing the effectiveness of each one of the observation technologies, as well as the definition of measurement protocols and strategies devoted to the integration of airborne and in-situ sensors, with the final goal to perform a multi-scale and multi-resolution non-invasive monitoring of the dynamic processes affecting the SSS.

A detailed summary of the performed and planned activities will be presented at the conference together with the technical specifics of the purchased instrumentations.


Acknowledgement: The communication has been funded by EU - Next Generation EU Mission 4 “Education and Research” - Component 2: “From research to business” - Investment 3.1: “Fund for the realisation of an integrated system of research and innovation infrastructures” - Project IR0000032 – ITINERIS - Italian Integrated Environmental Research Infrastructures System - CUP B53C22002150006.

The authors acknowledge the Research Infrastructures participating in the ITINERIS project with their Italian nodes: ACTRIS, ANAEE, ATLaS, CeTRA, DANUBIUS, DISSCO, e-LTER, ECORD, EMPHASIS, EMSO, EUFAR ,Euro-Argo, EuroFleets, Geoscience, IBISBA, ICOS, JERICO, LIFEWATCH, LNS, N/R Laura Bassi, SIOS, SMINO.

How to cite: Catapano, I., Barone, A., Berardino, P., Bernini, R., Esposito, C., Mercogliano, F., Natale, A., Perna, S., Rosero Legarda, J. A., Tizzani, P., Lanari, R., and Soldovieri, F.: Airborne Synthetic Aperture Radar and Electromagnetic Technologies of the Italian earth observation platform ITINERIS, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-10303,, 2024.

EGU24-10868 | Posters on site | GI1.1

A comparison of next generation mid-band broadband seismometers and traditional sensor technologies 

Connor Foster, Ella Price, Neil Watkiss, Aaron Clarke, Phil Hill, James Lindsey, and Federica Restelli

Mid-band seismometer systems usually have shorter period responses and higher noise floors when compared to broadband seismometer sensors. These seismometers have been hugely popular with permanent seismic networks and temporary experiments alike due to their cost-effectiveness, portability and relative ease of deployments which allow for network densification and quick deployments. Güralp have historically led the way with such sensors with the 6T and 40T series which have been used globally in challenging environments over the last decades for local and regional seismic monitoring applications. GSL have built on this tried and trusted platform to develop the next generation of mid-band sensor technology.

The Güralp next-generation smart sensor module is designed to be able to operate at any angle, without the use of a mechanical gimbal system. This allows for the entire sensor package to be rotated during installation and deployment without sacrificing data quality and means that all three components of the sensor to be manufactured to the same design, eliminating inconsistencies in performance between horizontal and vertical components whilst still maintaining an orthogonal orientation for redundancy. The new generation of sensor makes use of novel materials and techniques to drastically improve the noise performance over traditional mid-band sensors.

The sensor components include digital elements to the feedback loop, allowing for the sensor module to have an on-board serial server. This facilitates greater interoperability with Minimus based digitizer platforms, including automatic pulling of sensor serial number, sensor module SOH channels and the ability to remotely adjust the long period corner between options of 1s and 120s. This therefore makes the sensor module incredibly easy to deploy and mitigates against previous requirements for multiple instruments of varying responses.

The sensor module has now been successfully developed into a number of different packages for varying deployment scenarios including borehole (the Radian), offshore (Aquarius and Maris), vault (Certimus) and posthole (Certis) application. All packages make use of the latest digital technologies to reduce power consumption down to <300mW.

How to cite: Foster, C., Price, E., Watkiss, N., Clarke, A., Hill, P., Lindsey, J., and Restelli, F.: A comparison of next generation mid-band broadband seismometers and traditional sensor technologies, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-10868,, 2024.

EGU24-12932 | Orals | GI1.1

Removing the memory effect from water stable isotope analysis 

Hubert Vonhof, Stefan de Graaf, Elan Levy, and Julian Schroeder

Over the past decade or so, laser spectrometric instruments have revolutionized the field of isotope analysis of water samples. These instruments do not require complex lab facilities, are easy to use and can provide hydrogen and oxygen isotope data at high precision and high throughput.

One well-known shortcoming of these laser spectrometric analyzers is that individual measurements display significant sample-to-sample memory effects. Particularly at larger isotopic differences between samples, isotopic contamination by the previous sample can off-set the following measurements even after multiple injections. Therefore, it is common in many laboratories to run 7 or more replicate analyses of each sample, and discard the first 4 or so, to come to an accurate isotope value of that sample.

Because the single-shot precision of these instruments is rather good, the sample replication is not so much necessary for obtaining better precision, but indeed mostly needed to flush out the memory effect on the isotope values. Therefore, any technical adaptation that decreases the memory effect of these analyzers, and thus reduces the number of replicate analyses required to come to an accurate isotope ratio, would greatly improve the sample throughput of these instruments.

We here present an adapted injection interface system, coupled to a Picarro L2140i analyzer, that practically removes sample to sample memory effects. This effectively leads to accurate and high-precision isotope analysis of single-shot sample injections, even at large sample-to-sample isotope differences. Key to the removal of the memory effect is that the analyzer runs on a moisturized carrier gas, providing a constant water background upon which the injected samples are analyzed (De Graaf et al., 2021). We will present results of series of standard waters and natural samples (including seawaters) and discuss protocols that we developed for data calculation and quality control.



de Graaf, S., Vonhof, H.B., Levy, E.J., Markowska, M., Haug, G.H., 2021. Isotope ratio infrared spectroscopy analysis of water samples without memory effects. Rapid Communications in Mass Spectrometry 35.


How to cite: Vonhof, H., de Graaf, S., Levy, E., and Schroeder, J.: Removing the memory effect from water stable isotope analysis, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-12932,, 2024.

EGU24-13332 | Posters on site | GI1.1 | Highlight

Twenty five years of geochemical monitoring of the oceanic active volcanic island of El Hierro, Canary Islands 

Fátima Rodríguez, Ana Pires, Aarón Álvarez, María Asesio-Ramos, Gladys V. Melián, Eleazar Padrón, Pedro A. Hernández, Germán D. Padilla, Nemesio M. Pérez, and José Barrancos

El Hierro (278 Km2), the youngest and westernmost island of the Canarian archipelago,  is settled on an ocean floor 3.5 km deep and reaches 1.5 km above sea level. The island was constructed by rapid constructive and destructive processes in ~ 1.12 Ma. A submarine eruption took place from October 2011 to March 2012 about 2 km south of the small village of La Restinga in the southernmost part of the island. The eruptive process was the second longest and the second largest volume discharged in the historical volcanic activity of the Canaries (in the last 500 years) and was the first one to be monitored from the beginning. Since visible volcanic emissions are absent at the surface of El Hierro, one of the most useful geochemical tools to monitor the volcanic activity of El Hierro is the diffuse degassing studies. Diffuse CO2 emissions have been monitored at El Hierro Island since 1998 in a yearly basis, with higher frequency during the pre and eruptive period of 2011-2012. At each survey, 600 sampling sites are studied and measurements of soil CO2 efflux are performed in situ following the accumulation chamber method. During pre-eruptive and eruptive period, the diffuse CO2 emission released by the whole island experienced significant increases before the onset of the submarine eruption and the most energetic seismic events of the volcanic-seismic unrest. In the last survey, performed in the 2023 summer period, soil CO2 efflux values ranged from non-detectable up to 39 g m−2 d−1. Statistical-graphical analysis of the data show three different geochemical populations, background (B), intermediate (I) and peak (P), represented by 97.7%, 1.6 % and 0.7% of the total data respectively, with geometric means of 1.2, 20 and 27 g m−2 d−1, respectively. To quantify the diffuse CO2 emission for the 2023 survey, 100 sequential Gaussian simulations (sGs) were performed as interpolation method. The estimated 2023 diffuse CO2 output released to atmosphere by El Hierro was 528 ± 22 t d-1, value higher than the background average of CO2 emission estimated in 410 t d-1. The data presented here demonstrate that discrete surveys of diffuse CO2 emission offer important information to optimize the early warning system in volcano monitoring programs.

How to cite: Rodríguez, F., Pires, A., Álvarez, A., Asesio-Ramos, M., Melián, G. V., Padrón, E., Hernández, P. A., Padilla, G. D., Pérez, N. M., and Barrancos, J.: Twenty five years of geochemical monitoring of the oceanic active volcanic island of El Hierro, Canary Islands, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-13332,, 2024.

EGU24-15961 | Posters on site | GI1.1

Spontaneous potential surveys for geothermal exploration in Tenerife and La Palma (Canary Islands) 

David Martínez van Dorth, Silvia Beretta, Giovanni Floridia, Audrey Yin, Aarón Álvarez Hernández, Rubén García Hernández, María Jiménez-Mejías, Víctor Ortega Ramos, Luca D’Auria, and Nemesio M. Pérez

The spontaneous-potential (SP) method is a passive geophysical technique that measures naturally occurring voltage differences on the Earth's surface. This method is capable of identifying geoelectric anomalies which can be generated by different sources. In active volcanic areas, these geoelectrical anomalies may be related to thermoelectric and electrokinetic processes caused by the circulation of hydrothermal fluids in subsurface porous materials. The sensitivity of the SP method in characterizing hydrogeologic and hydrothermal circulations, together with its simplicity and non-intrusive nature, has made this method widely used for geothermal exploration in the last decades.

In the Canary Islands, the surface geothermal manifestations are less evident than in other active volcanic systems worldwide. Thus, exploration techniques used to study the geothermal potential of the Canaries must focus on investigating the possible presence of deep-seated hydrothermal reservoirs. For this purpose, self-potential surveys were conducted on the Tenerife and La Palma islands to determine the spatial variations of the electrokinetic potential related to the geothermal and volcanic-structural characteristics of the study areas. The choice of these two islands to promote the search for geothermal resources lies mainly in their historical volcanism, characterized by five well-documented historical eruptions on Tenerife and up to 8 on La Palma, where the most recent and voluminous eruption occurred in 2021.

The SP campaigns were carried out in two volcanic areas: the NW rift zone of Tenerife and the west flank of the Cumbre Vieja rift zone of La Palma. The instrumentation consisted of several V-FullWaver devices from IRIS Instruments, equipped with Cu-CuSO4 non-polarizable electrodes and copper wire reels ranging from 60m to 250m. The methodology consisted of measuring the potential difference of the electric field along different profiles. These profiles are divided into sections where the reference electrode remains at the beginning of the profile. At the same time, the other is moved, measuring on points spaced of about 60 m m until the maximum length of the cable is reached. Then, a new reference electrode is established, and the measurements continue along the profile. To obtain continuity in the data set along each profile, the reference correction is applied to connect all sub-sections of a single SP profile.

Measurement points were located along several trails within the geothermal prospecting areas. Preliminary results show anomalies ranging between -281 and 198 mV in Tenerife and between -234 and 256 mV in La Palma. The main objective of the SP application is to contribute to delimiting those areas of hydrothermal interest associated with the presence of geothermal resources. Although this study is in its initial stage, it promotes a more sustainable and resilient future for the Canary Islands, in which geothermal resources could provide a reliable and renewable energy source.

How to cite: Martínez van Dorth, D., Beretta, S., Floridia, G., Yin, A., Álvarez Hernández, A., García Hernández, R., Jiménez-Mejías, M., Ortega Ramos, V., D’Auria, L., and Pérez, N. M.: Spontaneous potential surveys for geothermal exploration in Tenerife and La Palma (Canary Islands), EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-15961,, 2024.

EGU24-15994 | ECS | Posters on site | GI1.1

Taking stock of the global area boom for greenhouse cultivation in the 21st century 

Xiaoye Tong, Xiaoxin Zhang, Rasmus Fensholt, Peter Rosendal Dau Jansen, Sizhuo Li, Marianne Nylandsted Larsen, Florian Reiner, Feng Tian, and Martin Brandt

Greenhouse cultivation that favors agricultural productivity is booming globally in the past decades. Yet, currently little knowledge exists on its global extent and possible drivers of the expansion. Here, we present a global assessment of greenhouse cultivation and map 1.3 million hectares of greenhouse infrastructures in 2019, including both large (61%) and small scale (39%) greenhouse infrastructure that are optimally detectable by using commercial satellite data at 3 m resolution. Examining the temporal development of the 65 largest clusters (> 1500 ha), we show a recent upsurge in greenhouse cultivation in the Global South since 2000s, primarily aimed at enhancing agricultural productivity and achieving economic prosperity. China is leading the boom in the Global South and accounts for 61% of the global greenhouse cultivation. Trade and production data for five major greenhouse-cultivated vegetables suggest that China's greenhouse cultivation boom is primarily driven by domestic mechanisms, rather than international ones. To investigate this hypothesis, we examined the spatial patterns of greenhouse cultivation in China and found distinct configurations around urban areas for food provision and around rural areas for poverty alleviation. Our high-resolution thematic map serves as a global baseline for future exploration of environmental and socioeconomic factors related to greenhouse cultivation. Our study also underscores the need for sub-category reporting and optimizing international policies to address measurement, reporting, and verification of greenhouse cultivation.

How to cite: Tong, X., Zhang, X., Fensholt, R., Rosendal Dau Jansen, P., Li, S., Nylandsted Larsen, M., Reiner, F., Tian, F., and Brandt, M.: Taking stock of the global area boom for greenhouse cultivation in the 21st century, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-15994,, 2024.

EGU24-15996 | Orals | GI1.1

Accurate characterization of graphite in ores and black mass using an innovative sample preparation method for automated mineralogy 

Hassan Bouzahzah, Laura Lenoir, Eric pirard, and Raphaël Mermillod-Blondin

Automated mineralogy systems are widely used for the mineralogical characterization of powder samples for mineral processing purposes. This characterization method requires the mineral powder to be embedded in a resin (polished block (PB) preparation). Very quickly, some problems linked to polished block preparation arose. This particularly involves the mineral settlement in the liquid resin due to differences in mineral size and density. Several authors have suggested solutions to overcome the error results due to the PB preparation method, such as vertical section, the addition of sized graphite, dynamic hardening, and the addition of black carbon (BC) to increase resin viscosity avoiding mineral settlement. Only the BC method resolved all the errors associated with the PB preparation. Indeed, it eliminated the mineral settlement and provided excellent spatial dispersion of particles on the observation surface, ensuring better mineral quantification and liberation/association estimation, except for the graphite-bearing samples. In fact, as graphite shows no contrast with resin under back-scattered electron-based based images, it cannot be characterized by the automated mineralogy systems. few studies have addressed this problem by the addition of carnauba wax or iodoform to contrast resin and graphite. The iodoform was easy to use and provided better contrast compared to carnauba wax. This work presents an innovative polished block preparation method that combines CB and iodoform to prevent both particle settlement and to contrast the resin and graphite which was very challenging. The obtained results are highly satisfying and comparable to those of standard characterization techniques such as XRD and chemical assay. This new preparation method is highly useful for graphite-bearing black mass (obtained from battery recycling) characterization by automated mineralogy systems.

How to cite: Bouzahzah, H., Lenoir, L., pirard, E., and Mermillod-Blondin, R.: Accurate characterization of graphite in ores and black mass using an innovative sample preparation method for automated mineralogy, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-15996,, 2024.

The Güralp Minimus broadband digitiser introduced innovative features to the market including easy network configuration; compact form-factor; extensive State of Health (SOH) monitoring; and low latency digitisation. Since it was launched in 2016, technological advances in semiconductors have significantly decreased their power requirements. The latest iteration of Minimus, Minimus2, utilises modern microprocessors to reduce power consumption by over 50% whilst maintaining high levels of functionality. The resulting reduction in power consumption facilitates simplified field deployments for offline deployments.

The Minimus platform also provides a high level of functionality for online stations, including the industry unique option of sending State of Health (SOH) data via the SEEDlink protocol. As well as simplifying SOH monitoring for larger networks, this facility also allows for time-series analysis of SOH data. This means that operators have the data they need to proactively manage their station network and diagnose issues before they result in data loss. The Minimus platform interfaces with Discovery software which seamlessly integrates new stations into existing networks. The management of large numbers of real-time seismic stations is further enhanced with Guralp Data Centre (“GDC”) a cloud-based software package that is an optional add-on of the Discovery tool set.

The Minimus platform was built from the ground up to provide one of the lowest latency digitizers available with digitization latencies down to 40ms, making it well suited to Earthquake Early Warning applications. This is achieved with the use of causal decimation filters, high sample rates and Guralp’s proprietary GDI protocol. The Minimus platform is built as a modular digitizer platform that is available within a number of different packages to suit a range of applications, including as a stand-alone digitiser or built within broadband seismic instruments and force balance accelerometer systems. 

How to cite: Price, E., Watkiss, N., and Restelli, F.: The Minimus Digitizer Platform: a User-Friendly Ecosystem for Efficient Network Management and Seismic Station Configuration., EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-16231,, 2024.

EGU24-16989 | Orals | GI1.1

Inversion of real gravity data from geological faults using a generative neural network model 

Henrietta Rakoczi, Gary Barnes, Abhinav Prasad, Karl Toland, Christopher Messenger, and Giles Hammond

Normalising flows is a novel generative neural network model, which can be applied to Bayesian parameter inference. When gravity inversion is reformulated as a probabilistic inference problem, stable results can be obtained that naturally incorporate the inherent uncertainties and noise from the source background and the instrument. As opposed to some standard methods, Bayesian gravity inversion does not default to a single solution in an ill-posed problem, but informs the user about all possibilities that are consistent with the gravimetry survey of interest. It has been demonstrated that the normalising flow method can provide accurate results for a simulated data set, even when applied to high-dimensional data. Once the network is trained, the results can be obtained within seconds and it can be reused, without retraining, for multiple gravimetry surveys that are consistent with the training data set. Here, improvements on the previous work are presented, where the method is applied to a more realistic and complex geophysical problem; the inversion of gravity measurements to infer parameters of geophysical faults. The normalising flow network is trained and tested for fault models with various complexities, and finally the method is applied to the inversion of airborne gravimetry data. 


How to cite: Rakoczi, H., Barnes, G., Prasad, A., Toland, K., Messenger, C., and Hammond, G.: Inversion of real gravity data from geological faults using a generative neural network model, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-16989,, 2024.

Mine optimisation and anticipation of ore behaviour in the mineral processing and separation circuits are major economic drivers for all mining operation. Recent methodological developments with the inception of geometallurgy across multiple commodities has highlighted the importance of mineralogy in addition to grades. Since several decades, many quantitative tools have been developed, mostly SEM-based such as QEMSCAN®, to provide quantitative mineralogical composition and textural properties of ore and gangue samples. We aim to compare the more established SEM-based techniques to the Solsa combined XRD-XRF analyser to highlight their respective potential and limitations depending on the minerals and goals of the mining operators. The combined XRF-XRD of the SOLSA analytical solution brings a new methodology able to produce quantitative mineralogical and geochemical data at a speed compatible with routine data collection, from exploration to quality control on the different streams of minerals in a processing plant.

How to cite: Herbelin, M., Delchini, S., Pillière, H., Lutterotti, L., Nicco, M., Dia, M., and Riegler, T.: The pros or cons of X-ray diffraction vs electron beam techniques in the assessment of mineral assemblages: improvements of Solsa combined XRD-XRF analyses applied to the Grande Cote Operation Ti-Zr mine, Senegal., EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-17031,, 2024.

EGU24-17777 | ECS | Posters on site | GI1.1 | Highlight

Microgravity surveys for geothermal exploration in Tenerife and La Palma islands (Canary Islands) 

Víctor Ortega-Ramos, Julian Benjamin Lai, Isabella Michelle Sulvarán Aguilar, Adriana Quezada-Ugalde, Aarón Álvarez Hernández, Rubén García Hernández, María Jiménez-Mejías, David Martínez van Dorth, Germán D. Padilla, Luca D’Auria, and Nemesio M. Pérez

Gravimetry is a passive geophysical technique that measures variations in the Earth's gravitational field over its surface. This method studies the gravimetric anomalies caused by the presence of heterogeneities in the subsurface, and its values vary depending on the density of the different geological bodies in the subsoil.

This technique has become fundamental in geothermal exploration, providing information on the subsurface density distribution, which allows for constraining underground geological structures. Specifically, it could enable identifying and characterizing gravitational anomalies generated by geothermal resources.

This work is focused on the islands of Tenerife and La Palma, belonging to the Canary Islands. These two islands have been the object of different microgravity studies in recent decades. However, we aim to reach unprecedented detail on some target areas to get a detailed image of the subsurface density distribution. We measured gravity on 109 points in a few target areas of Tenerife and 67 points on the Cumbre Vieja Volcano Complex on the island of La Palma. The precise positioning of the measurement points was realized with a differential GPS (Leica Viva CS10) reaching less than 0.003m of accuracy in the vertical component. Gravity measurements have been realized with a CG-6 Autograv™ gravity meter with a reading resolution of 1 μgal. Every gravity value has been obtained with an average of at least ten measurement cycles of thirty seconds each. This allowed reaching a precision of less than five μgal. Firstly, we got Bouguer anomaly maps of the different target areas of Tenerife and La Palma. Then, we perform inverse modelling to retrieve 3D density models of such regions. Although preliminary, the results reveal a complex geological setting, in accordance with previous geophysical studies

The gravimetric method plays a crucial role in identifying geothermal resources in the Canary Islands. This technique offers perspectives to further develop renewable energies in the Archipelago, fostering a transition towards more sustainable and environmentally friendly energy sources.

How to cite: Ortega-Ramos, V., Lai, J. B., Sulvarán Aguilar, I. M., Quezada-Ugalde, A., Álvarez Hernández, A., García Hernández, R., Jiménez-Mejías, M., Martínez van Dorth, D., Padilla, G. D., D’Auria, L., and Pérez, N. M.: Microgravity surveys for geothermal exploration in Tenerife and La Palma islands (Canary Islands), EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-17777,, 2024.

EGU24-18674 | Posters on site | GI1.1

Ground CO2 monitoring at Timanfaya volcano (Lanzarote, Canary Islands) during the period 1999-2023 

Daniel Di Nardo, Silvia Paglia, Gladys V. Melián, Nemesio M. Pérez, Eleazar Padrón, Pedro A. Hernández, Fátima Rodríguez, and María Asensio-Ramos

Lanzarote Island (795 km2) is a volcanic island located in the eastern part of the Canary Islands and approximately 100 km from the NW coast of Morocco. The largest historical eruption of the Canary Islands, Timanfaya, took place during 1730-36 in this island when long-term eruptions from a NE-SW-trending fissure formed the Montañas del Fuego. Tinguaton volcano, the last eruption at Lanzarote Island, occurred in 1824 and produced a much smaller lava flow that reached the SW coast. At present, one of the most prominent phenomena at Timanfaya volcanic field is the high maintained superficial temperatures occurring in the area since the 1730 volcanic eruption. The maximum temperatures recorded in this zone are 605ºC, measured in a slightly inclined well 13 m deep. Since fumarolic activity is absent at the surface environment of Lanzarote, to study the diffuse CO2 emission becomes an ideal geochemical tool for monitoring its volcanic activity. We report herein the results of eight soil CO2 efflux surveys performed from 2006 to 2023 at Timanfaya Volcanic Field (TVF) with the aim to evaluate the temporal variations of the diffuse CO2 emission. Approximately 400 sampling sites were selected at each survey to obtain an even distribution of the sampling points over the study area. Soil CO2 efflux was measured following the accumulation chamber method. Soil temperature at 40 cm depth and soil gas samples collected at each sampling site was also measured to evaluate the chemical and isotopic composition of soil gases. Diffuse CO2 emission values have ranged between non detectable values to 34 g·m-2·d-1, with the highest values measured in September 2008. Conditional sequential Gaussian simulations (sGs) were applied to construct soil CO2 efflux distribution maps and to estimate the total CO2 output from the studied area at the TVF. Soil CO2 efflux maps showed a high spatial and temporal variability. Most of the study area have shown relatively low values, around the detection limit of the instrument (~0.5 g·m-2·d-1). Higher soil CO2 diffuse emission values were observed where thermal anomalies occur, indicating a convective mechanism transport of gas from depth at these areas. Diffuse CO2 emission rates ranged between 41 and 519 t·d-1 during the study period (57 t·d-1 for 2023). Long-term temporal variation on total CO2 diffuse emission shows a peak recorded on winter 2011, suggesting a seasonal control on the CO2 emission. These observations along with the results from the eight soil gas surveys performed at TVF indicate that the short and long-term trends in the diffuse CO2 degassing are mainly controlled by environmental factors.

How to cite: Di Nardo, D., Paglia, S., Melián, G. V., Pérez, N. M., Padrón, E., Hernández, P. A., Rodríguez, F., and Asensio-Ramos, M.: Ground CO2 monitoring at Timanfaya volcano (Lanzarote, Canary Islands) during the period 1999-2023, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-18674,, 2024.

EGU24-19227 | Orals | GI1.1

Mineral Exploration of the Senegalese Grande Côte heavy minerals placer: QEMSCAN® characterisation of Fe-Ti oxides for ore modelling 

Aisha Kanzari, Sophie Graul, Arthur Delaporte, Rutt Hints, and Simon Blancher

Titanium-containing minerals serve a variety of industrial applications. Iron and titanium oxides, ilmenite (Fe2+TiO3), pseudorutile (Fe23+Ti3O9), and rutile/anatase (TiO2) are notably used in the production of paint, plastic and paper pigments; moreover, titanium metal is considered as a Critical Raw Material (CRM). Grande Côte Operation (GCO), a subsidiary of Eramet, has been operating the Senegalese Grande Côte heavy minerals (HM) placers for zircon and Fe-Ti oxides since 2014. Senegal's placer deposits extend over 100 km in length and 5 km in width and lie alongside the country's north coast. These Quaternary ore-bodies resulted from the erosion of the Mauritanian belt and repetitive episodes of marine transgression and regression, as well as from aeolian dune formations, leading to significant heterogeneity. Related distribution trends in impurities and heavy minerals are yet not anticipated or understood.

This study explores the mineralogical heterogeneities to investigate variations in terms of the distribution and alteration of the titanium-bearing phases. Ten drill cores were selected to investigate three synthetic profiles based on high-resolution sampling. Heavy minerals from composite samples were recovered using dense liquid. The obtained concentrates were prepared as representative thick sections for textural analysis. Semi-quantification investigations were conducted by means of QEMSCAN® analyses.

The heavy minerals content was not related to sand facies or depth, and the average concentration ranged from 0.1% to 4.2%, with an average of 0.9. From the concentrate, it could be inferred that Fe-Ti phases represented 14.4% for ilmenite, 57.1% for pseudorutile, 1.8% for anatase and 3.7% for rutile. Pseudorutile was the predominant phase, indicating an advanced alteration. A decrease in ilmenite/pseudorutile ratio was observed with increasing depth in all profiles.

Based on these findings, the alteration rate in the ilmenite series was investigated by adding a finely spaced range of Fe/Ti ratios and impurities content (mainly Al) to the QEMSCAN® database. The weathering process is initiated by the oxidation of Fe2+ into Fe3+, progressively leading to the formation of pseudorutile, marked by grains with cracking patterns due to topotaxial reactions. The following stage is driven by iron-lixiviation and implies hydroxylian pseudorutile apparition due to intense hydration and hydroxylation processes. Dissolution and reprecipitation reactions led to a final alteration, creating highly Ti-enriched, impurities-rich and porous grains. The evolution with depth of the coefficient of variation between the content of Fe-Ti phases illustrated an authigenic Ti-enrichment. A substantial drop (-40%) in unaltered ilmenites was observed at surface levels. A downward enrichment of pseudorutile proportion (5 to 10%) was observed up to 13m, where the sharp increase (up to 40%) in Ti-rich phases correlates to the water-table depth above 18m, advanced alteration led to the transformation of almost all ilmenite phases into pseudorutile.

QEMSCAN® analyses contributed to a better understanding of the Grande Côte placer deposits, highlighting the significance of spatial variability and local water table settings for Fe-Ti oxide distribution and alteration processes, allowing a first ore body modelling and a global assessment of HM content.

How to cite: Kanzari, A., Graul, S., Delaporte, A., Hints, R., and Blancher, S.: Mineral Exploration of the Senegalese Grande Côte heavy minerals placer: QEMSCAN® characterisation of Fe-Ti oxides for ore modelling, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-19227,, 2024.

The Direct Current Resistivity (DCR) method is one of the well-known geophysical methods used for a wide range of areas such as mining geophysics, hydrogeophysics, and archaeogeophysics investigations. DCR data is generally collected along profile using multi-electrode and multi-channel measurement systems and interpreted using ‘two-dimensional (2D) or three-dimensional (3D) inversion algorithms. The inverse problem of DCR data is ill-posed (nonlinear, nonunique, and unstable). Therefore, the generally smoothing regularization inversion method is used for DCR data inversion. Additionally, a homogenous resistivity model is used as the initial model in regularized inversion. Hence, we generally obtain a smooth resistivity model after 2D/3D inversion. However, some structures such as buried archaeological targets, cavities, and fault structures have sharp boundaries with their neighboring medium.


In this research, we propose enhancing 2D DCR data inversion results using a convolutional neural network (CNN), aiming for sharp boundaries. We developed a U-net-based CNN algorithm, named DCR2D_Net_Archeo. This method utilizes 2D inversion results as the input, with the real resistivity model serving as the output, streamlining geophysical data interpretation for archeological applications.  We tested the DCR2D_Net_Archeo algorithm by using synthetic and real data.  We showed that the developed resistivity model enhancement algorithm, DCR2D_Net_Archeo, improves smooth inversion results and buried archeological remains' size and position can be delineated from those enhanced models. 


KEYWORDS: DC Resistivity, 2D, Inversion, Deep Learning, archaeo-geophysics.


ACKNOWLEDGEMENT: This study is part of the Ph.D. thesis of the first author and the manuscript about this study has been submitted to Pure and Applied Geophysics. This study is also made under the Ankara University Technopolis R&D projects (STBP code: 084286).  

How to cite: Över, D. and Candansayar, M. E.: Improving 2D resistivity model obtained from DC Resistivity Data Inversion by using Convolutional Neural Network Algorithm to Find Buried Archaeological Remains, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-19337,, 2024.

EGU24-19641 | Posters on site | GI1.1

Remake of the low cost carbon dioxide sensor of the carbon dioxide network deployed by INVOLCAN in the urban areas of Puerto Naos and La Bombilla, La Palma, Canary Islands 

Gabriel González Rial, Daniel Dinardo, Germán D. Padilla, José Barrancos, Pedro A. Hernández, Nemesio M. Pérez, Konradin Weber, Christian Fischer, and Detlef Amend

An anomalous CO2 degassing appeared by the end of Tajogaite eruption (North-West flank of Cumbre Vieja volcano ridge, La Palma, Canary Islands), in the neighborhoods of La Bombilla and Puerto Naos at about 6 km distance from the volcanic vent. The areas affected by the anomalous CO2 degassing were not directly affected by lava flows during the eruptive period. After the eruption, and due to this strong volcanic-hydrothermal carbon dioxide emissions (CO2>5-20%)  were included in the exclusion zone. CO2 is an invisible toxic gas, as well as asphyxiating, and may be lethal when is present in concentrations higher than 14%. During the post-eruptive period, INVOLCAN deployed its own indoor and outdoor CO2 monitoring networks in collaboration with other institutions, with the aim of delimitating the anomalous CO2 degassing areas, paying attention to those areas where CO2 air concentration exceeds hazardous thresholds. The number of monitoring stations were increasing to cover most of the homes, garages, basements, and local businesses. The first monitoring network were based on a LILYGO® TTGO T-SIM7000G electroniccard, previously programmed with an unstable algorithm that caused problems during the measurements. After some implementations to enhance the stability of the sensor, a new algorithm was developed that consists of the acquisition of ambient values every 5 seconds, applying a Moving Average Filter in every measurement to avoid outliers. The SIM card integrated in the hardware allows the data transmission to an MQTT broker where the values are published every 5 minutes, recollecting them in a unique Raspberry Pi 4 Model B located at the INVOLCAN headquarters, that reads and stores the data in two databases (InfluxDB and Google Sheets). The visualization of the values are done through Grafana Cloud, recollecting the data from InfluxDB and showing them distributed as tables and a geographic map that illustrates the concentration in the measurement points. The difference between this and the last storing is the flexibility when visualizing the data, that can be transformed to different kind of plots as mentioned. Moreover, an API for the management of each subsystem is created using PyQT, allowing to the user the calibration of the sensors in remote, as well as executing a soft reboot, or the integration of deeper parameters like the sensor mode (manual polling, streaming or command mode) or pressure data. Two of the 20 devices have been successfully installed and they are working correctly in La Palma, meanwhile an amount of 18 devices are being tested and recollecting properly with better stability in CO2 concentration measurements at our laboratory and will be installed indoor in different locations soon. The remaking of the algorithm allows to forget previous problems of wrong data and disconnections, obtaining accurate data compared to commercial sensors and helping the operator to configure and control the sensors without moving to conflicting locations.

How to cite: González Rial, G., Dinardo, D., Padilla, G. D., Barrancos, J., Hernández, P. A., Pérez, N. M., Weber, K., Fischer, C., and Amend, D.: Remake of the low cost carbon dioxide sensor of the carbon dioxide network deployed by INVOLCAN in the urban areas of Puerto Naos and La Bombilla, La Palma, Canary Islands, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-19641,, 2024.

EGU24-19666 | ECS | Orals | GI1.1

Environmental stability of MEMS gravimeters. 

Elizabeth Passey, Abhinav Prasad, Karl Toland, Kristian Anastasiou, Douglas Paul, and Giles Hammond

Wee-g is a new gravimeter that utilises a micro-electromechanical system (MEMS) sensor. The use of MEMS-based sensors has benefits as a new gravimetry technology because the low cost of the base material silicon and accessibility of manufacturing facilities will enable increased availability of gravimeters. However, the challenge of working with silicon for a gravimetry device is its thermal sensitivity, which affects the Young's Modulus of the material. In the context of Wee-g's sensor design, when the flexures that support the proof mass become softer because of temperature changes, under gravity the proof mass change position. Changes to the ambient pressure can also result in changes to the proof mass position. Wee-g has a thermal control system that effectively controls the temperature at the sensor to within 1mK, but field observations indicate that large changes to ambient environmental conditions can be coupled to the sensor output. Here we present the results of environmental stability tests conducted on a Wee-g field prototype with implications for its performance in field environments that vary in temperature and pressure significantly.

How to cite: Passey, E., Prasad, A., Toland, K., Anastasiou, K., Paul, D., and Hammond, G.: Environmental stability of MEMS gravimeters., EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-19666,, 2024.

EGU24-19870 | Orals | GI1.1

Soil H2 degassing studies: a useful geochemical tool for monitoring Cumbre Vieja volcano, La Palma, Canary Islands 

Megan Expósito, Sophia Ioli, Ileana Santangelo, Gladys V. Melián, María Asensio-Ramos, Mónica Arencibia, Sttefany Cartaya, Carla Méndez, Nemesio M. Pérez, Eleazar Padrón, Pedro A. Hernández, Fátima Rodríguez, Germán D. Padilla, and Antonio J. Álvarez

La Palma Island (708 km2), situated in the northwest of the Canarian Archipelago, stands as one of the youngest (~2.0 My) islands. A new volcanic eruption took place at the Cumbre Vieja volcanic system, located in the southwest flank of the island, on September 19, 2021. Cumbre Vieja is renowned as the most active basaltic volcano in the Canaries. The eruptive event, which lasted for 85 days, featured various volcanic activities, including lava effusion, strombolian activity, lava fountaining, ash venting, and gas jetting, and concluded on December 13, 2021.

Regular surface geochemical studies have been conducted focusing on hydrogen (H2) emissions along Cumbre Vieja. H2, being one of the most abundant trace species in volcano-hydrothermal systems, plays a pivotal role in numerous redox reactions occurring in the hydrothermal reservoir gas. This comprehensive study of H2 emissions has been ongoing since 2001, encompassing continuous monitoring of soil gas samples collected at a depth of approximately 40 cm across 600 sites during each survey. H2 concentrations have been meticulously analyzed using a micro-gas chromatograph (Agilent 490 microGC).

Spatial distribution maps have been generated using sequential Gaussian simulation (sGs) techniques to quantify the diffuse H2 emissions from the study area. The time series data of the diffuse H2 emissions indicate significant increases before and during the occurrence of seismic swarms observed between 2017 and 2021. Furthermore, during the eruptive phase, substantial spikes in the diffuse H2 emissions were observed, closely correlating with the volcanic tremor escalation. These fluctuations in diffuse H2 emissions were observed preceding the peak of diffuse CO2 emissions, aligning with the anticipated behavior of these gases. Over the last two years following the eruption, the values have reverted to levels like those observed during periods of volcanic calm, reinstating the stability in the diffuse H2 emissions.

The absence of visible volcanic gas emissions before the eruption, such as fumaroles or hot springs, on the surface of Cumbre Vieja underscores the importance of such studies in serving as a critical tool for continuous volcanic surveillance and monitoring purposes. This update represents ongoing efforts to comprehensively study and understand the behavior of hydrogen emissions within the volcanic system, providing essential insights into volcanic activity and potential precursor signals for enhanced monitoring and risk assessment.

How to cite: Expósito, M., Ioli, S., Santangelo, I., Melián, G. V., Asensio-Ramos, M., Arencibia, M., Cartaya, S., Méndez, C., Pérez, N. M., Padrón, E., Hernández, P. A., Rodríguez, F., Padilla, G. D., and Álvarez, A. J.: Soil H2 degassing studies: a useful geochemical tool for monitoring Cumbre Vieja volcano, La Palma, Canary Islands, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-19870,, 2024.

Automated mineralogical analysis (quantitative scanning electron microscopy) is a powerful tool that has been used extensively to understand the occurrence and deportment of precious and base metals and critical minerals and is used to optimize the design of extractive metallurgy methodologies. However, this data-rich product can also be used in a predictive context and as the basis for data integration across the range of scales. Automated mineralogical data include quantitative mineral abundance and textural data that can form the basis for machine learning algorithms to improve statistical subsampling strategies, data integration, mineralogical upscaling, and to increase the value of x-ray fluorescence- and hyperspectral data.   

The Mineral and Materials Characterization Facility in the Department of Geology and Geological Engineering at the Colorado School of Mines in Golden, USA, houses two scanning electron microscopy-based automated mineralogy systems. These systems are used to conduct research over a broad range of disciplines including all stages of the mine life cycle, energy and petroleum resources investigations, provenance and climate studies, and environmental and biological studies.   

During this presentation, we will explore examples of how automated mineralogy can play a crucial role across the mine life cycle that spans from mineral exploration, mine planning and mining, extractive metallurgy, proactive waste rock and tailings management, to reclamation. The example use-inspired research projects, conducted through the Center to Advance the Science of Exploration to Reclamation in Mining (CASERM) using the Advanced Mineral Analysis and Characterization System (AMICS) from Bruker based on a field-emission scanning electron microscope from Hitachi, focus on the integration of diverse geoscience data types to accelerate and improve decision making across the mine life cycle.   

Quantitative scanning electron microscopy provides important mineralogical and textural data that can inform statistical, thermodynamic, and kinetic models. These data improve not only our understanding of the subsurface in the context of hard-rock mining, but can inform other disciplines such as geothermal energy exploration and extraction and understanding the carbonation potential, helping move the world towards a greener future. 

How to cite: Pfaff, K.: Automated Mineralogy – A Valuable and Data-Rich Product to Advance the Green Energy Transition  , EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-20638,, 2024.

EGU24-22369 | Posters on site | GI1.1

A circular workflow based on Earth Observation tools for the detection and characterization of illegal waste dumping sites in a waste to energy framework and policy assessment 

Alessandro Mei, Alfonso Valerio Ragazzo, Sara Mattei, Emiliano Zampetti, Patrizio Tratzi, Alice Cuzzucoli, Giuliano Fontinovo, Giorgio Pennazza, Marco Torre, Valentina Terenzi, and Mario Grosso

The main illegal Solid Waste Management (SWM) issues concern their detection and their consequent disposal/reuse/recycle, both at the municipal, provincial, and regional scales. Nowadays, an important aspect is that even developed countries show difficulties with the management of illegal wastes and decision-making processes in the field are not enough developed by policymakers. This contribution aims to reduce the environmental pressure caused by the illegal disposal of solid waste through the development of a circular model which includes different approaches. A multiparametric downscaling analysis integrating satellite (Worldview-2), Unmanned Aircraft Vehicle (UAV) and Unmanned Ground Vehicles (UGV) was applied first. From satellite images waste sites were first extracted by supervised techniques, while UAV and ground data were used for their characterization by means of Artificial Intelligence (AI) techniques. Furthermore, a volume’s frequencies map is obtained by using geospatial information, estimating the volume of garbage for each sampling site. Air quality sensors mounted on UGV were used to monitor each sample site to reveal environmental criticalities. Considering such kind of outputs, a Life Cycle Assessment (LCA) was setup to evaluate some waste to energy solutions. A cost analysis was finally performed by including information regarding the transport of waste to the nearest municipal collectors and, subsequently, to the assigned regional recovery plants. For this reason, a spatial model concerning the shortest paths, considering route network and local environmental variables, was made by using R scripts, QGIS geoinformation system, and Dijkstra’s algorithm. Finally, thematic maps and statistics were obtained with the aim of developing methodologies to solve social-political problematics as SWM issues. The project is focused on three municipalities in Calabria (Italy, Province of Catanzaro), within the INTESA project - INtegrazione di sistemi di TElerilevamento e Sensoristica per l’individuazione di accumulo di materiali in Abbandono " - promoted by the National Research Council - Institute for Atmospheric Pollution (CNR-IIA) and funded by POR Calabria FESR FSE 2014/2020 of the Calabria Region (LIVING LABS). Thus, all these information will be fundamental for the development of a regional Decision Support System (DSS) about SWM issues.

How to cite: Mei, A., Ragazzo, A. V., Mattei, S., Zampetti, E., Tratzi, P., Cuzzucoli, A., Fontinovo, G., Pennazza, G., Torre, M., Terenzi, V., and Grosso, M.: A circular workflow based on Earth Observation tools for the detection and characterization of illegal waste dumping sites in a waste to energy framework and policy assessment, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-22369,, 2024.

EGU24-22519 | Orals | GI1.1

Automated Mineralogy: from drill cores to sub-micron information 

Andrew Menzies, Alan R. Butcher, and Nigel M. Kelly

The transition towards cleaner energy and the manufacture of associated new technologies will require extraction of mineral resources at volumes much greater than at present.  Consequently, identification of new deposits is both economically and strategically important, driving a boom in mineral exploration coupled with a need to lower the environmental impact through more efficient mining capabilities.  Crucial is an understanding of mineralogy and texture across scales, and thus improving knowledge at each stage of the mining cycle – from exploration through to production and ultimately to waste handling.  A key tool in this understanding is Automated Mineralogy.     

Automated Mineralogy has been integral to process mineralogy for more than two decades, with SEM being the traditional analytical platform.  However, the extension of Automated Mineralogy using scanning micro-XRF instruments allows the technique to be implemented across broader spatial scales.  In practice, the same logical workflow can be applied from the scale of large cut or split (minimally prepared) drill-core samples,through to polished thin sections or block mounts of various sample types (fragments or crushed plant material).  At the most detailed level, the information obtained can be at the sub-micron scale of mineral classification, or even zonation withing single grains. 

An example of Automated Mineralogy as applied to Au-Co exploration is presented that highlights the benefit of analysis across scales, integrating information collected using the AMICS platform on drill core measured using scanning micro-XRF, and thin sections measured by SEM.  The example will also demonstrate the ability to use the same Automated Mineralogy approach to define and quantify sub-micron information within individual mineral grains.   

How to cite: Menzies, A., Butcher, A. R., and Kelly, N. M.: Automated Mineralogy: from drill cores to sub-micron information, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-22519,, 2024.

EGU24-2543 | ECS | Posters on site | ITS1.15/GI1.3

Comparative Analysis of Ground-Based and Satellite-Derived UV Index: Variability and Reliability from Three South American Mid-Latitudes Sites 

Gabriela Reis, Hassan Bencherif, Marco Reis, Bibiana Lopes, Marcelo de Paula Corrêa, Damaris Kirsch Pinheiro, Lucas Vaz Peres, Rodrigo da Silva, and Thierry Portafaix

Solar Ultraviolet Radiation (UV) corresponds to electromagnetic waves with wavelengths of 100-400 nm, constituting approximately 5% of the energy emitted by the sun. The risks and benefits of exposure to UV for life on Earth have been known for many years and include impacts on human health, materials, terrestrial and aquatic ecosystems, and biogeochemical cycles. Climate change, influenced by land use change and other factors, can increase or decrease the intensity of the incident UV depending on location, seasons, and changes in the atmospheric composition. UV intensity reaching the surface can be informed as the UV index. This dimensionless indicator often makes it easier for people to assess their UV levels and understand how to protect themselves from excessive sun exposure. In middle-income countries like Brazil and Argentina, networks, and instruments for monitoring UV are often sparse and poorly supported with both capacity and funding, and thus, obtaining reliable UV data is difficult. With only a few stations reporting long-term UV measurements, which significantly restricts its extrapolations to all populated areas, a way to continuous monitoring UV globally is through satellites. Similar to ground-based observations, satellite measurements are affected by instrument errors and are subject to uncertainties in the algorithms used to derive surface UV radiation. Therefore, evaluation of satellite-based estimates of surface UV against available ground measurements at many locations around the world is needed to characterize the errors toward further refinement of the surface UV estimates, especially in the Southern Hemisphere, where there has been relatively limited work to compare ground-based and satellite-derived UV. This study compares ground-based and satellite-derived UV Index levels from OMI (Ozone Monitoring Instrument) at overpass time during clear sky conditions, which are determined using LER (Lambertian Equivalent Reflectivity). A characterization of the diurnal and seasonal variability of the ground-based UV index levels will also be reported. The study period will be from 2005 to 2022, varying according to each data source, and comprises data from two Brazilian cities – Itajubá (22.41ºS, 45.44ºW, 885 m, Davis 6490 UV sensor), Santa Maria (29.4°S, 53.8°W, 476 m, Brewer Spectrophotometer MKIII #167), and from Buenos Aires in Argentina (34.58º S, 58.48°W, 25 m, Solar Light UV Biometer – Radiometer model 501). Comparing satellite-derived data with ground-based measurements helps validate the accuracy of satellite data, which can help identify any discrepancies and improve the satellite data retrieval algorithms, leading to more accurate satellite-derived UV products. Also, such a process of data verification is necessary should these data be used for long-term trend analysis or the monitoring of UV exposure risk and possible impacts on human health, as we intend to do in a future study, to understand better the dynamics of the space-temporal variability of the surface UV in South America. 

How to cite: Reis, G., Bencherif, H., Reis, M., Lopes, B., de Paula Corrêa, M., Kirsch Pinheiro, D., Vaz Peres, L., da Silva, R., and Portafaix, T.: Comparative Analysis of Ground-Based and Satellite-Derived UV Index: Variability and Reliability from Three South American Mid-Latitudes Sites, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-2543,, 2024.

A multi-channel brightness temperature (TB) Fundamental Climate Data Record (FCDR) for the period 1991-present has been developed in this study using measurements from two Special Sensor Microwave Imagers (SSM/I) onboard the F11 and F13 satellites and one Special Sensor Microwave Imager/Sounder (SSMIS) onboard the F17 satellite of the US Defense Meteorological Satellite Program (DMSP). Hardware differences among these instruments were corrected using a combination of techniques including Principal Component Analysis (PCA), using the third instrument as an intermediate, and weighted averaging, which accounts for interchannel covariability and observation matching issues. After intercalibration, all imagers were standardized using SSMIS as the observation reference. The average biases of the recalibrated TBs for almost all channels between any two instruments are globally less than 0.2 K, with standard deviations (STDs) of less than 1.2 K. This resulted in a 30-year continuous and stable FCDR. Based on this FCDR, a long time series of column water vapour (CWV) over the global oceans was retrieved. Validation of this retrieved moisture product against reanalysis, in-situ radiosonde, and Global Navigation Satellite System (GNSS) measurements showed reasonable accuracy, suggesting that the presented FCDR has high potential for climate applications. In the future, this research method will be applied to more satellites to create an expanding dataset of satellite observations that could enhance the accuracy of climate model assessments and improve the reliability of climate predictions.

How to cite: Liu, S. and Wang, Y.: Highly consistent brightness temperature fundamental climate data record from SSM/I and SSMIS, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-4525,, 2024.

The National Oceanic and Atmospheric Administration’s (NOAA) Joint Polar Satellite System (JPSS) provides critical observations of the Earth and its atmosphere from the ultraviolet region to the microwave region in Leo Earth Orbit (LEO). The mission now has three satellites in the same orbit: NOAA20 the primary satellite, NOAA21 as secondary and Suomi National Polar-orbiting Partnership (Suomi NPP) as the tertiary satellite. The primary and secondary satellite provide redundancy since measurements from the mission provide critical inputs to global numerical weather prediction. Since 2011, the multi-mission series of Low Earth Orbit (LEO) polar-orbiting environmental satellites is serving as one of the most important sources of continuous state-of-the-art observations of the Earth’s land, oceans, and atmosphere to protect lives and property, and support the global economy by providing accurate and timely environmental information. The Visible Infrared Imaging Radiometer Suite (VIIRS), the Cross-track Infrared Sounder (CrIS), the Advanced Technology Microwave Sounder (ATMS), the Ozone Mapping and Profiler Suite (OMPS), and the Clouds and the Earth’s Radiant Energy System (CERES) observe a large part of the electromagnetic spectrum from the UV region to the microwave region. All the sensors have state of the art onboard calibration sources and the data undergo extensive pre and post launch calibration and validation activities before the data are declared operational. Additionally, NOAA/NESDIS center for satellite applications and research maintains an integrated calibration and validation system to continuously monitor and track the performance of the sensors through the mission life cycle. NOAA also co-leads the Global Space-based Inter-Calibration Sytem (GSICS) which is an international collaborative effort initiated in 2005 by the World Meteorological Organization (WMO) and the Coordination Group for Meteorological Satellites (CGMS) to monitor, improve and harmonize the quality of observations from operational weather and environmental satellites of the Global Observing System (GOS). The level 2 geophysical measurements and products also go through extensive verification and validation through comparison of satellite products with surface-based, airborne, and/or space-based observations that are extensively documented and shared with users. This presentation will highlight the calibration activities and the performance of JPSS sensors and products.

How to cite: Kalluri, S. and Cao, C.: Calibration and Validation of Low Earth Orbit Observations From NOAA to Support Global Environmental Monitoring, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-6427,, 2024.

EGU24-6605 | Posters on site | ITS1.15/GI1.3

Utilizing Libya-4 to intercalibrate overlapping sensors in the same sun-synchronous orbit 

David Doelling, Conor Haney, Prathana Khakurel, Rajendra Bhatt, Benjamin Scarino, and Arun Gopalan

The NASA CERES observed SW and LW broadband fluxes are utilized by the climate community for monitoring the Earth’s energy imbalance and for climate model validation. The SNPP and NOAA20 CERES instruments and associated VIIRS imagers were launched into the same 1:30 PM mean local time sun-sun-synchronous orbits as well as the future NOAA22 Libera broadband instrument and VIIRS imager. The overlapping sensor records need to be intercalibrated to enable consistent broadband fluxes and imager cloud retrievals. The overlapping satellites are typically placed a half an orbit apart, thus preventing any simultaneous nadir overpass (SNO) events required for time-matched inter-calibration strategies. A Pseudo Invariant Calibration Site (PICS), such as Libya-4, can provide overlapping sensor radiometric scaling factors without the use of SNOs. 

The clear-sky Libya-4 observed radiances were characterized both spectrally and angularly and corrected for atmospheric effects. The Libya-4 natural variability was found to be consistent across the CERES and VIIRS records. This fact reveals that the sensor onboard calibration anomalies are smaller than the Libya-4 natural variability. By mitigating the Libya-4 natural variability will reduce the radiometric scaling factor uncertainty needed to provide both broadband flux and cloud retrieval continuity across the overlapping sensor records.

How to cite: Doelling, D., Haney, C., Khakurel, P., Bhatt, R., Scarino, B., and Gopalan, A.: Utilizing Libya-4 to intercalibrate overlapping sensors in the same sun-synchronous orbit, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-6605,, 2024.

EGU24-6849 | Orals | ITS1.15/GI1.3

Validation and simulation of existing and future satellite mid and thermal infrared sensors using a combination of automated validation sites and airborne datasets 

Simon Hook, Bjorn Eng, Gerardo Rivera, Robert Freepartner, Brenna Hatch, William Johnson, Dirk Schüttemeyer, Mary Langsdale, and Martin Wooster

Post-launch calibration and validation over the lifetime of missions is needed to ensure that any long-term variation in an observation, e.g. an area getting hotter, can be unambiguously assigned to a change in the Earth system, rather than a change in calibration. Such activities enable measurements from different satellites to be inter-compared and used seamlessly to create long-term multi-instrument/multi-platform data records, which serve as the basis for large-scale international science investigations into topics with high societal or environmental importance. In order to help address this need we have established a set of automated validation sites where the necessary measurements for validating mid and thermal infrared data from spaceborne and airborne sensors are made every few minutes on a continuous basis. We have also conducted multi-agency airborne campaigns with thermal infrared sensors to develop precursor datasets for future NASA and ESA missions to acquire mid and thermal infrared data as well as characterize variability within the automated validation sties.

We have established automated validation sites at several locations including Lake Tahoe CA/NV, Salton Sea CA and La Crau, France. The Lake Tahoe site was established in 1999, the Salton Sea site was established in 2008 and the La Crau site was established in 2023. Each site has one or more custom-built highly accurate (50mK) radiometers measuring the surface skin temperature. All the measurements are made every few minutes and downloaded hourly via a cellular modem.

Data from the sites have been used to validate numerous satellite instruments including the Advanced Very High Resolution Radiometer (AVHRR) series, the Along Track Scanning Radiometer (ATSR) series, the Advanced Spaceborne Thermal Emission and Reflectance Radiometer (ASTER), the Landsat series, the Moderate Resolution Imaging Spectroradiometer (MODIS) on both the Terra and Aqua platforms, the Visible Infrared Imaging Radiometer Suite (VIIRS) and the ECOsystem Spaceborne Thermal Radiometer Experiment on Space Station (ECOSTRESS). In all cases the standard products have been validated including the standard radiance at sensor, radiance at surface, surface temperature and surface emissivity products.

Over the last several years NASA and ESA have conducted multiple joint airborne campaigns to obtain data at high spatial and spectral resolutions to simulate future satellite sensors as well as characterize potential validation sites, such as the La Crau validation site. These data are currently being used to simulate the ASI/NASA Surface Biology and Geology (SBG) thermal infrared (TIR) mission, the ESA Land Surface Temperature Monitoring (LSTM) mission and the ISRO/CNES Thermal infraRed Imaging Satellite for High-resolution Natural resource Assessment (TRISHNA) mission.

We will present results from the validation of the mid and thermal infrared data using the automated validation sites as well as results from the recent airborne campaigns.

How to cite: Hook, S., Eng, B., Rivera, G., Freepartner, R., Hatch, B., Johnson, W., Schüttemeyer, D., Langsdale, M., and Wooster, M.: Validation and simulation of existing and future satellite mid and thermal infrared sensors using a combination of automated validation sites and airborne datasets, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-6849,, 2024.

EGU24-9248 | ECS | Posters on site | ITS1.15/GI1.3

Monitoring Metop ASCAT backscatter stability over tropical rainforests 

Clay Harrison, Sebastian Hahn, and Wolfgang Wagner

The Advanced Scatterometer (ASCAT) on-board the series of Metop satellites is a microwave radar instrument operating in C-band (5.255 GHz). ASCAT has been designed to measure wind speed and wind direction over open ocean, but the instrument has also shown its capabilities to observe changes of sea ice extent and surface soil moisture over land. While two Metop satellites (Metop-B launched in September 2012 and Metop-C launched in November 2018) are operational at the moment, the first Metop mission (Metop-A launched in October 2006) has been successfully completed in November 2021. Regular calibration campaigns based on active transponders located in Turkey ensure a continuous quality monitoring, but natural targets (e.g. tropical rainforests) have also been used in the past. Previous analyses have shown that ASCAT is an extremely stable instrument providing high quality Level 1b backscatter products. Any small changes are evaluated in detail and accounted for if necessary. However, the investigation of calibration anomalies detected by active transponders typically takes time. Monitoring natural targets has the advantage that data is continuously available rather than incremental (as is the case when using active transponders) allowing an earlier detection of anomalies. In any case, calibration problems can only be fully resolved retrospectively during a reprocessing of historic data and not entirely in Near Real-Time (NRT).

The upcoming EUMETSAT H SAF ASCAT Surface Soil Moisture (SSM) products sampled at 6.25 km and 12.5 km are divided into three product categories depending on their timeliness: (i) historic data are available as a Climate Data Record (CDR), (ii) a continuous and consistent extension of the CDR, also known as Intermediate CDR (ICDR) and (iii) Near Real- Time (NRT). It is important to note that NRT products could be subject to intentional (e.g. algorithmic updates) or unintentional (e.g. instrument drifts) changes at any given point in time, which would compromise the consistency compared to historic data. Therefore, ICDR products are introduced in order to fill this gap and maintain a consistency as best as possible. For this reason the ICDR products will be distributed with a one-week delay and ASCAT Level 1b backscatter will be continuously monitored using data over tropical rainforests.

In this study we present our strategy to monitor ASCAT Level 1b backscatter stability over tropical rainforests and show results based on historic ASCAT data for all three Metop satellites. We will also discuss the practical implementation of the monitoring methodology and its application as an early-warning system in case of the ASCAT SSM ICDR product. An anomaly detection should trigger a warning for the users until a more in-depth analysis determines whether it is advisable to continue the product distribution or stop. Discovering problems that undermine the coherence between CDR and ICDR products is of critical importance, since applications like drought monitoring or climate studies rely on consistent time series data.

How to cite: Harrison, C., Hahn, S., and Wagner, W.: Monitoring Metop ASCAT backscatter stability over tropical rainforests, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-9248,, 2024.

EGU24-9252 | ECS | Posters on site | ITS1.15/GI1.3

Using hyperspectral sensors on the ground for satellite validation. A focus on the Fluorescence Explorer mission 

Paul Naethe, Andreas Burkart, Matthias Drusch, Dirk Schuettemeyer, Marin Tudoroiu, Roberto Colombo, Mitchell Kennedy, and Tommaso Julitta

The validation of optical satellite data products is a central but challenging component of the space missions. In order to validate the satellite images, ground data is used for reference and allows also the assessment of the associated total uncertainty budget. Overall, when comparing ground data and satellite measurements three main uncertainty sources need to be considered: i) instrument characterisation, ii) algorithm retrieval performances and iii) spatial representativeness. These key components affect the proper comparison of ground measurements with satellite data and, thus, have to be carefully examined. 

JB devices (FloX and RoX) are hyperspectral instruments acquiring optical field data with standardized hardware and routines. They have collected a legacy of data for over half of a decade using a comprehensive and readily implemented open-source data processing chain, considering the individual laboratory characterization of each instrument’s optical performance. Thus, the instruments are capable of providing valuable data products for the purpose of satellite validation. In particular, the FloX (Fluorescence BoX, JB Hyperspectral Devices GmbH) is the first commercially available device for the measurement of solar-induced chlorophyll fluorescence (SIF). The instrument was developed with the support of the scientific community following the specification of the Fluorescence Explorer mission (FLEX) by the European Space Agency (ESA), expected to be launched in 2024. The FloX features a high performing spectrometer (FWHM: 0.3 nm, SSI: 0.15, SNR: 1000) and allows stand-alone measurement of SIF emission at canopy level on the ground. Furthermore, the FloX enables the continuous measurements of spectral down-welling and up-welling radiance in the VIS-NIR range using an additional spectrometer to cover a larger spectral range and allows the automatic computation of reflectance as well as various vegetation indices (VIs). The instrument synchronously acquires upwelling and downwelling radiance during each measurement cycle, automatically optimizes the integration time according to light conditions and acquires the dark current and internal quality flags to ensure high quality data products. In addition to SIF and VIs, the FloX produces time series of high-resolution radiometric parameters, suitable for the investigation of the optical properties from the monitored targets. In the last years over 60 FloX units have been deployed worldwide.

Within a current ESA project, we are investigating the instrument uncertainty sources, with the final aim of defining a preliminary version of the FLEX validation plan. At the same time, currently deployed instruments in 10 location around the world were used to examine the agreement of the ground measurements with available satellite product (i.e. Sentinel-2). This approach reversed the common practice of validating satellite data with ground measurements by using the globally available, standardized L2A products of Sentiel-2 evaluating the conformance of ground-measured data products across a network of standardized instruments. An unprecedented alignment of satellite and ground data was achieved, confirming high validity of data products from the network of automated field spectrometers around the globe.

In summary, in this contribution we provide an overview of how field spectroscopy systems can be used in the framework of specific activities with the purpose of satellite validation.

How to cite: Naethe, P., Burkart, A., Drusch, M., Schuettemeyer, D., Tudoroiu, M., Colombo, R., Kennedy, M., and Julitta, T.: Using hyperspectral sensors on the ground for satellite validation. A focus on the Fluorescence Explorer mission, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-9252,, 2024.

EGU24-10447 | Orals | ITS1.15/GI1.3

GBOV (Copernicus Ground-Based Observation for Validation) service: latest product updates and evolutions for EO data Cal/Val 

Christophe Lerebourg, Rémi Grousset, Thomas Vidal, Gabriele Bai, Marco Clerici, Nadine Gobron, Jadu Dash, Somnath Bar, Finn James, Luke Brown, Ernesto Lopez-baeza, Ana Perez-hoyos, Darren Ghent, Jasdeep Anand, Jan-Peter Muller, and Rui Song

GBOV (Copernicus Ground-Based Observation for Validation), is an element of CLMS (Copernicus Land Monitoring Service). Its initial purpose was to support yearly validation effort of core CLMS product (TOC-R, Albedo, LAI, FAPAR, FCOVER, SSM and LST), five of whom are listed among GCOS Essential Climate Variables (ECV). GBOV has however reached a much larger community with about 1200 users, including ESA optical MPC. There is a large variety of ground data publicly available through numerous networks including ICOS, BSRN, NEON, TERN, SurfRad … For GBOV service, the choice was made to focus on data from permanent deployment, i.e. long-term datasets, rather than field campaign data. Indeed, this reduces the number of available ground variables, but long-term deployments ensure the maximum of ground to satellite data matchups as well as measurement protocols consistency.

GBOV provides ground measurement (the so-called “Reference Measurements”) to the community, but its fundamental interest is that up-scaling procedures are applied to these ground measurements in order to provide ARVD (Analysis Ready Validation Data) to the community, the so-called “Land Products”. GBOV service is freely accessible on and provides data over 112 sites. Available ground data variables include: Top of Canopy Reflectance (ToC-R), surface albedo, Leaf Area Index (LAI), Fraction of Absorbed Photosynthetically Available Radiation (FAPAR), Fraction of Covered ground (FCover), Surface Soil Moisture (SSM) and Land Surface Temperature (LST).

The networks providing GBOV initial input data are unfortunately not evenly distributed. In an attempt to reduce the thematic and geographical gap, GBOV is developing its own network as part of collaboration with the existing networks. In GOBV phase 1, six ground stations have been upgraded with additional instrumentation. In GBOV phase 2, a ground station has been deployed in August 2023 on Fuji Hokuroku research station in Japan for vegetation variables monitoring. This is part of a collaboration with NIES (National Institute of Environmental Studies). In 2024, a vegetation station will be installed over Fontainebleau research station (France) as part of a GBOV/ICOS collaboration. Fuji Hokuroku and Litchfield (TERN network Australia) will receive a GBOV LST station in 2024.

Over the past year, several updates have been implemented in GBOV database to better respond to CLMS and general users requirements. This includes improved uncertainty estimates for vegetation products, improved procedure for Soil Moisture and LST products. More effort is being made for the end-to-end uncertainty budget computation.

This presentation will emphasis product status and recent product evolutions.

How to cite: Lerebourg, C., Grousset, R., Vidal, T., Bai, G., Clerici, M., Gobron, N., Dash, J., Bar, S., James, F., Brown, L., Lopez-baeza, E., Perez-hoyos, A., Ghent, D., Anand, J., Muller, J.-P., and Song, R.: GBOV (Copernicus Ground-Based Observation for Validation) service: latest product updates and evolutions for EO data Cal/Val, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-10447,, 2024.

EGU24-10864 | Orals | ITS1.15/GI1.3

Calibration and Validation Activities in the Context of the 2023 GABONX Airborne SAR Campaign for Tropical Forest Height and Change Analysis over Gabon 

Marc Jaeger, Irena Hajnsek, Matteo Pardini, Roman Guliaev, Kostas Papathanassiou, Markus Limbach, Martin Keller, Andreas Reigber, Temilola Fatoyinbo, Marc Simard, Michele Hofton, Bryan Blair, Ralph Dubayah, Aboubakar Mambimba Ndjoungui, Larissa Mengue, Ulrich Vianney Mpiga Assele, and Tania Casal

Tropical forests are of great ecological and climatological importance. Although they only cover about 6% of Earth’s surface, they are home to approx. 50% of the world’s animal and plant species. Their trees store 50% more carbon than trees outside the tropics. At the same time, they are one of the most endangered ecosystems on Earth: about 6 million of hectares per year are felled for timber or cleared for farming. Compared to the other components of the carbon cycle (i.e. the ocean as a sink and the burning of fossil fuels as a source), the uncertainties in the local land carbon stocks and the carbon fluxes are particularly large. This is especially true for tropical forests: more than 98% of the carbon flux generated by changes in land-use may be due to tropical deforestation, which converts carbon stored as biomass into emissions.

In this context, the AfriSAR 2015/16 campaign, supported by ESA, was carried out over four forest sites in Gabon by ONERA (July 2015) during the dry season and by DLR (February 2016) during the wet season. From the data collected the innovative techniques applied to estimate forest height and biomass could be improved significantly and are summarized in a special issue ‘Forest Structure Estimation in Remote Sensing’ of IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing.

The motivation of the AfriSAR campaign was to acquire demonstration data for the soon to be launched ESA BIOMASS mission, that was selected as the 7th Earth Explorer mission in May 2013 in order to meet the pressing need for information on tropical carbon sinks and sources by providing estimates of forest height and biomass. AfriSAR focused on African tropical and savannah forest types (with biomass in the 100-300 t/ha range) and complements previous ESA campaigns over Indonesian and Amazonian forest types in 2004 (INDREX-II) and 2009 (TropiSAR).

The present contribution concerns the GABONX campaign, the ESA supported successor to AfriSAR, which took place in May to July 2023. GABONX aims to detect and quantify changes that have occurred since the DLR acquisitions in February 2016. To this end, DLR’s F-SAR sensor acquired interferometric stacks of fully polarimetric L- and P-Band data over the same forest sites in the same flight geometry as in 2016. The results presented give an overview of campaign activities with particular emphasis on the calibration of the SAR instrument as well as the validation of forest parameters derived from polarimetric interferometry. The SAR sensor calibration is based on an innovative approach that leverages state-of-the-art EM simulation to accurately characterize the 5m trihedral reference target deployed for the campaign in Gabon. The validation of derived forest parameters uses lidar measurements obtained in the time frame of the GABONX campaign by NASA’s LVIS sensor. As an outlook, further collaborative calibration and validation activities will hopefully include the cross-calibration of DLR’s F-SAR and NASA’s UAVSAR, which is set to acquire L- and P-Band data over the GABONX sites in 2024.

How to cite: Jaeger, M., Hajnsek, I., Pardini, M., Guliaev, R., Papathanassiou, K., Limbach, M., Keller, M., Reigber, A., Fatoyinbo, T., Simard, M., Hofton, M., Blair, B., Dubayah, R., Mambimba Ndjoungui, A., Mengue, L., Vianney Mpiga Assele, U., and Casal, T.: Calibration and Validation Activities in the Context of the 2023 GABONX Airborne SAR Campaign for Tropical Forest Height and Change Analysis over Gabon, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-10864,, 2024.

EGU24-11905 | ECS | Orals | ITS1.15/GI1.3

Multi-angular airborne observations for simulating thermal directionality at the satellite scale 

Mary Langsdale, Martin Wooster, Dirk Schuettemeyer, Simon Hook, Callum Middleton, Mark Grosvenor, Bjorn Eng, Roberto Colombo, Franco Miglietta, Lorenzo Genesio, Jose Sobrino, Gerardo Rivera, Daniel Beeden, and William Jay

Viewing and illumination geometry are known to have significant impacts on remotely sensed retrieval of land surface temperature (LST), particularly for heterogeneous regions with mixed components. Disregarding directional effects can have significant impacts on both the stability and accuracy of satellite datasets, for example when harmonising datasets from different sensors with different viewing geometries. However, it is difficult to accurately quantify these impacts, in part due to the challenges of retrieving high-quality data for the different components in a scene at a variety of different viewing and illumination geometries over a time period where the real surface temperature and sun-sensor geometries are invariant. With LST an Essential Climate Variable and the development of high resolution future thermal infrared missions (e.g. LSTM, SBG, TRISHNA), it is essential that further work is done to redress this.

With this in mind, a joint NASA-ESA airborne campaign focused on directionality was conducted in Italy in the summer of 2023, led by the National Centre for Earth Observation at King’s College London. This campaign involved concurrent acquisition across longwave infrared (LWIR) wavelengths at both nadir and off-nadir viewing angles through the deployment of two aircraft flying simultaneously, each equipped with state-of-the-art LWIR hyperspectral instrumentation. Data was collected to enable simulation of angular effects at the satellite scale over both agricultural and urban surfaces, with the aim of understanding and potentially developing adjustments for wide view angle satellite-based LST retrievals and remotely sensed evapotranspiration estimates. In-situ observations were collected additionally to enable accuracy assessment of the airborne datasets.

This presentation first details the airborne campaign, including the unique and novel data collection strategies and design modifications to enable evaluation of directional effects for thermal satellites. Preliminary results from the campaign are then presented as well as plans for further analysis related to future satellite thermal missions. 

How to cite: Langsdale, M., Wooster, M., Schuettemeyer, D., Hook, S., Middleton, C., Grosvenor, M., Eng, B., Colombo, R., Miglietta, F., Genesio, L., Sobrino, J., Rivera, G., Beeden, D., and Jay, W.: Multi-angular airborne observations for simulating thermal directionality at the satellite scale, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-11905,, 2024.

EGU24-12167 | Posters on site | ITS1.15/GI1.3

The Cross-track Infrared Sounder Level 1B Product: NASA’s Accurate and Stable Infrared Hyperspectral Radiance Record 

David Tobin, Joe Taylor, Larrabee Strow, Hank Revercomb, Graeme Martin, Sergio DeSouza-Machado, Jess Braun, Daniel DeSlover, Ray Garcia, Michelle Loveless, Robert Knuteson, Howard Motteler, Greg Quinn, and William Roberts

The Cross-track Infrared Sounder (CrIS) is an infrared Fourier Transform Spectrometer onboard the Suomi-NPP (SNPP), JPSS-1, and JPSS-2 satellites. The CrIS instrument was designed to provide an optimum combination of optical performance, high radiometric accuracy, and compact packaging. While CrIS was developed primarily as a temperature and water vapor profiling instrument for weather forecasting, its high accuracy and extensive information about trace gases, clouds, dust, and surface properties make it a powerful tool for climate applications.

The goal of the NASA CrIS Level 1B project is to support NASA climate research by providing a climate quality Level 1B (geolocation and calibration) algorithm and create long-term measurement records for the CrIS instruments currently on-orbit on the SNPP, JPSS-1, and JPSS-2 satellites, and for those to be launched on JPSS-3 and JPSS-4. The long-term objectives of the project include:

  • Create well-documented and transparent software that produces climate quality CrIS Level 1B data to continue or improve on EOS-like data records, and to provide this software and associated documentation to the NASA Sounder Science Investigator-led Processing System (SIPS).
  • Provide long-term monitoring and validation of the CrIS Level 1B data record from SNPP and JPSS-1 through JPSS-4, and long-term maintenance and refinement of the Level 1B software to enable full mission reprocessing as often as needed.
  • Provide a homogeneous radiance product across all CrIS sensors through the end of the CrIS series lifetime, with rigorous radiance uncertainty estimates.
  • Develop and support of the CrIS/VIIRS IMG software and datasets, which provide a subset of Visible Infrared Imaging Radiometer Suite (VIIRS) products that are co-located to the CrIS footprints.
  • Develop and support of the Climate Hyperspectral Infrared Product (CHIRP) for the AIRS and CrIS sounders. The CHIRP product converts the parent instrument's radiances to a common Spectral Response Function (SRF) and removes inter-satellite biases, providing a consistent inter-satellite radiance record.

The NASA CrIS products are available via the NASA Goddard Earth Sciences (GES) Data and Information Services Center (DISC) at This presentation will include (1) an overview of the NASA Level 1B calibration algorithm and product, (2) example post-launch calibration/validation results demonstrating the accuracy and stability of the CrIS Level 1B data, and (3) example science results.

How to cite: Tobin, D., Taylor, J., Strow, L., Revercomb, H., Martin, G., DeSouza-Machado, S., Braun, J., DeSlover, D., Garcia, R., Loveless, M., Knuteson, R., Motteler, H., Quinn, G., and Roberts, W.: The Cross-track Infrared Sounder Level 1B Product: NASA’s Accurate and Stable Infrared Hyperspectral Radiance Record, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-12167,, 2024.

EGU24-12346 | Posters on site | ITS1.15/GI1.3

Multi-frequency SAR measurements to advance snow water equivalent algorithm development 

Chris Derksen, Richard Kelly, Benoit Montpetit, Julien Meloche, Vincent Vionnet, Nicolas Leroux, Courtney Bayer, Aaron Thompson, and Anna Wendleder

Snow mass (commonly expressed as snow water equivalent – SWE) is the only component of the water cycle without a dedicated Earth Observation mission. A number of missions currently under development, however, will provide previously unachieved coverage and resolution at frequencies ideal for retrieving SWE. These missions include a Ku-band synthetic aperture radar (SAR) mission (presently named the ‘Terrestrial Snow Mass Mission’ – TSMM) under development in Canada, and two Copernicus Expansion Missions: the Radar Observing System for Europe at L-band (ROSE-L) and the Copernicus Imaging Microwave Radiometer (CIMR). Airborne measurements are required to support SWE algorithm development for all three of these missions. In this presentation, we will present analysis of measurements from the ‘CryoSAR’ instrument, an InSAR capable L- (1.3 GHz) and Ku-band (13.5 GHz) SAR installed on a Cessna-208 aircraft.

A time series of CryoSAR measurements were acquired over open, forested, and lake sites in central Ontario, Canada during the 2022/23 winter season. These measurements were used to evaluate a new computationally efficient SWE retrieval technique based on the use of physical snow model simulations to initialize snow microstructure information in forward model simulations for prediction of snow volume scattering at Ku-band. A primary challenge is the treatment of different layers within the snowpack. We show that a k-means classifier based on snow layer properties can effectively reduce a complex snowpack to three ‘radar-relevant’ layers which conserve SWE but simplify calculation of the snow volume radar extinction coefficient. Estimation of the background contribution is based on soil information derived from lower frequency radar measurements (X-, C-, and L-band). Our collective analysis of satellite and airborne radar observations, snow physical modeling, and SWE retrievals is facilitated by the recently developed TSMM simulator, which incorporates outputs from the Environment and Climate Change Canada land surface prediction system to produce synthetic dual-frequency (13.5 and 17.25 GHz) Ku-band radar data products.

The acquisition of multi-frequency airborne radar measurements from the CryoSAR, and the integration of these observation into the TSMM simulator, provides a fundamental new capability to provide pre-cursor datasets to advance SWE algorithms in preparation for upcoming missions.

How to cite: Derksen, C., Kelly, R., Montpetit, B., Meloche, J., Vionnet, V., Leroux, N., Bayer, C., Thompson, A., and Wendleder, A.: Multi-frequency SAR measurements to advance snow water equivalent algorithm development, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-12346,, 2024.

EGU24-12428 | Orals | ITS1.15/GI1.3

ESA/NASA Quality Assurance Framework for Earth Observation Products 

Samuel Hunt, Clément Albinet, Jaime Nickeson, Batuhan Osmanoglu, Alfreda Hall, Guoqing Lin, Leonardo De Laurentiis, Philippe Goryl, Frederick Policelli, Dana Ostrenga, and Nigel Fox

Across the broad potential user base for Earth Observation (EO) data, confidence in the quality of the available products is vital, particularly for users requiring quantitative measured outputs they can rely on. Particularly as the commercial EO sector rapidly expands, however, it is an increasing challenge for the user community to discern between the wide variety of product offerings in a reliable manner, especially in terms of product quality.


In response to this ESA and NASA, through their Joint Program Planning Group (JPPG) Subgroup, have developed a common EO product Quality Assurance (QA) Framework to provide comprehensive assessments of product quality. The evaluation is primarily aimed at verifying that the data has achieved its claimed performance levels, and, reviews the extent to which the products have been prepared following community best practice in a manner that is “fit for purpose”. A Cal/Val maturity matrix provides a high-level colour-coded a simple summary of the quality assessment results for users. The matrix contains a column for each section of analysis (e.g., metrology), and cells for each subsection of analysis (e.g., sensor calibration). Subsection grades are indicated by the colour of the respective grid cell, which are defined in the key.


Both ESA and NASA have on-going activities supporting the procurement of commercial EO data that make use of the joint QA Framework – to ensure decisions on data acquisition are made with confidence. On the ESA side, the Earthnet Data Assessment Project (EDAP) project performs data assessments on EO missions in optical, atmospheric and SAR domains. Similarly, the NASA Earth Science Division (ESD) Commercial Smallsat Data Acquisition (CSDA) Program, completed a pilot study in 2020, and has since entered sustainment use phase for some of the commercial data sets.


In this presentation the joint ESA/NASA QA Framework is described, with some examples of its application to commercial EO products.

How to cite: Hunt, S., Albinet, C., Nickeson, J., Osmanoglu, B., Hall, A., Lin, G., De Laurentiis, L., Goryl, P., Policelli, F., Ostrenga, D., and Fox, N.: ESA/NASA Quality Assurance Framework for Earth Observation Products, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-12428,, 2024.

EGU24-12444 | Posters on site | ITS1.15/GI1.3

Validation of the Radiometric Scales of GLAMR and Grande 

Julia Barsi, Brendan McAndrew, Boryana Efremova, Andrei Sushkov, Nathan Kelley, and Brian Cairns

The NASA/GSFC Code 618 Calibration Laboratories include the Radiometric Calibration Lab (RCL) and the Goddard Laser for Absolute Measurement of Radiance (GLAMR) facility.  Both have large integrating sphere sources with NIST-traceable radiometric calibration.

The workhorse of the RCL is a 1-m integrating sphere with a 25.4-cm port, called Grande, illuminated by nine 150W halogen lamps, providing a broad-band radiance source (300 nm to 2400 nm).  The radiometric calibration of Grande is NIST-traceable through calibrated FEL lamps and a transfer spectroradiometer.

GLAMR is a tunable-laser based system fiber coupled to a large integrating sphere, providing a full-aperture, uniform, monochromatic radiance source. The GLAMR system has two spheres; the one used for this study was a 50-cm sphere with a 20-cm port.  The radiometric calibration is NIST-traceable through a set of calibrated transfer radiometers.

The Research Scanning Polarimeter was calibrated by both sources in 2023.  There was a 3% discrepancy in the absolute radiometric calibration between the two systems.  In order to investigate the discrepancy, a full wavelength scan of the GLAMR system was run, with the Grande spectroradiometer in front of the GLAMR sphere, along with two other spectoradiometers that are used to monitor Grande in real time.  The analysis of this dataset should establish the source of the discrepancy between the two systems and bring the two radiometric calibration systems, Grande and GLAMR, within the combined uncertainties of the methods and instruments.

How to cite: Barsi, J., McAndrew, B., Efremova, B., Sushkov, A., Kelley, N., and Cairns, B.: Validation of the Radiometric Scales of GLAMR and Grande, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-12444,, 2024.

EGU24-12761 | Posters on site | ITS1.15/GI1.3

Simulated Sea Surface Salinity Data from a 1/48° Ocean Model  

Frederick Bingham, Séverine Fournier, Susannah Brodnitz, Akiko Hayashi, Mikael Kuusela, Elizabeth Westbrook, Karly Carlin, Cristina González-Haro, and Verónica González-Gambau

In order to study the validation process for sea surface salinity (SSS) we have generated a year (November 2011- October 2012) of simulated satellite and in situ “ground truth” data. This was done using the ECCO (Estimating the Circulation and Climate of the Oceans) 1/48° simulation, the highest resolution ocean model currently available. The ground tracks of three satellites, Aquarius, SMAP (Soil Moisture Active Passive) and SMOS (Soil Moisture and Ocean Salinity) were extracted and used to sample the model with a gaussian weighting similar to that of the satellites. This produced simulated level 2 (L2) data. Simulated level 3 (L3) data were then produced by averaging L2 data onto a regular grid. The model was sampled to produce simulated Argo and tropical mooring SSS datasets. The Argo data were combined into a simulated gridded monthly 1° Argo product. The simulated data produced from this effort have been used to study sampling errors, matchups, subfootprint variability and the validation process for SSS at L2 and L3.

How to cite: Bingham, F., Fournier, S., Brodnitz, S., Hayashi, A., Kuusela, M., Westbrook, E., Carlin, K., González-Haro, C., and González-Gambau, V.: Simulated Sea Surface Salinity Data from a 1/48° Ocean Model , EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-12761,, 2024.

EGU24-12907 | Orals | ITS1.15/GI1.3

Advancing Sea Surface Salinity R&D: The Pi-MEP Initiative for Satellite Salinity Data Validation and Exploitation 

Sébastien Guimbard, Nicolas Reul, Roberto sabia, Raul Díez-García, Sylvain Herlédan, Ziad El Khoury Hanna, Tong Lee, Julian Schanze, Frederic Bingham, and Klaus Scipal

The Pilot-Mission Exploitation Platform (Pi-MEP) for salinity ( is an initiative originally meant to support and widen the uptake of ESA Soil Moisture and Ocean Salinity (SMOS) mission data over the ocean. Since its beginning in 2017, the project aims at setting up a computational web-based platform focusing on satellite sea surface salinity data validation, supporting also process studies over the ocean. It has been designed in close collaboration with a dedicated science advisory group in order to achieve three main objectives: 1) gathering all the data required to exploit satellite sea surface salinity data, 2) systematically producing a wide range of metrics for comparing and monitoring sea surface salinity products’ quality, and 3) providing user-friendly tools to explore, visualize and exploit both the collected products and the results of the automated analyses. 

Over the years, the Pi-MEP has become a reference hub for the validation of satellite sea surface salinity missions products (SMOS, Aquarius, SMAP), being collocated with an extensive in situ database (e.g. Argo float, thermosalinographs, moorings, surface drifters, saildrones and equipped marine mammals) and additional thematic datasets (precipitation, evaporation, currents, sea level anomalies, sea surface temperature, etc. ). Co-localized databases between satellite products and in situ datasets are systematically generated together with validation analysis reports for 30 predefined regions. The data and reports are made fully accessible through the web interface of the platform. The datasets, validation metrics and tools of the platform are described in detail in Guimbard et al., 2021. Several dedicated scientific case studies involving satellite SSS data are also systematically investigated by the platform, such as major river plumes monitoring, mesoscale signatures in boundary currents, or spatio-temporal evolution in challenging regions (high latitudes, semi-enclosed seas, and the high-precipitation region of the eastern tropical Pacific).

Since 2019, a partnership to sustain the Salinity Pi-MEP project has been agreed between ESA and NASA, encompassing R&D and validation over the entire set of satellite salinity sensors. The two Agencies are now working together to widen the platform features on several technical aspects, such as triple-collocation software implementation, additional match-up collocation criteria and sustained exploitation of data from dedicated in-situ field campaigns (e.g., SPURS, EUREC4A).

In this talk, we will showcase the main results of the latest phase of the project, with the recent distinctive focus on the representation errors characterization of the various satellite salinity missions. 

Guimbard, S.; Reul, N.; Sabia, R.; Herlédan, S.; Khoury Hanna, Z.E.; Piollé, J.-F.; Paul, F.; Lee, T.; Schanze, J.J.; Bingham, F.M.; Le Vine, D.; Vinogradova-Shiffer, N.; Mecklenburg, S.; Scipal, K. & Laur, H. (2021) The Salinity Pilot-Mission Exploitation Platform (Pi-MEP): A Hub for Validation and Exploitation of Satellite Sea Surface Salinity Data Remote Sensing 13(22):4600

How to cite: Guimbard, S., Reul, N., sabia, R., Díez-García, R., Herlédan, S., El Khoury Hanna, Z., Lee, T., Schanze, J., Bingham, F., and Scipal, K.: Advancing Sea Surface Salinity R&D: The Pi-MEP Initiative for Satellite Salinity Data Validation and Exploitation, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-12907,, 2024.

EGU24-13262 | Orals | ITS1.15/GI1.3

Intercomparison of Landsat OLI and Sentinel 2 MSI performance 

Esad Micijevic, Cody Anderson, Julia Barsi, Rajagopalan Rengarajan, MD. Obaidul Haque, and Joshua Mann

For Landsat 8 and Landsat 9 (L8 and L9), the radiometric stability of the Operational Land Imager (OLI) is monitored using two solar diffusers, three sets of stimulation lamps, and regular lunar collects. Consistent response to the multiple calibrators provides high confidence in the radiometric characterization of the imagers over time and calibration parameters needed to maintain the stability of image products. After 11 years on orbit, all spectral bands in Landsat 8 OLI are stable within 1.5%, while Landsat 9 OLI degradation over its 2.5 years of life remains within 0.3% across all bands.

The MultiSpectral Instruments (MSIs) onboard Sentinel 2A and 2B (S2A and S2B) satellites were designed with 8 similar spectral bands (out of 13) as the OLIs, which created opportunities to combine data from both types of instruments and obtain higher temporal frequency of Earth observations. To ensure proper interoperability among the different instruments, they need to be radiometrically cross-calibrated and consistently georeferenced. We use coincident acquisitions over Pseudo Invariant Calibration Sites (PICS) to monitor the radiometric calibration consistency and stability of the instruments over time. For geometry, Landsat and Sentinel 2 images acquired within a month of each other over the same ground targets were used to assess the co-registration accuracy between the sensor products.

Our results show a general agreement in radiometry of all four instruments over their lifetimes to within 1%. Following the launch of MSI instruments, the initial geometric co-registration assessment between the MSI instruments and the Landsat 8 OLI instrument showed more than 12 m Circular Error (CE90), larger than a Sentinel 2, 10m, pixel. To further improve co-registration and, thus, interoperability of the four instruments, Landsat Collection-2 products use a geometric reference that was harmonized using the Global Reference Image (GRI). The GRI is a dataset consisting of geometrically refined Sentinel 2 images with an absolute accuracy better than 6 m globally. After adopting a common geometric reference in the generation of Landsat and Sentinel 2 products, our assessment of geometric co-registration of the Landsat and Sentinel terrain-corrected products shows a CE90 error of less than 6 m.

Multiple efforts have also been made to validate the accuracy of surface reflectance products from both Landsat and Sentinel 2. In-situ measurements have been made during overpasses of L8, L9, S2A, and S2B using various methods. These measurements also show consistency between all the sensors and can also be used for other missions.

How to cite: Micijevic, E., Anderson, C., Barsi, J., Rengarajan, R., Haque, MD. O., and Mann, J.: Intercomparison of Landsat OLI and Sentinel 2 MSI performance, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-13262,, 2024.

The fifth FengYun satellite (FY-3E) was successfully launched into orbit on 5 July, 2021. It carries the third-generation microwave temperature sounder (MWTS-III) and the second-generation microwave humidity sounder (MWHS-II), providing the global atmospheric temperature and humidity measurements. It is important to assess the in-orbit performance of MWTS-III and MWHS-II and understand their calibration accuracy before applications in numerical weather prediction. Since atmospheric profiles from Global Positioning System (GPS) radio occultation (RO) are stable and accurate, they are very valuable for assessing the microwave sounder performance in orbit as demonstrated by many previous studies. This study aims at quantifying the calibration biases of FY-3E MWTS-III and MWHS-II sounding channels of interest using the collocated GPS RO data during January 1st to September 30th, 2023. The MWTS-III channels inherit most of the second-generation MWTS features and have frequencies near the oxygen absorption band (50-60 GHz), and channels at the frequencies of 23.8 and 31.4 GHz were added. Considering that the GPS RO data are more stable and accurate in the mid-troposphere to lower stratosphere and the atmospheric radiative transfer model is accurate in the upper troposphere and lower stratosphere, the mid- to upper-level sounding channels of the MWTS-III, i.e. channels 7-14 are of interest in this study. The cross-tracking scanning instrument MWHS-II provides 15 channels, at frequencies near 89, 118.75, 150 and 183.31 GHz. Of interest to this study are MWHS-II channels 2-6 and 11-15. Using the collocated COSMIC RO data in clear-sky conditions as inputs to the Advanced Radiative Transfer Modeling System (ARMS), brightness temperatures and viewing angles are simulated for FY-3E MWTS-III and MWHS-II. The collocation criterion between the radio-occultation data and the MWTS-III/MWHS-II measurements is defined such that the spatial and temporal difference is less than 50 km and 3 h, respectively. To simulate more accurate bright temperatures, the RO data should be obtained under clear sky conditions over oceans. To determine the clear sky for MWTS-III, the cloud liquid water path algorithm developed by Grody et al. (2001) was used for MWTS-III. While for MWHS-II, the cloud detection algorithm developed by Hou et al. (2019) was used. The initial analysis shows that for the upper sounding channels, the mean biases of the MWTS-III observations relative to the GPS RO simulations are negative for channels 7-8 and 10-13, with absolute values <2 K, and positive for channels 9 and 14, with values <1 K. For the MWHS, the mean biases in brightness temperature are negative for channels 2–6, with absolute values < 2 K and relatively small standard deviations. The mean biases are also negative for MWHS-II channels 11–15 with absolute values <1 K, but with relatively large standard deviations. The biases of both MWTS-III and MWHS-II show scan-angle dependence and are almost symmetrical across the scan line. The long-term mean bias shows only a weak dependence on latitude, which suggests that biases do not vary systematically with brightness temperature. The evaluation results indicate very good prospects for the assimilation application of FY-3E microwave sounding data.

How to cite: Hou, X. and Han, Y.: Verification of FengYun-3E MWTS and MWHS Calibration Accuracy Using GPS Radio Occultation Data, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-13926,, 2024.

EGU24-14759 | Orals | ITS1.15/GI1.3

Sea Surface Salinity in the Arctic Ocean - Results from the NASA SASSIE Field Campaign, Calibration-Validation of Satellite Observations, and Data Outreach 

Julian Schanze, Peter Gaube, Jessica Anderson, Frederick Bingham, Kyla Drushka, Sebastien Guimbard, Tong Lee, Nicolas Reul, Roberto Sabia, and Elizabeth Westbrook and the NASA Salinity and Stratification at the Sea Ice Edge Field Campaign Team

The National Aeronautical and Space Administration (NASA) Salinity and Stratification at the Sea Ice Edge (SASSIE) field campaign took place in the Arctic Ocean between August and October of 2022. The scientific aim is to understand the relationship between both haline and thermal stratification and sea-ice advance, and to test the hypothesis that a significant fresh layer at the surface can accelerate the formation of sea ice by limiting convective processes. With the advent of satellite-derived sea surface salinity (SSS) observations from SMOS, Aquarius/SAC-D, and SMAP in the last decade, such observations could provide insights into sea ice formation rates and extent. With the sensitivity of L-Band radiometry for SSS being low at the temperatures prevalent in the Arctic Ocean (-2°C – 5°C) and additional problems with sea ice contamination in the satellite footprint, careful calibration and validation is needed to determine the quality of satellite-derived SSS in this region, particularly near the ice-edge.

Here, we present three components that have resulted from this NASA Field Campaign.

1.) An overview of data gathered is presented, including an unprecedented density of near-surface salinity measurements from diverse platforms. These were measured during a one-month shipboard hydrographic and atmospheric survey in the Beaufort Sea and include continuous observations at radiometric depth (1-2cm) from the salinity snake instrument, more than 3000 high-resolution uCTD profiles, and air-sea flux measurements. Concurrent with the shipborne observations, an airborne campaign to observe ocean salinity, temperature, and other parameters from a low-flying aircraft was performed. Finally, we discuss the deployment and results of autonomous assets, buoys, and floats that were able to observe both the melt season and the sea ice advance. We combine these in situ observations with satellite SSS data to examine the effects of stratification on ocean dynamics in the Beaufort Sea near the sea ice edge and discuss the quality of SSS data in this region.

2.) NASA Physical Oceanography Programs has affirmed its commitment to Open Science and reproducibility of results. For the SASSIE field campaign, we have created a unique web portal that showcases the datasets gathered during the campaign, giving video overviews as well as written summaries of the available data and motivations for their collection. We have also created repositories that contain processing code used in the creation of these datasets, as well as example processing scripts in the form of Jupyter notebooks, which allow end users to execute a live download of datasets from NASA's Physical Oceanography Distributed Active Archive Center (PO.DAAC) as well as processing and plotting these data in Python.

3.) We show the active integration of these tools into the salinity pilot mission exploitation platform (Salinity Pi-MEP), operated by the European Space Agency (ESR) in collaboration with NASA. We demonstrate how such an integration leverages access to other datasets, and facilitates calibration-validation efforts for Level-2 and Level-3 satellite data from multiple satellites. 

How to cite: Schanze, J., Gaube, P., Anderson, J., Bingham, F., Drushka, K., Guimbard, S., Lee, T., Reul, N., Sabia, R., and Westbrook, E. and the NASA Salinity and Stratification at the Sea Ice Edge Field Campaign Team: Sea Surface Salinity in the Arctic Ocean - Results from the NASA SASSIE Field Campaign, Calibration-Validation of Satellite Observations, and Data Outreach, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-14759,, 2024.

EGU24-14920 | Posters on site | ITS1.15/GI1.3

Sentinel-3 Land Ice Thematic Product: Evaluation of Greenland surface elevation and elevation change.  

Sebastian B. Simonsen, Louise Sandberg Sørensen, Stine K. Rose, and Jérémie Aublanc

The Sentinel-3 satellite series, developed by the European Space Agency as part of the Copernicus Programme, currently comprises two satellites, Sentinel-3A and Sentinel-3B, launched on 16th February 2016 and 25th April 2018, respectively. These satellites are equipped with various instruments, including a radar altimeter, enabling them to conduct operational topography measurements of the Earth's surface. The primary objective of the Sentinel-3 constellation concerning land ice is to provide highly accurate topographic measurements of polar ice sheets. This data is crucial in supporting, e.g., ice sheet mass balance studies. Unlike previous missions that utilized conventional pulse-limited altimeters, Sentinel-3 employs an advanced SAR Radar ALtimeter (SRAL) with delay-doppler capabilities, resulting in significantly enhanced spatial resolution for surface topography measurements. The Sentinel-3 Mission Performance Cluster (MPC) is tasked with monitoring the stability and accuracy of the mission. Here, we report on the latest findings on the Greenland ice sheet.

ESA and the MPC recently developed a specialized delay-Doppler Level-2 processing chain (thematic products) over three dedicated surfaces: Inland Waters, sea ice, and Land Ice. For land ice, delay-Doppler processing with an extended window has been implemented to enhance the coverage of the ice sheet margins. With the improved coverage at the ice sheet margins, we can now access and monitor the fastest-changing regions of the Greenland ice sheet. Hence, the essential climate variable surface elevation change (SEC) can directly be derived solely from Sentinel-3 and, due to the operational concept of the Sentinel program, is ensured to provide continuous observations until at least 2030. Here, we present the latest SEC results based on the land ice thematic product and compare it to the other polar altimetric missions (CryoSat-2 and ICESat-2) to provide a benchmark for the performance of the Sentinel-3 mission for the time to come with less abundant polar radar altimeters.   

How to cite: Simonsen, S. B., Sandberg Sørensen, L., Rose, S. K., and Aublanc, J.: Sentinel-3 Land Ice Thematic Product: Evaluation of Greenland surface elevation and elevation change. , EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-14920,, 2024.

EGU24-15137 | Orals | ITS1.15/GI1.3

Utilizing surface-based observations from the Micro Pulse Lidar Network (MPLNET) for validation of space-based satellite missions 

Jasper Lewis, James Campbell, Erica Dolinar, Simone Lolli, Sebastian Stewart, Larry Belcher, and Ellsworth Welton

Starting with the Lidar In-Space Technology Experiment (LITE) in 1994, spaceborne lidars have provided highly detailed global views of the vertical structure of clouds and aerosols. And since that time, surface-based lidar, well as aircraft lidar, have been used for validation through correlative measurements. While the validation of space-based lidar systems by surface-based lidar observations is not straightforward, protocols for doing so are well-established and have shown good agreement in many instances.     

The Micro Pulse Lidar Network (MPLNET) is a federated, global network of Micro Pulse Lidar systems deployed worldwide to measure aerosol and cloud vertical structure, and mixed layer heights. The data have been collected continuously, day and night, for more than 20 years from sites around the world with multiple sites containing 5+ or 10+ years of data. MPLNET is also a contributing network to the World Meteorological Organization (WMO) Global Atmospheric Watch (GAW) Aerosol Lidar Observation Network (GALION). The use of common instrumentation and processing algorithms within MPLNET allow for direct comparisons between sites. Thus, long-term MPLNET measurements can be used to verify the fidelity of geophysical parameters measured throughout the lifetime of individual satellite missions (e.g. CALIPSO, CATS, EarthCARE, CALIGOLA, and AOS) and provide a metric for intercomparisons between different space-based lidar missions when gaps between satellite missions occur.

In this presentation, we use multiple years of comparisons between MPLNET and the Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP) flown aboard CALIPSO. For these comparisons, we use newly developed Level 3 MPLNET products consisting of monthly, diurnal statistics for cloud and aerosol retrievals covering a representative range of conditions and locations. Furthermore, we compare top-of-the-atmosphere cirrus cloud radiative forcing derived from these two complementary platforms. Finally, using results from an upcoming validation rehearsal, we demonstrate how these procedures will be utilized during the EarthCARE mission, scheduled to launch in May 2024.    

How to cite: Lewis, J., Campbell, J., Dolinar, E., Lolli, S., Stewart, S., Belcher, L., and Welton, E.: Utilizing surface-based observations from the Micro Pulse Lidar Network (MPLNET) for validation of space-based satellite missions, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-15137,, 2024.

Fundamental climate data records (FCDRs) play a vital role in monitoring climate change. In this article, we develop a spaceborne passive microwave-based FCDR byrecalibrating the Advanced Microwave Scanning Radiometerfor Earth Observing System (AMSR-E) on the Aqua satellite,the microwave radiometer imager (MWRI) onboard the FengYun-3B (FY3B) satellite, and the Advanced Microwave ScanningRadiometer-2 (AMSR2) onboard the JAXA’s Global ChangeObservation Mission first-Water (GCOM-W1) satellite. Beforerecalibration, it is found that AMSR-E and AMSR2 observations are stable over time, but MWRI drifted colder beforeMay 2015 and had nonnegligible errors in geolocation formost channels. In addition, intersensor differences of brightnesstemperatures (TBs) are as large as 5–10 K. To improve dataconsistency and continuity, several intersensor calibration methods are applied by using AMSR2 as a reference while usingMWRI to bridge the data gap between AMSR2 and AMSRE. The double difference method is used to provide intersensordifference time series for correcting calibration biases, such asscene temperature-dependent bias, solar-heating-induced bias,and systematic constant bias. Hardware differences betweensensors are corrected using principal component analysis. Afterrecalibration, the mean biases of both MWRI and AMSR-Eare less than 0.3 K compared to the AMSR2 reference andtheir standard deviations are less than 1 K for all channels.Under oceanic rain-free conditions, the TB biases are less than0.2 K for all channels and no significant relative bias driftswere found between sensors for overlapping observations. Thesestatistics suggest that the consistency between these instrumentswas significantly improved and the derived FCDR could be usefulto obtain long-term water cycle-related variables for climateresearch. 

How to cite: Wu, B. and Wang, Y.:  A Fundamental Climate Data Record Derived fromAMSR-E, MWRI, and AMSR2 , EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-15316,, 2024.

EGU24-15804 | ECS | Orals | ITS1.15/GI1.3

A multi frequency altimetry snow depth product over the Arctic sea ice 

Alice Carret, Sara Fleury, Alessandro Di Bella, Jack Landy, Isobel Lawrence, Antoine Laforge, Nathan Kurtz, and Florent Garnier

Since more than 10 years, CryoSat-2 (CS2) has observed and monitored the Arctic Ocean, providing unprecedented spatial and temporal coverage. Satellite altimetry enables to measure sea ice thickness, one essential variable to understand the sea ice dynamics. Numerous sea-ice products developed by the community showed the skills of CS2 to retrieve sea-ice thickness. Nevertheless, several questions remain to better quantify the quality of the measurements. One of them is to better assess the snow depth, a key parameter to obtain the sea ice thickness. In 2018, ICESat-2 mission was launched carrying a LIDAR altimeter. We took advantage of the difference of penetration in the snow layer of laser and Ku-Band altimetry to compute a snow depth product covering the ICESat-2 period. This product is then validated and compared to in situ datasets, reanalysis, models and other snow depth products from satellite missions such as SARAL. Results are quite good concerning the comparison to in situ datasets giving us confidence in the product reliability. In July 2020, the orbit of CryoSat-2 was raised, as part of the CRYO2ICE project, to coincide in space and time to tracks from NASA high resolution altimeter ICESat-2 over the Arctic ocean. This is a unique opportunity to benefit from along-track colocalised data. We present here a methodology to compare ICESat-2 and CryoSat-2 along coincident tracks and compare the resulting snow depth product to gridded products. The lack of in situ measurements is one of the main limitations to analyze the along-track product contribution. Finally we focus on the advantages of combining laser and Ku-band altimetry to lower the uncertainties. The snow depth uncertainties of our product are about 6 cm on average. This ESA-supported study should help prepare the Copernicus CRISTAL mission, which will include a Ka/Ku dual-frequency altimeter for the first time.

How to cite: Carret, A., Fleury, S., Di Bella, A., Landy, J., Lawrence, I., Laforge, A., Kurtz, N., and Garnier, F.: A multi frequency altimetry snow depth product over the Arctic sea ice, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-15804,, 2024.

EGU24-15810 | Posters on site | ITS1.15/GI1.3

Building a comprehensive picture of sea surface, troposphere and ionosphere contributions in precise GNSS reflectometry from space 

Maximilian Semmling, Weiqiang Li, Florian Zus, Mostafa Hoseini, Mario Moreno, Mainul Hoque, Jens Wickert, Estel Cardellach, Andreas Dielacher, and Hossein Nahavandchi

Signals of Global Navigation Satellite Systems (GNSS) are subjected to propagation effects, like reflection, refraction and scintillation. Twenty years ago, a first dedicated payload has been launched on a satellite mission (UK-DMC) to study Earth-reflected GNSS signals and their potential for Earth observations. It was a milestone in the research field of satellite-based reflectometry. The altimetric use of reflectometry is of particular interest for the geoscience community. The permanent and global availability of GNSS signals, exploited in an altimetric reflectometry concept, can help to improve the rather sparse coverage of today’s altimetric products.

Studies on altimetric reflectometry concepts started already thirty years ago. However, the sea surface roughness, the limited GNSS signal bandwidth, orbit uncertainties and the sub-mesoscale variability (we assume here a horizontal scale < 50 km) of troposphere and ionosphere pose a persistent challenge for the altimetric interpretation and application of reflectometry data.

The ESA nano-satellite mission PRETTY (Passive REflecTometry and dosimeTrY) will investigate the altimetric application of reflectometry. It concentrates on a grazing-angle geometry. A mitigation of roughness-induced signal disturbance can be expected under these angles. On the other hand, at grazing angles tropospheric and ionospheric variability will rise in importance. The PRETTY satellite and payload have been developed by an Austrian consortium and successfully launched on 9th October 2023 into the dedicated polar orbit (roughly 550 km in orbit height). We formed a science consortium (among the here listed partners) to merge competences in the field of altimetry and GNSS signal propagation effects.

Based on the mission’s ATBD (Algorithm Theoretical Baseline Document), we conducted simulations and case studies of existing satellite data. They allow a first quantification of expected roughness and sea surface topography effects, as well as, tropospheric and ionospheric biases in grazing-angle geometry. The preliminary results show that, for calm ocean areas (significant wave height < 1 m) and over sea ice, altimetric retrievals reach centimeter level precision. In these specific cases, the residual Doppler shift is small (mHz range) which indicates moderate variability of tropospheric and ionospheric contributions. New observation data of the PRETTY mission is expected early in 2024. Then, we will extend our picture for a more general altimetric use of precise reflectometry data.

How to cite: Semmling, M., Li, W., Zus, F., Hoseini, M., Moreno, M., Hoque, M., Wickert, J., Cardellach, E., Dielacher, A., and Nahavandchi, H.: Building a comprehensive picture of sea surface, troposphere and ionosphere contributions in precise GNSS reflectometry from space, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-15810,, 2024.

The US National Aeronautics and Space Administration (NASA) and the European Space Agency (ESA) created a Joint Program Planning Group (JPPG) in 2010 to enhance coordination between NASA and ESA on current and future space Earth Observation missions. One of the three sub-groups of the JPPG is dedicated to collaboration in field measurement campaigns, mission and product calval and more recent collaborative EO community science projects.

Since 2010 the JPPG has initiated or informed numerous airborne field campaigns to help develop and document the scientific objectives, develop geophysical retrieval algorithms and provide calibration and/or validation for present and/or future satellites to be operated by NASA, ESA, and its partners. The activities address an underlying need to demonstrate unambiguously that space-based measurements, which are typically based on engineering measurements by the detectors (e.g. photons), are sensitive to and can be used to reliably retrieve the geophysical and/or biogeochemical parameters of interest across the Earth and validate mission design. Such campaigns have included as diverse subjects as atmospheric trace gas composition over the western US, solar induced fluorescence over the Eastern United States, wind profiles over the north Atlantic, vegetation canopy profiles in Gabon, and sea ice and ice sheet properties in the Arctic and Antarctic. The collaborative field campaign and calval activities have helped use of surface-based, airborne, and/or space-based observations to develop precursor data sets and support both pre- and post- launch calibration/validation and retrieval algorithm development for space-based satellite missions measuring our Earth system.

The generation of consistent, inclusive, community-based assessments of Earth system change through integrated analyses of these different data sets is also a critically important process in the challenge of documenting Earth system change. To assist in this process the JPPG has supported collaborative community efforts including three installments of the Ice Mass Balance Intercomparison Experiment (IMBIE; two completed, one ongoing), the NASA-ESA Snow on Sea Ice Intercomparison (NESOSI), and the Arctic Methane and Permafrost Challenge (AMPAC).

In this talk a review of JPPG activities and their results, as well current plans for future collaborations including campaigns will be provided. 

How to cite: Davidson, M. W. J., Drinkwater, M., and Kaye, J.: An overview of collaborative field campaigns, calval and community science activities enabled through the ESA-NASA Joint Program Planning Group, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-16893,, 2024.

EGU24-17973 | Orals | ITS1.15/GI1.3

Post-launch Validation of the Copernicus Atmospheric Composition Satellites: Outcomes of the CCVS Gap Analysis 

Tijl Verhoelst, Jean-Christopher Lambert, Martine De Mazière, Bavo Langerock, Steven Compernolle, Folkert Boersma, Daan Hubert, Arno Keppens, Clémence Pierangelo, Gaia Pinardi, Mahesh Kumar Sha, Frederik Tack, Nicolas Theys, Gijsbert Tilstra, Michel Van Roozendael, Corinne Vigouroux, Angelika Dehn, Philippe Goryl, Thierry Marbach, and Sébastien Clerc

The European Earth Observation programme Copernicus is implementing the next-generation system for atmospheric composition monitoring: after the success of the Sentinel-5 Precursor TROPOMI, a constellation of Sentinel-4 geostationary and Sentinel-5 Low-Earth orbiting missions will be launched in 2025 and beyond for air quality, ozone and climate variables monitoring, while the CO2M missions will observe greenhouse gases emissions and related proxies.  Post-launch validation of the data products is essential to determine their quality and enable users to judge their fitness-for-purpose.  Therefore, in 2021-2022 the European Union funded the H2020 Copernicus Cal/Val Solution (CCVS) project with the aim to review the status of existing validation infrastructures and methods for all Sentinel missions and to define a holistic solution to overcome limitations (  In this contribution we report on the maturity assessment and gap analysis performed in this project.  This assessment synthesizes lessons learned from earlier work in FP7 and H2020 projects, and from the operational/routine validation services run in the ESA/Copernicus Atmosphere Mission Performance Cluster (ATM-MPC), the EUMETSAT Atmospheric Composition Satellite Application Facility (AC SAF), the Copernicus Atmosphere Monitoring Service (CAMS) and the Copernicus Climate Change Service (C3S).  The CCVS assessment includes feedback from space agencies, Copernicus stakeholders and the CEOS Working Group on Calibration and Validation (WGCV).  

The validation means, such as the precursor data sets and comparison methods, have evolved significantly in the past decade: (1) New ground-based instruments have been developed and networks have expanded  in geographical coverage and in capabilities, (2) traceability to metrological standards and uncertainty characterization of the (Fiducial) Reference Measurements (FRM) has improved considerably, (3) rapid provision of FRM through data distribution services is becoming commonplace, (4)  the advantages of advanced comparison methods have been demonstrated, and (5) all of this has facilitated the development of operational, near-real-time validation systems such as the Validation Data Analysis Facility (VDAF-AVS) of the ATM-MPC for the Sentinel-5P mission. 

On the other hand, a list of remaining challenges still restrain the scope and quality of the validation of several atmospheric data products: (1) Station-to-station differences in ground-based validation results suggest (poorly understood) intra-network and inter-network inhomogeneity, (2) the coverage offered by ground-based networks (of the full range of the measurand values and of the influence quantities affecting the retrieval) can have important gaps, (3) timeliness of ground-based data provision remains poor for several products, (4) comparability (representativeness) between ground-based and satellite measurements requires further methodological advances and supporting measurement campaigns, (5) the accuracy and breadth of scope of the latest generation of satellite sounders puts correspondingly tight and difficult-to-meet requirements on the FRM data quality, (6) cross-validation of the different satellites requires a coordinated approach, and (7) some networks and activities experience increased/recurrent funding difficulties. 

We conclude this overview of the CCVS gap analysis for atmospheric composition data with illustrations of concrete actions undertaken recently to address some of the validation challenges highlighted by the project.

The CCVS project has received funding from the European Union’s Horizon 2020 programme under grant agreement No 101004242 (Project title: “Copernicus Cal/Val Solution). 

How to cite: Verhoelst, T., Lambert, J.-C., De Mazière, M., Langerock, B., Compernolle, S., Boersma, F., Hubert, D., Keppens, A., Pierangelo, C., Pinardi, G., Kumar Sha, M., Tack, F., Theys, N., Tilstra, G., Van Roozendael, M., Vigouroux, C., Dehn, A., Goryl, P., Marbach, T., and Clerc, S.: Post-launch Validation of the Copernicus Atmospheric Composition Satellites: Outcomes of the CCVS Gap Analysis, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-17973,, 2024.

EGU24-19307 | Orals | ITS1.15/GI1.3

Four decades of cryosphere albedo from spaceborne observations - assessment with field data 

Jason Box, Rasmus Bahbah, Andreas Ahlstrøm, Adrien Wehrlé, Alexander Kokhanovsky, Ghislain Picard, and Laurent Arnaud

Snow and ice albedo plays a fundamental role in climate change amplification. Its importance is by modulating absorbed sunlight; the largest average melt energy source. Further, the presence or lack of light absorbing impurities including living matter and meltwater effects can strongly influence snow and ice heating rates. Through multiple consecutive satellite missions, cryosphere albedo has been mapped globally and continuously for more than four decades now.
This work examines a 42 year record of cryosphere albedo by joining the satellite climate records of snow and ice albedo from AVHRR 1982 to present, NASA MODIS 1999 to present, and EU Copernicus Sentinel-3 2017 to present. The long-term stability of the climate records is examined using independent field data from Greenland and Antarctica. Additionally, the work presents long term trends in snow and ice albedo in relation to the competing effects of surface melting, snowfall and rainfall.

How to cite: Box, J., Bahbah, R., Ahlstrøm, A., Wehrlé, A., Kokhanovsky, A., Picard, G., and Arnaud, L.: Four decades of cryosphere albedo from spaceborne observations - assessment with field data, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-19307,, 2024.

EGU24-19918 | Posters on site | ITS1.15/GI1.3

CryoSat Mission: CalVal, Science and International Cooperation Activities 

Alessandro Di Bella and Tommaso Parrinello

Launched in 2010, the European Space Agency’s (ESA) CryoSat mission was the first polar-orbiting satellite flying a SAR Interferometric altimeter dedicated to the cryosphere, with the objectives to monitor precise changes in the thickness of polar ice sheets and floating sea ice. After 14 years in orbit, CryoSat remains one of the most innovative radar altimeters in space and continues to deliver high-quality data, providing unique contributions to several Earth Science and application domains. The mission has been extended until the end of 2025 with the scope to achieve important scientific objectives and to extend the synergy with other missions by further strengthening international cooperation.

Routine CalVal activities are fundamental to evaluate the accuracy of CryoSat measurements, to monitor the long-term stability of the altimeter, and to characterise uncertainties on the final geophysical retrievals. In this talk, we present the CryoSat mission status and show results from some of the several CalVal activities currently in place, e.g., acquisition over transponders, comparison of sea level at tide gauges and exploitation of data collected during polar field campaigns. We also highlight the importance of international cooperation in CalVal and Science activities from the perspective of the ESA-NASA CRYO2ICE campaign, aligning CryoSat orbit to the one of ICESat-2, and the Sea Ice Thickness Intercomparison Exercise (SIN’XS) project, aiming to provide reconciled sea ice thickness estimates in both hemispheres. Finally, we discuss how current and future CryoSat activities are crucial to prepare for the upcoming Copernicus CRISTAL mission which will provide coincident measurements at Ka and Ku bands.

How to cite: Di Bella, A. and Parrinello, T.: CryoSat Mission: CalVal, Science and International Cooperation Activities, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-19918,, 2024.

EGU24-20394 | Orals | ITS1.15/GI1.3

Validation and support of space-based measurements with the Pandonia Global Network of ground-based spectrometers 

Thomas Hanisco, Nader Abuhassan, Stefano Casadio, Alexander Cede, Limseok Chang, Angelika Dehn, Barry Lefer, Elena Lind, Apoorva Pandey, Bryan Place, Alberto Redondas, James Szykman, Martin Tiefengraber, Luke Valin, Michel van Roozendael, and Jonas von Bismarck

Since 2019 the NASA Pandora and ESA Pandonia projects have been collaborating to coordinate and facilitate the expansion of a global network of ground-based spectrometers to support space-based measurements of trace gases relevant to air quality (NO2, O3, HCHO, SO2, …). This network of standardized, calibrated Pandora instruments, the Pandonia Global Network (PGN,, is focused on providing data needed to help validate satellite measurements and to contribute to scientific studies of air quality.  As of January 2024, the PGN is comprised of 158 official sites in 34 countries. This presentation will describe recent efforts to expand and improve the network to support the increased capability and complexity of space-based measurements. Collaborative efforts by partner agencies, especially the US Environmental Protection Agency (EPA) and the Korean National Institute of Environmental Research (NIER), and new programs such as the Increasing Participation in Minority Serving Institutions (IPMSI) and Satellite Needs Working Group (SNWG) have accelerated the growth of the PGN, providing greater global coverage and allowing improved data products.  With these improvements and continued input from other suborbital assets, the PGN is well positioned to facilitate the interpretation and validation of high spatial resolution and diurnal measurements provided by the newest orbiting and geostationary satellite instruments. 

How to cite: Hanisco, T., Abuhassan, N., Casadio, S., Cede, A., Chang, L., Dehn, A., Lefer, B., Lind, E., Pandey, A., Place, B., Redondas, A., Szykman, J., Tiefengraber, M., Valin, L., van Roozendael, M., and von Bismarck, J.: Validation and support of space-based measurements with the Pandonia Global Network of ground-based spectrometers, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-20394,, 2024.

EGU24-20397 | Orals | ITS1.15/GI1.3

Integration of ACIX-III Land Atmospheric Correction Inter-comparison eXercise within the Copernicus Expansion Mission Product Algorithm Laboratory to Support Surface Reflectance Cal/Val 

Kevin Alonso, Noelle Cremer, Valentina Boccia, Philip G. Brodrick, Adam Chlus, Georgia Doxani, Ferran Gascon, Sander Niemeijer, David R. Thompson, Philip Townsend, and Nikhil Ulahannan

Atmospheric Correction Inter-comparison eXercise (ACIX) was initiated in 2016 in the frame of the Committee on Earth Observation Satellites (CEOS) Working Group on Calibration & Validation (WGCV) and it is co-organised by ESA and NASA. The aim of ACIX is to compare the state-of-the-art atmospheric correction (AC) processors. ACIX is a voluntary and open-access initiative to which every AC processor’s developer is invited to participate. In the current third edition, ACIX-III Land, the focus is on imaging spectrometer data, also called hyperspectral data. Data from two spectrometers in orbit (PRISMA and EnMAP) will be used in a suite of test sites. These sites were selected based on the availability of ground-based measurements and flight campaign data with coincident acquisitions, i.e., RadCalNet and CHIME-AVIRIS-NG campaigns.

 The ACIX-III Land exercise will intercompare the performances of several AC software suits capable of retrieving Surface Reflectance (SR), Water Vapour (WV) and Aerosols Optical Depth (AOD). The original datasets along with the participant results will be catalogued, intercompared, and analysed within the Copernicus Expansion Mission - Product Algorithm Laboratory (CEM-PAL). The CEM-PAL is a virtual environment aiming to facilitate efficient prototyping of algorithms used to generate and test Expansion Missions Level-2 products, including algorithm modification, hosted processing, qualification functionalities and scientific validation environment. Once the ACIX-III results are published, the dataset will be repurposed to initially support the CHIME L2 developments with plans to extent the support to other missions (e.g., SBG, LSTM).

This contribution will present the ACIX-III Land, and CEM-PAL initiatives, highlighting the main implementation points, latest status, and future developments to support related Cal/Val activities.

How to cite: Alonso, K., Cremer, N., Boccia, V., Brodrick, P. G., Chlus, A., Doxani, G., Gascon, F., Niemeijer, S., Thompson, D. R., Townsend, P., and Ulahannan, N.: Integration of ACIX-III Land Atmospheric Correction Inter-comparison eXercise within the Copernicus Expansion Mission Product Algorithm Laboratory to Support Surface Reflectance Cal/Val, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-20397,, 2024.

EGU24-20612 | Posters on site | ITS1.15/GI1.3

Using In Situ Airborne Measurements to Evaluate Pandora Ground-based Remote Sensing Formaldehyde Data Products  

Jason St. Clair, Glenn Wolfe, and Thomas Hanisco

Measurements of boundary layer formaldehyde (HCHO) are valuable for air quality monitoring, both because HCHO is classified as an air toxic by the US EPA and because HCHO concentrations directly reflect recent VOC oxidation and therefore are a diagnostic for ozone production. The Pandora network, with instruments deployed across the US and around the world, is a promising source of boundary layer HCHO data but previous evaluation of Pandora HCHO data was limited to total column HCHO at two sites during one campaign. Here we extend the evaluation to include Pandora tropospheric column and profiling data products derived from differential optical absorption spectroscopy (DOAS) operation. NASA’s SARP-East program provided a unique opportunity to evaluate the Pandora DOAS data products with profiling spirals by an airborne in situ payload that included the NASA Goddard CAFE HCHO instrument. Comparison of CAFE and Pandora data will be presented with the goal of better informing the Pandora data community of its performance.

How to cite: St. Clair, J., Wolfe, G., and Hanisco, T.: Using In Situ Airborne Measurements to Evaluate Pandora Ground-based Remote Sensing Formaldehyde Data Products , EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-20612,, 2024.

EGU24-20665 | Orals | ITS1.15/GI1.3

Using Pandora direct sun and MAX-DOAS formaldehyde columns for evaluating satellite retrievals 

Apoorva Pandey, Bryan Place, Jin Liao, Nader Abuhassan, Alexander Cede, Thomas Hanisco, and Elena Lind

Atmospheric formaldehyde (HCHO) is a short-lived but ubiquitous product of hydrocarbon oxidation. It is a tracer of hydrocarbon emissions and reactivity. HCHO has been observed from satellite-based instruments for over two decades. Retrievals typically involve (1) fitting slant columns to the observed UV/IR radiances and (2) deriving vertical columns from the slant columns using air mass factors. Air mass factors are calculated using radiative modeling and a-priori vertical HCHO distributions from a chemical transport model. The Pandora instruments form a ground-based remote sensing network that is valuable for validating satellite retievals. Pandora provides total and tropospheric columns of HCHO via direct sun (DS) and Multi-Axis Differential Optical Absorption Spectroscopy (MAX-DOAS) observations in the UV, respectively. Here, we discuss conversion of slant columns to vertical columns for DS and MAX-DOAS Pandora measurements, neither of which involves radiative modeling and a-priori assumptions. We intercompare daily and seasonal variations in Pandora HCHO columns from these two distinct measurement techniques for ‘hotspot’ and ‘background’ sites to demonstrate their robustness and complementary strengths, as well as to estimate their uncertainties. We further examine the inter-site and seasonal variability in satellite (e.g., OMI, OMPS) retrievals relative to Pandora HCHO columns.     

How to cite: Pandey, A., Place, B., Liao, J., Abuhassan, N., Cede, A., Hanisco, T., and Lind, E.: Using Pandora direct sun and MAX-DOAS formaldehyde columns for evaluating satellite retrievals, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-20665,, 2024.

EGU24-20707 | ECS | Posters on site | ITS1.15/GI1.3

Intercomparison of Pandora surface and vertical profile NO2 retrievals with in-situ network measurements and airborne observations across the Eastern USA 

Bryan Place, Apoorva Pandey, Lukas Valin, Jason St. Clair, Thomas Hanisco, Nader Abuhassan, Alexander Cede, and Elena Spinei

Trace gas total and tropospheric/stratospheric column retrievals from the Pandora instruments across the Pandonia Global Network (PGN) have played a key role in satellite validation. With the addition of multi-axis differential optical absorption spectroscopy (MAX-DOAS) retrievals to the latest Pandora processing software (Blick v1.8), the PGN now generates surface and vertically-resolved trace gas measurements that will further aid in future satellite product validation. The MAX-DOAS retrievals developed for the Pandora instrument rely upon simple assumptions and measurements and do not require complex radiative transfer calculations, allowing for the columns to be retrieved at a sub-hourly timescale. In this presentation, we give a brief overview of the theory and measurements behind the Pandora MAX-DOAS retrievals and provide an evaluation of the MAX-DOAS NO2 products. For the evaluation we show an intercomparison of PGN NO2 surface products with co-located surface network measurements taken from the US Environmental Protection Agency Air Quality System (EPA AQS) database.  We also compare Pandora NO2 vertical profiles with profiles collected from both sonde and aircraft measurements in the Eastern United States.

How to cite: Place, B., Pandey, A., Valin, L., St. Clair, J., Hanisco, T., Abuhassan, N., Cede, A., and Spinei, E.: Intercomparison of Pandora surface and vertical profile NO2 retrievals with in-situ network measurements and airborne observations across the Eastern USA, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-20707,, 2024.

EGU24-888 | ECS | Posters on site | NH9.11 | Highlight

Identifying Mixing Components by Natural Tracers in the Lake Hévíz System 

Saeed Bidar Kahnamuei, Katalin Hegedűs-Csondor, Petra Baják, Ákos Horváth, Dénes Szieberth, György Czuppon, Márta Vargha, Bálint Izsák, and Anita Erőss

One of the largest natural thermal lakes in the world, Lake Hévíz is located in the southwestern part of the Transdanubian Range’s karst system (Hungary). It is fed by springs with different temperatures, which are located in a cave beneath the lake. The mixing of cold and hot waters generates the lake’s sulphuric therapeutic water, and it is responsible for the cave formation at the bottom, resulting in the lake's unique ecosystem. The presented research aimed at the comprehensive geochemical characterization of waters in the wider surroundings of the lake (lake water, springs, observation, drinking water, and thermal water wells). Investigating the geochemical characteristics of water took on a novel perspective through the innovative application of radionuclides as natural tracers. Within the framework of this investigation, we utilized uranium, radium, and radon isotopes to identify the mixing of fluids and infer the mixing end members in the Hévíz karst system. Alpha spectrometry was applied on selectively adsorbing Nucfilm discs as an inventive approach to measure uranium and radium isotopes. Moreover, stable isotopic ratios of hydrogen and oxygen (δ2H and δ18O) were determined to supplement the information on waters with different origins. Hydrochemical water analysis for measuring the concentration of major ions and trace elements was carried out using ICP-MS, ion chromatography, and UV-Vis spectrophotometry. The inferred fluid end members and their compositions are anticipated to provide insightful information on the hydrogeological functioning of the Lake Hévíz karst system, which is indispensable in sustainable water resource management and understanding climate change's impact.



Keywords: Thermal lake; Hydrogeochemical characteristics; Mixing fluids; Radionuclides; Stable isotopes; ICP-MS, Nucfilm, Alpha spectroscopy

How to cite: Bidar Kahnamuei, S., Hegedűs-Csondor, K., Baják, P., Horváth, Á., Szieberth, D., Czuppon, G., Vargha, M., Izsák, B., and Erőss, A.: Identifying Mixing Components by Natural Tracers in the Lake Hévíz System, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-888,, 2024.

EGU24-908 | ECS | Posters on site | NH9.11 | Highlight

Proximal Gamma Ray Spectroscopy for monitoring Soil Water Content in vineyards 

Michele Franceschi, Matteo Alberi, Marco Antoni, Ada Baldi, Alessio Barbagli, Luisa Beltramone, Laura Carnevali, Alessandro Castellano, Giovanni Collodi, Enrico Chiarelli, Tommaso Colonna, Vivien De Lucia, Andrea Ermini, Andrea Maino, Fabio Gallorini, Enrico Guastaldi, Nicola Lopane, Antonio Manes, Fabio Mantovani, Samuele Messeri, Dario Petrone, Silvio Pierini, Kassandra Giulia Cristina Raptis, Andrea Rindinella, Riccardo Salvini, Daniele Silvestri, Virginia Strati, and Gerti Xhixha

Soil Water Content (SWC) is a key information in precision agriculture for obtaining high levels of efficiency and health of crops, while reducing water consumption. In particular, for the case of vineyards, due to the recent extreme temperature fluctuations, the knowledge of the SWC of the entire field becomes crucial to allow a timely intervention with emergency irrigation to preserve plant health and yield.

Unlike electromagnetic SWC measurements, that are punctual and gravimetric measurements, that are punctual and also time-consuming, the Proximal Gamma Ray Spectroscopy (PGRS) technique can provide field-scale, non-invasive, and real-time measurements of SWC. This is achievable through an in-situ NaI detector, continuously recording photons resulting from the radioactive decay of 40K in the soil, which are attenuated proportionally based on the amount of stored water. Given the inverse proportionality between soil moisture and photons detected by the gamma ray sensor, the SWC value can be easily obtained.

In this study we investigate the performance of PGRS applied to the case of study of a vineyard at the farm “Il Poggione” located in Montalcino (Siena, Italy).

The effectiveness of the results obtained is supported by different tests: first the validation allowed to compare the PGRS measurement (5.8 ± 1.5)% with a gravimetric measurement (9.0 ± 2.5)%, highlighting a 1-σ agreement; then by the rainfall recognition capability indeed, in correspondence to the most significant rainfall event (18 mm) the SWC value before and after the rain increased of 7.8%.

Moreover, the integration of the in-situ system with an agrometeorological station resulted in a Web App, allowing for real time data storage and thus facilitating data management, spectrum analysis, and display for both gamma ray sensor and agrometeorological station results, enabling comprehensive studies of environmental parameters (e.g., temperature, air humidity).

This research underlines the potential of PGRS as a precise, real-time, and field scale SWC monitoring tool not only in vineyards but for cultivated fields in general. Further refinements concerning the gamma ray spectra analysis and broader applications in environmental monitoring are envisaged for improved agricultural practices.

This study was supported by the project STELLA (Sistema inTEgrato per Lo studio del contenuto d'acqua in agricoLturA) (CUP: D94E20002180009) funded by the Tuscany region under the program POR FESR 2014/2020.

How to cite: Franceschi, M., Alberi, M., Antoni, M., Baldi, A., Barbagli, A., Beltramone, L., Carnevali, L., Castellano, A., Collodi, G., Chiarelli, E., Colonna, T., De Lucia, V., Ermini, A., Maino, A., Gallorini, F., Guastaldi, E., Lopane, N., Manes, A., Mantovani, F., Messeri, S., Petrone, D., Pierini, S., Raptis, K. G. C., Rindinella, A., Salvini, R., Silvestri, D., Strati, V., and Xhixha, G.: Proximal Gamma Ray Spectroscopy for monitoring Soil Water Content in vineyards, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-908,, 2024.

EGU24-1450 | ECS | Orals | NH9.11 | Highlight

Origin of radioactivity in a neoformed mineral: the case of epsomite from the Perticara sulfur mine 

Matteo Giordani, Marco Taussi, Maria Assunta Meli, Carla Roselli, Giacomo Zambelli, Ivan Fagiolino, and Michele Mattioli

Recently, high amounts of toxic and radioactive elements have been discovered in epsomite crystals in the abandoned sulphur mine of Perticara, Italy (Giordani et al., 2022). Epsomite represents a neoformed mineral grown in the galleries after the extraction activities of the sulfur mine. In particular, a content of 5.59 ± 0.84 Bq/g of 210Po was detected in the epsomite phase, coupled with other toxic elements such as 228Th, As, Co, Fe, Mn, Ni, Sr, Ti, Zn.

The anomalous content of polonium led to new investigations of the area through the study of different matrices present in the galleries: minerals, host-rock, water, air, dust and bitumen, with the aim to define the origin and the distribution of this hazardous element. The samples were investigated combining several analytical techniques: X-ray Powder Diffraction (XRPD), Environmental Scanning Electron Microscopy (ESEM-EDS), Inductively Coupled Plasma-Atomic Emission (ICP-AES), Inductively Coupled Plasma-Mass Spectrometry (ICP-MS), Atomic Absorption Spectrometry (AAS), Gamma Spectrometry, Alpha Spectrometry, Radon Monitor, and Alpha Track Detector (ATD).

Water samples showed high Al, Fe, Pb, Mg, and Mn content but not radioactive elements. The bitumen sample showed a higher amount of 210Po and 210Pb (0.12 ± 0.02 Bq/g and 0.11 ± 0.02 Bq/g, respectively), compared to the host-rock and fibrous sericolite samples, but lower than fibrous epsomite crystals (210Po 5.59 ± 0.84 Bq/g; 210Pb 5.93 ± 1.19 Bq/g). A slight anomaly in the 40K and 226Ra content of the host-rock was observed (0.38 ± 0.05 Bq/g and 0.052 ± 0.007 Bq/g respectively), and a high 222Rn concentration (up to 2200 ± 300 Bq/m3) was also detected in the tunnels (Giordani et al., 2024).

The confined atmosphere of the mine, with the high 222Rn concentration, is likely the source of the high level of 210Po and 210Pb, in radioactive equilibrium, detected in epsomite. Thus, the 222Rn-rich, anoxic, and hypoxic atmosphere, coupled with the abundance of Mn, Fe, and organic matter in the mine, could play a key role in the 210Po remobilization. This work highlighted that natural epsomite, which is a very common mineral phase in mines, caves, and underground environments, is able to capture 210Po and 210Pb. For this reason, it should be used as a mineral indicator for the presence of radioactive elements in similar environmental conditions, also helping to ensure safe management. These results indicate that in areas with a long history of mining, despite decommissioning, environmental hazards and human health risks may still emerge in terms of radioactivity and potentially toxic elements (PTEs).


Giordani, M., Meli, M.A., Roselli, C., Betti, M., Peruzzi, F., Taussi, M., Valentini, L., Fagiolino, I. and Mattioli, M., 2022. Could soluble minerals be hazardous to human health? Evidence from fibrous epsomite. Environmental Research, 206, p.112579.

Giordani, M., Taussi, M., Meli, M.A., Roselli, C., Zambelli, G., Fagiolino, I. and Mattioli, M., 2024. High-levels of toxic elements and radioactivity in an abandoned sulphur mine: Insights on the origin and associated environmental concerns. Science of the Total Environment, 906, p.167498.

How to cite: Giordani, M., Taussi, M., Meli, M. A., Roselli, C., Zambelli, G., Fagiolino, I., and Mattioli, M.: Origin of radioactivity in a neoformed mineral: the case of epsomite from the Perticara sulfur mine, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-1450,, 2024.

The research aimed to analyse variations in soil gas radon concentrations and geogenic radon potential in areas of typical building plots located in regions known for high and low geogenic radon potential. The study was designated to address the following questions:

  • Are spatial variations in soil gas radon concentrations and radon potential statistically important in the area of a typical building plot? Are these variations similar in regions known for high and low radon potential?
  • How many measurement points should be proposed to properly evaluate geogenic radon potential and radon index on the building plot area?
  • Can an in-situ gamma spectrometric survey, combined with soil properties, be useful in the defining radon index at the area of the building plot?
  • Are seasonal variations of soil gas radon concentration significant at a depth of 0.8 m?  If so, which season is the most appropriate to evaluate geogenic radon potential?

The research was conducted in two counties: Wrocław and Dzierżoniów located in the Lower Silesian Voivodeship in the southwest part of Poland. Dzierżoniów County is among the counties listed in the Regulation of 18 June 2020 of the Minister of Health where the average radon concentration in a significant number of buildings may exceed the reference level of 300 Bq m−3. In both regions, three building plots, each of an area of 300 m2 (which is the size of a typical building plot in an urban area in Poland) were identified. At each building plot, five measurement points were designated -  at the four corners and in the middle of each plot. The research at each measurement point included the following procedures:

  • Soil gas radon concentration measurements at the depth of 0.8 m using solid nuclear track detectors have been performed. The detectors were replaced at the beginning of each season starting from summer 2023.
  • The radionuclides contents in the soil were measured in situ using the gamma-ray spectrometer Exploranium RS-230.
  • The ambient gamma dose rate was measured by the radiometer RK-100
  • Various soil properties including grain size, permeability, and filtration coefficient were determined.

Additionally, at each building plot, the instantaneous radon concentration and soil permeability measurements were performed using Lucas cells and RADON-JOK.

The preliminary research results indicate that in Dzierżoniów County uranium contents were in the range from 1.6 ppm to 3.3 ppm and thorium from 5.4 ppm to 8.2 ppm, whereas in Wrocław County uranium contents were in the range from 1.6 ppm to 2.5 ppm and thorium from 4.3 ppm to 7.4 ppm. The instantaneous survey of radon concentration revealed that in Dzierżoniów County soil gas radon concentration varied from 10.338 kBq m-3 to 31,050 kBq m-3 and soil permeability from 1*10-12 m2 to 1*10 -13 m2, whereas in Wrocław county the soil radon concentration varied from 0.102 to 0.266 kBq m-3 and soil permeability form very low (impossible to measure by used equipment) to 2*10-13m2.

Research project supported by program „Excellence initiative – research university” for years 2020-2026 for University of Wrocław

How to cite: Tchorz-Trzeciakiewicz, D.: Variations of soil gas radon concentrations in a typical building plot area - preliminary results, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-3352,, 2024.

EGU24-4664 | Orals | NH9.11

Gamma spectroscopy for geological studies 

Rares Suvaila

Gamma Ray spectroscopy is used in a large number of interdisciplinary applications, providing information on the identity of radioactive nuclides and allows their quantitative determination. 

Gamma Rays are electromagnetic radiations of nuclear origin and their detection is not a direct one, as it depends on the production of secondary particles which can be collected together to produce an electric signal.

Of all detector types, we prefer semiconductor ones, particularly Hyper-Pure Germanium detectors, which have very high efficiencies and excellent energy resolution. Following the sample type, occasionally the computerized analysis of the spectra has to be adapted or customized. The enormous differences between the environmental samples we need to face (from air filters to sediment, water to organic matter) drove us to develop protocols which have a general structure/pattern/methodology, but different approaches when it comes to treat the different matrices, would they be homogenous or not.

The opposite extremes in terms of use of Gamma Ray spectroscopy are the low and high count rate systems. Our job is to evaluate limits, to adapt to the statistical conditions, to calculate correction factors in order to get the results as close as possible to the reality.

Among our strengths there are various non standcard protocols, but also the use of information from the sum (coincident) peaks in order to acknowledge source activity and volume distribution; if the study is based only on the simple gamma peaks, the only information one would get is a large domain of possible positions of the source, without clear activity information. Another important topic is the information on the source homogeneity which is given by the count rates for peaks of different nature.

Our work is mainly experimental; most of the experiments are meant to be performed in the laboratory, as an interdisciplinary approach to nuclear and environmental science. One very important issue to consider in this field is the necessity to adapt to the changing radiation background, no matter the origins of the modifications. Also, the possibility of performing in situ gamma spectrometry is not to be neglected, as it offers multuple benefits, as on the spot analysis, quick tests, feasibility studies, accident dosimetry or simply mapping.

Additionally, we perform neutron activation on the samples, which means we can get the initially non-emitting nuclei to de-excite by gamma radiation: following neutron capture, the activated nuclei disintegrate by a beta process and subsequently emit characteristic gamma radiation, which helps un identify initially "silent" isotopes, bringing precious additional information.


Our results obtained experimentally and by Monte Carlo simulations in hypothesis testing of homogeneity properties and/or hot spots in volume sources are now being patented. Also, we seek to develop the quantum correlated gamma spectroscopy field, as it is emerging with new possibilities of treating entangled photons from environmental materials and specimens. Our main purpose for this event is to seek for partnership opportunities accross Europe.


How to cite: Suvaila, R.: Gamma spectroscopy for geological studies, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-4664,, 2024.

EGU24-5787 | ECS | Posters on site | NH9.11

From the collective to the individual radon risk exposure: an insight in the current European regulation 

Eleonora Benà, Giancarlo Ciotoli, Peter Bossew, Eric Petermann, Luca Verdi, Claudio Mazzoli, and Raffaele Sassi

Radon (222Rn) is a radioactive gas considered the major source of ionizing radiation exposure for the population and represents a significant health risk when it accumulates indoor environments. In Europe the regulation has been implemented in order to address the issue of indoor radon exposure, including pose national reference levels and the identification of the so-called Radon Priority Areas (RPAs). Although the European directive states that RPAs are defined as those areas where the annual average Indoor Radon Concentrations in a significant number of dwellings is expected to exceed the reference level the concept and interpretation of “significant number of buildings” in the European Directive remained unclear. According to this idea, radon is classified as an anthropogenic hazard since it has a strong correlation with IRC. However, indoor radon levels can vary significantly at the municipal level also among neighbouring dwellings, mostly due to differences in building characteristics and inhabitants’ habits. Since in this way the radon natural origin may be bypassed, many authors (mostly geologists) propose to use the Geogenic Radon Potential (GRP) as a hazard indicator. The GRP represents the amount of radon that can potentially influx within buildings from geogenic sources. Being the radon hazard and risk concepts still debated, in the last year, researchers proposed a clear transition from the radon hazard to the more comprehensive radon risk concept proposing that mapping this geo-hazard (GRP) is a fundamental step to define the collective radon risk exposure. The Collective Risk Areas (CRAs) are composed by many possible little Individual Risk Areas (IRAs). Considering that the radiation protection aimed to reduce the detriment, radon abatement policies have to take care of these CRAs not forgetting areas with high individual risk in order to protect individuals from high exposure. On the one hand the collective risk areas have proposed as geological-based risk areas; on the other hand, the individual risk areas are strictly linked to the Indoor Radon Concentration (IRC) and may be assimilated to the “classical” RPAs concept. Considering the absence of an unambiguous methodology at the European scale to define the RPAs and the proposed CRAs mapping as the first step to define the IRAs (“classical” RPA), with this work we aimed to lay the foundation to create a definitive methodology for the individual risk-based RAPs mapping considering, first of all, the number of people involved. The test area chosen for this study is the Bolzano province (Italy) due to the high availability of potential predictors variables and a detailed IRC survey campaign on the entire provincial territory. Starting from this we proposed the first IRAs map (i.e., the first individual risk-based RPAs definition) using a set of Machine Learning techniques allowing to connect and validate the geo-hazard with real IRC measured in the province, with the aim to predict both the collective risk and the possible individual detriment as required by the European regulation.

How to cite: Benà, E., Ciotoli, G., Bossew, P., Petermann, E., Verdi, L., Mazzoli, C., and Sassi, R.: From the collective to the individual radon risk exposure: an insight in the current European regulation, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-5787,, 2024.

Brazil is envisaging a large scale plan for indoor radon assessment. Radon levels shall be mapped and priority areas identified. Given the size of the country and its diversity in natural and socio-economical respects, this is a challenging project. Pilot studies and local surveys have been performed in the past but no country-wide assessment exists.

In November 2023, the IAEA organized a workshop on radon survey planning in Poços de Caldas, Minas Gerais, Brazil, to support the project. The objective was to identify items which have to be resolved before starting the actual experimental, i.e., field and laboratory work; so to speak, asking the right questions beforehand to render work as efficiently as possible. Experts from several scientific disciplines related to radon participated (physics, statistics, geology, geography, radiology, national demographic database management, etc.). Among the questions which result from experiences with past surveys, are:

  • Which is the objective of the survey? (Assessment of radon hazard, of collective risk, of detriment attributable to radon, decision base for mitigation action, etc.)
  • Which is the target quantity? (Mean concentration in living rooms over an area, probability to exceed a reference level within an area, status of an area as priority area, etc.)
  • Which is the mapping support, i.e., the geographical area to which a value of the target quantity shall be assigned? (Municipality, administrative region, geological unit, grid cell, etc.)
  • Which spatial estimation strategy is chosen: design based (inference only from radon measurements) or model based (inference from predictor quantities such as geology or ambient dose rate)?
  • How to generate a representative sampling scheme, and how to verify it?
  • In case of a design based strategy: which sample size is required to achieve a given accuracy of the result? More generally: which information is necessary to establish an uncertainty budget of the target quantity?
  • How should an operational database be structured, which metadata should be included?
  • How should a "cooking recipe" look like, which generation of new data should follow? ("Bottom-up harmonization") How can existing data be integrated into the database ("Top-down harmonisation")?
  • How can experiences gained during pilot and local projects be transferred and "upscaled" to different environments and larger regions?
  • How should a QA/QC scheme look like, appropriate to the project?

These questions, some of which are by no means trivial, should be thoroughly discussed and answered before actually starting a survey. Some of them will be addressed in the presentation.


How to cite: Bossew, P. and Da Silva, N.: Designing an indoor radon survey - results of a recent IAEA workshop on survey planning in Brazil, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-6182,, 2024.

EGU24-7604 | Orals | NH9.11 | Highlight

Long-term atmospheric radon measurements and their connection with environmental conditions 

Sebastian Baumann, Valeria Gruber, Joachim Gräser, and Dietmar Roth

Radon is a radioactive noble gas. Accumulated indoors it is a large source of radiation exposure. Atmospheric radon can be used as a tracer for greenhouse gases and for atmospheric modelling.

We analyzed long-term (> 10 years) time series of atmospheric radon (Rn-222 and Rn-220) at 15 locations in Austria and neighboring countries. The measured concentrations are equilibrium-equivalent concentrations (EEC), where decay products of radon are measured on air filters with a PIPS-detector. Other parameters as ambient dose rate and weather data (wind, rainfall and precipitation) are measured at the same location. Additional for one year the atmospheric radon concentration was measured directly with a different measurement system (Alphaguard) at three locations.

The analysis of the EEC showed that the temporal variation of atmospheric radon (Rn-222, Rn-220) depends on meteorological parameters. Seasonal and diurnal variations are linked to the stability of atmospheric layers. Under stable weather conditions higher radon concentrations occur. Correlation of the radon concentrations were found primarily with temperature and wind speed. At temperatures below 0 °C, Rn-220 shows very low concentrations and a different behavior than Rn-222. This reduction of Rn-220 availability could be associated with frozen or snow-covered soils.

The additional measurements (Alphaguard) of atmospheric radon concentrations provided plausible long-term averages, although individual measurements can provide implausible values (e.g. negative values). The temporal patterns of the two measurement systems are very similar, and the atmospheric radon concentrations are predominantly higher than the EEC.

A connection of the long-term average values of the atmospheric radon and the radon potential of an area was found, by comparing atmospheric radon concentrations with indoor radon measurements and predictions of the radon potential in Austria. This indicates that the radon potential determines the average level of the atmospheric radon concentrations and weather conditions temporally modulate the atmospheric radon concentrations around this level.

This work is supported by the federal ministry of Austria for climate action and the project RadoNORM, which has received funding from the Euratom research and training programme 2019-2020 under grant agreement No 900009.

How to cite: Baumann, S., Gruber, V., Gräser, J., and Roth, D.: Long-term atmospheric radon measurements and their connection with environmental conditions, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-7604,, 2024.

EGU24-8408 | ECS | Posters on site | NH9.11

Indoor 222-Rn Modeling in Data-Scarce Regions: An Interactive Dashboard Approach for Bogotá, Colombia 

Martín Dominguez Duran, María Angélica Sandoval Garzón, and Carme Huguet

Radon (222Rn) is a naturally occurring gas that represents a health threat due to its causal relationship with lung cancer. Despite its potential health impacts, several regions have not conducted studies, mainly due to data scarcity and/or economic constraints. This study aims to bridge the baseline information gap by building an interactive dashboard that uses inferential statistical methods to estimate indoor radon concentration’s (IRC) spatial distribution for a target area. We demonstrate the functionality of the dashboard by modelling IRC in the city of Bogotá, Colombia, using 30 in situ measurements. IRC measured were the highest reported in the country, with a geometric mean of 91 ±14 Bq/m3 and a maximum concentration of 407 Bq/m3. In 57 % of the residences RC exceeded the WHO's recommendation of 100 Bq/m3. A prediction map for houses registered in Bogotá’s cadastre was built in the dashboard by using a log-linear regression model fitted with the in-situ measurements, together with meteorological, geologic, and building specific variables. The model showed a cross-validation Root Mean Squared Error of 56.5 Bq/m3. Furthermore, the model showed that the age of the house presented a statistically significant positive association with RC. According to the model, IRC measured in houses built before 1980 present a statistically significant increase of 72 % compared to those built after 1980 (p-value = 0.045). The prediction map exhibited higher IRC in older buildings most likely related to cracks in the structure that could enhance gas migration in older houses. This study highlights the importance of expanding 222Rn studies in countries with a lack of baseline values and provides a cost-effective alternative that could help deal with the scarcity of IRC data and get a better understanding of place-specific variables that affect IRC spatial distribution.

How to cite: Dominguez Duran, M., Sandoval Garzón, M. A., and Huguet, C.: Indoor 222-Rn Modeling in Data-Scarce Regions: An Interactive Dashboard Approach for Bogotá, Colombia, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-8408,, 2024.

EGU24-9434 | ECS | Posters on site | NH9.11

A combined approach for the correlation between indoor radon and geological background: application in the western Ligurian Alps (Italy) 

Linda Bonorino, Gianluca Beccaris, Paola Bisi, Paolo Chiozzi, Andrea Cogorno, Elga Filippi, Riccardo Narizzano, Sonja Prandi, and Massimo Verdoya

Radon (222Rn) is one of the most common naturally occurring radioactive elements and is particularly interesting to environmental issues, for it is considered a carcinogenic gas. It is a decay product of 238U, contained in most rocks and soils, and can easily escape from the ground to accumulate in closed spaces where it may become dangerous. The knowledge of its potential is vital to urban development plans and to protect people from potential hazards. We recently conducted monitoring campaigns in Liguria (NW Italy) to investigate the relations between the observed indoor radon concentrations and the geo-lithological background. We focused on the geological units of the Western Alps, characterized by various lithotypes, ranging from sedimentary to metasedimentary and metavolcanic rocks. The natural gamma radiation was measured on outcrops. Spectrometric measurements indicated that metamorphic acid rocks have the highest specific activity values of 238U (75-85 Bq/kg). In metasedimentary rocks, quartz and mica schists show the highest concentration of 238U, with an average specific activity of 56 Bq/kg. Sedimentary rock types are characterized by average specific activities < 40 Bq/kg., The dosimetric indoor surveys highlighted that about 40% of the investigated public and private buildings show indoor radon values above 200 Bq/m3. These preliminary campaigns revealed a relationship between the uranium content of the bedrock and the indoor radon. The correlation can be used to predict the geogenic radon potential based on a geological background when dosimetric data are few or scattered. In this paper, we refined our early analysis by integrating the dataset with further spectrometric and indoor dosimetric records, which were also coupled with soil radon measurements. The radon concentration in soil was investigated focusing on the sites where the previous monitoring campaigns showed high indoor radon concentrations. Soil radon was recorded at depths between 50 and 80 cm, where radon diffusion from the ground to the buildings very likely occurs. Soil radon concentrations substantially agree with spectrometric measurements. The largest concentration of 222Rn was found in the soils on more acid metamorphic rocks (porphyroid and porphyric shists) with values of about 100 kBq/m3. The lowest values about (20 kBq/m3) were recorded in soils occurring in sedimentary rocks. Despite the limitations and uncertainties, mainly related to the uneven data coverage and the complex interaction between the building and the bedrock, the combined techniques can identify areas of potentially high indoor radon concentrations.

How to cite: Bonorino, L., Beccaris, G., Bisi, P., Chiozzi, P., Cogorno, A., Filippi, E., Narizzano, R., Prandi, S., and Verdoya, M.: A combined approach for the correlation between indoor radon and geological background: application in the western Ligurian Alps (Italy), EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-9434,, 2024.

EGU24-10152 | ECS | Posters virtual | NH9.11

Exploring the hydrothermal vent field of Milos Island in Aegean Seausing novel radiation instrumentation 

Georgios Siltzovalis, Ioannis Madesis, Varvara Lagaki, Theodoros J. Mertzimekis, Pavlos Krassakis, Stavroula Kazana, and Konstantinos Nikolopoulos

Radioactivity monitoring in the marine environment exhibits various challenges. First and foremost, the water-induced attenuation substantially limits the detection ability and range of the sensors. Additionally, the harshness and remoteness of underwater locations pose significant obstacles to existing technological solutions towards dense and extended radioactivity mapping of the oceans. The highly ambitious EU FET Proactive Research Programme RAMONES (Radioactivity Monitoring in Ocean Ecosystems) is aiming towards overcoming existing limitations by developing and deploying novel underwater radiation-sensing instruments, enabling direct correlation of marine radioactivity with underwater geological and geochemical processes.

The present study will focus on the analysis of experimental data collected during field experiments conducted in the extended hydrothermal vents of Milos, an island located on the south Aegean Sea that is part of the Hellenic Volcanic Arc. The shallow active hydrothermal system of Milos is associated with calc-alkaline volcanic rocks from basaltic andesites to dacites, and rhyolites that have been deposited over several cycles of volcanic activity. Novel portable γ-detectors based on lightweight CdZnTe crystals, were deployed to acquire in situ measurements from coastal locations at the eastern part of the island. Complementary sediment samples were collected to offer baseline NORM (Naturally Occurring Radioactive Material) levels from Milos Island having attracted a lot of attention recently due to its role as a potential geohazards source. These measurements are used to benchmark the γ spectrometers and prepare them for underwater operation aboard autonomous underwater gliders. Collected data will feed a prototype Risk Information System (RIS) titled as POIS2ON (PrOtotype Information System for SOcioecoNomic stakeholders). POIS2ON database will include datasets accompanied by geoinformation to be visualized though NORM levels heat maps, as well as support detailed Monte Carlo simulations to evaluate the radiation doses on local marine ecosystems.

How to cite: Siltzovalis, G., Madesis, I., Lagaki, V., Mertzimekis, T. J., Krassakis, P., Kazana, S., and Nikolopoulos, K.: Exploring the hydrothermal vent field of Milos Island in Aegean Seausing novel radiation instrumentation, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-10152,, 2024.

EGU24-12397 | ECS | Posters on site | NH9.11

Cross-Ventilation Strategies for Efficient Indoor Radon Reduction: Experimental Data and CFD Simulations 

Diana Altendorf, Henning Wienkenjohann, Florian Berger, Jörg Dehnert, Michal Duzynski, Hannes Grünewald, Dmitri Naumov, Ralf Trabitzsch, and Holger Weiß

Naturally occurring radon-222 (Rn) is a widespread indoor air pollutant, posing a potential health risk for humans, particularly elevating the risk of lung cancer in indoor living and working spaces. One highly promising solution for existing buildings, requiring relatively minimal technical effort to reduce indoor radon, is the installation of a ventilation system.

As a proof of concept, a series of different ventilation experiments, utilising a decentralised ventilation system with heat recovery (inVENTer GmbH, Germany) were performed in an unoccupied ground-floor flat in Bad Schlema (Germany).

The flat was divided into three individually controllable ventilation zones using strategically positioned ventilation devices, controlled by a novel real-time measurement system for indoor radon activity concentration [Rn] (Smart Radon Sensors by SARAD GmbH, Germany) in each room. This innovative approach to eliminate indoor radon by employing [Rn] as a control parameter enabled automated switching between different ventilation modes or the option to deactivate the system entirely.

Over three years, the different ventilation experiments successfully reduced elevated indoor radon levels from up to 7000 Bq/m³ to 300 Bq/m³ and below. The effectiveness varied based on factors such as the initial room-specific radon levels before each experiment, the performance level of the fans and meteorological parameters.

Furthermore, we developed a true-to-scale three-dimensional Computational Fluid Dynamics (CFD) model based on the actual flat, enabling the quantitative interpretation of various ventilation experiments within a CFD environment. The CFD model utilised a stationary k-ε turbulent flow model to simulate ventilation-induced airflow inside the flat and was coupled with a transient transport model for radon simulation.

For the development of the CFD model, the "Cross-Ventilation" experiment was chosen. This experiment successfully achieved a room-specific reduction of indoor radon levels from approximately 3,000 Bq/m³ to about 300 Bq/m³. To precisely capture the impact of ventilation on indoor radon, the initial radon values for each room were utilised as initial conditions for the transient radon transport model.

Base case results showed an overestimation by the model in radon level reduction due to ventilation. Parameter adjustments of the inflowing radon and the airflow velocity at the inlet resulted in good agreement between experimental values and the CFD model's outcome.

In summary, this study highlights CFD modeling as a versatile tool for evaluating and optimising ventilation systems, offering valuable insights into the mechanism of managing the air quality in complex real-world indoor environments with elevated radon levels.

How to cite: Altendorf, D., Wienkenjohann, H., Berger, F., Dehnert, J., Duzynski, M., Grünewald, H., Naumov, D., Trabitzsch, R., and Weiß, H.: Cross-Ventilation Strategies for Efficient Indoor Radon Reduction: Experimental Data and CFD Simulations, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-12397,, 2024.

EGU24-12506 | ECS | Orals | NH9.11

Rapid field measurement of uranium in water samples  

Katalin Hegedűs-Csondor, Heinz Surbeck, Petra Baják, and Judit Mádl-Szőnyi

We present an analytical method that allows for the rapid measurement of uranium in water samples. For a 50 ml sample concentrations down to about 2 micro-g/l can be measured within an hour. There are no toxic chemicals used and the whole equipment is portable and can be powered by a 12 V battery. The preparation consists of adding 200 mg silica gel to the 50 ml sample, stirring for 1 hour, filtering out the silica gel and transferring it to a semi-micro cuvette for the measurement. Several samples can be prepared in parallel, depening on the number of magnetic stirrers available. The measurement takes only 1 minute and uses the uranyl fluorescence, enhanced by the adsorption on silica gel. Excitation is done by a pulsed UV-LED at 285 nm. The delayed fluorescence signal around 520 nm is detected by a 6 mm x 6 mm Silicon Photomultiplie (SiPM) behind a 520 nm bandpass filter. Pulsing the LED, converting the SiPM output and displaying the result is controlled by an Arduino microprocessor. All details of the experimental setup as well the software code are presented. It's open source, open to be copied and the whole material costs are only around 500 Euro.

How to cite: Hegedűs-Csondor, K., Surbeck, H., Baják, P., and Mádl-Szőnyi, J.: Rapid field measurement of uranium in water samples , EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-12506,, 2024.

EGU24-12663 | ECS | Posters on site | NH9.11

Preliminary results of two-dimensional multicomponent reactive transport modelling to understand the controlling factors on uranium mobility in a siliciclastic aquifer in Hungary 

Petra Baják, Daniele Pedretti, András Csepregi, Muhammad Muniruzzaman, Katalin Hegedűs-Csondor, and Anita Erőss

In Hungary, the drinking water supply relies upon groundwater resources of up to 98%. As a drinking water resource, groundwater must meet strict quality requirements in order to minimise any health effects arising from daily water consumption. Water-rock interactions enrich groundwater not only with essential elements (e.g. Ca, Mg) but also with undesired substances such as heavy metals or radioactive elements. In the last few years, a thorough drinking water quality monitoring campaign was carried out in Hungary, revealing that some parts of the country are characterised by relatively high uranium concentrations. The causes of these elevated activities have not been properly investigated, yet. However, understanding the controls of the release and mobility of uranium is critical in proper groundwater management.

Baják et al (2022) developed a one-dimensional (1-D) geochemical model using the code PHREEQC (Parkhurst and Appelo, 2013) to examine the processes that determine the fate of uranium in the siliciclastic Miocene-Quaternary aquifer system near Velence Hills, some 50 km off Budapest. Here, the geological build-up (granitic rocks on the surface) favours the high uranium content in groundwater, which is characterised by oxidising conditions. The 1-D model included redox-controlled kinetic reactions as well as other potential uranium-controlling processes (e.g., surface complexation). The results suggested that uranium distribution is sensitive to redox changes in the aquifer and its mobility in groundwater especially depends on the residence time of water compared to the reaction times controlling the consumption of oxidising species.

This study introduces a two-dimensional multicomponent reactive transport model developed using the PHT3D code (Prommer et al., 2003), which is a coupling between MODFLOW and PHREEQC. The model builds on and extends the capability of the 1-D model to simulate uranium mobility across the multiple flow paths of the aquifer systems. The model calibration accounts for 30 groundwater samples collected from drinking water wells in the study area. Physico-chemical parameters (temperature, pH, specific electric conductivity, redox potential) were measured on-site, and the samples were analysed for natural tracers (δ16O, δ2H, 234U, 238U, 226Ra) to gain further insight into the geochemical processes of the aquifer system.

This research was supported by the ÚNKP-23-4 New National Excellence Program of the Ministry for Culture and Innovation from the source of the National Research, Development and Innovation Fund and was supported by the János Bolyai Research Scholarship of the Hungarian Academy of Sciences. The research is part of a project which was funded by the National Multidisciplinary Laboratory for Climate Change, RRF-2.3.1-21-2022-00014.


Baják, P., Csondor, K., Pedretti, D., Muniruzzaman, M., Surbeck, H., Izsák, B., Vargha, M., Horváth, Á., Pándics, T., Erőss, A., 2022. Refining the conceptual model for radionuclide mobility in groundwater in the vicinity of a Hungarian granitic complex using geochemical modeling. Applied Geochemistry 137, 105201.

Parkhurst, D.L., Appelo, C.A.J., 2013. Description of Input and Examples for PHREEQC Version 3—A Computer Program for Speciation, Batch-Reaction, One-Dimensional Transport, and Inverse Geochemical Calculations. (USGS Technical No. 6(A)43). U.S. Geological Survey, Denver, CO, USA.

Prommer H, Barry, D.A., Zheng, C. (2003). MODFLOW/MT3DMS based reactive multi-component transport modeling. Ground Water, 41(2).

How to cite: Baják, P., Pedretti, D., Csepregi, A., Muniruzzaman, M., Hegedűs-Csondor, K., and Erőss, A.: Preliminary results of two-dimensional multicomponent reactive transport modelling to understand the controlling factors on uranium mobility in a siliciclastic aquifer in Hungary, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-12663,, 2024.

Understanding the temporal and spatial distribution of soil water content (SWC) is critical for efficient water resource management in agriculture. However, the variability of SWC over time and space presents challenges in obtaining accurate values at field scale using conventional methods. Proximal gamma-ray spectroscopy (PGRS), supported by adequate calibration and biomass corrections, emerge as promising methods for monitoring SWC. The inverse correlation between the gamma counts of the radioisotope 40K (1461 KeV) and volumetric SWC (m3/m3) demonstrates potential for reliable soil moisture estimation in agricultural and hydrological applications. This contribution examines the potential application of a portable sodium iodide (NaI) scintillation detector (PGRS) for estimating SWC in an irrigated wheat field. We explore the sensitivity of the 40K variations to changes in soil moisture and detector height. Over the last two months of the growing season, several one-hour manual monitoring surveys were conducted to capture the effect on 40K signal of irrigation and soil moisture status before and after the harvesting. In each survey, total counts of 40K were recorded using a NaI detector positioned at different elevations above the ground in the middle of a wheat field. Preliminary results indicate a general correlation between 40K (cps) and SWC throughout the study period, suggesting the sensitivity of the PGRS detector to SWC variations. Our findings show a slight increase in 40K counts by decreasing the detector height for all the field surveys conducted. In addition, we observed that the lowest counts of 40K were recorded during the survey with the highest soil water content after irrigation. We can conclude that 40K signal is sensitive to both changes in SWC and the height position of the detector. Furthermore, this detector offers a significant advantage, as it not only captures data on the 40K peak but also analyses the full gamma spectrum.

How to cite: Catalá, A., Navas, A., and Gaspar, L.: Assessing the variability of 40K measurements using a portable gamma-ray spectroscopy in an irrigated agricultural field (Spain), EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-12700,, 2024.

EGU24-15380 | Posters virtual | NH9.11

Application of machine learning methods to improve the radon deficit technique 

David Lorenzo, Fernando Barrio-Parra, Humberto Serrano-García, Miguel Izquierdo-Díaz, and Eduardo De Miguel

The Radon deficit technique is a promising screening method for identifying and mapping potential subsurface organic pollution hotspots and thus, for the optimization of intrusive characterization campaigns. Radon (222Rn) a naturally procuded radionucleid and particularly suitable for use as a natural tracer due to its preferential partitioning with non aqueos phase liquids (NAPLs) and and ease of in situ analytical detection (Kram et al., 2001). The ability of the 222Rn technique to locate organic pollution hotspots and provide a semiquantitative analysis has been widely assessed in sites affected by NAPLs (De Miguel et al., 2018, De Miguel et al. 2020). However, the Radon measurement is affected by several confounding factors, such as variations in soil water saturation and ground-level temperature. Machine learning can be used to study and model these confounding factors and improve the interpretation of in situ radon analytical information.

Machine learning is a class of statistical techniques that have proven to be a powerful tool for modelling the behaviour of complex systems in which response quantities depend on assumed controls or predictors in a complicated way (Janik, 2018). The first purpose of this work is the application of machine learning to analyse sampled data of time series outdoor 222Rn. The algorithms "learn" from complete sections of multivariate series (containing measurements of soil water content, soil temperature and meteorological information), derive a dependence model. The model trained in this work can be used to improve the accuracy and reliability of the radon deficit technique, making it a more valuable tool for identifying and mapping subsurface contamination.


De Miguel, E., Barrio-Parra, F., Elío, J., Izquierdo-Díaz, M., Jerónimo, García-González, E., Mazadiego, L.F., Medina, R., 2018. Applicability of radon emanometry in lithologically discontinuous sites contaminated by organic chemicals. Environ. Sci. Pollut. Res. 25, 20255–20263.

De Miguel, E., Barrio-Parra, F., Izquierdo-díaz, M., Fernández, J., García-gonzález, J.E., 2020. Applicability and limitations of the radon-deficit technique for the preliminary assessment of sites contaminated with complex mixtures of organic chemicals: a blind field-test. Environ. Int. 138, 105591. 105591.

Janik, P. Bossew, O. Kurihara, 2018,Machine learning methods as a tool to analyse incomplete or irregularly sampled radon time series data, Scie. Tot. Environ.,630, 1155-1167,

Schubert, M., 2015. Using radon as environmental tracer for the assessment of subsurface non-aqueous phase liquid (NAPL) contamination – a review. Eur. Phys. J. Spec. Top. 224, 717–730.

How to cite: Lorenzo, D., Barrio-Parra, F., Serrano-García, H., Izquierdo-Díaz, M., and De Miguel, E.: Application of machine learning methods to improve the radon deficit technique, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-15380,, 2024.

EGU24-16925 | ECS | Posters on site | NH9.11

Deciphering Radon Variability in the Northern Upper Rhine Graben: An Analysis Using Passive and Active Detection with Random Forest Modelling 

Johannes Mair, Eric Petermann, Rouwen Lehné, and Andreas Henk

This study, conducted about 30km south of Frankfurt in the Northern Upper Rhine Graben, focuses on deepening the understanding of Radon concentrations in soil air. The selected area, where neotectonic activity was proven in an accompanying project, provides an ideal setting for investigating Radon variability, particularly its potential correlation with fault zones in unconsolidated rocks or sedimentary basins. Understanding the factors influencing Radon levels in the environment is a complex task, as they are affected by a multitude of variables. Our work aims to decipher these influences and, if possible, quantitatively analyse the contributions of each variable. By doing so, we hope to gain a clearer understanding of how different environmental factors interact to determine Radon levels.

A central element of our research is the use of Random Forest models, chosen to handle our multidimensional dataset. This dataset includes a variety of parameters such as Radon measurements, nuclide content, soil grain sizes, weather data, and the distance to fault zones. Random Forest models are particularly effective for this type of complex data because they can analyse many different factors at once and uncover hidden patterns.

Contrary to initial hypotheses, our findings indicate that in unconsolidated rocks and sedimentary basins, the grain size of soil is the most influential factor in determining soil air Radon levels, closely followed by soil moisture. These results challenge the previously held belief that fault zones are the primary influencing factors on Radon concentrations in these geological settings.

How to cite: Mair, J., Petermann, E., Lehné, R., and Henk, A.: Deciphering Radon Variability in the Northern Upper Rhine Graben: An Analysis Using Passive and Active Detection with Random Forest Modelling, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-16925,, 2024.

EGU24-17300 | ECS | Posters on site | NH9.11 | Highlight

Investigating the sensitivity of flux maps in simulating radon concentrations at greenhouse gas monitoring sites 

Adam Howes, Dafina Kikaj, Edward Chung, Ute Karstens, Alistair Manning, Stephan Henne, Angelina Wenger, Grant Foster, Simon O'Doherty, Chris Rennick, and Tim Arnold

Given its unique properties as a radioactive chemically inert gas, radon can act as a valuable atmospheric tracer, for evaluating the performance of atmospheric transport models to calculate the sources of trace gases to the atmosphere. A radon flux map is the scientific starting point for simulating atmospheric radon concentrations using atmospheric transport models. As such, it is important to assess the available high resolution radon flux maps to ensure that simulated concentrations can be accurately interpreted. The spatial fluxes of radon primarily depend on soil and rock types, while temporal variations are influenced by soil moisture content.

The recent advancements in generating two high-resolution radon flux maps for Europe using two different soil moisture reanalysis, GLDAS Noah and the ERA5 maps1, have significantly enhanced our understanding of radon flux dynamics. Yet, the radon flux values diverge notably between these two maps and sometimes these variations can be substantial, with differences as large as the absolute radon flux itself.

In our work, two available versions of European radon flux maps are coupled with two Lagranian particle dispersion models – the Met Office’s Numerical Atmospheric Modelling Environment (NAME) and the FLEXPART model – are used to simulate radon concentrations measured at four tall tower sites in the United Kingdom: Heathfield, Ridge Hill, Tacolneston and Weybourne. We calculate the differences between the modelled radon concentrations to the observed radon concentrations at these sites and use this to investigate the sensitivity of two radon flux maps: GLDAS Noah and ERA5.


References: 12022:

How to cite: Howes, A., Kikaj, D., Chung, E., Karstens, U., Manning, A., Henne, S., Wenger, A., Foster, G., O'Doherty, S., Rennick, C., and Arnold, T.: Investigating the sensitivity of flux maps in simulating radon concentrations at greenhouse gas monitoring sites, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-17300,, 2024.

EGU24-17369 | Posters on site | NH9.11 | Highlight

INGV experience on radon monitoring in the Ciampino Municipality (Rome, Italy): a link between research and territory 

Alessandra Sciarra, Luca Pizzino, Gianfranco Galli, Daniele Cinti, Giancarlo Ciotoli, and Sabina Bigi

Ciampino area has been the subject, from 1999 onwards, to reiterated geochemical surveys on soil-gas, spring waters and groundwater, commissioned by the municipality to INGV (National Institute of Geophysics and Volcanology). Indeed, this area is affected by huge CO2 emissions of volcanic origin and high levels of indoor radon. Both gases can constitute a big concern for local population known as Natural Gas Hazard (NGH). Accordingly, the distribution of the two gases in groundwater, soils and indoor buildings must be assessed in order to define sectors of the territory more exposed to NGH.
Interest in the Natural Gas Hazard arose mainly starting from November 1995, when several homes, basements and wells were affected by widespread exhalations, to the point of danger to human health.
The most area affected is characterized by abundant and concentrated gas leaks which caused the death of 29 cattle and some sheep in September 1999 and March 2000, until December 2000 when a paroxysmal episode caused the death of a man.
The main activities carried out in the last 25 years have concerned:
-    sampling of water sites (about 100 natural springs, public and private wells), measuring chemical-physical parameters, CO2 and 222Rn contents;
-    monthly indoor radon measurements (around 500/year) in 14 selected sites (both private homes and workplaces, including schools);
-    measurements of radon in soils (about 300) to identify the areas with the greatest degassing and the possible relationship with existing tectonic structures;
-    continuous indoor radon measurements in a selected home;
-    spot measurements in groundwater and intervention in the event of reports from the municipality and/or private citizens of emergency situations resulting from gaseous emanations falling in areas of the municipal territory of Ciampino.
The data obtained include measurements of flux and concentration of soil gases, distribution of pCO2 and radon in groundwater, radionuclide content in soils from different geological units, indoor radon measurements.
All this data has allowed us to define the sectors at greatest risk, by identification and delimitation of NGH risk areas. Dissemination and information activities on the NGH were carried out through public meetings, seminars and the drafting of brochures. Also training activities for the staff of the Civil Protection and Environment Offices of the Municipality were performed.
The experience gained has allowed the participation of INGV in a European project Life Respire for the monitoring and remediation of the radon problem.
Based on the distribution of the different samples collected: soil gas, terrestrial gamma dose rate and rock/soil samples by radionuclide content, we were able to provide the local authorities the map of the geogenic potential of radon for the whole municipal territory.

How to cite: Sciarra, A., Pizzino, L., Galli, G., Cinti, D., Ciotoli, G., and Bigi, S.: INGV experience on radon monitoring in the Ciampino Municipality (Rome, Italy): a link between research and territory, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-17369,, 2024.

Fumaroles spread out several elements to the atmosphere and may include radon that contributes to environmental radioactivity. The long-lasting vigorous gaseous emissions of the Campi Flegrei volcanic caldera, i.e., Solfatara and Pisciarelli, occur in densely inhabited areas of Naples where the population may be exposed to ionizing radiation from 222-radon. In 2021, we started a study on radon levels from the Solfatara and Pisciarelli fumaroles by using the RAD7 commercial detector, one of the most widely used instruments for measuring 222Rn, either dissolved in water or in soil gas. However, the local high H2S levels and hot temperatures did not allow direct measurements of Rn, resulting in the instrumentation (RAD7) damage. Thus, we developed a proper technique for sampling and measuring radon gas from fumarolic gases in such a “critical” areas to overcome the instrumental issue.

At fumarole sites i.e., Bocca Nuova and Bocca Grande within the Solfatara crater, and Pisciarelli, the gas was periodically sampled in Tedlar® bag of 1 or 3 liters in order to have the possibility to repeat the measurements two or three times to verify the accuracy of the data.

In laboratory, at first, H2S traps were prepared by filling silicone tubes with lead acetate powder, bordered, at both ends, by hydrophilic cotton and closed. Then the fumarole gas was transferred from the Tedlar® bag into a glass tube. Finally, radon gas was measured via a closed loop by using the RAD7. Rn printouts obtained from RAD7 were corrected for the time lag between sampling and measurement. RAD7 and charcoal canister measurements were compared to check the obtained results.

Preliminary results, published in Iovine et al. (2023), demonstrate that the methodology utilized enables the analysis of Rn concentrations even in H2S-bearing gases, discharged from the fumaroles of the Campi Flegrei volcano and, most importantly, without instrumental issues. Fumaroles sampled and analyzed over time according to the methodology adopted, may be suitable for environmental radioactivity assessment and volcanic monitoring purposes as well.


Iovine RS, Avino R, Minopoli C, Cuoco E, Caliro S, Galli G, Piochi M. (2023). A procedure to use the RAD7 detector for measuring 222Rn in soil gases exceeding instrumental limits: an application to chemically aggressive fumaroles of the Campi Flegrei area. Rapp. Tec. INGV, 473: 1­18,

How to cite: Iovine, R. S., Minopoli, C., Avino, R., Caliro, S., Galli, G., and Piochi, M.: Determination of 222Radon (222Rn) from the hot and acidic fumaroles gases to the atmosphere of the highly populated Campi Flegrei caldera (Naples, Southern Italy) by using a RAD7 detector: a procedure overcoming instrumental limits, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-17908,, 2024.

EGU24-18058 | ECS | Orals | NH9.11

Studies on radon time series in various underground environments: Case of abandoned Kővágószőlős uranium mine 

Tóth Szabolcs, Horváth Ákos, and Sajó-Bohus László

Field uranium research began in Hungary in 1947 under the guidance of Hungarian specialists. After the research period, mining plants were opened one after the other, and an ore processing plant was also established. The ore grade found in the Mecsek Mountains was less favorable than average, 1 ton of ore contained 1.2 kg of uranium metal. The characteristic of the uranium ore found in the permian sandstones is that it occurs in several layers and levels, not continuously, but in lenticular spots with varied development. This geological occurence significantly increased the costs. By 1989, Hungarian uranium ore mining had become uneconomical, and a government decision was made to close it down, dating back to 1997. The recultivation process began in 1998. Currently, environmental damage is being eliminated under the title of long-term monitoring. Due to the proximity of inhabited areas, NORM anomalies, and the presence of radon gas, radiation protection played a particularly important role during and after remediation.

The radon monitoring of the abandoned mine cavity system was carried out with active radon monitors placed in different boreholes, closed shafts and adits. In the last two years, a radon soil gas monitoring station has also been operated on a waste rock pile site covered with 1 m of loess cover to check the radon retention capacity of the soil.

For radon detection alpha-sensitive photodiode (sensitive area: 1 cm2) or PIPS detector (sensitive area: 3 cm2) are used. The Dataqua monitoring system gives one impulse per hour for 140 and 56 Bq/m3222Rn concentration, respectively, for the photodiode and PIPS detector. The multi-channel devices beside the radon detector can include other additional sensors for temperature, pressure, humidity, water level, salinity, etc. measurements to study the relation between the variation of radon concentration and other environmental parameters. The radon concentration together with other environmental parameters are continuously recorded with one measurement per hour sampling frequency for several years.

In closed, underground places extremely high radon concentration (a couple of tens up to hundred kBq/m3, may occur in the absence of ventilation, even in rocks of average radionuclide content. According to our measurements both the daily and the yearly variation is well recognizable, which originate from the variation of the meteorological and lunisolar parameters. In the case of a few time series, we revealed a strong correlation between the outside temperature and the resulting radon concentrations.  We found the atmospheric pressure also affects radon levels, but extent and only on a smaller scale than temperature. 

Comprehensive statistics and Fourier analysis were also carried out in order to examine the dominant frequencies, and we also examined the change of the one day long components as a function of time.

How to cite: Szabolcs, T., Ákos, H., and László, S.-B.: Studies on radon time series in various underground environments: Case of abandoned Kővágószőlős uranium mine, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-18058,, 2024.

EGU24-19068 | ECS | Orals | NH9.11

Long-term Evaluation of HPGe Calibration for Environmental Radioactivity Assessment Using IAEA-U and IAEA-Th Sources 

Debora Siqueira Nascimento, Riccardo Ciolini, Andrea Chierici, Stefano Chiappini, Francesco d'Errico, and Massimo Chiappini

The investigation of the dynamics between environmental radioactivity and its implications for human health stands as a fundamental pursuit in contemporary scientific research. Employing the Gamma Spectrometry technique, particularly utilizing High Purity Germanium (HPGe) detectors, emerges as a pivotal methodology to study environmental radioactivity with precision. The veracity and dependability of these analyses hinge upon the scrupulous and precise energy and efficiency calibration of the HPGe system. Within this framework,  we used calibrated IAEA-U and IAEA-Th sources, thereby not only ensuring measurement accuracy but also establishing a robust foundation for comprehensive evaluation of radioactivity levels. Our findings illuminate a comprehensive understanding of the energy and efficiency calibration of the HPGe detector, exemplified by linear relationships in the energy calibration curves for both IAEA-U and IAEA-Th sources, manifesting high correlation coefficients (R² > 0.99). Essential for translating count rates to activity, the efficiency calibration consistently yielded low errors, with the maximum observed efficiency error being less than 4% for both sources, significantly below the recommended by standard rules. This study affirms the reliability and stability of our calibration methods through repeatability assessments over four years. Looking forward, the calibrated HPGe systems are prepared to assume a central role in the spectral analysis of different Italian terrains. Application of these calibrated detectors to Italian soil aims to discern and quantify the presence of radionuclides, thereby contributing into the radioprotection of the region. This prospective dimension underscores the practical application and broader implications of our calibrated systems in addressing environmental and health-related concerns.

How to cite: Siqueira Nascimento, D., Ciolini, R., Chierici, A., Chiappini, S., d'Errico, F., and Chiappini, M.: Long-term Evaluation of HPGe Calibration for Environmental Radioactivity Assessment Using IAEA-U and IAEA-Th Sources, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-19068,, 2024.

EGU24-19881 | Posters on site | NH9.11

Multi-level continuous monitoring of residential radon in the urban contest of Rome 

gaia soldati, maria grazia ciaccio, antonio piersanti, Valentina cannelli, and gianfranco galli

The urbanized area of Rome is largely built over volcanic deposits, characterized by  a significant radionuclides content and radon emanation potential.  A first step towards the mitigation of the indoor radon exposure is the accurate monitoring of workplaces and residential dwellings. Due to the complex interactions among many environmental parameters on different time scales, a proper assessment of radon diffusion dynamics and concentration variations can be better achieved by means of active monitoring approaches. We present here the results of one year of continuous measurements conducted in 6 premises (5 apartments and a basement) at different floors of the same building in the Esquilino district, in the historical center of Rome. The simultaneous tracking of different floors should cancel the influence of geogenic radon and of building characteristics like age, typology, and construction materials, and reveal the characteristics of the gas emanation and transport inside the buildings, and of its temporal fluctuations, with the final goal to select the most suitable preventive measures to reduce radon exposure. Conducting the experiment in the Roman urban contest, we cannot ignore the specificity of the retrieved data, affected not only by endogenous factors like heating and ventilation of the apartments, but also by exogenous factors like the urban heat islands effect.

How to cite: soldati, G., ciaccio, M. G., piersanti, A., cannelli, V., and galli, G.: Multi-level continuous monitoring of residential radon in the urban contest of Rome, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-19881,, 2024.

EGU24-20104 | Orals | NH9.11 | Highlight

Assessing the chemical availability and environmental fate of fallout radionuclides in cryoconite 

Caroline Clason, Harriet Davidson, Geoffrey Millward, Andrew Fisher, and Alex Taylor

Glaciers are stores for contaminants, both local and further afield in origin, that are released into the environment through anthropogenic processes. Cryoconite, a heterogenous granular material commonly found on glacier surfaces, is now known to be an efficient accumulator of contaminants such as fallout radionuclides (FRNs) and potentially toxic elements, with multiple regional studies reporting notable concentrations of radioactivity in cryoconite that far exceeds that which is found in other environmental matrices. Indeed, concentrations of FRNs in cryoconite can be as much as three orders of magnitude higher than those found in nearby proglacial sediments. While we now understand that this ‘hyper-accumulation’ of FRNs is commonplace on glaciers around the world, our understanding of the extent to which release of contaminants stored in cryoconite poses an environmental downstream risk is in its infancy. To assess both the activity concentrations and chemical availability of FRNs within cryoconite, we conducted novel sequential chemical extractions twinned with gamma spectrometry for cryoconite samples from glaciers in Arctic Sweden and Iceland. Major and minor elemental composition of cryoconite was also analysed with Wavelength Dispersive X-ray Fluorescence (WD-XRF) spectrometry. The results of these experiments demonstrate that different cryoconite-bound FRNs undergo varying degrees of solubilization, with consequences for increased contaminant mobilization under higher melt scenarios. Our work identifies a clear requirement for further research in this field in order to improve understanding of downstream environmental risk from the secondary release of legacy contaminants under continued glacier retreat.

How to cite: Clason, C., Davidson, H., Millward, G., Fisher, A., and Taylor, A.: Assessing the chemical availability and environmental fate of fallout radionuclides in cryoconite, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-20104,, 2024.

EGU24-21822 | Orals | NH9.11

Observation and geological interpretation of the longest vertical radon profile to date: variability of radon concentrations along a 323 m deep drilling 

Rouwen Lehne, Jessica Daum, Johannes Mair, Heiner Heggemann, Christian Hoselmann, and Andreas Henk

Radon soil air measurements and associated permeability measurements are a mandatory prerequisite for the calculation of radon potentials as an important basis for the statistical derivation of an expected radon situation in a defined area. Accordingly, in the federal state of Hesse, as almost everywhere in Germany, numerous measurements have been carried out in recent years and made available to the Federal Office for Radiation Protection (BfS) for the modelling of a radon potential map of Germany, which has since been an important (sometimes the only) basis for the definition of radon precautionary areas for all federal states in Germany. The associated benefits are undoubtedly great.

From a geological perspective, however, the question arises to what extent the large lateral variability of measurable radon concentrations also exists in the vertical and, if so, whether this variability can be placed in a context with the geological development of the area under consideration. The background to this is the fact that the radon soil gas measurements usually address a depth of between 0.8 and 1 m below the ground surface, in rare cases reaching a depth of up to 2 metres.

In addition to the scientific added value, such an investigation approach is also associated with an applied benefit, as building foundations are usually founded significantly deeper than 1 m below the ground surface, which means that a significant part of the building envelope in contact not only with the soil layers, but also to the geological subsurface, must be seen decoupled from the radon concentration determined near the surface, depending on the heterogeneity of the geological bedding.

For this reason, we took a total of 175 samples along an 323 m deep research drilling in the northern Upper Rhine Graben and determined the radon concentration for these in the laboratory (= stationary). The results show a very high variability of the measurable radon concentrations, ranging from 16 Bq/m³ to 9086 Bq/m³ with a mean value of approx. 1527 Bq/m³. At the same time, the radon concentrations determined show a very good correlation with both the geological response of the drill core and the gamma log measurements carried out.

In this presentation, we would like to show the results obtained so far and look at the possibility of regionalising the measured values as well as the next work steps.

How to cite: Lehne, R., Daum, J., Mair, J., Heggemann, H., Hoselmann, C., and Henk, A.: Observation and geological interpretation of the longest vertical radon profile to date: variability of radon concentrations along a 323 m deep drilling, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-21822,, 2024.

EGU24-3806 | ECS | PICO | G2.7

A low-cost commercial off-the-shelf GNSS receiver for space 

Gregor Moeller, Alexander Wolf, Flavio Sonnenberg, Gerald Bauer, Benedikt Soja, and Markus Rothacher

The era of tracking artificial Earth satellites using Global Navigation Satellite Systems (GNSS) began in the early 1980s when a GPS receiver was launched onboard the Landsat-4 mission. Since then, a large number of Low Earth Orbiters has utilized constantly improved GPS receivers for timing and positioning. GNSS has become a key technique not only for satellite orbit determination but also for atmosphere sounding. With the increasing popularity of miniaturized satellites in recent years, the need for an adapted GNSS payload for nanosatellites arose. Therefore, we developed a small-size, versatile payload board using commercial-off-the-shelf (COTS) low-cost multi-GNSS receivers with extremely small weight, size, and power consumption.

The receiver firmware enables multi-constellation navigation solutions and GNSS raw data output in space with a sampling rate of up to 20 Hz. With this configuration, we can retrieve the required GNSS code and carrier phase measurements, e.g. for precise orbit and attitude determination, to monitor the total air density from drag, the distribution of the electron content, or scintillation effects. The high demands on GNSS receiver performance lead to particular requirements for hardware, payload software, onboard computing, data downlink, and remote control, which will be briefly discussed in the presentation. The resulting low-cost GNSS board fits into a 0.25U form factor, and the modular design makes it a scalable and adaptable payload for CubeSat missions.

In this presentation, we will provide insight into the performance of the GNSS payload under simulated orbit conditions and highlight the necessary modifications that allow us to transform a COTS GNSS receiver into a scientific instrument for space applications.

How to cite: Moeller, G., Wolf, A., Sonnenberg, F., Bauer, G., Soja, B., and Rothacher, M.: A low-cost commercial off-the-shelf GNSS receiver for space, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-3806,, 2024.

Real-Time Single-Frequency Precise Point Positioning (PPP) is a cost-effective and promising method for achieving highly accurate navigation at sub-meter or centimeter levels. However, its success heavily relies on real-time ionospheric state estimations to correct delays in Global Navigation Satellite System (GNSS) signals. This research employs the Dynamic Mode Decomposition (DMD) model in conjunction with global ionospheric vertical total electron content (vTEC) Root Mean Square (RMS) maps to create 24-hour forecasts of global ionospheric vTEC RMS maps. These forecasts are integrated with C1P forecast products, and the performance of L1 single-frequency positioning solutions is compared across various ionospheric correction models. The study assesses the impact of assimilating predicted RMS data and evaluates the practicality of the proposed approach using the IGRG product. The results demonstrate that the IGSG RMS prediction-based model significantly enhances positioning accuracy for up to five hours ahead, yielding results comparable to alternative models. This approach holds promise for achieving high precision navigation.

How to cite: Reuveni, Y. and Landa, V.: Advancing Real-Time GNSS Single-Frequency Precise Point Positioning through Ionospheric Corrections, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-5282,, 2024.

EGU24-5511 | ECS | PICO | G2.7

New neutral density estimates and forecasts in the framework of project ESPRIT 

Andreas Strasser, Sandro Krauss, Manuel Scherf, Barbara Suesser-Rechberger, and Helmut Lammer

In the ongoing project ESPRIT, a goal is to investigate the contribution of the chemical composition and associated chemical reactions to the Earth’s upper atmosphere. This is realized through a combined analysis of thermospheric neutral density estimates and the exploration of external parameters of the interplanetary space, including variations in the magnetic field and the merged electric field. Regarding changes in the chemical composition of the Earth’s atmosphere, which might cause heating and cooling effects, we investigated TIMED/SABER measurements in conjunction with findings from the 1D first-principles hydrodynamic upper atmosphere model Kompot code, which shows some significant expansion in the density profile mainly based on the increased XUV flux from the Sun. The neutral mass densities were processed based on accelerometer measurements as well as on kinematic orbit information (Süsser-Rechberger et al. 2022). This allowed us to successfully process kinematic orbits for 19 different satellites at an altitude range of approximately 400 to 1300 km. Both approaches are realized using the in-house software package GROOPS. During the evaluation, significant improvements in the processing and parametrization could be achieved compared to previous solutions, especially through refined models for solar radiation pressure, the Earth’s re-radiation, the thermal radiation of the satellite itself and the consideration of the chemical composition of the atmosphere. Based on these new neutral density estimates, investigations regarding the effects of solar eruptions on the various satellites are performed and used for attempting to forecast the orbital decay of LEO satellites.

How to cite: Strasser, A., Krauss, S., Scherf, M., Suesser-Rechberger, B., and Lammer, H.: New neutral density estimates and forecasts in the framework of project ESPRIT, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-5511,, 2024.

EGU24-5567 | PICO | G2.7

Sequential calibration and data assimilation for predicting atmospheric variability 

Ehsan Forootan, Saeed Farzaneh, Masoud Dehvari, Leire Retegui-Schiettekatte, and Maike Schumacher

Estimating global and multi-level variations of the atmospheric variables and being able to predict them are very important for studying coupling processes within the atmosphere, and for various geodetic and space weather applications. These variables include the thermosphere neutral density, the ionospheric electron density, and the tropospheric water vapour, which are relevant to applications such as orbit determination, satellite navigation, and weather/climate monitoring. Available models have difficulties in realistic prediction of these variables due to the simplicity of their structure or sampling limitations. In this study, we present an ensemble-based simultaneous Calibration and Data Assimilation (C/DA) algorithm to integrate freely available satellite geodetic data (e.g., CHAMP, GRACE(-FO), Swarm, and GNSS) into empirical models with the focus on improving the predictability of atmospheric variables. The improved model, called `C/DA-model' will be assessed in relevant geodetic and space weather applications. For demonstration, the CDA-NRLMSISE-00 is examined during seven periods with relatively high geomagnetic activity and CDA-IRI-ZWD during extensive rainy events.

How to cite: Forootan, E., Farzaneh, S., Dehvari, M., Retegui-Schiettekatte, L., and Schumacher, M.: Sequential calibration and data assimilation for predicting atmospheric variability, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-5567,, 2024.

The upcoming low earth orbit (LEO) constellations can bring new opportunities for ionospheric sounding below the LEO satellite altitude. The CENTISPACETM LEO satellites working with an altitude of 700 km broadcasting navigation augmentation signals to the ground stations. This study established a regional bottomside ionospheric map (RBIM) using navigation augmentation signals from two CENTISPACETM satellites on April 1, 2023, under moderate solar activity and quiet geomagnetic conditions. The RBIM accuracy was subsequently validated through comparison with multiple datasets, including Global and Regional Ionospheric Maps (GIMs and RIMs) constructed from ground-based GNSS observations, as well as the differential Slant Bottomside Electron Content (dSBEC) derived from LEO observations. To build the RBIM, the vertical bottomside electron content (VBEC) is fitted by two distinct methods, which are grid map and polynomial methods. The root mean square (RMS) values of the RBIM fitting residuals are 1.2 TECU and 0.7 TECU for the two methods, respectively. The RBIM precision evaluated by LEO dSBEC is better than 1.0 TECU. Comparing the VBEC from established RBIM to the GIM/RIM indicates that the RMS values mostly within 3-8 TECU, which can attribute to the limited modelling precision of the latter two models. What’s more, the RBIM facilitates the probe of the proportional variation of the VBEC over the total electron content using experimental data. The results derived from LEO observations indicate that the VBEC proportion is 83% at noon and 53% at night in the north mid-latitude region, presenting a reduction of 35.36%, which is more realistic than that calculated values from the empirical International Reference Ionosphere (IRI-2020) model (4.65%). Thus, the RBIM can not only benefit LEO navigation augmentation but also provide significant observations on the vertical distribution of ionospheric electron content.

How to cite: He, R., Li, M., Li, W., and Zhang, Q.: Estimating bottomside ionosphere electron content using navigation augmentation observations from two CENTISPACETM LEO satellites, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-8540,, 2024.

EGU24-9779 | ECS | PICO | G2.7

Monitoring short-term dynamic motion with single-frequency observations from a low-cost GNSS receiver 

Mert Bezcioglu, Berkay Bahadur, Ahmet Anil Dindar, and Cemal Ozer Yigit

In the last few decades, GNSS observations have frequently been used in Structural Health Monitoring (SHM) and Earthquake Early Warning (EEW) systems. The primary advantage of high-frequency GNSS techniques is to detect displacements directly in a terrestrial reference frame compared to conventional geotechnical sensors. Among GNSS techniques, the real-time kinematic (RTK) has predominantly been employed in dynamic displacement monitoring because it provides high accuracy simultaneously. Nevertheless, an external GNSS infrastructure is essential in RTK applications to achieve high positioning accuracy, which restricts its use in possible mega earthquake events. On the other hand, Precise Point Positioning (PPP), which can provide high positioning accuracy with a standalone GNSS receiver on a global scale, emerged as an alternative to traditional GNSS techniques. However, the requirement of an external internet connection for real-time PPP applications is the main restriction of this technique in the employment of possible mega earthquake events like the RTK technique. Instead, the variometric approach (VA) can provide high accuracy in determining dynamic behaviors with a standalone GNSS receiver and broadcast ephemeris only, which means it doesn't require any external infrastructure and connection. Furthermore, the emergence of new navigation systems, such as Galileo and BeiDou, brings considerable opportunities to improve the performance of the VA technique in detecting dynamic behaviors. Thanks to progress in GNSS receiver technology, low-cost GNSS receivers have been introduced and taken considerable attention from the GNSS community. Their more compact design makes low-cost GNSS receivers very usable for establishing monitoring networks in harsh environments, such as high-rise buildings and bridges. In this context, this study aims to evaluate the capability of the VA technique with a low-cost GNSS receiver in detecting horizontal dynamic motion simultaneously. For this purpose, this study employs single-frequency (SF) observations of GPS, GLONASS, Galileo, and BeiDou satellites from the u-blox ZED-F9P receiver for the VA technique. Harmonic motions from 5 to 20 mm with frequencies between 0.3 and 5.0 Hz were generated by a single-axis shake table to analyze the capability of the SF-VA technique in detecting structural motion. Also, a simulation of Mw 6.9 Kobe, 1995 earthquake was performed using the shake table to understand the feasibility of the SF-VA technique in possible EEW systems. In the evaluation, displacements from the Linear Variable Differential Transformer (LVDT) were selected as the reference to assess the capability of the SF-VA technique. The results indicated that the peak frequency value of short-term harmonic oscillations up to 5 Hz can be detected with the SF-VA technique adopting GNSS observations from the low-cost receiver. Besides, the results demonstrated that the SF-VA technique can determine the strong ground motions resulting from mega earthquakes at mm-level.

How to cite: Bezcioglu, M., Bahadur, B., Dindar, A. A., and Yigit, C. O.: Monitoring short-term dynamic motion with single-frequency observations from a low-cost GNSS receiver, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-9779,, 2024.

EGU24-10188 | PICO | G2.7

A new low-cost GNSS instrument for monitoring of ground motions and critical infrastructures within the Greek “Supersite” 

Athanassios Ganas, George Mavropoulos, Ioannis Karamitros, Konstantinos Nikolakopoulos, Vassiliki Charalampopoulou, Dimitrios Anastasiou, Theodoros Athanassopoulos, Aggeliki Kyriou, and Varvara Tsironi

There is a continuous need for integrating multi-parameter instrumental observations and measurements with Satellite Earth observation data towards continuous monitoring of the environment and infrastructures. This task attains more importance within the tectonic and seismically active area of the Greek "Supersite" (Corinth Gulf, Ionian Islands, etc.). The significant level of geohazards in this region have made necessary the implementation of new technological approaches that could offer reliable augmentation to permanent networks (both geodetic and seismological). In this contribution, we demonstrate the design, construction and installation of a new technological infrastructure that is based on the collaboration of a multidisciplinary research team and on low-cost equipment. Our low-cost instrumentation includes a multi-GNSS dual-frequency chip (Ublox ZED F9P module) mounted on a Raspberry-Pi 4 compute module IO board together with an industry-standard MEMS accelerometer. It provides signal tracking for most of GNSS systems (GPS, GLONASS, Galileo and BeiDou). The GNSS data are collected 24/7/365, quality-checked and processed by use of open-source software. The combined-synergistic use of these new sensors is compatible with ground motion data provided by GNSS reference stations and accelerometers used by seismic agencies. Current work includes the collection, homogenization, processing and archiving of daily data from three test sites using 4G telemetry. The GNSS data support the on-going, pre-operational monitoring of three test sites together with InSAR Copernicus data (Tsironi et al. 2022).


Tsironi, V., Ganas, A., Karamitros, I., Efstathiou, E., Koukouvelas, I., Sokos, E. 2022. Kinematics of Active Landslides in Achaia (Peloponnese, Greece) through InSAR Time Series Analysis and Relation to Rainfall Patterns. Remote Sens., 14(4), 844.

How to cite: Ganas, A., Mavropoulos, G., Karamitros, I., Nikolakopoulos, K., Charalampopoulou, V., Anastasiou, D., Athanassopoulos, T., Kyriou, A., and Tsironi, V.: A new low-cost GNSS instrument for monitoring of ground motions and critical infrastructures within the Greek “Supersite”, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-10188,, 2024.

The sensitivity of Global Navigation Satellite Systems (GNSS)  receivers to ionospheric disturbances and their constant growth is nowadays resulting in an increased concern of GNSS-users about the impacts of ionospheric disturbances at mid-latitudes. The geomagnetic storm of June 22-23, 2015, is an example of a rare phenomenon of a spill-over of equatorial plasma bubbles well north from their habitual region of ~+/- 20º around the magnetic equator.

We study the occurrence of small- and medium-scale irregularities in Southern Europe by analysing the behaviour of the amplitude scintillation index S4 and of the Rate Of Total Electron Content Index (ROTI) during the geomagnetic storm of June 22-23, 2015. To the scope, we leverage data obtained by local GNSS receivers for scintillation monitoring located in Lisbon (Portugal) and Lampedusa (Italy). Data is complemented with total electron content (TEC) data both from the local GNSS receivers and from global ionospheric maps.

The multi-source data allows for a better understanding of the ionospheric dynamic during the studied event.

How to cite: Morozova, A., Estaço, D., Spogli, L., and Barata, T.: Scintillations in the Southern Europe during the geomagnetic storm of June 2015: analysis of a plasma bubbles spill-off using local data, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-10792,, 2024.

EGU24-12608 | ECS | PICO | G2.7

NeGIX and TEGIX: two new indices to characterize the topside ionosphere with Swarm 

Juan Andrés Cahuasquí, Mohammed Mainul Hoque, Norbert Jakowski, Dmytro Vasylyev, Stephan Buchert, Grzegorz Nykiel, Martin Kriegel, Paul David, Youssef Tagargouste, and Jens Berdermann

Since its launch in 2013, ESA’s Swarm satellite constellation has pushed the frontiers of space weather research and monitoring by means of its broad spectrum of high-quality experiments on-board. Particularly, Swarm observations are being used to globally characterize small- to mid-scale perturbations in the topside ionosphere that may cause severe amplitude and phase scintillations of trans-ionospheric radio signals. Ionospheric scintillation can cause radio signal outage, as well as disruption of modern technological systems used for telecommunication, navigation and remote sensing.

While performing the Swarm DISC project “Monitoring of Ionospheric Gradients at Swarm (MIGRAS)”, the MIGRAS team has profited from the close orbits and synchronization of Swarm satellites Alpha (A) and Charlie (C) to develop two new products that focus on the monitoring of small- to mid-scale plasma density irregularities with horizontal spatial scales in the order of about 100 km - the electron density (Ne) Gradient Ionospheric indeX (NeGIX), and the Total Electron Content (TEC) Gradient Ionospheric indeX (TEGIX). NeGIX estimates spatial Ne gradients using Langmuir probe measurements, and TEGIX estimates spatial TEC gradients using GNSS Precise Orbit Determination (POD) data of Swarm.

In this work, we provide a comprehensive analysis of the capability of these two novel Swarm data products to characterize the perturbation state of the ionosphere at different geographic locations and conditions of geomagnetic activity. Our analysis covers the whole period of available Swarm observations to quantitively describe expected signatures of ionospheric variability, e.g. gradients at sunrise and sunset time, or equatorial crests. The analysis concentrates also on events of perturbed geomagnetic conditions to compare the performance of NeGIX and TEGIX with existing ground-based indices (e.g. GIX) and Swarm products (e.g. IPIR). Moreover, these indices have been developed technically compatible with Swarm’s and DLR’s operational data services. Therefore, our analysis validates and discusses their applicability for space weather science and purposes.

Acknowledgement: The work is funded by the MIGRAS (Monitoring of Ionospheric Gradients At Swarm) project under the Swarm DISC Subcontract Doc. no: SW‐CO‐DTU‐GS‐133, Rev: 1, 13 September 2022.

How to cite: Cahuasquí, J. A., Hoque, M. M., Jakowski, N., Vasylyev, D., Buchert, S., Nykiel, G., Kriegel, M., David, P., Tagargouste, Y., and Berdermann, J.: NeGIX and TEGIX: two new indices to characterize the topside ionosphere with Swarm, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-12608,, 2024.

EGU24-14279 | PICO | G2.7

Tectonic monitoring with low-cost multi-GNSS installations in Greece 

Jonathan Bedford, Konstantinos Chousianitis, Athanassios Ganas, Vasiliki Mouslopoulou, Efthimios Sokos, Zafeiria Roumelioti, Konstantinos Nikolakopoulos, Christoforos Pappas, Markus Ramatschi, Carsten Falck, Benjamin Männel, Cristian Garcia, Carlos Peña, Kaan Cökerim, Elvira Latypova, Michail Gianniou, Paraskevi Io Ioannidi, Chris Pikridas, Ilias Lazos, and Vasiliki Saltogianni

In 2023, we began installing a low-cost tectonic multi-GNSS network in Greece, funded by the European Research Council. We have installed a total of 45 permanent/continuous-mode stations, with another 15-20 to be installed in 2024. Installations so far have been mainly on the Peloponnese peninsula, with the strategy of increasing spatial resolution in between the existing research and privately operated GNSS networks. Station maintenance is funded by the project (ERC StG: TectoVision) until 2027, but it is the intention that as many as possible of these stations can stay installed (as permanent installations).

The scientific purpose of the new stations is to increase spatial resolution of microplate motions in Greece but these data will also be of use to other research fields needing single- or multi-GNSS observables. Accordingly, these data are being released without embargo subject to completion of quality control checks (with the data publication and link to download to be finalized before EGU 2024).

We consider this installation campaign to be a pilot project in affordable, rapid densification of tectonic-grade GNSS stations. Part of our strategy has been to use relatively low-cost monumentation for the geodetic marker onto which the low-cost installations are installed. Most stations are connected to mains electricity supplies of public buildings, with the monumentation being installed on flat roofs of these buildings. In higher altitude areas where flat roofs are rare, we have made 3 special installations at bedrock sites, with radio telemetry linking to a radio-receiving station in the nearby villages. We use a range of telemetry solutions, with the most common being the transfer of the 30s sampling data via a router containing a Machine-to-Machine (M2M) sim card.

In this presentation, we will show data quality metrics from the initial analysis of 6-11 months of observations and compare to the time series that can be processed from more expensive receiver-antenna combinations. We will also discuss what the team has learned practically (on-site) and logistically about installing low-cost GNSS stations at scale.

How to cite: Bedford, J., Chousianitis, K., Ganas, A., Mouslopoulou, V., Sokos, E., Roumelioti, Z., Nikolakopoulos, K., Pappas, C., Ramatschi, M., Falck, C., Männel, B., Garcia, C., Peña, C., Cökerim, K., Latypova, E., Gianniou, M., Ioannidi, P. I., Pikridas, C., Lazos, I., and Saltogianni, V.: Tectonic monitoring with low-cost multi-GNSS installations in Greece, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-14279,, 2024.

EGU24-14661 | ECS | PICO | G2.7

Assessing measurement noises from low-cost GNSS receivers and antennas 

Ibaad Anwar and Balaji Devaraju

Observations from the Global Navigation Satellite System (GNSS) play a crucial role in numerous applications, but are prone to measurement noise, especially when utilizing low-cost receivers and antennas. These measurement noises are crucial as they significantly impact the accuracy and reliability of positional data. This study investigates the characteristics and implications of measurement noises in low-cost GNSS systems, with a particular focus on the effects of receiver and antenna quality, environmental factors, and satellite dynamics. It employs a geometry-free approach to GNSS measurement analysis, aiming to identify and quantify the various noise sources in code-pseudorange and carrier phase observations. The analysis utilized data from two low-cost GNSS stations, each equipped with a u-blox dual-frequency receiver. These stations are equipped with survey-grade and navigational antennas. Additionally, data from the IGS station IITK has been used for comparative analysis.

How to cite: Anwar, I. and Devaraju, B.: Assessing measurement noises from low-cost GNSS receivers and antennas, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-14661,, 2024.

EGU24-14817 | ECS | PICO | G2.7

The emergence of low-cost GNSS-IR sensors for surface change monitoring: a case study of the RPR network for measuring the Rhine River level 

Makan A. Karegar, Luciana Fenoglio-Marc, Kristine M. Larson, Jürgen Kusche, and Hakan Uyanik

GNSS Interferometric Reflectometry (GNSS-IR) is redefining its role as an innovative technique in environmental sensing. However, geodetic-quality GNSS receivers and antennas are still very expensive instruments which limits their use as dedicated environmental sensors. Recently, low-cost GNSS-IR sensors have been developed for monitoring surface changes such as water level, snow depth and soil moisture. Real-time signal-to-noise ratio (SNR) observation, the key observable of ground-based GNSS-IR, can open up a range of possibilities for environmental monitoring with low cost sensors that can operate unattended for long periods of time. We have recently successfully developed a low-cost water-level sensor called Raspberry Pi Reflector (RPR) based on GNSS-IR technique (Karegar et al. 2022, Water Resources Research, 58). In spring and summer 2023, a network of eight RPRs was installed along the Rhine, the largest river in Germany, from Petersau to Sankt Goar. We installed some of these RPRs in a relatively steep and narrow middle Rhine valley, where the terrain relief around the instrument can influence the effectiveness of the GNSS-IR approach. The water level measurements provided by these sensors are used to validate the SWOT observations of surface water levels. In this presentation, we will present the results of the deployment of the RPRs and discuss the challenges associated with these low-cost sensors.

How to cite: A. Karegar, M., Fenoglio-Marc, L., M. Larson, K., Kusche, J., and Uyanik, H.: The emergence of low-cost GNSS-IR sensors for surface change monitoring: a case study of the RPR network for measuring the Rhine River level, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-14817,, 2024.

EGU24-14867 | ECS | PICO | G2.7

GFZRNX-QC: Advanced GNSS Data Processing and Quality Control for Multi-System Observations 

Xinghan Chen, Thomas Nischan, Zhiguo Deng, Benjamin Männel, and Jens Wickert

GFZRNX-QC software is designed to streamline the processing of Receiver Independent Exchange Format (RINEX) observations and the generation of overall information by providing a robust and efficient solution for data cleaning and quality control. With a focus on multiple Global Navigation Satellite System (multi-GNSS) observations, GFZRNX-QC offers a comprehensive approach to ensuring data accuracy and reliability. GFZRNX-QC can allow users to efficiently manage and analyze data from various GNSS receivers, especially for low-cost GNSS receivers. The software incorporates advanced algorithms for data cleaning, helping users to eliminate inconsistencies and enhance the overall quality of GNSS observations. GFZRNX-QC conducts comprehensive quality control assessments on GNSS observations. This ensures that the processed data meets the highest standards of accuracy. The software generates detailed statistical results, offering insights into the performance and reliability of observations across the five major GNSS systems. This information aids researchers and analysts in making informed decisions. GFZRNX-QC produces various outputs that can be e.g. compatible to former processing tools like teqc. This can enhance user convenience and interoperability with other geodetic processing tools.

GFZRNX-QC has been extensively tested by utilizing multi-year data from IGS stations to enable comprehensive long-term statistical analysis. By combining efficient data processing, advanced cleaning algorithms, and extensive quality control measures, GFZRNX-QC serves as a valuable tool for researchers, geodesists, and GNSS professionals seeking reliable and accurate observations and overall information from multiple satellite systems.

How to cite: Chen, X., Nischan, T., Deng, Z., Männel, B., and Wickert, J.: GFZRNX-QC: Advanced GNSS Data Processing and Quality Control for Multi-System Observations, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-14867,, 2024.

EGU24-17705 | PICO | G2.7

Atmospheric and Soil Moisture Monitoring in Agriculture Using GNSS: First Results from the MAGDA Project 

Andrea Gatti, Alessandro Fumagalli, Stefano Barindelli, and Eugenio Realini

The Meteorological Assimilation from Galileo and Drones for Agriculture (MAGDA) project aims to advance the integrated use of satellite-borne, drone-borne, and in-situ sensors, enhancing irrigation optimisation and weather hazard mitigation in agriculture. At its core, MAGDA employs low-cost Galileo-enabled GNSS ground stations for retrieving atmospheric water vapour and soil moisture. This data, combined with information from other technologies, is intended for assimilation into numerical weather prediction and hydrological models.

MAGDA’s demonstration sites are strategically located in three diverse agricultural regions of Europe: fruit plantations in Italy’s Piedmont, vineyards in France’s Burgundy, and mixed crops in Romania’s Braila county. Each of these sites is equipped with three low-cost GNSS stations, operational since mid-2023, providing valuable data for testing the efficacy and adaptability of GNSS technology in different agricultural and climatic conditions.

In addition to the three demonstration sites, MAGDA leverages data from pre-existing GNSS permanent stations across these countries. A comprehensive dataset from 397 stations in the Italy-France domain and 74 stations in the Romania domain has been downloaded. This data is specifically designed for the assimilation of GNSS-derived water vapour data, covering the entire weather model domains, complementing the localised information from the project’s targeted low-cost stations.

GNSS data processing utilises GReD’s proprietary Breva software, capable of analysing multi-frequency and multi-constellation observations. Atmospheric water vapour estimates are obtained through an undifferenced and uncombined batch least squares Precise Point Positioning (PPP) approach. This method has been employed to analyse six weather events that significantly impacted agricultural operations at the demonstration sites, two events per site.

Soil moisture results have been obtained by a newly developed module of Breva software that applies GNSS reflectometry based on the analysis of SNR measurements influenced by the humidity of the superficial soil. The methodology has been tested and validated at various previously studied sites, as well as directly at the low-cost GNSS stations established by the MAGDA project.

This work presents the preliminary results achieved in the first half of the MAGDA project, outlining encountered limitations and future development plans related to the analysis of MAGDA’s GNSS stations.

How to cite: Gatti, A., Fumagalli, A., Barindelli, S., and Realini, E.: Atmospheric and Soil Moisture Monitoring in Agriculture Using GNSS: First Results from the MAGDA Project, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-17705,, 2024.

EGU24-17764 | ECS | PICO | G2.7

GNSS low-cost prototype on ship for caching tsunami wave propagation  

Paul Jarrin, Lucie Rolland, Maurin Vidal, Pierre Sakic, Frédérique Leclerc, Jean-Xavier Dessa, and Sylvain Palagonia

Ship navigation data records are proposed to be complementary information for monitoring offshore tsunami currents following great earthquakes. Offshore GPS measurements on the research vessel Kilo Moana of the University of Hawaii following the 2010 Mw 8.8 Maule earthquake have illustrated the potential of GPS kinematic positioning solutions, together with a filtering approach, for detecting the ship's vertical displacement promoted by the tsunami travel velocity. However, kinematic positioning of GPS observations on ships is challenging due to the load, ship speed, and wavefield changes on the open ocean that might produce fast changes in the ship's drift and vertical motion. Wavefield could also introduce additional noise frequencies to the GPS positioning, thus decreasing its precision. Herein, we present a dual-frequency Global Satellite Navigation System (GNSS) low-cost prototype based on the Septentrio Mosaic-X5 card and a low-cost AS-ANT2BCAL antenna. Such a low-cost GNSS station has been installed on a non-commercial ship fleet in order to assess the precision and noise content of offshore GNSS positioning and ionosphere Total Electron Content measurements. We discuss our preliminary results by comparing the precision of the multi-GNSS solution (GPS, GLONASS, Galileo) relative to the one from only the GPS solution using both long-baselines and Precise Point Positioning approaches in post-processing mode. In the second step, we simulate a real-time multi-GNSS positioning solution to evaluate their ability to catch wavefield changes. We finally discuss the detectability of tsunamis with the newly developed GNSS low-cost prototype under various conditions.

How to cite: Jarrin, P., Rolland, L., Vidal, M., Sakic, P., Leclerc, F., Dessa, J.-X., and Palagonia, S.: GNSS low-cost prototype on ship for caching tsunami wave propagation , EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-17764,, 2024.

EGU24-1261 | Orals | GM3.1

Machine-learning based 3D point cloud classification and multitemporal change analysis with simulated laser scanning data using open source scientific software 

Bernhard Höfle, Ronald Tabernig, Vivien Zahs, Alberto M. Esmorís Pena, Lukas Winiwarter, and Hannah Weiser

AIM: We will present how virtual laser scanning (VLS), i.e., simulation of realistic LiDAR campaigns, can be key for applying machine/deep learning (ML/DL) approaches to geographic point clouds. Recent results will be shown for semantic classification and change analysis in multitemporal point clouds using exclusively open source scientific software.

MOTIVATION: Laser scanning is able to deliver precise 3D point clouds which have made huge progress in research in geosciences over the last decade. Capturing multitemporal (4D: 3D + time) point clouds enables to observe and quantify Earth surface process activities, their complex interactions and triggers. Due to the large size of 3D/4D datasets that can be captured by modern systems, automatic methods are required for point cloud analysis. Machine learning approaches applied to geographic point clouds, in particular DL, have shown very promising results for many different geoscientific applications [1,2].

METHODS & RESULTS: While new approaches for deep neural networks are rapidly developing [1], the bottleneck of sufficient and appropriate training data (typically annotated point clouds) remains the major obstacle for many applications in geosciences. Those data hungry learning methods depend on proper domain representation by training data, which is challenging for natural surfaces and dynamics, where there is high intra-class variability. Synthetic LiDAR point clouds generated by means of VLS, e.g., with the open-source simulator HELIOS++ [3], can be a possible solution to overcome the lack of training data for a given task. In a virtual 3D/4D scene representing the target surface classes, different LiDAR campaigns can be simulated, with all generated point clouds being automatically annotated. VLS software like HELIOS++ allows to simulate any LiDAR platform and settings for a given scene, which offers high potential for data augmentation and the creation of training samples tailored to specific applications. In recent experiments [1], purely synthetic training data could achieve similar performances to costly labeled training data from real-world acquisitions for semantic scene classification.

Furthermore, surface changes can be introduced to create dynamic VLS scenes (e.g., erosion, accumulation, movement/transport). Combining LiDAR simulation with automatic change analysis, such as offered by the open-source scientific software py4dgeo [5], enables to perform ML for change analysis in multitemporal point clouds [6]. Recent results show that rockfall activity mapping and classification for permanent laser scanning data can be successfully implemented by combining HELIOS++, py4dgeo and the open-source framework VL3D, which can be used for investigating various ML/DL approaches in parallel.

CONCLUSION: Expert domain knowledge (i.e., definition of proper 3D/4D scenes) and the power of AI can be closely coupled in VLS-driven ML/DL approaches to analyze 3D/4D point clouds in the geosciences. Open-source scientific software already offers all required components (HELIOS++, VL3D, py4dgeo). 


[1] Esmorís Pena, A. M., et al. (2024): Deep learning with simulated laser scanning data for 3D point cloud classification. ISPRS Journal of Photogrammetry and Remote Sensing. under revision.

[2] Winiwarter, L., et al. (2022): DOI: 

[3] HELIOS++:

[4] VL3D framework:

[5] py4dgeo:

[6] Zahs, V. et al. (2023): DOI:

How to cite: Höfle, B., Tabernig, R., Zahs, V., Esmorís Pena, A. M., Winiwarter, L., and Weiser, H.: Machine-learning based 3D point cloud classification and multitemporal change analysis with simulated laser scanning data using open source scientific software, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-1261,, 2024.

EGU24-1640 | ECS | Posters on site | GM3.1

Automatic Classification of Surface Activity Types from Geographic 4D Monitoring Combining Virtual Laser Scanning, Change Analysis and Machine Learning 

Vivien Zahs, Bernhard Höfle, Maria Federer, Hannah Weiser, Ronald Tabernig, and Katharina Anders

We advance the characterization of landscape dynamics through analysis of point cloud time series by integrating virtual laser scanning, machine learning and innovative open source methods for 4D change analysis. We present a novel approach for automatic identification of different surface activity types in real-world 4D geospatial data using a machine learning model trained exclusively on simulated data.

Our method focuses on classifying surface activity types based on spatiotemporal features. We generate training data using virtual laser scanning of a dynamic coastal scene with artificially induced surface changes. Scenes with surface change are generated using geographic knowledge and the concept of 4D objects-by-change (4D-OBCs) [1, 2], which represent spatiotemporal subsets of the scene that exhibit change with similar properties. A realistic 3D scene modelling is essential for accurately replicating the dynamic nature of coastal landscapes, where morphological changes are driven by both natural processes and anthropogenic activities.

The Earth's landscapes exhibit complex dynamics, spanning large spatiotemporal scales, from high-mountain glaciers to sandy coastlines. The challenge lies in effectively detecting and classifying diverse surface activities with varying magnitudes, spatial extents, velocities, and return frequencies. Effective characterization of these dynamics is crucial for understanding the underlying environmental processes and their interplay with human activities. Supervised machine learning classification of surface activities from point cloud time series is challenging due to the limited availability of comprehensive and diverse real-world datasets for training and validation. Our approach combines virtual laser scanning with machine learning-based classification, enabling the generation of comprehensive training datasets covering the full spectrum of expected change patterns [3].

In our approach, the simulation of LiDAR point clouds is performed in the open-source framework HELIOS++ [4, 5]. HELIOS++ allows the flexible simulation of custom LiDAR campaigns with diverse acquisition modes and settings together with automatic annotations of artificially induced surface changes. We train a supervised machine learning model to classify synthetic 4D-OBCs into typical surface activity types of a sandy beach (e.g. dune erosion/accretion, sediment transport, etc.). Moreover, we investigate descriptors for 4D-OBCs, assessing their suitability for representing general types of surface activity (transferable between use cases) and types specific to particular surface processes.

We evaluate our model for 4D-OBC classification in terms of its capacity to discriminate surface activity types in a real-world dataset of a sandy beach in the Netherlands [6]. 4D-OBCs are extracted, classified into our target classes and validated with manually labelled reference data based on expert evaluation.

Our study showcases the efficacy of coupling virtual laser scanning, innovative open-source 4D change analysis methods, and machine learning for classifying natural surface changes [7]. Our findings not only contribute to advancing the understanding of landscape dynamics but also provide a promising approach to mitigating environmental challenges.


[1] Anders et al. (2022): DOI:

[2] py4dgeo: 

[3] Zahs et al. (2022): DOI:

[4] HELIOS++:

[5] Winiwarter et al. (2022): DOI: 

[6] Vos et al. (2022): DOI:

[7] CharAct4D:

How to cite: Zahs, V., Höfle, B., Federer, M., Weiser, H., Tabernig, R., and Anders, K.: Automatic Classification of Surface Activity Types from Geographic 4D Monitoring Combining Virtual Laser Scanning, Change Analysis and Machine Learning, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-1640,, 2024.

The acquisition of aerial photographs for cartographic applications started in the 1930s, and more intensively after World War II. Such old, often panchromatic, imagery offers metre to sub-metre scale spatial resolution over landscapes that have significantly evolved over the decades. Before the appearance of the first digital aerial camera systems at the end of the 20th Century, surveys were performed with analogue metric cameras, with images acquired on films or glass plates and, next, developed on photo papers. In Europe and North America, several institutions hold unique collections of historical aerial photographs having local, national and, in some cases, colonial coverages. They represent invaluable opportunities for environmental studies, allowing the comparison with today’s land use land cover, and the analysis of long-term surface displacements.

Initially, the photogrammetric processing of analogue aerial photographs would require expensive equipment, specialised operators, and significant processing time. Thanks to the digital revolution of the past two decades and the development of modern digital photogrammetric approaches, the processing of this type of image datasets has become less cumbersome, time consuming and expensive, at least in theory. In practice, this is more complex, with digitising and processing issues related to the ageing and quality of conservation of the aerial photographs, the potential distortions created during the digitising process, and the lack of ancillary data, such as, flight plans, and camera calibration reports. The limited overlap between photographs, typically 60 % and 10-20 %, along-track and across-track, respectively, make their processing with Structure-from-Motion Multi-View Stereo (SfM-MVS) photogrammetry poorly reliable to accurately reconstruct the topography and orthorectify the images. Given the fact that some collections reach up to millions of historical aerial photographs, the digitising, pre-processing, and photogrammetric processing of these images remain a challenge that must be properly tackle if we would like to ensure their preservation and large-scale valorisation.

In the present work, we describe the mass-digitising, digital image pre-processing and photogrammetric processing approaches implemented at the Royal Museum for Central Africa (RMCA, Belgium) to preserve and valorise the collection of >320,000 historical aerial photographs conserved in this federal institution. This imagery was acquired between the 1940’s and the 1980’s, over Central Africa, and mostly D.R. Congo, Rwanda and Burundi. For the digitising, a system of parallelized flatbed scanners controlled by a Linux computer and a self-developed software allows speeding-up the scanning of the entire collection in only few years. A series of Python scripts were developed and combined to allow a swift pre-processing that prepare and optimise the digitised images for photogrammetric processing. Finally, a SfM-MVS photogrammetric approach adapted to historical aerial photos is used. Examples of application for geo-hydrological hazards studies in the western branch of the East African Rift are shown.

How to cite: Smets, B., Dille, A., Dewitte, O., and Kervyn, F.: Digitising, pre-processing and photogrammetric processing of historical aerial photographs for the production of high resolution orthomosaics and the study of geohazards, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-2356,, 2024.

EGU24-4399 | ECS | Posters on site | GM3.1

Evaluating the efficacy of multitemporal TLS and UAS surveys for quantifying wind erosion magnitudes of sand dune topography 

László Bertalan, Gábor Négyesi, Gergely Szabó, Zoltán Túri, and Szilárd Szabó

Wind erosion constitutes a prominent land degradation process in regions of Hungary characterized by low annual precipitation. In these areas, it poses significant challenges to agricultural productivity and adversely impacts soil and environmental quality. Presently, human activities exert a more pronounced influence on the endangered areas of Hungary in comparison to climate-related factors. It is noteworthy that the wind erodibility of Hungarian soils not only poses a soil conservation challenge but also gives rise to economic ramifications, such as nutrient loss, as well as environmental and human health concerns. Within agricultural landscapes, wind erosion contributes to the removal and transportation of the finest and biologically active soil fractions, rich in organic matter and nutrients.

High-resolution topographic surveys have become integral for assessing volumetric changes in sand dune mobility and mapping wind erosion. While Unmanned Aerial Systems (UAS) surveys have been extensively employed for erosion rates exceeding the decimeter scale, Terrestrial Laser Scanning (TLS) surveys have demonstrated efficiency in capturing more extensive negative erosional forms, even in a vertical orientation. To enhance the field of view, a mounting framework can be implemented to elevate the TLS. However, determining centimeter-scale material displacement in flat terrain conditions remains challenging and requires an increased number of scanning positions.

To identify optimal settings for surveying centimeter-scale wind erosion magnitudes, we conducted combined multi-temporal TLS and UAS surveys at the Westsik experimental site near Nyíregyháza during the spring of 2023. This site features dune topography with a height of 6 meters. Our investigations encompassed various UAS image acquisition modes, involving different flight altitudes and camera settings, utilizing a DJI Matrice M210 RTK v2 drone and a Zenmuse X7 24 mm lens. Additionally, we generated diverse point clouds through various scanning scenarios using a Trimble X7 TLS device. In the data processing phase, we explored multiple co-registration algorithms to address the challenge of larger Root Mean Square Error (RMSE) in Digital Terrain Models (DTMs) from UAS Structure from Motion (SfM) compared to the actual wind erosion rates.


The research is supported by the NKFI K138079 project.

How to cite: Bertalan, L., Négyesi, G., Szabó, G., Túri, Z., and Szabó, S.: Evaluating the efficacy of multitemporal TLS and UAS surveys for quantifying wind erosion magnitudes of sand dune topography, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-4399,, 2024.

EGU24-5142 | Posters on site | GM3.1 | Highlight

Four nationwide Digital Surface Models from airborne historical stereo-images 

Christian Ginzler, Livia Piermattei, Mauro Marty, and Lars T. Waser

Historical aerial images, captured by film cameras in the previous century, have emerged as valuable resources for quantifying Earth's surface and landscape changes over time. In the post-war period, historical aerial images were often acquired to create topographic maps, resulting in the acquisition of large-scale aerial photographs with stereo coverage. Using photogrammetric techniques on stereo-images enables extracting 3D information to reconstruct Digital Surface Models (DSMs), and orthoimages.

This study presents a highly automated photogrammetric approach for generating nationwide DSMs for Switzerland at 1 m resolution using aerial stereo-images acquired between 1979 and 2006. The 8-bit scanned images, with known exterior and interior orientation, were processed using BAE Systems' SocetSet (v5.6.0) with the "Next-Generation Automatic Terrain Extraction" (NGATE) package for DSM generation. The primary objective of the study is to derive four nationwide DSMs for the epochs 1979-1985, 1985-1991, 1991-1998, and 1998-2006. The study assesses DSM quality in terms of vertical accuracy and completeness of image matching across different land cover types, with a focus on forest dynamics and management research.

The elevation accuracy of the generated DSMs was assessed using two reference datasets. Firstly, the elevation differences between a nationwide reference Digital Terrain Model (DTM - swissAlti3d 2017 by Swisstopo) and the generated DSMs were calculated on points classified as "sealed surface". Secondly, elevation values of the DSMs were compared to approximately 500 independent geodetic points distributed across the country. Six study areas were chosen to assess completeness, and it was calculated as the percentage of successfully matched points to the potential total number of matched points within a predefined area. This assessment was conducted for six land cover classes based on the land cover/land-use statistics dataset from the Federal Office of Statistics.

Across the entire country, the median elevation accuracy of the DSMs on sealed points ranges between 0.28 to 0.53 m, with a Normalized Median Absolute Deviation (NMAD) of around 1 m (maximum 1.41 m) and an RMSE of a maximum of 3.90 m. The elevation differences between geodetic points and DSMs show higher accuracy, with a median value of a maximum of 0.05 m and an NMAD smaller than 1 m. Completeness results reveal mean completeness between 64 % to 98 % for the classes "glacial and perpetual snow" and "sealed surfaces," respectively and 93 % specifically for the “closed forest” class.

This work demonstrates the feasibility of generating accurate DSM time series (spanning four epochs) from historical scanned images for the entire Switzerland in a highly automated manner. The resulting DSMs will be available upon publication, providing an excellent opportunity to detect major surface changes, such as forest dynamics.

How to cite: Ginzler, C., Piermattei, L., Marty, M., and Waser, L. T.: Four nationwide Digital Surface Models from airborne historical stereo-images, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-5142,, 2024.

EGU24-5670 | ECS | Posters on site | GM3.1

Enhancing 3D Feature-based Landslide Monitoring Efficiency by Integrating Contour Lines in Laser Scanner Point Clouds 

Kourosh Hosseini, Jakob Hummelsberger, Daniel Czerwonka-Schröder, and Christoph Holst

Landslides are a pervasive natural hazard with significant societal and environmental impacts. In addressing the critical need for accurate landslide detection and monitoring, our previous research introduced a feature-based monitoring method enhanced by histogram analyses, straddling a middle ground between point-based and point cloud-based methods. This paper expands upon that foundation, introducing an innovative contour line extraction technique from various epochs to precisely identify areas prone to deformation. This refined focus diverges from conventional methodologies that analyze entire point clouds. By applying on regions where contour lines do not match, indicating potential ground movement, we significantly elevate the efficiency and precision of our feature-based monitoring system.


One of the principal challenges of feature-based monitoring is managing a substantial number of outliers. Our prior research tackled this issue effectively by integrating feature tracking with histogram analysis, thereby filtering these outliers from the final results. However, the process of extracting features from each patch and matching them with corresponding patches from different epochs was time-intensive.


The incorporation of contour line extraction into our workflow, using high-resolution laser scanner data, allows for a more focused and efficient analysis. We can now identify and analyze areas of landscape alteration with greater accuracy. This approach limits the application of feature tracking and histogram analysis to these critical areas, thus streamlining the process and significantly reducing computational demands. This focused methodology not only accelerates data processing but also enhances the accuracy of landslide predictions.


Our findings indicate a substantial improvement in the efficiency of landslide monitoring methods. This methodology represents a promising advancement in geospatial analysis, particularly for environmental monitoring and risk management in regions susceptible to landslides. This research contributes to the ongoing efforts to develop more effective, efficient, and accurate approaches to landslide monitoring, ultimately aiding in better informed and timely decision-making processes for hazard mitigation and risk management.

How to cite: Hosseini, K., Hummelsberger, J., Czerwonka-Schröder, D., and Holst, C.: Enhancing 3D Feature-based Landslide Monitoring Efficiency by Integrating Contour Lines in Laser Scanner Point Clouds, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-5670,, 2024.

EGU24-5674 | ECS | Orals | GM3.1

Piecewise-ICP: Efficient Registration of 4D Point Clouds for Geodetic Monitoring 

Yihui Yang, Daniel Czerwonka-Schröder, and Christoph Holst

The permanent terrestrial laser scanning (PLS) system has opened the possibilities for efficient data acquisition with high-temporal and spatial resolution, thus allowing for improved capture and analyses of complex geomorphological changes on the Earth's surface. Accurate georeferencing of generated four-dimensional point clouds (4DPC) from PLS is the prerequisite of the following change analysis. Due to the massive data volume and potential changes between scans, however, efficient, robust, and automatic georeferencing of 4DPC remains challenging, especially in scenarios lacking signalized and reliable targets. This georeferencing procedure can be typically realized by designating a reference epoch and registering all other scans to this epoch. Addressing the challenges in targetless registration of topographic 4DPC, we propose a simple and efficient registration method called Piecewise-ICP, which first segments point clouds into piecewise patches and aligns them in a piecewise manner.

Assuming the stable areas on monitored surfaces are locally planar, supervoxel-based segmentation is employed to generate small planes from adjacent point clouds. These planes are then refined and classified by comparing defined correspondence distances to a monotonically decreasing distance threshold, thus progressively eliminating unstable planes in an efficient iterative process as well as preventing local minimization in the ICP process. Finally, point-to-plane ICP is performed on the centroids of the remaining stable planes. We introduce the level of detection in change analysis to determine the minimum distance threshold, which mitigates the influence of outliers and deformed areas on registration accuracy. Besides, the spatial distribution of empirical registration uncertainties on registered point clouds is derived based on the variance-covariance propagation law.

Our registration method is demonstrated on two datasets: (1) Synthetic point cloud time series with defined changes and transformation parameters, and (2) a 4DPC dataset from a PLS system installed in the Vals Valley (Tyrol, Austria) for monitoring a rockfall. The experimental results show that the proposed algorithm exhibits higher registration accuracy compared to the existing robust ICP variants. The real-time capability of Piecewise-ICP is significantly improved owing to the centroid-based point-to-plane ICP and the efficient iteration process.

How to cite: Yang, Y., Czerwonka-Schröder, D., and Holst, C.: Piecewise-ICP: Efficient Registration of 4D Point Clouds for Geodetic Monitoring, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-5674,, 2024.

EGU24-5757 | Posters on site | GM3.1

Arctic puzzle: pioneering a shrimp habitat model in topographically complex Disko Bay (West Greenland) 

Diana Krawczyk, Tobias Vonnahme, Ann-Dorte Burmeister, Sandra Maier, Martin Blicher, Lorenz Meire, and Rasmus Nygaard

Our study focuses on the geologically, topographically, and oceanographically complex region of Disko Bay in West Greenland. Disko Bay is also considered a marine biodiversity hotspot in Greenland. Given the impact of commercial fishing on seafloor integrity in the area, seafloor habitats studies are crucial for sustainable use of marine resources. One of the key fishery resources in Greenland, as well as in the North Atlantic Ocean, is northern shrimp.

In this study we analyzed multiple (1) monitoring datasets from 2010 to 2019, including data from shrimp and fish surveys, commercial shrimp fishery catches, satellite chlorophyll data, and (2) seafloor models, encompassing high-resolution (25 x 25 m) multibeam data with a low-resolution (200 x 200 m) IBCAO grid. Using multivariate regression analysis and spatial linear mixed-effect model we assessed the impact of physical (water depth, bottom water temperature, sediment type), biological (chlorophyll a, Greenland halibut predation), and anthropogenic factors (shrimp fishery catch and effort) on shrimp density in the area. The resulting high-resolution predictive model of northern shrimp distribution in Disko Bay is the first model of this kind developed for an Arctic area.

Our findings reveal that shrimp density is significantly associated with static habitat factors, namely sediment type and water depth, explaining 34% of the variation. The optimal shrimp habitat is characterized by medium-deep water (approximately 150-350 m) and mixed sediments, primarily in the north-eastern, south-eastern, and north-western Disko Bay. This pioneering study highlights the importance of seafloor habitat mapping and modeling, providing fundamental geophysical knowledge necessary for long-term sustainable use of marine resources in Greenland.

The developed high-resolution model contributes to a better understanding of detailed patterns in northern shrimp distribution in the Arctic, offering valuable insights for stock assessments and sustainable fishery management. This novel approach to seafloor habitat mapping supports the broader goal of ensuring the responsible utilization of marine resources, aligning with principles of environmental conservation and fisheries management. Our work serves as a foundation for ongoing efforts to balance economic interests with the preservation of marine ecosystems, fostering a harmonious coexistence between human activities and the fragile Arctic environment.

How to cite: Krawczyk, D., Vonnahme, T., Burmeister, A.-D., Maier, S., Blicher, M., Meire, L., and Nygaard, R.: Arctic puzzle: pioneering a shrimp habitat model in topographically complex Disko Bay (West Greenland), EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-5757,, 2024.

EGU24-10361 | ECS | Orals | GM3.1

A Time-Series Analysis of Rockfall Evolution in a Coastal Region Using Remote Sensing Data 

Aliki Konsolaki, Emmanuel Vassilakis, Evelina Kotsi, Michalis Diakakis, Spyridon Mavroulis, Stelios Petrakis, Christos Filis, and Efthymios Lekkas

The evolution of technology, particularly the integration of Unmanned Aerial Systems (UAS), earth observation datasets, and historical data such as aerial photographs, stand as fundamental tools for comprehending and reconstructing surface evolution and potential environmental changes. In addition, the active geodynamic phenomena in conjunction with climate crisis and the increasing frequency of extreme weather phenomena can cause abrupt events such as rockfalls and landslides, altering completely the morphology on both small and large scales.

This study deals generally with the temporal evolution of landscapes and specifically focuses on the detection and quantification of a significant rockfall event that occurred at Kalamaki Beach on Zakynthos Island, Greece – a very popular summer destination. Utilizing UAS surveys conducted in July 2020 and July 2023, this research revealed a rockfall that has significantly altered the coastal morphology. During this period, two severe natural phenomena occurred, one of which could potentially be the cause of this rockfall event. Initially, the Mediterranean hurricane (‘medicane’) ‘Ianos’ made landfall in September 2020, affecting a large part of the country including the Ionian Islands. The result was severe damage to property and infrastructures, along with human casualties, induced by intense precipitation, flash flooding, strong winds, and wave action. Second, in September of 2022, an ML=5.4 earthquake struck between Cephalonia and Zakynthos Islands in the Ionian Sea, triggering considerable impact in both islands. The study employs satellite images postdating these natural disasters, to detect the source of the rockfall in Kalamaki Beach. Additionally, historical analog aerial images from 1996 and 2010 were used as assets for understanding the surface’s evolution. For the quantitative analysis, we applied 3D semi-automated change detection techniques such as the M3C2 algorithm, to estimate the volume of the rockfall.

The results provide insights into the complex interplay between natural disasters and geological processes, shedding light on the dynamic nature of landscapes and the potential implications for visitor-preferred areas.

This research not only contributes to our understanding of landscape evolution but also underscores the importance of integrating modern and historical datasets to decipher the dynamic processes shaping the Earth's surface. The methodology proposed, serves as a valuable approach for assessing and managing geological hazards in coastal regions affected by both climatic events and geodynamic activities.

How to cite: Konsolaki, A., Vassilakis, E., Kotsi, E., Diakakis, M., Mavroulis, S., Petrakis, S., Filis, C., and Lekkas, E.: A Time-Series Analysis of Rockfall Evolution in a Coastal Region Using Remote Sensing Data, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-10361,, 2024.

EGU24-10373 | Orals | GM3.1

A database for ancillary information of three-dimensional soil surface microtopography measurements. 

Kossi Nouwakpo, Anette Eltner, Bernardo Candido, Yingkui Li, Kenneth Wacha, Mary Nichols, and Robert Washington-Allen

Understanding the complex processes occurring at the soil surface is challenging due to the intricate spatial variability and dynamic nature of these processes. An effective tool for elucidating these phenomena is three-dimensional (3D) reconstruction, which employs advanced imaging technologies to create a comprehensive representation of the soil surface at high spatial resolution, often at the mm-scale. Three-dimensional reconstruction techniques are increasingly available to scientists in the fields of soil science, geomorphology, hydrology, and ecology and many studies have employed these novel tools to advance understanding of surface processes. Much of the data being collected in these studies are however not interoperable, i.e., 3D data from one study may not be directly combined with 3D data from other studies thus limiting the ability of researchers to advance process understanding at a broader scope. The limited interoperability of existing data is due in part to the fact that 3D surface reconstruction data are influenced by many factors including experimental conditions, intrinsic soil properties and accuracy and precision limits of the 3D reconstruction technique used. These ancillary data are crucial to any broad-scope efforts that leverage the increasing number of 3D datasets collected by scientists across disciplines, geographic regions, and experimental conditions. We have developed a relational database that archives and serves ancillary data associated with published high-resolution 3D data representing soil surface processes. This presentation introduces the structure of the database with its required and optional variables. We also provide analytics on the currently available records in the database and discuss potential applications of the database and future developments.

How to cite: Nouwakpo, K., Eltner, A., Candido, B., Li, Y., Wacha, K., Nichols, M., and Washington-Allen, R.: A database for ancillary information of three-dimensional soil surface microtopography measurements., EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-10373,, 2024.

EGU24-11949 | ECS | Posters on site | GM3.1

Employng satellite immagery interpretation tools to detect land-use land-change dynamics in Italian historical rural landscapes 

Virginia Chiara Cuccaro, Claudio Di Giovannantonio, Giovanni Pica, Luca Malatesta, and Fabio Attorre

Rural landscapes inherited from the past are marked by a strong interaction between man and nature, a relationship rooted in a long history that testifies to the importance of the landscape as one of the most historically representative expressions of a country's cultural identity.

In this broad context, olive groves markedly characterize the agricultural landscape of many European rural areas, particularly in the Mediterranean region. Along with other rural landscapes, they form a semi-natural environment that can contribute to biodiversity conservation, soil protection and ecosystem resilience.

In addition to the global increase in temperatures, the main threats affecting these agrarian landscapes include the abandonment of traditional practices and the intensification of cultivation through the installation of irregular, intensive and overly dense planting beds.

The Land Cover classification and change-detection can provide useful indications for the restoration, conservation, and enhancement of olive groves

The objective of this work was to identify , rural landscapes in the Lazio region with characteristics of historical interest and determine their level of conservation. In particular, it was investigated the olive landscape of Cures (historic province of Sabina) trough a multi-temporal analysis of literature and cartographic information (e.g. orthophotos from the Italian Aeronautical Group flight of 1954)

The technique concerns the VASA (Historical Environmental Assessment) methodology, which allows the temporal evaluation of a given landscape and can inform on how agricultural practices and land use have changed over time.

Softwares  Collect Earth and Google Earth were employed to manipulate the historical series of high-resolution satellite images and implement photointerpretation. The coverage of identitied land units  was then estimated to address the configuration of the target landscape.

Landscape evolution over time was achieved by overlaying the 1954 and 2022 land use polygons, resulting in a merging database, in which an evolutionary dynamic was associated with each land use change.

The approach generated in-depth insights on the significant elements of the CURES olive landscape and informed on the dynamics of the area in relation to the risk of their disappearance, making it possible to identify what are the "landscape emergencies," i.e., the land uses that have seen the most̀ reduction in their area.

The methodologies employed have proven reliability in improving the knowledge ng target landscapes.  It might be useful to promote  sustainable agricultural practices for better preservation and management of rural environments so that cultural traditions can be preserved as well, and the environmental balance of the agrarian land can be maintained.

How to cite: Cuccaro, V. C., Di Giovannantonio, C., Pica, G., Malatesta, L., and Attorre, F.: Employng satellite immagery interpretation tools to detect land-use land-change dynamics in Italian historical rural landscapes, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-11949,, 2024.

EGU24-12105 | ECS | Orals | GM3.1 | Highlight

Unleashing the archive of aerial photographs of Iceland, 1945-2000. Applications in geosciences  

Joaquín M. C. Belart, Sydney Gunnarson, Etienne Berthier, Amaury Dehecq, Tómas Jóhannesson, Hrafnhildur Hannesdóttir, and Kieran Baxter

The archive of historical aerial photographs of Iceland consists of ~140,000 vertical aerial photographs acquired between the years 1945 and 2000. It contains an invaluable amount of information about human and natural changes in the landscape of Iceland. We have developed a series of automated processing workflows for producing accurate orthomosaics and Digital Elevation Models (DEMs) from these aerial photographs, which we’re making openly available in a data repository and a web map visualization service. The workflow requires two primary inputs: a modern orthomosaic to automatically extract Ground Control Points (GCPs) and an accurate DEM for a fine-scale (sub-meter) alignment of the historical datasets. We evaluated the accuracy of the DEMs by comparing them in unchanged terrain against accurate recent lidar and Pléiades-based DEMs, and we evaluated the accuracy of the orthomosaics by comparing them against Pléiades-based orthomosaics. The data are becoming available at To show the potential applications of this repository, we present the following showcases where these data reveal significant changes the landscape in Iceland in the past 80 years: (1) volcanic eruptions (Askja 1961, Heimaey 1973 and the Krafla eruptions, 1975-1984), (2) decadal changes of Múlajökull glacier from 1960-2023, (3) Landslides (Steinsholtsjökull 1967, Tungnakvíslarjökull 1945-present) and (4) coastal erosion (Surtsey island).

How to cite: Belart, J. M. C., Gunnarson, S., Berthier, E., Dehecq, A., Jóhannesson, T., Hannesdóttir, H., and Baxter, K.: Unleashing the archive of aerial photographs of Iceland, 1945-2000. Applications in geosciences , EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-12105,, 2024.

EGU24-14087 | ECS | Posters on site | GM3.1

A point-cloud deep learning model based on RGB-D images: Application of riverbed grain size survey 

Bo Rui Chen and Wei An Chao

The water level and discharge of river are crucial parameters to understand the variance in riverbed scour. The detail behavior of scouring can be studied by the hydraulic simulation. The grain-size distribution of riverbed is also one of crucial parameter for modeling. Thus, how to investigate the grain-size of riverbed efficiently and swiftly is the urgent issue. However, the conventional measurement methods including Wolman counts (particles sampled at a fixed interval) which are a long and laborious task cannot survey the grain-size efficiently in the large area. In recent years, with an advantage of image segmentation and recognition has been applied to the investigation of grain-size, for example, capturing images through UAV and generating orthoimage is one of commonly used image technique. Although above the method can investigate the grain-size in the large area, it does not provide the information in the field immediately. Hence, a recent study developed the low-cost portable scanner to obtain the information of grain-size distribution in the field. However, the calibrating parameters of camera (e.g., height camera capture) are necessary before survey, and the uncertainties in calculation of image resolution will significantly affect the accuracy of grain-size analysis. Therefore, this study provides the additional algorithm to analyze the grain-size by using RGB-D image as inputs. The application of RGB-D can be categorized into two-dimensional (2D) and three-dimensional (3D) spaces. In a case of 2D, it integrates depth information with traditional RGB image processing to separate the grain-size of riverbed from the background (e.g., bottomland). Furthermore, depth information is also applied for grain-size edge detection. In a case of 3D, the collected RGB-D image information is transformed into point cloud data, then extract 3D features of grain particle by Deep learning, specifically PointNet. Our study demonstrates that clustering of 3D features can achieve the automatic identification of particle. The grain-size of particle can also be estimated by fitting 3D ellipsoid geometry. In the end, results show the grain-size distribution curves with the RGB、RGB-D、PointNet recognition, and compare with the true observations. 3D image information provides the cloud points of grain object, leading the possibility of estimating the 3D geometric morphology of the object. Our study successfully overcomes the limitations of conventional RGB-based process, which could only capture size and shape information in 2D planar. RGB-D-based image recognition, is an innovative technique for the hydraulic problem, not only advances survey efficiency but also addresses the intricate steps required for field investigations.


Key words: Riverbed grain size, RGB-D image, Point cloud, Deep Learning

How to cite: Chen, B. R. and Chao, W. A.: A point-cloud deep learning model based on RGB-D images: Application of riverbed grain size survey, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-14087,, 2024.

EGU24-14680 | Orals | GM3.1

Using current 3D point clouds as a tool to infer on past geomorphological processes 

Reuma Arav, Sagi Filin, and Yoav Avni

Examining deposition and erosion dynamics during the late Pleistocene and Holocene is crucial for gaining insights into soil development, erosion, and climate fluctuations. This urgency intensifies as arable lands face escalating degradation rates, particularly in arid and semi-arid environments. Nevertheless, as the destructive nature of erosional processes allows only for short-term studies, long-term processes in these regions are insufficiently investigated. In that respect, the ancient agricultural installations in the arid Southern Levant offer distinctive and undisturbed evidence of long-term land dynamics. Constructed on a late Pleistocene fluvial-loess section during the 3rd-4th CE and abandoned after 600-700 years, these installations record sediment deposition, soil formation, and erosion processes. The challenge is to trace and quantify these processes based on their current state. In this presentation, we demonstrate how the use of 3D point cloud data enables us to follow past geomorphological processes and reconstruct trends and rates. Utilizing data gathered in the immediate vicinity of the UNESCO World Heritage Site of Avdat (Israel), we illustrate how these point clouds comprehensively document the history of soil dynamics in the region. This encompasses the initial erosion phase, subsequent soil aggradation processes resulting from anthropogenic interruption, and the ongoing reinstated erosion. The unique setting, which uncovers the different fluvial sections, together with the detailed 3D documentation of the site, allows us to develop means for the reconstruction of the natural environment in each of the erosion/siltation stages. Therefore, by utilizing the obtained data, we can recreate the site during its developmental stages till the present day. Furthermore, we utilize terrestrial laser scan data sequence acquired in the past decade (2012-2022) to compute current erosion rates. These are then used to determine past rates, enabling inferences about the climatic conditions prevalent in the region over the last millennium. The in-depth examination of these installations provides valuable insights into approaches for soil conservation, sustainable desert living, and strategies to safeguard world-heritage sites subjected to soil erosion. As the global imperative to address soil erosion intensifies, this case study gains heightened relevance.

How to cite: Arav, R., Filin, S., and Avni, Y.: Using current 3D point clouds as a tool to infer on past geomorphological processes, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-14680,, 2024.

EGU24-15439 | ECS | Orals | GM3.1 | Highlight

Utilizing historical aerial imagery for change detection in Antarctica 

Felix Dahle, Roderik Lindenbergh, and Bert Wouters

Our research explores the potential of historical images of Antarctica for change detection in 2D and 3D. We
make use of the TMA Archive, a vast collection of over 330,000 black and white photographs of Antarctica taken
between 1940 to 1990. These photographs, available in both nadir and oblique, are systematically captured
from airplanes along flight paths and offer an unprecedented historical snapshot of the Antarctic landscape.
Detecting changes between past and present observations provides a unique insight into the long-term impact
of changing climate conditions on Antarctica’s glaciers, and their dynamical response to ice shelf weakening and
disintegration. Furthermore, it provides essential validation data for ice modelling efforts, thereby contributing
to reducing the uncertainties in future sea level rise scenarios.

In previous work, we applied semantic segmentation to these images [1]. By employing classes derived from this
segmentation, we can focus on features of interest and exclude images with extensive cloud coverage, enhancing
the accuracy of change analyses. In the next step, we geo-referenced the images: We assigned the images to
their actual position, scaled them to their true size, and aligned them with their genuine orientation. This
presents novel opportunities for detecting environmental changes in Antarctica, particularly in the retreat of
glaciers and sea ice.

Furthermore, the combination of these two steps allows for the first time a large scale reconstruction of these
images in 3D through Structure from Motion (SfM) techniques, which enables further multidimensional change
detection by comparing historical 3D models with contemporary ones. Due to the high number of images,
manual processing is impractical. Therefore, we are investigating the possibility of automatizing this process.
We utilize MicMac, an open-source software developed by the French National Geographic Institute for the
creation of the 3D models. Its high modularity allows for necessary customizations to automate the SfM
process effectively. Further adaptions are required due to the poor image quality and monotonous scenery. By
comparing historical 3D models with contemporary ones, we can assess alterations in elevation due to factors
such as glacial isostatic adjustments and glacier retreat.

We have already employed geo-referenced images for detecting changes on the Antarctic peninsula and are in the
process of creating initial 3D models. Our presentation will outline the workflow we developed for this process
and showcase the initial results of the change detection, both in 2D and 3D formats. This approach marks a
significant step in understanding and visualizing the impacts of climate change on the Antarctic landscape.

This work was funded by NWO-grant ALWGO.2019.044.

[1] F. Dahle, R. Lindenbergh, and B. Wouters. Revisiting the past: A comparative study for semantic segmen-
tation of historical images of Adelaide Island using U-nets. ISPRS Open Journal of Photogrammetry and
Remote Sensing, 11:100056, 2024.

How to cite: Dahle, F., Lindenbergh, R., and Wouters, B.: Utilizing historical aerial imagery for change detection in Antarctica, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-15439,, 2024.

EGU24-15896 | Orals | GM3.1

Classification and segmentation of 3D point clouds to survey river dynamics and evolution  

Laure Guerit, Philippe Steer, Paul Leroy, Dimitri Lague, Dobromir Filipov, Jiri Jakubinsky, Ana Petrovic, and Valentina Nikolova

3D data for natural environments are now widely available via open data at large scales (e.g., OpenTopography) and can be easily acquired on the field by terrestrial LiDAR scan (TLS) or by structure-from-motion (SFM) from camera or drone imagery. The 3D description of landscapes gives access to an unprecedented level of details that can significantly change the way we look at, understand, and study natural systems. Point clouds with millimetric resolution even allow to go further and to investigate the properties of riverbed sediments: dedicated algorithms are now able to extract the sediment size distribution or their spatial orientation directly from the point cloud. 

Such data can be real game changers to study for example torrential streams prone to flash floods or debris flows. Such events are usually associated with heavy rainfall events, while conditioned by the geomorphological state of a stream (e.g., channel geometry, vegetation cover). The size and the shape of the grains available in the river also strongly influence river erosion and sediment transport during a flood. 3D data can thus help to design prevention and mitigation measures in streams prone to torrential events. 

However, it is not straightforward to go from data acquisition to river erosion or to grain-size distributions. Indeed, isolating and classifying the areas of interest can be complex and time-consuming. This can be done manually, at the cost of time and absence of reproducibility. We rather take advantage of state-of-the-art classification method (3DMASC) to develop a general classifier for point clouds in fluvial environments designed to identify five classes usually found in such settings: coarse sediments, sand, bedrock, vegetation and human-made structures. We also improved the G3Point sediment segmentation algorithm, developed by our team, to make it more efficient and straightforward to use in the CloudCompare software, which is dedicated to point cloud visualization and analysis. We apply it to the coarse sediments class identified by 3DMASC to provide a more accurate description of grain size and orientation. We also make a profit of the sand class to estimate its relative areal distribution that can then be compared to the coarse sediment class. This provides valuable information about the type of flows which are also important for planning torrential events mitigation measures.

We illustrate this combined approach with two field examples. The first one is based on SFM data acquired along streams prone to torrential events in Bulgaria and in Serbia where we documented sediment size and orientation. The second one is based on TLS data acquired along a bedrock river in France that experienced a major flood which induced dramatic changes in the river morphology. 

This work has been partially funded by PHC Danube n° 49921ZG/ n° KP-06-Danube/5, 14.08.2023 (National Science Fund, Bulgaria) and the H2020 European Research Council (grant no. 803721). 

How to cite: Guerit, L., Steer, P., Leroy, P., Lague, D., Filipov, D., Jakubinsky, J., Petrovic, A., and Nikolova, V.: Classification and segmentation of 3D point clouds to survey river dynamics and evolution , EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-15896,, 2024.

EGU24-16939 | ECS | Posters on site | GM3.1 | Highlight

Integrating structure-from-motion photogrammetry with 3D webGIS for risk assessment, mapping and monitoring of coastal area changes in the Maltese archipelago 

Emanuele Colica, Daniel Fenech, Christopher Gauci, and George Buhagiar

The Maltese coasts extend for approximately 273km, representing a notable resource for the country and of one of its pillar economies, the tourism sector. Natural processes and anthropic interventions continue to threaten Malta's coastal morphology, shaping its landscape and triggering soil erosion phenomena. Therefore, many research projects (Colica et al., 2021, 2022 and 2023) have concentrated their work on the investigation and monitoring of the instability of cliffs and the erosion of pocket beaches. The results of such activities can be widely disseminated and shared with expert and non-expert users through web mapping, which has only been used in a very limited way in collaborative coastal management and monitoring by different entities in Malta. This study describes the performance of a WebGIS designed to disseminate the results of innovative geomatic investigations for monitoring and analyzing erosion risk, performed by the Research and Planning Unit within the Public Works Department of Malta. While aiming to include the entire national coastline, three study areas along the NE and NW regional coasts of the island of Malta have already been implemented as pilot cases. This WebGIS was generated using ArcGIS pro software by ESRI and a user-friendly interactive interface has been programmed to help users view in 2D and 3D, satisfying both multi-temporal and multi-scale perspectives. It is envisaged that through further development and wider dissemination there will be a stronger uptake across different agencies involved in coastal risk assessment, monitoring and management.


Colica, E., D’Amico, S., Iannucci, R., Martino, S., Gauci, A., Galone, L., ... & Paciello, A. (2021). Using unmanned aerial vehicle photogrammetry for digital geological surveys: Case study of Selmun promontory, northern of Malta. Environmental Earth Sciences, 80, 1-14.

Colica, E. (2022). Geophysics and geomatics methods for coastal monitoring and hazard evaluation.

Colica, E., Galone, L., D’Amico, S., Gauci, A., Iannucci, R., Martino, S., ... & Valentino, G. (2023). Evaluating Characteristics of an Active Coastal Spreading Area Combining Geophysical Data with Satellite, Aerial, and Unmanned Aerial Vehicles Images. Remote Sensing, 15(5), 1465.

How to cite: Colica, E., Fenech, D., Gauci, C., and Buhagiar, G.: Integrating structure-from-motion photogrammetry with 3D webGIS for risk assessment, mapping and monitoring of coastal area changes in the Maltese archipelago, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-16939,, 2024.

EGU24-17822 | ECS | Posters on site | GM3.1

Evaluating Ordnance Survey sheets (1890s – 1957) for shoreline change analysis in the Maltese Islands  

Daniel Fenech, Jeremie Tranchant, Christopher Gauci, Daniela Ghirxi, Ines Felix-Martins, Emanuele Colica, and George Buhagiar


Jeremie' Tranchant1, Daniel Fenech1, Christopher Gauci1, Daniela Ghirxi1, Ines Felix Martins1, Emanuele Colica1, George Buhagiar1

1  Research and Planning Unit, Ministry for Transport, Infrastructure and Public Works, Project House, Triq Francesco    Buonamici, Floriana, FRN1700, Malta

The assessment of coastal erosion through shoreline change analysis, is an exercise of national utility undertaken in many countries. The Maltese Islands are particularly vulnerable to coastal erosion given the economic value of coastal activities and their high ratio of coast-to-land surface. The integration of historical cartographic material is often used to hindcast shoreline change across long periods of time, as well as to model future erosion rates. The Public Works Department have produced detailed 1:2500 maps of Malta in collaboration with the British Ordnance Survey from the end of the 19th century to 1957, however these maps have never been scientifically assessed. The initial research carried out evaluated the usefulness of the two oldest 25-inches Maltese maps series (early 20th century and 1957) for shoreline change analysis.  The two series were digitised, georeferenced, and compared in a GIS environment to assess their differences. The inaccuracies of the original drawings, absent shoreline indicators, and the absence of a geographic coordinate system (datum and projection) were identified as limitations for their use in evaluating small gradual changes, but were ideal for the identification of stochastic, large-scale historic erosion events using difference maps. This assessment showed that the two series are highly congruous and any changes between the two series are largely attributed to changes in infrastructure. There were, however, minor exceptions and these need to be explored on a case-by-case basis. These methods and the insights garnered from their production will function as scientific steppingstones towards developing a holistic coastal erosion national monitoring program.  

How to cite: Fenech, D., Tranchant, J., Gauci, C., Ghirxi, D., Felix-Martins, I., Colica, E., and Buhagiar, G.: Evaluating Ordnance Survey sheets (1890s – 1957) for shoreline change analysis in the Maltese Islands , EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-17822,, 2024.

EGU24-21396 | ECS | Orals | GM3.1

Automatic detection of river bankfull parameters from high density lidar data 

Alexandre Rétat, Nathalie Thommeret, Frédéric Gob, Thomas Depret, Jean-Stéphane Bailly, Laurent Lespez, and Karl Kreutzenberger

The European Water Framework Directive (WFD), adopted in 2000, set out requirements for a
better understanding of aquatic environments and ecosystems. In 2006, following the transposition of
the WFD into French law (LEMA), France began work on a field protocol for the geomorphological
characterization of watercourses, as part of a partnership between the Centre National de la Recherche
Scientifique (CNRS) and the Office Français de la Biodiversité (OFB). This protocol, known as "Carhyce"
(For « River Hydromorphological Caracterisation »), has been tested, strengthened and approved over
the last 15 years at more than 2500 reaches. It consists of collecting standardised qualitative and
quantitative data in the field, essential for the caracterisation of a watercourse: channel geometry,
substrate, riparian vegetation... However, certain rivers that are difficult to survey (too deep or too
wide) pose problems for data collection.
To address these issues, and to extend the analysis to a wider scale (full river section), using
remote sensing, and in particular LiDAR data, was considered. The major advantages of LiDAR over
passive optical sensors are better geometric accuracy and especially under vegetation. For a long time,
LiDAR data rarely exists at national scale with data density similar to passive imagery. Today, the French
LiDAR HD dataset (10 pulses per meter square) program run by the French mapping agency offers an
unprecedented amount of data at this scale. Thanks to them, a national 3D coverage of the ground can
be used, and numerous geomorphological measurements can be carried out on a more or less large
scale. This is the case for hydromorphological parameters such as water level and width.
The aim of this study is therefore to use this high-density lidar to automatically determine the
hydromorphological parameters sought in the Carhyce protocol. In particular, we have developed a
lidar-based algorithm to reconstruct the topography from point cloud and automatically identify the
bankfull level at reach scale. Designed to be applicable to every French river, the method must be
robust to all river features such as longitudinal slope, width, sinuosity, multi-channel etc... For
validation purposes, the bankfull geometry calculated by the algorithm has been compared with field
measurements at some twenty Carhyce stations across France. To determine the test stations, we
looked for the diversity of situations in terms of river characteristics describe above to observed the
influence of this features on the results.

How to cite: Rétat, A., Thommeret, N., Gob, F., Depret, T., Bailly, J.-S., Lespez, L., and Kreutzenberger, K.: Automatic detection of river bankfull parameters from high density lidar data, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-21396,, 2024.

EGU24-22358 | ECS | Orals | GM3.1 | Highlight

UAV’s to monitor the mass balance of glaciers 

Lander Van Tricht, Harry Zekollari, Matthias Huss, Philippe Huybrechts, and Daniel Farinotti

Uncrewed Aerial Vehicles (UAVs) are increasingly employed for glacier monitoring, particularly for small to medium-sized glaciers. The UAVs are mainly used to generate high-resolution Digital Elevation Models (DEMs), delineate glacier areas, determine surface velocities, and map supraglacial features. In this study, we utilise UAVs across various sites in the Alps and the Tien Shan (Central Asia) to monitor the mass balance of glaciers. We present a workflow for calculating the annual geodetic mass balance and obtaining the surface mass balance using the continuity-equation method. Our results demonstrate generally a close alignment between the determined mass balances and those obtained through traditional glaciological methods involving intensive fieldwork. We show that utilising UAV data reveals significantly more spatial details, such as the influence of debris and collapsing ice caves, which are challenging to capture using conventional methods that strongly rely on interpolation and extrapolation. This underscores the UAV's significance as a valuable add-on tool for quantifying annual glacier mass balance and validating glaciological assessments. Drawing on our experience in on-site UAV glacier surveys, we discuss the methodology's advantages, disadvantages, and potential pitfalls. 

How to cite: Van Tricht, L., Zekollari, H., Huss, M., Huybrechts, P., and Farinotti, D.: UAV’s to monitor the mass balance of glaciers, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-22358,, 2024.

EGU24-1857 | Orals | ESSI2.9

A Replicable Multi-Cloud Automation Architecture for Earth Observation 

Armagan Karatosun, Claudio Pisa, Tolga Kaprol, Vasileios Baousis, and Mohanad Albughdadi

The EO4EU project aims at making the access and use of Earth Observation (EO) data easier for environmental, government and business forecasts and operations.

To reach this goal, the EO4EU Platform will soon be made officially available, leveraging existing EO data sources such as DestinE, GEOSS, INSPIRE, Copernicus and Galileo, and offering advanced tools and services, based also on machine learning techniques, to help users find, access and handle the data they are interested in. The EO4EU Platform relies on a combination of a multi-cloud computing infrastructure coupled with pre-exascale high-performance computing facilities to manage demanding processing workloads.

The EO4EU multi-cloud infrastructure is composed by IaaS resources hosted on the WEkEO and CINECA Ada clouds, leveraged by a set of Kubernetes clusters dedicated to different workloads (e.g. cluster management tools, observability, or specific applications such as an inference server). To automate the deployment and management of these clusters, with advantages in terms of minimisation of dedicated effort and human errors, we have devised an Infrastructure-as-Code (IaC) architecture based on the Terraform, Rancher and Ansible technologies.

We believe that the proposed IaC architecture, based on open-source components and extensively documented and tested on the field, can be successfully replicated by other EO initiatives leveraging cloud infrastructures.

How to cite: Karatosun, A., Pisa, C., Kaprol, T., Baousis, V., and Albughdadi, M.: A Replicable Multi-Cloud Automation Architecture for Earth Observation, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-1857,, 2024.

EGU24-6216 | Posters on site | ESSI2.9

Pangeo environment in Galaxy Earth System supported by Fair-Ease 

Thierry Carval, Marie Jossé, and Jérôme Detoc

The Earth System is a complex and dynamic system that encompasses the interactions between the atmosphere, oceans, land, and biosphere. Understanding and analyzing data from the Earth System Model (ESM) is essential, for example to predict and mitigate the impacts of climate change.

Today, collaborative efforts among scientists across diverse fields are increasingly urgent. The FAIR-EASE project aims to build an interdomain digital architecture for integrated and collaborative use of environmental data. Galaxy is a main component of this architecture which will be used by several domains of study chose by FAIR-EASE.

Galaxy, an open-source web platform, provides users with an easy and FAIR tool to access and handle multidisciplinary environmental data. By design, Galaxy manages data analyses by sharing and publishing all involved items like inputs, results, workflows, and visualisations, ensuring reproducibility by capturing the necessary information to repeat and understand data analyses.

From this point on, a Pangeo environment is a tool more than relevant to be used alongside earth-system related data and processing tools in order to create cross domain analyses. The good news is that a Pangeo environment is accessible on Galaxy. It can be exploited as a jupyterlab and allows the user to manage their NetCDF data in a Pangeo environment with the use of notebooks. Multiple tutorials are available on the Galaxy Training Network to learn how to use Pangeo.

The Galaxy Training Network significantly contributes to enhancing the accessibility and reusability of tools and workflows. The Galaxy Training platform hosts an extensive collection of tutorials. These tutorials serve as valuable resources for individuals seeking to learn how to navigate Galaxy, employ specific functionalities like Interactive Tools or how to execute workflows for specific analyses.

In synthetisis, Pangeo in Galaxy provide Pangeo users with an up-to-date data analysis platform ensuring reproducibility and mixing trainings and tools.

On the Earth System side, a first step was the creation of a Galaxy declination for Earth System studies ( with dedicated data, models, processing, visualisations and tutorials. It will make Earth System modeling more accessible to researchers in different fields.

In this Galaxy subdomain we choose to have the Pangeo tools. Our hope is to be able to implement cross domain workflows including climate and earth system sciences.

During this session our aim is to present how you can use the Pangeo environment from the Galaxy Earth System.

How to cite: Carval, T., Jossé, M., and Detoc, J.: Pangeo environment in Galaxy Earth System supported by Fair-Ease, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-6216,, 2024.

EGU24-7765 | Orals | ESSI2.9

Unleashing the power of Dask with a high-throughput Trust Region Reflectance solver for raster datacubes 

Bernhard Raml, Raphael Quast, Martin Schobben, Christoph Reimer, and Wolfgang Wagner

In remote sensing applications, the ability to efficiently fit models to vast amounts of observational data is vital for deriving high-quality data products, as well as accelerating research and development. Addressing this challenge, we developed a high-performance non-linear Trust Region Reflectance solver specialised for datacubes, by integrating Python's interoperability with C++ and Dask's distributed computing capabilities. Our solution achieves high throughput both locally and potentially on any Dask-compatible backend, such as EODC's Dask Gateway. The Dask framework takes care of chunking the datacube, and streaming each chunk efficiently to available workers where our specialised solver is applied. Introducing Dask for distributed computing enables our algorithm to run on different compatible backends. This approach not only broadens operational flexibility, but also allows us to focus on enhancing the algorithm's efficiency, free from concerns about concurrency. This enabled us to implement a highly efficient solver in C++, which is optimised to run on a single core, but still utilise all available resources effectively. For the heavy lifting, such as performing singular value decompositions and matrix operations we rely on Eigen, a powerful open-source C++ library specialized on linear algebra. To describe the spatial reference and other auxiliary data associated with our datacube, we employ the Xarray framework. Importantly, Xarray integrates seamlessly with Dask. Finally, to ensure robustness and extensibility of our framework, we applied state-of-the-art software engineering practices, including Continuous Integration and Test-Driven Development. In our work we demonstrate the significant performance gains achievable by effectively utilising available open-source frameworks, and adhering to best engineering practices. This is exemplified by our practical workflow demonstration to fit a soil moisture estimation model. 

How to cite: Raml, B., Quast, R., Schobben, M., Reimer, C., and Wagner, W.: Unleashing the power of Dask with a high-throughput Trust Region Reflectance solver for raster datacubes, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-7765,, 2024.

The Earth System Grid Federation (ESGF) data nodes are usually the first address for accessing climate model datasets from WCRP-CMIP activities. It is currently hosting different datasets in several projects, e.g., CMIP6, CORDEX, Input4MIPs or Obs4MIPs. Datasets are usually hosted on different data nodes all over the world while data access is managed by any of the ESGF web portals through a web-based GUI or the ESGF Search RESTful API. The ESGF data nodes provide different access methods, e.g., https, OPeNDAP or Globus. 

Beyond ESGF, there has been the Pangeo / ESGF Cloud Data Working Group that coordinates efforts related to storing and cataloging CMIP data in the cloud, e.g., in the Google cloud and in the Amazon Web Services Simple Storage Service (S3) where a large part of the WCRP-CMIP6 ensemble of global climate simulations is now available in analysis-ready cloud-optimized (ARCO) zarr format. The availibility in the cloud has significantly lowered the barrier for users with limited resources and no access to an HPC environment to work with CMIP6 datasets and at the same time increases the chance for reproducibility and reusability of scientific results. 

Following the Pangeo strategy, we have adapted parts of the Pangeo Forge software stack for publishing our regional climate model datasets from the EURO-CORDEX initiative on AWS S3 cloud storage. The main tools involved are Xarray, Dask, Zarr, Intake and the ETL tools of pangeo-forge-recipes. Thanks to similar meta data conventions in comparison to the global CMIP6 datasets, the workflows require only minor adaptations. In this talk, we will show the strategy and workflow implemented and orchestrated in GitHub Actions workflows as well as a demonstration of how to access EURO-CORDEX datasets in the cloud.

How to cite: Buntemeyer, L.: Beyond ESGF – Bringing regional climate model datasets to the cloud on AWS S3 using the Pangeo Forge ETL framework, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-8058,, 2024.

EGU24-8343 | ECS | Posters on site | ESSI2.9 | Highlight

Implementation of a reproducible pipeline for producing seasonal Arctic sea ice forecasts 

Vanessa Stöckl, Björn Grüning, Anne Fouilloux, Jean Iaquinta, and Alejandro Coca-Castro

This work highlights the integration of IceNet (, a cutting-edge sea ice forecasting system leveraging numerous Python packages from the Pangeo ecosystem, into the Galaxy platform—an open-source tool designed for FAIR (Findable, Accessible, Interoperable, and Reusable) data analysis. Aligned with the Pangeo ecosystem's broader objectives, and carried out in the frame of the EuroScienceGateway project (, this initiative embraces a collaborative approach to tackle significant geoscience data challenges. The primary aim is to democratise access to IceNet's capabilities by converting a Jupyter Notebook, published in the Environmental Data Science book (, into Galaxy Tools and crafting a reusable workflow executable through a Graphical User Interface or standardised APIs. IceNet is meant to predict Arctic sea ice concentration up to six months in advance, and it outperforms previous systems. This integration establishes a fully reproducible workflow, enabling scientists with diverse computational expertise to automate sea ice predictions. The IceNet workflow is hosted on the European Galaxy Server (, along with the related tools, ensuring accessibility for a wide community of researchers. With the urgency of accurate predictions amid global warming's impact on Arctic sea ice, this work addresses challenges faced by scientists, particularly those with limited programming experience. The transparent, accessible, and reproducible pipeline for Arctic sea ice forecasting aligns with Open and Science principles. The integrated IceNet into Galaxy enhances accessibility to advanced climate science tools, allowing for automated predictions that contribute to early and precise identification of potential damages from sea ice loss. This initiative mirrors the overarching goals of the Pangeo community, advancing transparent, accessible, and reproducible research. The Galaxy-based pipeline presented serves as a testament to collaborative efforts within the Pangeo community, breaking down barriers related to computational literacy and empowering a diverse range of scientists to contribute to climate science research. The integration of IceNet into Galaxy not only provides a valuable tool for seasonal sea ice predictions but also exemplifies the potential for broad interdisciplinary collaboration within the Pangeo ecosystem.

How to cite: Stöckl, V., Grüning, B., Fouilloux, A., Iaquinta, J., and Coca-Castro, A.: Implementation of a reproducible pipeline for producing seasonal Arctic sea ice forecasts, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-8343,, 2024.

EGU24-9156 | ECS | Orals | ESSI2.9

DataLabs: development of a cloud collaborative platform for open interdisciplinary geo-environmental sciences  

Michael Tso, Michael Hollaway, Faiza Samreen, Iain Walmsley, Matthew Fry, John Watkins, and Gordon Blair

In environmental science, scientists and practitioners are increasingly facing the need to create data-driven solutions to the environment's grand challenges, often needing to use data from disparate sources and advanced analytical methods, as well as drawing expertise from collaborative and cross-disciplinary teams [1]. Virtual labs allow scientists to collaboratively explore large or heterogeneous datasets, develop and share methods, and communicate their results to stakeholders and decision-makers. 

DataLabs [2] has been developed as a cloud-based collaborative platform to tackle these challenges and promote open, collaborative, interdisciplinary geo-environmental sciences. It allows users to share notebooks (e.g. JupyterLab, R Studio, and most recently VS Code), datasets and computational environments and promote transparency and end-to-end reasoning of model uncertainty. It supports FAIR access to data and digital assets by providing shared data stores and discovery functionality of datasets and assets hosted on the platform’s asset catalogue. Its tailorable design allows it to be adaptable to different challenges and applications. It is also an excellent platform for large collaborative teams to work on outputs together [3] as well as communicating results to stakeholders by allowing easy prototyping and publishing of web applications (e.g. Shiny, Panel, Voila). It is currently deployed on JASMIN [4] and is part of the UK NERC Environmental data service [5]. 

There are a growing number of use cases and requirements for DataLabs and it is going to play a central part in several planned digital research infrastructure (DRI) initiatives. Future development needs of the platform to further its vision include e.g. more intuitive onboarding experience, easier access to key datasets at source, better connectivity to other cloud platforms, and better use of workflow tools. DataLabs shares many of the features (e.g. heavy use of PANGEO core packages) and design principles of PANGEO. We would be interested in exploring commonalities and differences, sharing best practices, and growing the community of practice in this increasingly important area. 

[1]  Blair, G.S., Henrys, P., Leeson, A., Watkins, J., Eastoe, E., Jarvis, S., Young, P.J., 2019. Data Science of the Natural Environment: A Research Roadmap. Front. Environ. Sci. 7.  

[2] Hollaway, M.J., Dean, G., Blair, G.S., Brown, M., Henrys, P.A., Watkins, J., 2020. Tackling the Challenges of 21st-Century Open Science and Beyond: A Data Science Lab Approach. Patterns 1, 100103. 




How to cite: Tso, M., Hollaway, M., Samreen, F., Walmsley, I., Fry, M., Watkins, J., and Blair, G.: DataLabs: development of a cloud collaborative platform for open interdisciplinary geo-environmental sciences , EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-9156,, 2024.

EGU24-9781 | Posters on site | ESSI2.9

Optimizing NetCDF performance for cloud computing : exploring a new chunking strategy 

Flavien Gouillon, Cédric Pénard, Xavier Delaunay, and Florian Wery

Owing to the increasing number of satellites and advancements in sensor resolutions, the volume of scientific data is experiencing rapid growth. NetCDF (Network Common Data Form) stands as the community standard for storing such data, necessitating the development of efficient solutions for file storage and manipulation in this format.

Object storage, emerging with cloud infrastructures, offers potential solutions for data storage and parallel access challenges. However, NetCDF may not fully harness this technology without appropriate adjustments and fine-tuning. To optimize computing and storage resource utilization, evaluating NetCDF performance on cloud infrastructures is essential. Additionally, exploring how cloud-developed software solutions contribute to enhanced overall performance for scientific data is crucial.

Offering multiple file versions with data split into chunks tailored for each use case incurs significant storage costs. Thus, we investigate methods to read portions of compressed chunks, creating virtual sub-chunks that can be read independently. A novel approach involves indexing data within NetCDF chunks compressed with deflate, enabling extraction of smaller data portions without reading the entire chunk.

This feature is very valuable in use cases such as pixel drilling or extracting small amounts of data from large files with sizable chunks. It also saves reading time, particularly in scenarios of poor network connection, such as those encountered onboard research vessels.

We conduct performance assessments of various libraries in various use cases to provide recommendations for the most suitable and efficient library for reading NetCDF data in different situations.

Our tests involved accessing remote NetCDF datasets (two files from the SWOT mission) available on the network via a lighttpd server and an s3 server. Additionally, simulations of degraded Internet connections, featuring high latency, packet loss, and limited bandwidth, are also performed.

We evaluate the performance of four Python libraries (netcdf4 lib, Xarray, h5py, and our chunk indexing library) for reading dataset portions through fsspec or fs_s3. A comparison of reading performance using netCDF, zarr, and nczarr data formats is also conducted on an s3 server.

Preliminary findings indicate that the h5py library is the most efficient, while Xarray exhibits poor performance in reading NetCDF files. Furthermore, the NetCDF format demonstrates reasonably good performance on an s3 server, albeit lower than zarr or nczarr formats. However, the considerable efforts required to convert petabytes of archived NetCDF files and adapt numerous software libraries for a performance improvement within the same order of magnitude can raise questions about the practicality of such endeavors and benefits is thus extremely related to the use cases.

How to cite: Gouillon, F., Pénard, C., Delaunay, X., and Wery, F.: Optimizing NetCDF performance for cloud computing : exploring a new chunking strategy, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-9781,, 2024.

EGU24-9795 | ECS | Orals | ESSI2.9

Unifying HPC and Cloud Systems; A Containerized Approach for the Integrated Forecast System (IFS) 

Cathal O'Brien, Armagan Karatosun, Adrian Hill, Paul Cresswell, Michael Sleigh, and Ioan Hadade

The IFS (Integrated Forecast System) is a global numerical weather prediction system maintained by the European Centre for Medium-Range Weather Forecasts (ECMWF). Traditionally, ECMWF’s high-performance computing facility (HPCF) is responsible for operationally supporting the IFS cycles. However, with the emergence of new cloud technologies, initiatives such as Destination Earth (DestinE), and growth of OpenIFS users within Europe and around the globe, the need to run IFS outside of ECMWF's computing facilities becomes more evident. Concerning such use cases, IFSTestsuite allows for the complete IFS system and its dependencies (e.g. ecCodes) to be built and tested outside of ECMWF's HPCF and designed to be self-contained, eliminating the need for external tools like MARS or ecCodes. Despite the need for users to perform multiple steps and the dependency of the software availability and versions on the host operating system, this indicates that there might be a potential for more generic and broader approach. 

Containerization might provide the much-needed portability and disposable environments to trigger new cycles with the desired compiler versions, or even with different compilers. In addition, pre-built container images can be executed on any platform, provided there is a compatible container runtime installed on the target system that adheres to Open Container Initiative (OCI) standards like Singularity or Docker. Another benefit of using container images is container image layers which can significantly reduce the image build time. Lastly, despite their differences, both Singularity and Docker adhere to the OCI standards, and converting one container image to another is straightforward. However, despite the clear advantages, there are several crucial design choices to keep in mind. Notably, the available hardware and software stacks varies greatly across different HPC systems. When performance is important, this heterogeneous landscape limits the portability of containers. The libraries and drivers inside the container must be specially selected with regard to the hardware and software stack of a specific host system to maximize performance on that system. If this is done correctly, the performance of containerized HPC applications can match native applications. We demonstrate this process with the use of a hybrid containerization strategy where compatible MPI stacks and drivers are built inside the containers. The binding of host libraries into containers is also used on systems where proprietary software cannot be rebuilt inside the container.  

In this study we present a containerized solution which balances portability and efficient performance, with examples of containerizing the IFS on a variety of systems including cloud systems with generic x86-64 architecture, such as European Weather Cloud (EWC) and Microsoft Azure, on EuroHPC systems such as Leonardo and LUMI and provided container image recipes for OpenIFS. 

How to cite: O'Brien, C., Karatosun, A., Hill, A., Cresswell, P., Sleigh, M., and Hadade, I.: Unifying HPC and Cloud Systems; A Containerized Approach for the Integrated Forecast System (IFS), EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-9795,, 2024.

EGU24-10741 | Posters on site | ESSI2.9

Harnessing the Pangeo ecosystem for delivering the cloud-based Global Fish Tracking System 

Daniel Wiesmann, Tina Odaka, Anne Fouilloux, Emmanuelle Autret, Mathieu Woillez, and Benjamin Ragan-Kelley

We present our approach of leveraging the Pangeo software stack for developing the Global Fish Tracking System (GFTS). The GFTS project tackles the challenge of accurately modelling fish movement in the ocean based on biologging data with a primary focus on Sea Bass. Modelling fish movements is essential to better understand migration strategies and site fidelity, which are critical aspects for fish stock management policy and marine life conservation efforts.

Estimating fish movements is a highly compute intensive process. It involves matching pressure and temperature data from in-situ biologging sensors with high resolution ocean temperature simulations over long time periods. The Pangeo software stack provides an ideal environment for this kind of modelling. While the primary target platform of the GFTS project is the new Destination Earth Service Platform (DESP), relying on the Pangeo ecosystem ensures that the GFTS project is a robust and portable solution that can be re-deployed on different infrastructure. 

One of the distinctive features of the GFTS project is its advanced data management approach, synergizing with the capabilities of Pangeo. Diverse datasets, including climate change adaptation digital twin data, sea temperature observations, bathymetry, and biologging in-situ data from tagged fish, are seamlessly integrated within the Pangeo environment. A dedicated software called pangeo-fish has been developed to streamline this complex modelling process. The technical framework of the GFTS project includes Pangeo core packages such as Xarray and Dask, which facilitate scalable computations.

Pangeo's added value in data management becomes apparent in its capability to optimise data access and enhance performance. The concept of "data visitation" is central to this approach. By strategically deploying Dask clusters close to the data sources, the GFTS project aims to significantly improve performance of fish track modelling when compared to traditional approaches. This optimised data access ensures that end-users can efficiently interact with large datasets, leading to more streamlined and efficient analyses.

The cloud-based delivery of the GFTS project aligns with the overarching goal of Pangeo. In addition, the GFTS includes the development of a custom interactive Decision Support Tool (DST). The DST empowers non-technical users with an intuitive interface for better understanding the results of the GFTS project, leading to more informed decision-making. The integration with Pangeo and providing intuitive access to the GFTS data is not merely a technicality; it is a commitment to FAIR (Findable, Accessible, Interoperable and Reusable), TRUST (Transparency, Responsibility, User focus, Sustainability and Technology) and open science principles. 

In short, the GFTS project, within the Pangeo ecosystem, exemplifies how advanced data management, coupled with the optimization of data access through "data visitation," can significantly enhance the performance and usability of geoscience tools. This collaborative and innovative approach not only benefits the immediate goals of the GFTS project but contributes to the evolving landscape of community-driven geoscience initiatives.

How to cite: Wiesmann, D., Odaka, T., Fouilloux, A., Autret, E., Woillez, M., and Ragan-Kelley, B.: Harnessing the Pangeo ecosystem for delivering the cloud-based Global Fish Tracking System, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-10741,, 2024.

EGU24-12410 | Orals | ESSI2.9

Towards Enhancing WaaS and Data Provenance over Reana 

Iraklis Klampanos, Antonis Ganios, and Antonis Troumpoukis

Interoperability and reproducibility are critical aspects of scientific computation. The data analysis platform Reana [1], developed by CERN, enhances the interoperability and reproducibility of scientific analyses by allowing researchers to describe, execute, and share their analyses. This is achieved via the execution of standardised scientific workflows, such as CWL, within reusable containers. Moreover, it allows execution to span different types of resources, such as Cloud and HPC. 

In this session we will present ongoing work to enhance Reana’s Workflows-as-a-Service (WaaS) functionality and also support Workflow registration and discoverability. Building upon the design goals and principles of the DARE platform [2], this work aims to enhance Reana by enabling users to register and discover available workflows within the system. In addition, we will present the integration of Data Provenance based on the W3C PROV-O standard [3] allowing the tracking and recording of data lineage in a systematic and dependable way across resource types. 

In summary, key aspects of this ongoing work include:

  • Workflows-as-a-Service (WaaS): Extending Reana's service-oriented mode of operation, allowing users to register, discover, access, execute, and manage workflows by name or ID, via APIs, therefore enhancing the platform's accessibility and usability.
  • Data Provenance based on W3C PROV-O: Implementing support for recording and visualising data lineage information in compliance with the W3C PROV-O standard. This ensures transparency and traceability of data processing steps, aiding in reproducibility and understanding of scientific analyses.

This work aims to broaden Reana's functionality, aligning with best practices for reproducible and transparent scientific research. We aim to make use of the enhanced Reana-based system on the European AI-on-demand platform [4], currently under development, to address the requirements of AI innovators and researchers when studying and executing large-scale AI-infused workflows.


[1] Simko et al., (2019). Reana: A system for reusable research data analyses. EPJ Web Conf., 214:06034,

[2] Klampanos et al., (2020). DARE Platform: a Developer-Friendly and Self-Optimising Workflows-as-a-Service Framework for e-Science on the Cloud. Journal of Open Source Software, 5(54), 2664,

[3] PROV-O: The PROV Ontology: (viewed 9 Jan 2024)

[4] The European AI-on-Demand platform: (viewed 9 Jan 2024)

This work has been has received funding from the European Union’s Horizon Europe research and innovation programme under Grant Agreement No 101070000.

How to cite: Klampanos, I., Ganios, A., and Troumpoukis, A.: Towards Enhancing WaaS and Data Provenance over Reana, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-12410,, 2024.

EGU24-12669 | ECS | Orals | ESSI2.9

DeployAI to Deliver Interoperability of Cloud and HPC Resources for Earth Observation in the Context of the European AI-on-Demand Platform 

Antonis Troumpoukis, Iraklis Klampanos, and Vangelis Karkaletsis

The European AI-on-Demand Platform (AIoD, is a vital resource for leveraging and boosting the European AI research landscape towards economic growth and societal advancement across Europe. Following and emphasising European values, such as openness, transparency, and trustworthiness for developing and using AI technologies, the AIoD platform aims to become the main one-stop shop for exchanging and building AI resources and applications within the European AI innovation ecosystem, whilst also adhering to European values. The primary goal of the DIGITAL-EUROPE CSA initiative DeployAI (DIGITAL-2022-CLOUD-AI-B-03, 01/2024-12/2027) is to build, deploy, and launch a fully operational AIoD platform, promoting trustworthy, ethical, and transparent European AI solutions for the industry, with a focus on SMEs and the public sector.

Building on Open-source and trusted software, DeployAI will provide a number of technological assets such as a comprehensive and Trustworthy AI resource catalogue and marketplace offering responsible AI resources and tools, workflow composition and execution systems for prototyping and user-friendly creation of novel services, responsible foundational models and services to foster dependable innovation, etc. In addition, and building upon the results of the ICT-49 AI4Copernicus project [1], which provided a bridge between the AIoD platform and the Copernicus ecosystem and the DIAS platforms, DeployAI will integrate impactful Earth Observation AI services into the AIoD platform. These will include (but not limited to) satellite imagery preprocessing, land usage classification, crop type identification, super-resolution, and weather forecasting.

Furthermore, DeployAI will allow the rapid prototyping of AI applications and their deployment to a variety of Cloud/Edge/HPC infrastructures. The project will focus on establishing a cohesive interaction framework that integrates with European Data Spaces and Gaia-X initiatives, HPC systems with an emphasis on the EuroHPC context, and the European Open Science Cloud. Interfaces to European initiatives and industrial AI-capable cloud platforms will be further implemented to enable interoperability. This capability enables the execution of Earth Observation applications not only within the context of a DIAS/DAS but also within several other compute systems. This level of interoperability enhances the adaptability and accessibility of AI applications, fostering a collaborative environment where geoscientific workflows can be seamlessly executed across diverse computational infrastructures and made available to a wide audience of innovators.

[1] A. Troumpoukis et al., "Bridging the European Earth-Observation and AI Communities for Data-Intensive Innovation", 2023 IEEE Ninth International Conference on Big Data Computing Service and Applications (BigDataService), Athens, Greece, 2023, pp. 9-16, doi:10.1109/BigDataService58306.2023.00008.

This work has been has received funding from the European Union’s Digital Europe Programme (DIGITAL) under grant agreement No 101146490.

How to cite: Troumpoukis, A., Klampanos, I., and Karkaletsis, V.: DeployAI to Deliver Interoperability of Cloud and HPC Resources for Earth Observation in the Context of the European AI-on-Demand Platform, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-12669,, 2024.

EGU24-15366 | ECS | Posters on site | ESSI2.9

Enabling seamless integration of Copernicus and in-situ data 

Iason Sotiropoulos, Athos Papanikolaou, Odysseas Sekkas, Anastasios Polydoros, Vassileios Tsetsos, Claudio Pisa, and Stamatia Rizou

BUILDSPACE aims to combine terrestrial data from buildings collected by IoT devices with aerial imaging from drones equipped with thermal cameras and location annotated data from satellite services (i.e., EGNSS and Copernicus) to deliver innovative services at building scale, enabling the generation of high fidelity multi-modal digital twins and at city scale providing decision support services for energy demand prediction, urban heat and urban flood analysis. A pivotal element and the foundational support of the BUILDSPACE ecosystem is the Core Platform and it plays a crucial role in facilitating seamless data exchange, secure and scalable data storage, and streamlined access to data from three Copernicus services, namely the Land, Atmosphere, and Climate Change.The platform's underlying technology is robust, incorporating two key components: OIDC for user authentication and group authorization over the data, and a REST API to handle various file operations. OIDC stands for OpenID Connect, a standard protocol that enables secure user authentication and allows for effective management of user groups and their access permissions. On the other hand, the platform employs a REST API for seamless handling of file-related tasks, including uploading, downloading, and sharing. This combination ensures efficient and secure data exchange within the system. Additionally, the use of an S3 compatible file system ensures secure and scalable file storage, while a separate metadata storage system enhances data organization and accessibility. Currently deployed on a Kubernetes cluster, this platform offers numerous advantages, including enhanced scalability, efficient resource management, and simplified deployment processes. The implementation of the Core Platform has led to a current focus on integrating APIs from Copernicus services into the Core Platform's API. This ongoing effort aims to enhance the platform's capabilities by seamlessly incorporating external data, enriching the overall functionality and utility of the project.

How to cite: Sotiropoulos, I., Papanikolaou, A., Sekkas, O., Polydoros, A., Tsetsos, V., Pisa, C., and Rizou, S.: Enabling seamless integration of Copernicus and in-situ data, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-15366,, 2024.

EGU24-15416 | ECS | Orals | ESSI2.9

XDGGS: Xarray Extension for Discrete Global Grid Systems (DGGS) 

Alexander Kmoch, Benoît Bovy, Justus Magin, Ryan Abernathey, Peter Strobl, Alejandro Coca-Castro, Anne Fouilloux, Daniel Loos, and Tina Odaka

Traditional geospatial representations of the globe on a 2-dimensional plane often introduce distortions in area, distance, and angles. Discrete Global Grid Systems (DGGS) mitigate these distortions and introduce a hierarchical structure of global grids. Defined by ISO standards, DGGSs serve as spatial reference systems facilitating data cube construction, enabling integration and aggregation of multi-resolution data sources. Various tessellation schemes such as hexagons and triangles cater to different needs - equal area, optimal neighborhoods, congruent parent-child relationships, ease of use, or vector field representation in modeling flows.

The fusion of Discrete Global Grid Systems (DGGS) and Datacubes represents a promising synergy for integrated handling of planetary-scale data.

The recent Pangeo community initiative at the ESA BiDS'23 conference has led to significant advancements in supporting Discrete Global Grid Systems (DGGS) within the widely used Xarray package. This collaboration resulted in the development of the Xarray extension XDGGS ( The aim of xdggs is to provide a unified, high-level, and user-friendly API that simplifies working with various DGGS types and their respective backend libraries, seamlessly integrating with Xarray and the Pangeo scientific computing ecosystem. Executable notebooks demonstrating the use of the xdggs package are also developed to showcase its capabilities.

This development represents a significant step forward, though continuous efforts are necessary to broaden the accessibility of DGGS for scientific and operational applications, especially in handling gridded data such as global climate and ocean modeling, satellite imagery, raster data, and maps.

Keywords: Discrete Global Grid Systems, Xarray Extension, Geospatial Data Integration, Earth Observation, Data Cube, Scientific Collaboration

How to cite: Kmoch, A., Bovy, B., Magin, J., Abernathey, R., Strobl, P., Coca-Castro, A., Fouilloux, A., Loos, D., and Odaka, T.: XDGGS: Xarray Extension for Discrete Global Grid Systems (DGGS), EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-15416,, 2024.

EGU24-15872 | Posters on site | ESSI2.9

Deploying Pangeo on HPC: our experience with the Remote Sensing Deployment Analysis environmenT on SURF infrastructure 

Francesco Nattino, Meiert W. Grootes, Pranav Chandramouli, Ou Ku, Fakhereh Alidoost, and Yifat Dzigan

The Pangeo software stack includes powerful tools that have the potential to revolutionize the way in which research on big (geo)data is conducted. A few of the aspects that make them very attractive to researchers are the ease of use of the Jupyter web-based interface, the level of integration of the tools with the Dask distributed computing library, and the possibility to seamlessly move from local deployments to large-scale infrastructures. 

The Pangeo community and project Pythia are playing a key role in providing training resources and examples that showcase what is possible with these tools. These are essential to guide interested researchers with clear end goals but also to provide inspiration for new applications. 

However, configuring and setting up a Pangeo-like deployment is not always straightforward. Scientists whose primary focus is domain-specific often do not have the time to spend solving issues that are mostly ICT in nature. In this contribution, we share our experience in providing support to researchers in running use cases backed by deployments based on Jupyter and Dask at the SURF supercomputing center in the Netherlands, in what we call the Remote Sensing Deployment Analysis environmenT (RS-DAT) project. 

Despite the popularity of cloud-based deployments, which are justified by the enormous data availability at various public cloud providers, we discuss the role that HPC infrastructure still plays for researchers, due to the ease of access via merit-based allocation grants and the requirements of integration with pre-existing workflows. We present the solution that we have identified to seamlessly access datasets from the SURF dCache massive storage system, we stress how installation and deployment scripts can facilitate adoption and re-use, and we finally highlight how technical research-support staff such as Research Software Engineers can be key in bridging researchers and HPC centers. 

How to cite: Nattino, F., Grootes, M. W., Chandramouli, P., Ku, O., Alidoost, F., and Dzigan, Y.: Deploying Pangeo on HPC: our experience with the Remote Sensing Deployment Analysis environmenT on SURF infrastructure, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-15872,, 2024.

EGU24-17111 | Posters on site | ESSI2.9

Cloudifying Earth System Model Output 

Fabian Wachsmann

We introduce (, a data server for efficient access to prominent climate data sets stored on disk at the German Climate Computing Center (DKRZ). We show how we “cloudify” data from two projects, EERIE and ERA5, and how one can benefit from it. 

The European Eddy-rich Earth System Model (EERIE) project aims to develop state-of-the-art high-resolution Earth System Models (ESM) that are able to resolve ocean mesoscale processes. These models are then used to perform simulations over centennial scales and make their output available for the global community. At present, the total volume of the EERIE data set exceeds 0.5PB  and is rapidly growing, posing challenges for data management.
ERA5 is the fifth generation ECMWF global atmospheric reanalysis. It is widely used as forcing data for climate model simulations, for model evaluation or for the analysis of climate trends. DKRZ maintains a 1.6 PB subset of ERA5 data at its native resolution.

We use Xpublish to set up the data server. Xpublish is a python package and a plugin for Pangeo's central analysis package Xarray. Its main feature is to provide ESM output by mapping any input data to virtual zarr data sets. Users can retrieve these data sets as if they were cloud-native and cloud-optimized. features

  • Parallel access to data subsets on chunk-level
  • Interfaces to make the data more FAIR
    • User friendly content overviews with displays of xarray-like dataset representations
    • Simple browsing and loading data with an intake catalog
  • On-the-fly server-side computation 
    • Register simple xarray routines for generating customized variables
    • Compression for speeding up downloads
  • Generation of interactive geographical plots, including animations is a solution to make EERIE data more usable by a wider community.

How to cite: Wachsmann, F.: Cloudifying Earth System Model Output, EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-17111,, 2024.

EGU24-17150 |