Sept. 3, 2025, 18:15 – 19:00
Timetable
Sept. 3, 2025
Neural networks have been applied for fast downscaling of environmental fields. However, their inherent randomness can lead to prediction instability. This study introduces an ensemble neural network to assess the effectiveness of the ensemble method in mitigating instability in statistical spatial wave downscaling. Its performance is compared with a deterministic linear regression model. Significant wave height (SWH) in the western Black Sea is considered, with low-resolution SWH and wind data from ERA5 and high-resolution SWH data from a regional numerical model. Both self-variable downscaling (from low-resolution SWH) and cross-variable downscaling (from low-resolution wind fields) are considered. Results show that the ensemble method significantly reduces the base neural network’s prediction instability. In self-variable SWH downscaling, two models perform similarly well, whereas in cross-variable downscaling, the ensemble model outperforms the linear model. These findings provide valuable insights into downscaling methodologies, contributing to improved spatial wave predictions.
Coastal zones face increasing pressure from both natural forces and human activity, including sea-level rise, erosion, and expanding infrastructure. Understanding how these landscapes evolve over time is essential for informed decision-making in environmental management, urban planning, and climate adaptation.
We present AutoCoast, a web-based platform for long-term coastal monitoring that combines multi-source satellite imagery with machine learning to detect and visualize shoreline changes from 2015 to 2024. Initially developed for the Baltic Sea, the system is being expanded to cover additional regions such as the North Sea.
A key component of the platform is a custom annotation tool that supports rapid image labeling through active learning. This approach reduces manual effort while maintaining high-quality training data. Our curated dataset, based on Sentinel-2 imagery, includes coastal-specific classes such as beaches, marshes, tidal flats, cliffs, and man-made structures. The resulting segmentation model can reliably identify and classify coastal landforms.
To enhance temporal consistency and spatial accuracy, we implement post-processing steps such as tidal normalization and integrate complementary Sentinel-1 radar data for detecting elevation-driven changes and improving resilience to cloud cover.
The user interface supports dynamic visualization and comparison of coastline evolution, enabling exploration of trends in erosion, accretion, and land use change. …
When applying sustainable Nature-based Solution (NbS) for coastal engineering, a major challenge lies in determining the effectiveness of these NbS approaches in mitigating coastal erosion. The efficacy of NbS is influenced by various factors, including the specific location, layout, and the scale of implementation. This study integrates artificial intelligence (AI) with hydro-morphodynamic numerical simulations to develop an AI-based emulator focused on predicting Bed Level Changes (BLC) as indicators of erosion and deposition dynamics. In particular, we explore the influence of seagrass meadows, which vary in their initial depth (hs) and depth range (hr), on the attenuation of coastal erosion during storm events.
The framework employs a hybrid approach combining the SCHISM-WWM hydrodynamic model with XBeach to simulate 180 depth range and starting depth combination (h r -h s ) scenarios along the Norderney coast in the German Bight. A Convolutional Neural Network (CNN) architecture is used with two inputs—roller energy and Eulerian velocity—to efficiently predict BLC. The CNN shows high accuracy in replicating spatial erosion patterns and quantifying erosion/deposition volumes, achieving an R² of 0.94 and RMSE of 3.47 cm during validation.
This innovative integration of AI and NbS reduces computational costs associated with traditional numerical modelling and improves the …
Monitoring marine mammals is critical during noisy activities such as seismic surveys and naval operations, where the use of loud airguns and sonars can harm whales and seals. Traditional visual monitoring by marine mammal observers is limited by factors such as low light, rough seas, fog, and human fatigue. Drawing on over 10 years of at-sea and ashore research, thousands of whale cues, primarily blows and bodily displays were captured using infrared cameras. Studies demonstrated that infrared imaging reliably detects whale blows worldwide across all climate zones and operates continuously over extended periods (with fog being the primary limitation) [1].
To enable continuous, 24/7 monitoring of whales, we developed a deep learning framework that utilizes infrared video captured by a commercial, cooled 360° thermal imaging sensor for automatic whale blow detection. We evaluated multiple machine learning models, including 2D CNN, 3D CNN, and classical algorithms such as Random Forest and SVM, with infrared video data as input. Each video contains a set of 30 frames, which the system analyzes to determine whether a whale is present in any of them.
Our experiments demonstrated strong detection performance. The results highlighted the robustness of our model, demonstrating its adaptability across different environmental …
The novel DSHIP land system integrates new concepts and state-of-the-art technology to explore, access, and process environmental time series data from various platforms. In this demo session we would like to show i) where to find and how to access the long established features from the land system, ii) highlight some of the new features, such as filtering, querying, and subsetting of (mass) data, iii) options for traceability and interoperability, and iv) give some insights and benchmarks of the systems.
Soil moisture is a critical environmental variable, that has a big impact on hydrological extremes, plant water availability and climate processes. Hence, accurate and extensive data on in situ soil moisture is crucial to assess these environmental impacts. The International Soil Moisture Network (ISMN, https://ismn.earth ) collects such data from several monitoring networks providing a harmonized, quality controlled and free accessible archive.
Collecting and integrating soil moisture data from various data providers presents significant challenges, as they use diverse measurement technologies and data formats. Next to daily ingesting data into the database, keeping the metadata updated and clean remains challenging. ISMN’s existing metadata mapping between input data and database was potentially leading to data corruption due to non-robust and weak data processing.
To ensure reliable data ingestion and mitigate corruption risks, we significantly enhanced the data integration and harmonization process. This involved improving data download robustness, developing an automated procedure for metadata mapping, and implementing automated metadata checks and retrievals. After successfully testing the automated detection and incorporation of metadata changes (e.g. sensor types, exchange dates) the maintenance efforts and human interventions were reduced significantly. Future work will focus on expanding this functionality to other providers and fully incorporating it …
Effective data stewardship in research hinges upon the consistent and FAIR (Findable, Accessible, Interoperable, Reusable) representation of scientific variables across diverse environmental disciplines. Within the Helmholtz Earth and Environment DataHub initiative, we are, hence, developing an innovative approach utilizing Large Language Models (LLMs) to support data producers by automating the semantic annotation of research data. Our service employs the community-driven I-ADOPT framework, which decomposes variable definitions from natural language descriptions into essential atomic parts, ensuring naming consistency and interoperability.
In this poster, we present our approach to developing an LLM-based annotation service, highlighting key challenges and solutions as well as the integration in higher-level infrastructures of the Helmholtz DataHub and beyond. The proposed annotation framework significantly streamlines the integration and harmonizes the description of environmental data across domains such as climate, biodiversity, and atmospheric sciences, aligning closely with the objectives of the NFDI and European Open Science Cloud (EOSC).
This contribution showcases how advanced semantic annotation tools can support data stewardship in practical research contexts, enhancing reproducibility, interoperability, and collaboration within the scientific community.
The Helmholtz Metadata Collaboration (HMC) Hub Earth and Environment seeks to create a framework for semantic interoperability across the diverse research data platforms within the Helmholtz research area Earth and Environment (E&E). Standardizing metadata annotations and aligning the use of semantic resources are essential for overcoming barriers in data sharing, discovery, and reuse. To foster a unified, community-driven approach, HMC, together with the DataHub, has established the formal "Metadata-Semantics" Working Group, which brings together engaged data stewards from major Helmholtz research data platforms within the E&E domain.
As part of its strategy to standardize metadata annotation in collaboration with the community, the working group will begin by harmonizing device-type denotations across two Helmholtz sensor registries: the O2A REGISTRY, developed at AWI, and the Sensor Management System (SMS), maintained by UFZ, GFZ, KIT, and FZJ. This harmonization involves the development of a shared FAIR controlled vocabulary and the implementation of a peer-reviewed curation process for it.
The common vocabulary will support the creation of referenceable ad-hoc terms when needed, incorporate versioning and quality assurance measures, and establish links with existing terminologies in the field (e.g., NERC L05, L06, ODM2, GCMD). Its development will involve experts from various disciplines within Helmholtz E&E …
Climate change has strong effects on many areas in our daily life. It, therefore, requires a thorough understanding of climate change and its consequences to develop effective mitigation and adaptation strategies. The wealth of available data and services is a key to help understand climate change. However, they are often not easily accessible and interpretable, and the hurdle to work with them is high for non-domain experts.
The EU Horizon project FOCAL aims to bridge the gap between data, services and end users by implementing a platform that enables easy and efficient exploration of climate data on a local scale. In particular, decision-making processes of stakeholders in the fields of forestry and urban planning shall be supported with the developed tools. To this end, an open compute platform will be implemented and launched that combines intelligent workflow management with high-performance computing (HPC) resources. A modular and interoperable platform architecture will be used that provides software container-based services. Furthermore, new AI tools to enhance the climate data analysis in terms of speed, robustness and accuracy and to broaden the toolkit of climate data analysis and impact assessment tools will be investigated, developed and made available via the platform.
An additional co-design …
Confronting the escalating impacts of climate change on our coastlines demands a revolution in how we monitor, predict, and protect these vital zones. The rapid advancement of the Digital Ocean, powered by AI-enhanced and data fusion high-resolution coastal observations, and sophisticated AI-driven numerical models, offers this crucial leap forward. Building on this technology and the Copernicus Marine Environment Monitoring Service (CMEMS), the Horizon Europe programme of the European Commission launched the Forecasting and Observing the Open-to-Coastal Ocean for Copernicus Users (FOCCUS) project (foccus-project.eu), consisting of 19 partners from 11 countries. Member State Coastal Systems (MSCS) and users will collaborate to advance the coastal dimension of CMEMS by improving existing capability and developing innovative coastal products.
FOCCUS enhances CMEMS’s coastal capabilities through three key pillars: i) developing novel high-resolution coastal data products by integrating multi-platform observations (remote sensing and in-situ), Artificial Intelligence (AI) algorithms, and advanced data fusion techniques for improved monitoring; ii) developing advanced hydrology and coastal models including a pan-European hydrological ensemble for improved river discharge predictions, and improving member state coastal systems by testing new methodologies in MSCS production chains while taking advantage of stochastic simulation, ensemble approaches, and AI technology; and iii) demonstrating innovative products and improved …
Software development and data curation or analysis share many of their issues: keeping track of the evolution of files -- ideally with information on how, why, when, and by whom --, organizing collaboration with multiple people, keeping track of known issues and other TODOs, discussing changes, making versions available to others, automating tasks, and more. Often you will even have to write code as part of a data project, blurring the line between the two even more. In the free and open-source software development world these issues already have well established solutions: a version control system keeps track of your projects history and ongoing development, a forge can serve as a collaboration hub, CI/CD services provide flexible automation. So, why not apply them to our data management needs? Forgejo-aneksajo does just that and extends Forgejo with git-annex support, making it a versatile (meta-)data collaboration platform that neatly fits into a data management ecosystem around git, git-annex and DataLad.
Morbidity from heat extremes is much higher than mortality and has higher costs but has received less attention. In addition, there are a few, if any, models available to assess the current and potential future impacts of heat extremes on morbidity under climate change. In this study we develop a machine learning model based on a large insurance dataset for the springtime (Q2: April, May, June) and summertime (Q3: July, August, September) for the period 2013-2023 in Germany. From this dataset, we construct a spatially distributed 1km 2 dataset on incidence of heat strokes and volume depletion for the federal state of North-Rhine Westphalia. We link this to detailed estimates of past heat extremes (maximum air temperature, average air temperature, number of hot days) as well as air pollution (NO 2 , O 3 , PM 10 , and PM 2.5 ), and socioeconomic factors (education level, household income, and unemployment rate) to explain temporal and spatial differences in incidence. We present results for the XGBoost algorithm, as well as initial results for deep-learning algorithms.
ADCP moving ship measurements provide high-resolution insight into the hydrodynamics and sediment transport processes of a waterbody, thus improving the understanding of physical processes. In order to facilitate scientific advancement, the BAW provides ADCP data from a measurement campaign conducted in the Eider estuary in 2020. Data compliant with the FAIR principles can be downloaded from the BAW-Datenrepository. The raw and analysed data are available in ASCII format. Entering the ISO metadata and uploading the files is currently performed manually.
As an increasing number of data sets are to be published, the error-prone manual work is to be replaced by an automated workflow. In a first step, the raw data is converted into binary NetCDF files including metadata that complies with CF metadata conventions. The measurement and metadata are inseparable and consequently, the process is more error-proof. The existing ADCP2NETCDF converter has been extended for this purpose. Software that supports the generated CF NetCDF file type “trajectoryProfile” can process the offered files directly.
In a second processing step, it is planned to convert the NetCDF files into ISO-compliant metadata in XML format, which can be imported into the metadata information system (MIS) of the BAW-Datenrepository. A method that has already …
Based on statistical analysis combined with numerical modeling and machine learning, We investigated annual- to decadal-scale morphodynamic patterns of the German Wadden Sea and their predictability at the relevant scales. Results from the multivariate EOF (Empirical Orthogonal Function) analysis of the annual bathymetry data spanning from 1998 till 2022 and potential related drivers and environmental factors (tidal range, storm surge level and frequency, sediment properties and longshore currents) provide insights into morphodynamic patterns of the study area. Both extreme water levels (storm surges) and tidal range show a significant positive correlation with the magnitude of morphological changes, indicating their important role in controlling sediment transport and morphological evolution. Coastal longshore currents exhibit a correlation with the movement of tidal channels which are continuously migrating and deepening in the East and North Frisian regions and oscillating in the estuarine areas (Ems, Wesser and Elbe). Numerical modeling was then applied to derive a process-based understanding of the feedback mechanisms between the physical drivers and the morphology of the Wadden Sea. Finally, state-of-the-art machine learning approaches were used to explore the predictability of morphological change of the Wadden Sea and compared with numerical predictions to identify the strengths and weakness of both methods.