In forestry, having consistently relevant and correct information is critical towards environmentally conscionable decision making. Consulting commercially available or open-source Large Language Models (LLMs) in this decision-making process can be an effective way towards informed decision making. However, currently LLMs have demonstrated to be deficient in three critical areas: reliable information sources, lack of access to real world data, and ambiguity in their scientific reasoning ability.
To overcome these shortcomings we opted to instantiate a curated knowledgebase containing information from relevant CC-0 research articles. With clearly defined constraints applied to each research article intended for ingestion in the knowledgebase, it becomes possible for the underlying LLM to produce feedback that is correct, concise and accurate. Some of the constraints of the knowledgebase are in place to adhere to European laws regarding the ethical use of AI as well as comply with copyright laws.
Aside from having access to relevant research papers, having access to real world data is one of the cornerstones of the proposed framework. By utilizing calibrated level 1 data from multiple sensors, platforms and measuring devices, we can implement agentic RAG functionality to retrieve information about our area of interest.
Lastly, reasoning abilities in Large Language Models …
The Helmholtz Model Zoo (HMZ) is a cloud-based platform that provides remote access to deep learning models within the Helmholtz Association. It enables seamless inference execution via both a web interface and a REST API, lowering the barrier for scientists to integrate state-of-the-art AI models into their research.
Scientists from all 18 Helmholtz centers can contribute their models to HMZ through a streamlined, well-documented submission process on GitLab. This process minimizes effort for model providers while ensuring flexibility for diverse scientific use cases. Based on the information provided about the model, HMZ automatically generates the web interface and API, tests the model, and deploys it. The REST API further allows for easy integration of HMZ models into other computational pipelines.
With the launch of HMZ, researchers can now run AI models within the Helmholtz Cloud while keeping their data within the association. The platform imposes no strict limits on the number of inferences or the volume of uploaded data, and it supports both open-access and restricted-access model sharing. Data uploaded for inference is stored within HIFIS dCache InfiniteSpace and remains under the ownership of the uploading user.
HMZ is powered by GPU nodes equipped with four NVIDIA L40 GPUs per …
Meeting room
Lecture Hall
Coastal zones face increasing pressure from both natural forces and human activity, including sea-level rise, erosion, and expanding infrastructure. Understanding how these landscapes evolve over time is essential for informed decision-making in environmental management, urban planning, and climate adaptation.
We present AutoCoast, a web-based platform for long-term coastal monitoring that combines multi-source satellite imagery with machine learning to detect and visualize shoreline changes from 2015 to 2024. Initially developed for the Baltic Sea, the system is being expanded to cover additional regions such as the North Sea.
A key component of the platform is a custom annotation tool that supports rapid image labeling through active learning. This approach reduces manual effort while maintaining high-quality training data. Our curated dataset, based on Sentinel-2 imagery, includes coastal-specific classes such as beaches, marshes, tidal flats, cliffs, and man-made structures. The resulting segmentation model can reliably identify and classify coastal landforms.
To enhance temporal consistency and spatial accuracy, we implement post-processing steps such as tidal normalization and integrate complementary Sentinel-1 radar data for detecting elevation-driven changes and improving resilience to cloud cover.
The user interface supports dynamic visualization and comparison of coastline evolution, enabling exploration of trends in erosion, accretion, and land use change. …
Digital twins of the ocean (DTO) make marine data available to support the development of the blue economy and enable a direct interaction through bi-directional components. Typical DTOs provide insufficient detail near the coast, because their resolution is too coarse and the underlying models lack processes that become relevant in shallow areas, e.g., at wetting and drying of tidal flats. As roughly 2.13 Billion of the world’s population live near a coast, downscaling ocean information to a local scale becomes necessary, as many practical applications, e.g., sediment management, require high resolution data. For this reason, we focused on the appropriate downscaling of regional and global data from existing DTOs using a high-resolution (100s of meters), unstructured, three-dimensional, process-based hindcast model in combination with in-situ observations. This high-resolution model allows the fine tidal channels, estuaries, and coastal structures like dams and flood barriers to be represented digitally. Our digital twin includes tidal dynamics, salinity, sea water temperature, waves, and suspended sediment transport. Thanks to a fast and intuitive web interface of our prototype digital twin, the model data provided enable a wide range of coastal applications and support sustainable management. Bi-directional web processing services (WPS) were implemented within the interactive web-viewer …