Sort by
Refine Your Search
-
of neural hydrology, where hydrological models are directly learned from data via machine learning (e.g., LSTM neural networks, [1]). Initially, these models ignored all physical background knowledge and did
-
– from the modeling of material behavior to the development of the material to the finished component. PhD Position in Machine Learning and Computer Simulation Reference code: 50145735_2 – 2025/WD 1
-
Description Are you interested in developing novel scientific machine learning models for a special class of ordinary and differential algebraic equations? We are currently looking for a PhD
-
: Develop innovative machine learning architectures for the mining, prediction, and design of enzymes. Combine state-of-the-art ML (e.g., deep learning, generative models) with computational biochemistry
-
dynamics, data science, and machine learning are beneficial. Please submit your detailed application with the usual documents by August 15, 2025 (stamped arrival date of the university central mail
-
: Approximately 2,000 EUR/month for three years Website: IMPRS-ESM Application Contact: office.imprs at mpimet.mpg.de The International Max Planck Research School on Earth System Modelling (IMPRS-ESM) invites
-
breakage models, e.g. with stochastic tessellations Development and implementation of estimation methods for the model parameters, e.g. with machine learning or statistical methods Lab work and collection
-
Mathematics/ Approximation Theory to be filled by the earliest possible starting date. The Chair of Applied Mathematics, headed by Prof. Marcel Oliver, is part of the Mathematical Institute for Machine Learning
-
and data analytics (including machine learning and deep learning); from high-performance computing to high-performance analytics; from data integration to data-related topics such as uncertainty
-
written and spoken English skills High degree of independence and commitment Experience with machine learning and high-performance computing is advantageous, but not necessary Our Offer: We work on the very