Sort by
Refine Your Search
-
Listed
-
Category
-
Program
-
Employer
-
Field
-
optimization is considered a strong advantage. Experience with Python scripting for data analysis is considered valuable. A strong commitment to research. Enthusiasm for working with research students
-
such as R, Julia, or Python. Excellent communication skills and the ability to work in teams and in a cross-disciplinary research center. Excellent written and spoken English skills. Place of employment
-
learning, data science, atmospheric sciences, geophysics, or related fields. Solid numerical modelling and programming skills (e.g., Python, TensorFlow, scikit learn) are essential, along with a basic
-
proficiency in Python, R, or MATLAB. Experience with Deep Learning frameworks (PyTorch, TensorFlow) and LLM APIs is an asset. Communication: Fluent English skills, both written and spoken, with a demonstrated
-
field that provides a sufficient degree of background in computer science, artificial intelligence, mathematics and data science. Fluency in English, Python, and C/C plus plus are required. Experience
-
for predictivemodelling and large foundation models for earth systems. Skillsetssuch as programming in Python/R, Google Earth Engine, JavaScript, database management environments, Geographical AI, and machine learning
-
Experience with scientific programming (e.g. in python) Experience with modelling of disordered materials is seen as an advantage Experience with automated computational workflows is seen as an advantage
-
methods, omics data analysis, and spatial tools is highly valued. Programming expertise in Python and/or R is essential. As a person you demonstrate high ambitions. You are equally innovative – and result
-
Python or similar. Experience with optics and/or reflectometry for X-rays and/or neutrons. Personal drive and self-motivated. Good communication skills in English, both written and spoken. Being able
-
development section, specifically: Develop and maintain custom Python graphical user interfaces that enhance the usability of the group’s research tools. Optimize data workflows through efficient data