Sort by
Refine Your Search
-
Listed
-
Country
-
Program
-
Field
-
/ computer vision and pattern recognition, including but not limited to biomedical applications Strong interest in applied machine learning, including but not limited to deep learning Experience utilising GPU
-
programming (Shared and Distributed memory, GPU programming etc.) Demonstrated experience with distributed memory MPI programming Experience with collaborative software design, development, and testing
-
/TimeSformer, CLIP/BLIP or similar) in PyTorch, including scalable training on GPUs and reproducible experimentation. Demonstrated experience building explainable models (e.g., concept bottlenecks, prototype
-
scientists and engineers are accustomed to. Moreover, the vast majority of the performance associated with these reduced precision formats resides on special hardware units such as tensor cores on NVIDIA GPUs
-
disease insights. The lab has state-of-the-art computing capabilities with an in-house cluster serving 80 CPU cores and 1.5TB of RAM, as well as a newly acquired NVIDIA DGX box with eight H100 GPUs and 224
-
. Programming & Software Development: Proficiency in Python, PyTorch, JAX, or other ML frameworks Computing: Some experience with large-scale datasets, parallel computing, and GPUs/TPUs. Algorithm Development
-
). Expertise in data and model parallelisms for distributed training on large GPU-based machines is essential. Candidates with experience using diffusion-based or other generative AI methods as
-
). Practical experience with cloud computing platforms (e.g., AWS, GCP, Azure). Additional Qualifications Experience with multi-GPU model training and large-scale inference. Familiarity with modern AI
-
managing supercomputer resources Strong skills in algorithm development for large sparse matrices Excellency in programming GPU accelerators from all major vendors Very good command of written and spoken
-
E13) up to 5 years International collaboration to build a large radiotherapy dataset Dedicated GPU infrastructure Strong collaborations within TUM’s AI ecosystem High-impact publication potential