Sort by
Refine Your Search
-
Listed
-
Category
-
Program
-
Employer
-
Field
-
integrate linear and circular processes, enabling used products to be transformed into new generations. What you will do Implement GPU-accelerated Gaussian Mixture Model (GMM) learning in PyTorch Optimize
-
with the domain of optical material behavior acquisition at a decent pace. What you bring to the table Very good C++ programming skills GPU & Shader programming, ideally knowledge of PBR (Physically
-
or similar deep learning frameworks GPU know-how: Familiar with GPU workflows and distributed training setups Data competence: Experience in preprocessing, augmentation, and dataset organization; confident
-
models on one or more GPUs and the ability to work with existing codebases to set up training runs Research interest in one or more of the following areas: probabilistic machine learning, time series
-
, Pandas, SQL, Docker, git, etc. PyTorch skills: experience in training machine learning models with one or more GPUs; ability to work with pre-existing codebases and get a training run going A versatile
-
- and multi-GPU setups, and ability to work with existing codebases to quickly get training pipelines running Deep research interest in one or more of the following areas: 3D Gaussian Splatting, Neural
-
experience with accelerated architectures (e.g., GPUs or other accelerators) Experience with performance analysis, profiling, and optimization. Note that it is not necessary to fulfil all of these requirements
-
GPU-capable, parallelized simulation frameworks. Work closely with experts in HPC and power systems to enhance scalability and computational performance. Disseminate your findings through scientific
-
-following inverters. Implementing and optimizing scalable algorithms for transient and stability analyses on HPC architectures (CPU, GPU, hybrid). Enhancing the numerical robustness and efficiency of existing
-
performance computing systems or cloud infrastructure (including GPU-accelerated workloads). Practical experience with modern deep learning frameworks, model serving in production, and building end-to-end data