Sort by
Refine Your Search
-
optimization – with rigorous theoretical analysis. The ideal candidate has strong machine learning and AI expertise and is comfortable with – or eager to learn – large-scale multi-GPU experimentation
-
Computational Imaging Research Lab (CIR), Department of Biomedical Imaging and Image-guided Therapy | Austria | about 1 month ago
imaging datasets across modalities (X-ray, ultrasound, MRI). Scalable ML workflows: GPU-based training, experiment tracking, reproducible pipelines, model validation and deployment. Research excellence
-
segmentation." CVPR. 2022. [3] van Spengler, Max, and Pascal Mettes. "Low-distortion and GPU-compatible Tree Embeddings in Hyperbolic Space." ICML. 2025. [4] Pal, Avik, Max van Spengler, Guido Maria D'Amely di
-
simulation workload and update the solver data structures when the mesh changes. These approaches would be applied on modern large-scale heterogeneous parallel computing environments where both CPUs and GPUs
-
background in machine learning, deep learning, and/or computer vision; Experience in programming. Python is a must, lower-level GPU programming experience is a bonus; Strong grasp on the English language
-
selectivity mechanisms in potassium channels Simulation of electrophysiology experiments Please indicate in your application which of the above listed projects is most intriguing for you. Your profile
-
, GPUs, AI accelerators etc.) require high power demands with optimized power distribution networks (PDNs) to improve power efficiency and preserve power integrity. Integrated voltage regulators (IVRs
-
, considering discrete modulation; Contribute to the implementation of digital signal processing algorithms in a FPGA platform; Contribute to the implementation of information reconciliation algorithms in a GPU
-
on conventional computing platforms such as GPUs, CPUs and TPUs. As language models become essential tools in society, there is a critical need to optimize their inference for edge and embedded systems