Sort by
Refine Your Search
-
Listed
-
Country
-
Program
-
Field
-
and GPU-accelerated tools for circuit and system design optimization, addressing challenges in physical design, timing analysis, and large-scale hardware design automation. The researcher will
-
simulation methods, GPU-accelerated computations, several programming languages, and presenting results to wide technical and non-technical audiences. Additionally, the candidate will also develop theory and
-
). Practical experience with cloud computing platforms (e.g., AWS, GCP, Azure). Additional Qualifications Experience with multi-GPU model training and large-scale inference. Familiarity with modern AI
-
100% funding per SNSF guidelines (~CHF 90'000/year) Access to modern GPU clusters and confidential-computing infrastructure Collaboration with leading researchers in AI & HPC systems and digital health
-
environments. Experience with parallel computing environments, HPC in a Linux environment. Experience with surrogate modeling. Experience with data analytics techniques. Familiarity with C++ and GPU programming
-
computing environment that includes GPU clusters, large-memory servers, and an NVIDIA DGX B200 system. These resources support the training of large multimodal models involving audio, video, language
-
in GPU programming one or more parallel computing models, including SYCL, CUDA, HIP, or OpenMP Experience with scientific computing and software development on HPC systems Ability to conduct
-
variety of computational devices (e.g. CPUs and GPUs) while ensuring overall consistency and performance. - contribute to identify new CSE applications domains, such as condensed matter systems, quantum
-
with OFDM modulation required. Skills Programming skills in MATLAB and or Python required, experience with wireless testbeds desirable, some familiarity with GPU programming desirable (to support
-
/ computer vision and pattern recognition, including but not limited to biomedical applications Strong interest in applied machine learning, including but not limited to deep learning Experience utilising GPU