Sort by
Refine Your Search
-
Listed
-
Country
-
Field
-
of the art equipment within DNA and RNA sequencing, laboratory automation, CPU and GPU compute resources, proteomics, metabolomics, and advanced microscopy. This position offers an excellent opportunity
-
programming LAMP stack design and implementation experience Knowledge of GPU and FPGA cluster management Experience with federal research compliance and security requirements Background in AI/ML computing
-
., Bayesian, hierarchical, time-series), experience with sensor-based data (such as eye tracking, EEG, and heart rate), and proficiency in computational workflows, including distributed and GPU-based systems
-
is of advantage: Knowledge of parallel programming and HPC architectures, including accelerators (e.g., GPUs) Experience in modelling and simulation, ideally in the field of energy systems Experience
-
, telemetry systems) into immersive environments. Optimize XR applications for performance including CPU/GPU profiling, draw call reduction, shader optimization, memory management, and LOD systems. Develop
-
-dimensional biological datasets. Familiarity with GPU computing and high-performance computing (HPC) environments. Other Requirements Ability to work collaboratively with researchers across computational and
-
with edge computing or embedded systems (e.g., NVIDIA Jetson, Raspberry Pi) Background in real-time processing and GPU acceleration (CUDA) Participation in relevant competitions (e.g., Kaggle, computer
-
. Knowledge of GPU architectures, GPU cloud computing services, and strong familiarity with Linux operating systems. Knowledge of BIM, universal scene description and scene composition Knowledge of physics and
-
-term project. We are looking for a software engineer to develop new features and extend the capabilities of a real-time neural data processing and decoding platform. This includes optimizing GPU
-
inference pipelines using modern ML tooling (e.g., PyTorch/TensorFlow/JAX), version control, containers, and HPC/GPU resources. Support the publication of intermediate data products, models, code, and