-
programming LAMP stack design and implementation experience Knowledge of GPU and FPGA cluster management Experience with federal research compliance and security requirements Background in AI/ML computing
-
inference Develop distributed model training and inference architectures leveraging GPU-based compute resources Implement server-less and containerized solutions using Docker, Kubernetes, and cloud-native
-
conferences. Qualifications: PhD in computer science with file systems, GPU architecture experience. Proven ability to articulate research work and findings in peer-reviewed proceedings. Knowledge of systems
Enter an email to receive alerts for gpu-computing-"https:" positions