45 parallel-programming "Michigan State University" Postdoctoral positions at Argonne
Sort by
Refine Your Search
-
in GPU programming one or more parallel computing models, including SYCL, CUDA, HIP, or OpenMP Experience with scientific computing and software development on HPC systems Ability to conduct
-
Postdoctoral Appointee - Uncertainty Quantification and Modeling of Large-Scale Dynamics in Networks
Knowledge in modeling and algorithms for large-scale ordinary differential equations (ODEs) and differential-algebraic equations (DAEs) Proficiency in a scientific programming language (e.g., C, C++, Fortran
-
is supported by a DOE-funded research program on ultrafast science involving Argonne National Laboratory, University of Washington, and MIT. The goal of this research program is to understand and
-
may include work at Jefferson Lab, the Electron-Ion Collider (EIC) program, detector research and development, and applications of AI in nuclear physics. Applications received by Tuesday, November 4
-
on experiment progress, technique development, and new initiatives to peer reviewers and Q-NEXT program managers Position Requirements Completed Ph.D. within the last 0-5 years (or soon-to-be-completed) in
-
and tracing tools (LTTng, Babeltrace, perf, ftrace, etc.). Strong C (and/or C++) system-programming skills and familiarity with dynamic linking (e.g., ldd). Experience developing and optimizing
-
, Quantum Information and Quantum Simulation. The successful candidate will be expected to carry out an independent and collaborative research program in particle theory that strengthens and complements
-
to/from the memory via optical fibers. The candidate will be primarily responsible for: (1) advancing our experimental program to fabricate new hybrid devices in Argonne’s Center for Nanoscale Materials
-
2.0) program. The collaboration team includes Clarence Chang, Tim Hobbs, Dafei Jin, Yi Li, Marharyta Lisovenko, Valentine Novosad, Zain Saleem, Tanner Trickle, and Gensheng Wang. We seek highly
-
development, and publication in peer-reviewed venues. Strong background in machine learning, with research experience in deep learning, foundation models, or related areas. Solid programming ability in Python