Sort by
Refine Your Search
-
a variety of different ways ranging from conducting basic biomedical research, empowering educators, inspiring students, developing the next generation of scientists – even stretching into film and
-
optimize new techniques such as tissue clearing, expansion microscopy, and large-scale RNA labeling. All in support of research to develop new molecular tools and imaging, to understand neuronal circuits
-
, integrate, test, debug, modify, and optimize simple to complex systems, instruments, and one-of-a-kind prototype component parts for research. Study project objectives, priorities, and requirements
-
develop new features and extend the capabilities of a real-time neural data processing and decoding platform. This includes optimizing GPU-accelerated signal processing pipelines, improving system
-
immune development. Our research places particular emphasis on host–microbe interactions at barrier tissues and during critical developmental and reproductive windows. By uncovering the maternal-offspring
-
a variety of different ways ranging from conducting basic biomedical research, empowering educators, inspiring students, developing the next generation of scientists – even stretching into film and
-
and sharing of scientific knowledge to benefit us all. As a biomedical research organization and philanthropy, HHMI supports a vibrant community of academic researchers, educators, students, and
-
Computing Associate (SCA) position represents an alternative to traditional scientific roles (e.g. postdoc) and provides an ideal environment to establish a career in computational research or software
-
years to support AI-driven projects and to embed AI systems throughout every stage of the scientific process in labs across HHMI. The AI initiative will be centered at HHMI’s Janelia Research Campus
-
infrastructure, model training, and inference systems. You'll design, develop, and optimize scalable data pipelines and build multi-node GPU training and inference pipelines for foundational models. You'll also