Sort by
Refine Your Search
-
Listed
-
Category
-
Program
-
Field
-
, detailed 3D modeling and design, design and safety reviews Work closely with scientific staff as well as other engineering groups to develop solutions to complex problems and integration issues Prepare cost
-
hands-on with mechanical equipment, and collaborating with a team to develop and build unique equipment. Demonstrated ability with 3D Computer Aided Design modeling software. Ability to perform mechanical
-
the equivalent tools associated with the EIC Project scope. Required Knowledge, Skills, and Abilities: BA/BS Degree (or equivalent experience), preferably in Computer Science or a related discipline At least ten
-
Condensed Matter Theory Group within the Division focuses on band structure theory, tensor-network models and DMRG, analytical methods for low dimensional systems, and on other aspects of strongly correlated
-
of existing ones for scientific applications; (ii) Large Language Models (LLMs) and multi-modal Foundation Models (iii) Large vision-language models (VLM) and computer vision techniques; and (iv) techniques
-
and imaging x-ray tools, supported by computational resources for high-throughput analysis and modeling. The program’s scientific focus is to understand and optimize the structural and chemical features
-
the large language models, and multi-modal Foundation Models; ML models for computer vision (CV) and natural language processing (NLP) related tasks; and techniques applied to the scientific discovery, i.e
-
The National Synchrotron Light Source II (NSLS-II) at Brookhaven National Laboratory seeks a leader for the Spectroscopy Program within the Physical Sciences & Research Operations Division
-
the area of electric power grids. Essential Duties and Responsibilities: • Perform cyber-physical system modeling and simulation for the power grid. • Perform research on power system dynamics and
-
foundational ML and NLP innovations. This work involves the development and applications of encoding and generative NLP models (e.g., large language models, LLMs), and includes model training, tuning and