Sort by
Refine Your Search
-
Listed
-
Employer
-
Field
-
capabilities, catastrophic forgetting, scaling laws, and other emergent physical phenomena. Building on this foundation, we aim to explore more effective training algorithms for Large Language Models (LLMs). (2
-
for process industries (e.g., chemical, pharmaceutical, materials, energy) or discrete manufacturing (e.g., electronics assembly, automotive, home appliances). Explore the integration of large language models
-
, energy) or discrete manufacturing (e.g., electronics assembly, automotive, home appliances). Explore the integration of large language models and reinforcement learning for real-time optimization, fault
-
language models and reinforcement learning for real-time optimization, fault self-recovery, and production scheduling in industrial processes. Publish research results in top-tier international conferences
-
(QMC), density-matrix renormalization group (DMRG), etc. The applicant should be able to write and implement numerical codes; good command of at least one programming language and basic Linux experience
-
years. As a research-led international university, XJTLU integrates global expertise to offer high-quality education and cutting-edge research with English as the primary working language. With 25,000
-
frontier research programs in the fields of Condensed Matter Physics, Particle & Nuclear Physics, and Astronomy & Astrophysics. English is the working language at the institute. Further information about the
-
-messenger era • Numerical simulation of core-collapse supernovae IHEP values diversity and welcomes candidates from all backgrounds worldwide. Knowledge of the Chinese language is not required. We provide