23 natural-language-processing-phd PhD positions at Chalmers University of Technology in Sweden
Sort by
Refine Your Search
-
environment is international and English is the working language. If Swedish is not your native language and you are interested in learning it, Chalmers offers Swedish courses that are free for PhD students
-
learning environment is international and English is the working language. If Swedish is not your native language and you are interested in learning it, Chalmers offers Swedish courses that are free for PhD
-
We are looking for a highly motivated, skilled, and persistent PhD student with experience in computational fluid dynamics (CFD) and some knowledge in structural analysis. The research aims
-
• Excellent collaboration skills • Strong proficiency in English, both written and spoken. Merits: Experience in organic electronics and/or fiber spinning is considered a strong asset. Contract terms As a PhD
-
Are you interested in doctoral research and acquiring exciting career opportunities, in academia and beyond? We now offer a PhD student position in architectural design and robotic 3D printing with
-
month (valid from May 25, 2025). Application procedure Please submit your application in English and include the following documents. Maximum size for each file is 40 MB. Statement of purpose (max. 1 page
-
Python programming. Application procedure The application should be written in English be attached as PDF-files, as below. Maximum size for each file is 40 MB. Please note that the system does not support
-
harness the broad diversity of flavor profiles among different marine micro- and macroalgae, which ultimately depend on cultivation and post harvest processing. As a PhD-student, you will be part of
-
for the recruiting process in connection with this position. *** Chalmers University of Technology in Gothenburg conducts research and education in technology and natural sciences at a high international level. The
-
PhD Position in Theoretical Machine Learning – Understanding Transformers through Information Theory
Transformers are central to many of today’s most successful AI models, from language understanding to computer vision. Yet, their success remains largely empirical, with limited theoretical understanding