Sort by
Refine Your Search
-
Listed
-
Category
-
Employer
- Delft University of Technology (TU Delft)
- University of Groningen
- University of Twente
- University of Amsterdam (UvA)
- Wageningen University and Research Center
- Eindhoven University of Technology (TU/e)
- Leiden University
- Maastricht University (UM)
- Radboud University
- Universiteit van Amsterdam
- University of Twente (UT)
- Delft University of Technlogy
- Erasmus MC (University Medical Center Rotterdam)
- Utrecht University
- Wageningen University & Research
- 5 more »
- « less
-
Field
-
human rights, and protects health, safety, and the environment” through a set of risk-based rules for developers and users of specific applications of AI. Methodologically, the project is open to
-
to “foster trustworthy AI in Europe” that is sold and used in ways that are “safe, respect human rights, and protects health, safety, and the environment” through a set of risk-based rules for developers and
-
staff position within a Research Infrastructure? No Offer Description How do regulators balance food safety, sustainability and innovation in circular food systems? Will you join our multidisciplinary
-
scholarly merit and potential abuse of research grants, they create serious problems in the form of safety risks for patients and other supposed beneficiaries of scientific research. In collaboration with
-
securing well designed and well executed analyses of food and feed? Are you eager to take on the challenge of improving analytical techniques in this field using state-of-the-art measurement systems such as
-
protest demonstrations or riots. Trust and the perceived legitimacy of institutions play a key role in the dynamics of crisis conflict: waning trust in the parties handling the crisis is usually a driving
-
discovery and safety, bio-therapeutics, and systems biomedicine and pharmacology. State-of-the-art expertise and infrastructures ensure that we are strategically positioned in (inter)national collaborations
-
project focuses on developing methods to assess the sustainability of nearly 100 nationwide AI systems within the National Lab on Education and AI (NOLAI). You will create methods to predict energy
-
evident not only in the ongoing conflict in Ukraine, but also as an central part of hybrid conflict – confrontation that takes place below the threshold of open war. In this context, the high-paced
-
to breakthrough solutions for autonomous driving and safety-critical applications. This fully funded, four-year PhD position is a collaboration between the Intelligent Vehicles Section and the Department of Systems