Sort by
Refine Your Search
-
of the processing system online. Our approach will be to draw on a broad selection of tools including (deep) reinforcement learning, queuing networks, online algorithms and systems engineering. In addition, a large
-
electricity price signals, demand-response mechanisms, and time-of-use optimization. AI-Driven Optimization using Reinforcement Learning: Apply RL algorithms to develop and train agents that optimize power
-
algorithms, have excelled in tasks like computer vision, image recognition and large language models (LLM). However, their reliance on extensive computational resources results in excessively high energy
-
(entities) given the rules and the rules given the molecules. The aim of this project is to develop a theory and accompanying algorithms to decide if an abstract system can be instantiated by a concrete
-
algorithm. Design methods: Develop novel control methods for power electronic converters feeding electric machine Simulation: Learn advanced simulation tools such as Ansys to simulate and analyze the effect
-
biased, outdated, or sensitive data? That's where the project TRAI comes in. This research project aims to develop machine unlearning algorithms to selectively erase specific knowledge from trained AI
-
assets, in combination with other types of power system assets (batteries, etc.). From a research perspective, this can involve techniques from algorithmic game theory and AI-based mechanism design
-
scholarship holder in the field of planning and scheduling of in-orbit service missions. Position You will work actively on the preparation and defence of a PhD thesis in the domain of operations research
-
programming Creating their own mechanical designs, implement and test them accordingly, Implementation of control algorithms on physical experiments. In addition, the candidates are expected to contribute with
-
, storage and demand. YOUR TASKS You will develop mathematical models and metaheuristic algorithms for complex optimization problems in the context described above, see e.g., https://arxiv.org/abs/2503.01325