Sort by
Refine Your Search
-
Category
-
Country
-
Employer
-
Field
-
perform measurements of AI algorithms to fill in the unknowns uncovered in such a data flow diagram. The energy scalability of the core algorithms of a new nationwide AI system can be predicted using
-
Apply now The Faculty of Science, Leiden Institute of Advanced Computer Science, is looking for a: PhD Candidate, Efficient LLM Algorithm, Hardware and System Design (1.0 FTE) Project description We
-
the ranking. However, STV method becomes considerably more complex with encrypted ballots. Our goal is to develop an algorithm/protocol to count encrypted ballot using the STV method. Our first point of
-
needs. By bridging human-centric innovation, generative algorithms, and sustainability metrics, this project seeks to redefine how novel products and systems are conceived, developed, and evaluated. You
-
harness advanced techniques such as machine learning, optimization algorithms, and sensitivity analysis to automate and enhance the mode selection process. The result will be a scalable methodology that
-
) uses principles from systems neuroscience to develop reliable, low-power spiking neural networks and learning algorithms for implementation in a new generation of neuromorphic hardware. Both projects
-
%). The position will be offered for three years and will have as topic the classical emulation of quantum algorithms for the simulation of complex quantum systems. The position will be based at the Institute
-
Europe | 3 days ago
manufacturing, development of machine learning algorithms and design of optical communication networks or power consumption and energy saving. The synergies of MATCH consortium act together to enable the thirteen
-
specialist collaborator to guarantee adequate integration of perception and action; advanced motion-planning and control algorithms, continuously refined via robotic digital twins, enable reliable handling
-
experimentation with Asst. Prof. Eli N. Weinstein. Your goal will be to develop fundamental algorithmic techniques to overcome critical bottlenecks on data scale and quality, enabling scientists to gather vastly