-
direction of this research, shaping the application domains we explore based on their interests and vision for where quantum networks can make a difference. The successful applicant has an excellent track
-
funded through the EU Research Framework Programme? Not funded by a EU programme Is the Job related to staff position within a Research Infrastructure? No Offer Description Contribute your computer vision
-
funded through the EU Research Framework Programme? Not funded by a EU programme Is the Job related to staff position within a Research Infrastructure? No Offer Description Energy transitions are essential
-
funded through the EU Research Framework Programme? Not funded by a EU programme Is the Job related to staff position within a Research Infrastructure? No Offer Description Postdoctoral researcher in
-
feedback and different perspectives in your work. Your experience and profile: PhD in Ecology, Soil Science, or Environmental Sciences, or a related discipline; A thorough understanding of linkages between
-
) with quantitative techniques (e.g., computer vision, physiological sensing, environmental monitoring, crowd behaviour analysis), as well as researching existing sources of knowledge in the literature and
-
methods, integrating qualitative approaches (e.g., interviews, walk-alongs, focus groups) with quantitative techniques (e.g., computer vision, physiological sensing, environmental monitoring, crowd
-
can work to a high level of independence, but you also like working in a team, and you effectively incorporate feedback and different perspectives in your work. Your experience and profile: PhD in
-
to your expertise. What do we require? A PhD degree (or equivalent qualification) in AI (e.g., machine learning, natural language processing or computer vision); A strong scientific track record, documented
-
qualification) in AI (e.g., machine learning, natural language processing or computer vision); A strong scientific track record, documented by publications at first-tier conferences and journals (e.g., NeurIPS