Sort by
Refine Your Search
-
Listed
-
Category
-
Country
-
Field
-
or withdrawal by the selected candidate, the next candidate on the final classification list will be notified. XV - The full competition process can be consulted by candidates, subject to prior scheduling, by
-
This is an exciting PhD opportunity to develop innovative AI and computer vision tools to automate the identification and monitoring of UK pollinators from images and videos. Working at
-
communication of the provisional lists of the results in the different phases of the selection process (Admission and Exclusion, Curricular Evaluation, and Interview), the candidates have 10 working days
-
short-listing process for studentship applications will begin on 14 November 2025.
-
into meaningful information. This data includes measurements, images, text, recipes, and more. The project addresses the real-world challenge of extracting value from semi-structured textual data, a
-
Instituto de Investigação e Inovação em Saúde da Universidade do Porto (i3S) | Portugal | about 2 months ago
the reference COMPETE2030-FEDER-00683900, Operation No. 15733, funded by COMPETE2030 and national funds (FCT) under Call for Applications no. MPr-2023-12. Scientific Area: Neuroscience and Neuroengineering 1
-
about working at NTNU and the application process here. About the position The PhD research project is part of the new Norwegian Maritime AI Centre, led by NTNU. Digitalization and AI are transforming
-
. At NTNU, 9,000 employees and 43,000 students work to create knowledge for a better world. You will find more information about working at NTNU and the application process here. ... (Video unable to load
-
the design and testing of a prototype, this project will offer a robust theoretical and practical model for museums to address difficult pasts more effectively and resonantly in the digital age. This PhD
-
, the project accelerates trait data acquisition by applying computer vision to herbarium specimens and field photos, as well as large language models to extract complementary information from literature and