Sort by
Refine Your Search
-
develop AI- and deep learning–based computer vision tools to automatically identify and quantify intertidal organisms. Beyond computer vision, it will leverage machine learning for large-scale, data-driven
-
of real-time adaptive 3D inspection, dynamically adjusting its measurement strategy based on data quality as well as environmental and scene cues. Positioned at the intersection of robotics, computer vision
-
manufacturing sectors, from SMEs to large global manufacturers. For details, visit the MTC website . Entry requirements: A 1st or high 2:1 degree in computer science, manufacturing/industrial engineering, data
-
computational lung modelling. Working with our proprietary respiratory system model, you will generate large-scale, high-quality virtual airflow and acoustic datasets. You will then develop and train state
-
are looking for an enthusiastic, self-motivated candidate with a 1st or high 2:1 degree in robotics, computer science, physics, mechanical engineering, electrical and electronic engineering or a related field
-
applications should be made online . Under Campus, please select Loughborough and select Programme Ph.D. Sport, Exercise and Health Sciences. Please quote the advertised reference number SSEHS/JK26 in your
-
This PhD project focuses on advancing computer vision and edge-AI technology for real-time marine monitoring. In collaboration with CEFAS (the Centre for Environment, Fisheries, and Aquaculture
-
(computer vision technologies). The interdisciplinary nature of this PhD will require the integration of environmental science, engineering, and community science methodologies. Supervisors: Primary
-
outlining how you would approach the project, and an up-to-date CV. Under programme name, please select 'Architecture, Building and Civil Engineering'. Please quote reference RAINDROP-CH. Only applicants with
-
, the project accelerates trait data acquisition by applying computer vision to herbarium specimens and field photos, as well as large language models to extract complementary information from literature and