Sort by
Refine Your Search
-
across different imaging devices, including future sensors with unknown spectral sensitivities. Training The student will be based at the Colour & Imaging Lab at the School of Computing Sciences which has
-
implementing control systems for robotic arms, including vision-based control and sensor integration. Carrying out experimental validation, system calibration, and performance optimisation of robotic and
-
interconnected computing nodes, actuators and sensors, communicating over networks, to achieve complex functionalities, at both slow and fast timeframes, and at different safety criticalities. Future connectivity
-
. It will use signals from different sources—such as radio signals and internal sensors— to maintain robust and accurate PNT, even when satellite signals are weak or missing. A built-in intelligent
-
-identified scans, records and sensor feeds to answer questions such as: Can we predict a patient’s response to treatment without ever seeing their raw file? Can an algorithm learn the warning signs of trouble
-
the ground, in different weather conditions. 4) develop methods for sensor management and data fusion linked with inference and decision making, jointly applied to several wildfire detection tasks. 5) embed
-
, path finding and routing algorithms, sense of direction, human computer interaction, cognitive navigation, intelligent mobility, and artificial intelligence. Sensor fusion and Signals of Opportunity We
-
functional outcomes. In parallel, you will develop new sensors for intracellular potassium concentration leveraging AI based protein design algorithms. Techniques used are Biomolecular NMR spectroscopy and