Sort by
Refine Your Search
-
Listed
-
Category
-
Country
-
Program
-
Field
-
. Basic programming skills (e.g., MATLAB, Python) for data handling and sensor signal analysis. Enthusiasm for interdisciplinary research and the ability to support multiple concurrent projects. The terms
-
environment. Develop robot localization approach leveraging data from multiple sensors including but not limited to camera and lidars. Develop efficient approach for multi-robot trajectory planning and tracking
-
international research groups, and with national and regional industry partners. The research programme includes the research themes of software engineering, engineering computing, sensor networks and measurement
-
archiving of environmental data collected at a meso-network of biometeorological field sites across California for the Oikawa Laboratory at the Cal State East Bay campus. This role is responsible for the day
-
—remains a critical challenge. This project will focus on designing AI-driven cognitive navigation solutions that can adaptively fuse multiple sensor sources under uncertainty, enabling safe and efficient
-
interconnected computing nodes, actuators and sensors, communicating over networks, to achieve complex functionalities, at both slow and fast timeframes, and at different safety criticalities. Future connectivity
-
Virtual Training Environment (VTE) for disaster response simulation, integration of Building Information Modelling (BIM) with Structural Health Monitoring (SHM) using smart sensor networks, and resilience
-
health records (EHR), waveforms from bedside monitors, radiology images and wearable sensors. This position offers a unique opportunity to work closely with clinicians on applications of machine learning
-
to monitor oceanic CO2 uptake with improved confidence. Any future observational network utilises a range of instrument/sensor technologies, deployed on different platforms, and measuring multiple variables
-
. To reach level-4/5 autonomy, we need teamwork: nearby vehicles, drones, and roadside units must co-perceive their environment, sharing and fusing complementary sensor views in real time. Yet raw video