Sort by
Refine Your Search
-
Listed
-
Country
-
Employer
- Fraunhofer-Gesellschaft
- Nanyang Technological University
- Nature Careers
- University of Sheffield
- NIST
- Pennsylvania State University
- ;
- Oak Ridge National Laboratory
- California Institute of Technology
- Technical University of Munich
- University of Dayton
- AIT Austrian Institute of Technology
- Loughborough University
- MOHAMMED VI POLYTECHNIC UNIVERSITY
- Princeton University
- Singapore Institute of Technology
- Texas A&M University System
- University of Adelaide
- University of British Columbia
- University of Glasgow
- Washington State University
- Zintellect
- CSIRO
- Chalmers University of Technology
- Curtin University
- DURHAM UNIVERSITY
- Durham University
- ETH Zurich
- Flinders University
- Ghent University
- Heriot Watt University
- Imperial College London
- Johns Hopkins University
- Lawrence Berkeley National Laboratory
- Linköping University
- Lulea University of Technology
- Monash University
- Northeastern University
- Purdue University
- Queensland University of Technology
- Stony Brook University
- Texas A&M AgriLife
- The Chinese University of Hong Kong
- The Ohio State University
- The University of Queensland
- UNIVERSITY OF SURREY
- Ulster University
- University of Bristol
- University of California, Los Angeles
- University of Florida
- University of Guelph
- University of Illinois at Urbana Champaign
- University of Lund
- University of Manchester
- University of Maryland
- University of Newcastle
- University of Texas at Arlington
- University of Texas at Tyler
- University of Virginia
- 49 more »
- « less
-
Field
-
and simple embedding alignment to develop architectures that can process and reason across modalities (vision, language, audio, sensor data) from the ground up. How do we build a truly unified
-
algorithms, like multi-camera approaches or sensor fusion. You will implement currently available methods in this area, evaluate their performance and systematically study their dependability. Onsite
-
of Biomedical Engineering, please visit https://www.bme.ubc.ca/ . HuMBL, a research lab within the SBME, specializes specifically in wearable sensor technologies and algorithm-driven health analytics to enhance
-
multi-sensor SLAM systems for firefighting robots, optimizing for speed, memory, and power. ROS2, Jetson platform experience, and deep understanding of SLAM backends are essential. Key Responsibilities
-
The Research Engineer’s role’s is to work on research projects which involves developing of metacognitive algorithms and integrating multi-sensors for autonomous moving robot. Key Responsibilities
-
with the latest sensors (camera and LiDAR sensors), is available for the work. What you will do – development of algorithms for 3D multi-object tracking based on heterogeneous sensor data fusion
-
to conduct and lead research sensor array and radar processing. in particular the detection, localization and tracking of active and passive targets in a cellular network under harsh urban environment
-
transmission methods (wired or wireless) will be optimised for robust data capture in natural sleep environments. AI-Driven Analysis: Develop advanced AI algorithms to analyse the collected sensor data, aiming
-
of fingers, the shapes of the fingers, and the positions of tactile sensors), and the control policy for that hand, when given a particular task or set of tasks. Through this, we aim to develop a framework
-
collaboration between the OU and Teledyne e2v (T-e2v), a world-leading manufacturer of scientific and industrial image sensors. The CEI is dedicated to conducting research into advanced imaging technologies