Sort by
Refine Your Search
-
, validation, deployment, monitoring, and (preferably) MLOps practices. You can develop and integrate AI models into operational systems, including: user‑facing applications (computer vision, sensor fusion
-
the complementarity of both types. Specifically, you’ll be performing the following tasks: You set up a simulation framework for both conventional photon-counting and hyperspectral X-ray detectors with high-Z sensors
-
Engineering or a related field, with a focus on Robotic Manipulation Experience setting up and operating ROS-compatible robotic manipulation: systems robotic arms, cameras, F/T sensors, sensor calibration
-
variety of fields including optics, sensors, lasers, materials, cryogenic cooling, vibration damping, vacuum and controls. The ETpathfinder Smart Skills lab is a new initiative with the aim of providing
-
, or autonomous driving applications Perception and Sensor Fusion: Strong background in processing and fusing data from cameras, LiDAR, radar, or other sensors for robust autonomous system perception ML Engineering
-
the Computer Vision Lab, equipped with state-of-the-art visual sensors and acquisition systems, along with the Zero-G Lab, a unique facility designed to emulate proximity operations under space-like conditions