Sort by
Refine Your Search
-
. Our approach combines advanced nanofabrication with phase-engineered optical excitation to generate and visualize elastic waves with high spatial and temporal resolution. By designing and fabricating
-
(optical fibers and electrodes); Conducting a behavioral test: visual and sound discrimination tasks ("go"/"nogo" or two-alternative behavioral paradigms); In-vivo extracellular electrophysiological
-
the preparation of technology transfer and valorization steps. • Software development (Python, PsychoPy, real-time analysis and visualization tools). • Integration of acquisition modules (EEG, fMRI
-
in Python, PyTorch/TensorFlow, and medical visualization tools (e.g., 3D Slicer, ITK-SNAP, MONAI...). - Mastery of deep learning platforms Tensorflow/Pytorch/scikitlearn - English: high level Website
-
writing) is mandatory. 4. Strong ability in data analysis and visualization is required. 5. Knowledge of French or Chinese is an asset. • International Experience: Prior research experience abroad is highly
-
and medium-scale computing environments (Linux, job schedulers, parallel computing). • Familiarity with data analysis, visualization, and handling large scientific datasets. • Ability to run, adapt
-
analysis and visualization, signal processing, and ideally machine learning. • Working knowledge of Distributed Acoustic Sensing (DAS) and its applications in seismology (appreciated). • Aptitude
-
augmented reality applications for history education and heritage visualization. Multimodal Technologies and Interaction, 3(2), 39. Endacott, J., & Brooks, S. (2013). An updated theoretical and practical