Sort by
Refine Your Search
-
learning, focusing on identifying abrupt shifts in the properties of data over time. These shifts, commonly referred to as change-points, indicate transitions in the underlying distribution or dynamics of a
-
research work will be to devise efficient algorithms for source separation in DAS measurements. Issues such as large data volumes that can exceed 1 To per day and per fiber, instrument noise, complex nature
-
huge data storage capacity to accelerate the research performed in intensive computing and large-scale data analytics (Big Data). This characteristic distinguishes the HPC center at the university from
-
transcriptomics) opens new research horizons on cancer pathologies. These data, of very large dimensions and volume, bring new methodological challenges in terms of statistical and mathematical analysis, as
-
through text, images, networks, … A similar situation can be encountered in the context of medical data, where the data types may be even more large. It is therefore of strong interest to be able to analyze
-
, images, networks, … A similar situation can be encountered in the context of medical data, where the data types may be even more large. It is therefore of strong interest to be able to analyze those
-
to perceive their environment because this sensor can produce precise depth measurement at a high density. LiDARs measurements are generally sparse, mainly geometric and lacks semantic information. Therefore
-
mouse models (FELASA or equivalent certification); Ability to handle and analyze large-scale omics data; Experience with gene cloning. Soft Skills: Strong sense of initiative, rigor, and motivation
-
], which states that random neural networks can be pruned to approximate a large class of functions without changing the initial weights. We are also interested in Neural Combinatorial Optimization, where we
-
this project we will draw analogies between memory dynamics in rodent brains and challenges in machine unlearning, particularly in foundation models such as Large Language Models. Using experimental data from