Sort by
Refine Your Search
-
Listed
-
Category
-
Employer
-
Field
-
advances the mathematical foundations, algorithms, and real-world applications of epistemic uncertainty in machine learning, with a strong focus on imprecise probabilities, uncertainty representation and
-
Responsibilities Development of new machine learning modeling approaches Development of new advanced control and optimization algorithms Optimization of carbon capture process operation Provide regular project
-
researcher in natural language processing and large language models to work with a team from multiple disciplines of machine learning and artificial intelligence to develop multimodal large language models
-
at the intersection of mitochondrial biology, functional genomics, and machine learning. This interdisciplinary initiative focuses on discovering, decoding and engineering mitochondrial microproteins (mito-MPs) with
-
on emerging privacy-preserving techniques such as homomorphic encryption, secure multi-party computation and federate learning. Key Responsibilities: Conduct advanced research in the areas of privacy-preserving
-
, and innovators to thrive in the digital age. Located in the heart of Asia, NTU’s College of Computing and Data Science is an ‘exciting place to learn and grow'. We welcome you to join our community
-
requirements: PhD Degree in Computer Science, Artificial Intelligence, Machine Learning, Data Science, or a related field, obtained within the last five year Research Experience in one or more of the following
-
computer programming to verify the efficiency of the designed solution algorithms Analyze data acquired from the field survey Develop machine learning models for prediction and recommendation Job
-
programming languages such as C and Python Proficiency in deep learning frameworks such as Pytorch and Tensorflow Knowledge in imaging and computing device and equipment Good written and oral
-
Responsibilities: Conduct programming and software development for data management. Design and implement machine learning models for optimizing data management. Conduct experiments and evaluations of the designed