118 parallel-processing-bioinformatics positions at SINGAPORE INSTITUTE OF TECHNOLOGY (SIT)
Sort by
Refine Your Search
-
Listed
-
Category
-
Country
-
Program
-
Field
-
Interaction (HCI) and social science methodologies to lead the user research and policy development, as part of an interdisciplinary team investigating harmful user-generated content in virtual worlds as part
-
of the award, awardees will be considered for Assistant Professorship. Awardees will also be assigned a faculty mentor for the duration of the scheme. Application Process Applications are open throughout
-
ensure high-quality delivery. Job Requirements Have relevant competence in the areas of computer vision. Have a Bachelor’s or Master’s degree in computer science, data science, AI, or related fields
-
available at all levels from Lecturer to Full Professor in Chemical Engineering or related disciplines. Priority skill areas include: process monitoring and control, industry 4.0, modeling, AI and ML
-
Automation: Develop pipelines and utilities to support data processing, model training, evaluation, and continuous improvement. Collaboration and Knowledge Transfer: Document technical processes, contribute
-
processes to ensure seamless operations, compliance with safety and quality standards, and effective client engagement. The ideal candidate will also contribute to continuous process improvement, training
-
EU programme Is the Job related to staff position within a Research Infrastructure? No Offer Description FACULTY POSITIONS (AT ALL LEVELS) IN COMPUTER SCIENCE, COMPUTER ENGINEEERING AND INFORMATION
-
or above in Electrical Engineering, Computer Engineering, or a related field from a recognised university. Strong programming skills in MATLAB/Simulink, Python, C/C++, or similar environments. Understanding
-
of the award, awardees will be considered for Assistant Professorship. Awardees will also be assigned a faculty mentor for the duration of the scheme. Application Process Applications are open throughout
-
models (classification, detection, segmentation, enhancement) into modular pipelines. Build APIs to manage model training, deployment, and inference. Automate the process of pushing trained models