Sort by
Refine Your Search
-
Listed
-
Category
-
Program
-
Employer
-
Field
-
quantitative fields Experience in writing production level code in Python Strong background in data engineering, big data management, data modeling and warehousing Strong background in other programming
-
projects, including large-scale surveys, intervention trials, and qualitative studies. Manage project administrations and logistics, including data management and adherence of IRB procedures. Disseminate
-
experience for transactional and analytics systems, including querying and tuning large, complex data sets and performance analysis. • Working experience in big data and DataSecOps is an added advantage
-
services, preferably Microsoft Fabric. • Strong database working experience for transactional and analytics systems, including querying and tuning large, complex data sets and performance analysis
-
data from major facilities, including NASA’s Kepler, TESS, and JWST missions; ESA’s Gaia mission; and large-scale spectroscopic surveys such as LAMOST. Experience working with large, complex datasets is
-
the management of large datasets (in the excess of 21,000 data points) and advanced data analyses, including Hierarchical Linear Modelling and Growth Modelling. Additional responsibilities include providing drafts
-
and implement AI-based NLP solutions, with a focus on large language models, reasoning, and healthcare applications. Assist with data curation, dataset construction, and conducting experiments or user
-
. An applicant is required to have a Doctoral Degree in Hospitality and Tourism Management or a related field (e.g., Data Analytics, Big Data, Entrepreneurship, Asset Management, AI), preferably with extensive
-
(SPHS) is a large scale population-based health research project by the Saw Swee Hock School of Public Health, National University of Singapore. We are looking for a team player with experience in making
-
trends. 2) Analyse Energy Data Collect, clean, and interpret large datasets from various sources. 3) Monitor and Validate Model Performance Continuously evaluate data accuracy and refine models to improve