213 parallel-and-distributed-computing-"Meta"-"Meta" positions at California Institute of Technology
Sort by
Refine Your Search
-
Listed
-
Category
-
Program
-
Field
-
; performs initial review, preparation, and distribution of all meeting materials, including meeting agendas, minutes, SOPs, and Committee communications with investigators. Maintains IRB and HESC records
-
overarching communications strategy. Implement and innovate on current and future best practices and workflows associated with multimedia production, post-production, and distribution. Other duties as assigned
-
study of natural isotope distributions. Coordinate experiments to study, investigate, test, and/or resolve scientific problems. Partner with research groups across disciplines (i.e., geochemistry, geology
-
distribution of all meeting materials, including meeting agendas, minutes, SOPs, and Committee communications with investigators. Maintains IRB and HESC records, monitors and administers IRB and HESC email
-
Computer Science, Computer Engineering, or a related field. At least 3 years of related experience, with a strong understanding of software engineering principles and best practices (e.g., design patterns, object
-
out research on distributed acoustic sensing (DAS) data analysis and participate in continued development on the inclusion of DAS data within the network monitoring systems. We seek applicants who like
-
, and distribution. Other duties as assigned. Basic Qualifications Bachelor's degree in film, journalism, or communications discipline. 3 or more years of experience in video storytelling and multimedia
-
technologies. Essential Job Duties Collaborate with Principal Investigator(s) to determine direction of research projects in the fundamental and applied study of natural isotope distributions. Coordinate
-
, promote, and protect Caltech’s image, reputation, and brand. We create and distribute a wide range of content across multiple formats and platforms to tell the story of Caltech’s people, their research, and
-
batch-oriented workflows with Apache Airflow or other task schedulers, using a Kubernetes cluster or other distributed computer cluster. Experience with developing software code using Python and/or C