Sort by
Refine Your Search
-
experiences will enrich and strengthen our organization to apply. Learn more: Our benefits , where we prioritize your well-being and success to enhance every aspect of your life. Being a part of the University
-
limited to: vacuuming, mopping, and waxing floors; emptying trash receptacles; washing lavatories and fixtures within; dusting and polishing wood and metal fixtures Learn more: Our benefits , where we
-
, training opportunities and access to recreational resources. Here, people from all backgrounds and cultures challenge and inspire each other to discover, learn and succeed. We focus on creating and
-
knowledge of unmanned aerial vehicles, microwave and mmWaves, software development, and digital twin. Solid knowledge of Artificial Intelligence/Machine Learning. Proficient in C, C++, Python, and MATLAB
-
Bioinformatics and/or Cheminformatics. Experience working in computational drug discovery, particularly multiscale applications. Experience working in Python and Bash. Understanding of machine/deep learning topics
-
collaborative multidisciplinary team involving medical, research and behavioral medicine care, willingness to learn, develop additional computer skills related to data management and preparation of data
-
track record of exceptional time and attendance Candidate must have solid computer skills and the ability to learn and properly use software. Minimum Qualifications Non-competitive: one year of trade
-
media and solution preparation, as well as the disposal of hazardous and biological waste. You will also assist other lab members with various tasks as needed. Learn more: Our benefits , where we
-
, graduate-level trainees and student assistants Perform clerical duties, assist with daily office operations and train student assistants The successful candidate will have computer skills, the ability
-
(including video analysis and 3D reconstruction), machine learning (including big data analytics and adversarial machine learning), natural language processing (audio-visual multimodal understanding