Sort by
Refine Your Search
-
measurements, including raw data processing and postprocessing. The data will contribute to one or two scientific articles. Work duties Research within the subject area Contributing to publishing results in high
-
, imaging or spectroscopy analyses. We deal with high data rates (GB/s) and “big data” volumes. To strengthen the support we can offer to the beamlines and their users, we are looking for a Beamline Data
-
and algorithm development as well as engineering methods that enable robust and efficient practical solutions. As society and technology evolve toward increasingly large‑scale, data‑intensive, and
-
for large specimens”. Your role/tasks: Main: the development of a PtyPy multibeam ptychography high-throughput data processing framework Supporting the design and construction of the multibeam ptychography
-
computational and experimental mechanics of biological tissues, where experimental work largely takes place at large scale facilities, using imaging, scattering and spectroscopy-based techniques. The research
-
experience of supervising students, in particular supervising students working together in large groups with molecular biology related experimental projects. It is an additional qualification if you have
-
the genomic landscape of breast cancer using large scale tumor sequencing data. You can read more about the project on our website https://portal.research.lu.se/sv/persons/staaf-johan/. The research team is
-
bioinformatician position at Lund University! National Bioinformatics Infrastructure Sweden (NBIS; https://nbis.se) is a large national infrastructure in rapid development providing support, tools and training
-
harness data collected during the entire lifecycle. In large-scale cloud-edge services, DevOps include many tools working together (e.g., build servers, code review, program analysis) to feed back
-
, and institutional environments, is a central research challenge. At the same time, rapid technological transformation is reshaping the conditions for research itself, making large-scale data