81 distributed-algorithm-"Meta"-"Meta"-"Meta" positions at Monash University in Australia
Sort by
Refine Your Search
-
software frameworks, algorithms, robust testing and validation methods, and/or empirically validated solutions that contribute directly to social good, promoting trust, fairness, transparency, and
-
distribution across multiple HSR scenarios. You will work alongside a team of internationally renowned experts in transport and urban planning, including Associate Professor Liton Kamruzzaman, Professor Hai Vu
-
This project aims to employ advanced machine learning techniques to analyse text, audio, images, and videos for signs of harmful behaviour. Natural language processing algorithms are utilized
-
the given non-classical logic. The proof of the claim contains an algorithm for deciding whether an arbitrary formula is true or else false! This proof can then be exported automatically to produce a formally
-
at one time. In non-stationary environments on the other hand, the same algorithms cannot be applied as the underlying data distributions change constantly and the same models are not valid. Hence, we need
-
determined by combining the observed space density of galaxies, the measured spatial distribution of galaxies and simulations of the dark matter distribution. Example themes for student projects follow and
-
The existing deep learning based time series classification (TSC) algorithms have some success in multivariate time series, their accuracy is not high when we apply them on brain EEG time series (65
-
challenging data problem. Weak signals from collisions of compact objects can be dug out of noisy time series because we understand what the signal should look like, and can therefore use simple algorithms
-
indicators beyond current proxies. High-Speed Rail (HSR) and National Spatial Optimisation Examine how HSR infrastructure reshapes urban and regional population distribution, and develop a multi-objective
-
-readable representations, such as distributed representations of text augmented with random noises [1] or unnatural text curated by replacing sensitive tokens with random non-sensitive ones [2]. First, such