Sort by
Refine Your Search
-
Listed
-
Category
-
Program
-
Field
-
the given non-classical logic. The proof of the claim contains an algorithm for deciding whether an arbitrary formula is true or else false! This proof can then be exported automatically to produce a formally
-
This project aims to employ advanced machine learning techniques to analyse text, audio, images, and videos for signs of harmful behaviour. Natural language processing algorithms are utilized
-
monthly stipends. This research direction will involve close collaborations with Prof Lin Chen of China. The supervisor is currently engaged in a joint research project with Prof Lin Chen that will fund
-
" (with Prof Kris Helmerson) "High-bandwidth continuous magnetic sensing of an ensemble of electric spins" (with Prof Kris Helmerson) "Developing a spatially sensitive optical magnetometer catheter probe
-
at one time. In non-stationary environments on the other hand, the same algorithms cannot be applied as the underlying data distributions change constantly and the same models are not valid. Hence, we need
-
with leading researchers in glass science/engineering and diffraction physics/crystallography in Australia and around the world. "Local structure and symmetry in metallic glasses" (with Assoc Prof Scott
-
The existing deep learning based time series classification (TSC) algorithms have some success in multivariate time series, their accuracy is not high when we apply them on brain EEG time series (65
-
challenging data problem. Weak signals from collisions of compact objects can be dug out of noisy time series because we understand what the signal should look like, and can therefore use simple algorithms
-
short-(Illumina) and long-read sequencing (Oxford Nanopore), data mining of electronic medical records and use of machine learning to predict several outcomes. Assoc. Prof. David Dowe will be the primary
-
-readable representations, such as distributed representations of text augmented with random noises [1] or unnatural text curated by replacing sensitive tokens with random non-sensitive ones [2]. First, such