University of OxfordI am currently a DPhil student in Autonomous Intelligent Machines and Systems (AIMS CDT) at the University of Oxford.
Through AIMS, I aim to contribute to the advancement of AI techniques that can be used in practical application to real-world problems, with a real capacity to benefit lives. I am particularly excited to apply my knowledge to create better, safer, and more effective AI tools.
Before starting my DPhil, I completed a Masters of Engineering in Discrete Mathematics at the University of Warwick, graduating at the top of my cohort with First Class Honours. During my degree, I completed a third-year project on an efficient 4D rendering algorithm for the Mandelbrot set, which won the department's outstanding project prize. I also undertook a summer research project on the theoretical foundations of machine learning, where I investigated the iterative instability of gradient descent.
In my free time, I enjoy playing a variety of tabletop games, learning new things, and turning curiosities of things I learn about into software solutions.
") does not match the recommended repository name for your site ("").
", so that your site can be accessed directly at "http://".
However, if the current repository name is intended, you can ignore this message by removing "{% include widgets/debug_repo_name.html %}" in index.html.
",
which does not match the baseurl ("") configured in _config.yml.
baseurl in _config.yml to "".
Masters of Engineering, Discrete Mathematics
1st Year: 85.2%
2nd Year: 88.1% (awarded)
3rd Year: 88.3% (awarded)
4th Year: 80.2% (awarded)
All are a high first, >70% is a first.
| Computer Science: | A* |
| Maths: | A* |
| Further Maths: | A* |
| Physics: | A* |

Jonathan Auton, Ranko Lazic, Matthias Englert
URSS Showcase 2024
Artificial neural networks are a type of self-learning computer algorithm that have become central to the development of modern AI systems. The most used self-learning technique is gradient descent, a simple yet effective algorithm that iteratively improves a network by tweaking it repeatedly in a direction of improving accuracy. However, new findings suggest the step size cannot be made small enough to avoid the effects of iterative instability. As a result, the learning process tends to become chaotic and unpredictable. What is fascinating about this chaotic nature is that despite it, gradient descent still finds effective solutions. My project seeks to develop an understanding of the underlying mechanisms of this chaotic nature that is paradoxically effective.
Jonathan Auton, Ranko Lazic, Matthias Englert
URSS Showcase 2024
Artificial neural networks are a type of self-learning computer algorithm that have become central to the development of modern AI systems. The most used self-learning technique is gradient descent, a simple yet effective algorithm that iteratively improves a network by tweaking it repeatedly in a direction of improving accuracy. However, new findings suggest the step size cannot be made small enough to avoid the effects of iterative instability. As a result, the learning process tends to become chaotic and unpredictable. What is fascinating about this chaotic nature is that despite it, gradient descent still finds effective solutions. My project seeks to develop an understanding of the underlying mechanisms of this chaotic nature that is paradoxically effective.

Jonathan Auton, Marcin Jurdzinski
Master's Dissertation 2024
A Java software application for rendering a 4D representation of the Mandelbrot set. The project uses novel 4D projectional techniques to efficiently visualise and move in the space, and generalises well to alternate formulae for fractal generation.
Jonathan Auton, Marcin Jurdzinski
Master's Dissertation 2024
A Java software application for rendering a 4D representation of the Mandelbrot set. The project uses novel 4D projectional techniques to efficiently visualise and move in the space, and generalises well to alternate formulae for fractal generation.