I am currently a 4th year Masters of Engineering Discrete Mathematics student at the University of Warwick. I find particularly enjoyable the process of applying my knowledge of algorithms and mathematics to solve deep and fascinating problems, creating something new and unexplored, and collaborating with a team to discover something amazing!
I have a broad interest in all of:
I am currently looking for a PhD position within these interests, so if you are interested in supervising such a project, email me.
") does not match the recommended repository name for your site ("
").
", so that your site can be accessed directly at "http://
".
However, if the current repository name is intended, you can ignore this message by removing "{% include widgets/debug_repo_name.html %}
" in index.html
.
",
which does not match the baseurl
("
") configured in _config.yml
.
baseurl
in _config.yml
to "
".
Masters of Engineering, Discrete Mathematics
1st Year: 85.2%
2nd Year: 88.1% (awarded for highest)
3rd Year: 88.3% (awarded for project)
All are a high first, >70% is a first.
Computer Science: | A* |
Maths: | A* |
Further Maths: | A* |
Physics: | A* |
Jonathan Auton, Ranko Lazic, Matthias Englert
URSS Showcase 2024
Artificial neural networks are a type of self-learning computer algorithm that have become central to the development of modern AI systems. The most used self-learning technique is gradient descent, a simple yet effective algorithm that iteratively improves a network by tweaking it repeatedly in a direction of improving accuracy. However, new findings suggest the step size cannot be made small enough to avoid the effects of iterative instability. As a result, the learning process tends to become chaotic and unpredictable. What is fascinating about this chaotic nature is that despite it, gradient descent still finds effective solutions. My project seeks to develop an understanding of the underlying mechanisms of this chaotic nature that is paradoxically effective.
Jonathan Auton, Ranko Lazic, Matthias Englert
URSS Showcase 2024
Artificial neural networks are a type of self-learning computer algorithm that have become central to the development of modern AI systems. The most used self-learning technique is gradient descent, a simple yet effective algorithm that iteratively improves a network by tweaking it repeatedly in a direction of improving accuracy. However, new findings suggest the step size cannot be made small enough to avoid the effects of iterative instability. As a result, the learning process tends to become chaotic and unpredictable. What is fascinating about this chaotic nature is that despite it, gradient descent still finds effective solutions. My project seeks to develop an understanding of the underlying mechanisms of this chaotic nature that is paradoxically effective.
Jonathan Auton, Marcin Jurdzinski
Master's Dissertation 2024
A Java software application for rendering a 4D representation of the Mandelbrot set. The project uses novel 4D projectional techniques to efficiently visualise and move in the space, and generalises well to alternate formulae for fractal generation.
Jonathan Auton, Marcin Jurdzinski
Master's Dissertation 2024
A Java software application for rendering a 4D representation of the Mandelbrot set. The project uses novel 4D projectional techniques to efficiently visualise and move in the space, and generalises well to alternate formulae for fractal generation.