My main focus is non-standard neural machines and methods and hybrid systems combining those non-standard approaches with SOTA AI models.
The December 2024 version of my resume: resume-2024.pdf.
The December 2024 version of my list of publications: publications-2024.pdf.
I am focusing on dataflow matrix machines, a highly expressive flavor of programmable neural networks (see links on page 3 of this resume).
I am looking for an organization or a group of people which would like to take a lead in developing technologies based on dataflow matrix machines.
I am also quite open to informal research and engineering collaborations on dataflow matrix machines and related topics.
I also want to collaborate on attention-based models including Transformers, as this is currently the cutting edge of machine learning and AI.
There are deep connections between dataflow matrix machines and attention-based models: linear combinations of high-dimensional vector-like objects are at the center of both approaches.
The aim of dataflow matrix machines is to achieve convergence between programs and models, cf. The future of deep learning by François Chollet.
Modern differentiable programming frameworks such as Zygote.jl (Julia Flux) and JAX have sufficient expressive power to handle full flexibility of dataflow matrix machines.
A white paper on dataflow matrix machines: dmm-white-paper-2022.pdf.
An interdisciplinary and collaborative research agenda: dmm-collaborative-research-agenda.pdf.
I am particularly interested in various possibilities to use dataflow matrix machines to enhance magic of modern Transformers.
I am also exploring ways and avenues to contribute to AI safety research.
My Mastodon account at sigmoid.social.
Michael Bukatin
P.O.Box 391894, Cambridge, MA 02139
e-mail: bukatin@cs.brandeis.edu