About the Role

As a Research Engineer for Foundational Models, you will be responsible for investigating, developing and experimenting privacy preserving techniques on Transformer Models.

We are looking for a mid-to-senior level Research Engineer who can build and modify transformer-based architectures from scratch (Initially LLMs). This is a hands-on role for someone excited about digging into the inner workings of LLMs and experimenting with novel neural network layers.

You are passionate about the mathematical elegance that goes into building neural networks and are able to explain

You’ll also have opportunities to occasionally work on-site with our research partners at SUPSI and IDSIA in Switzerland to accelerate our joint innovations.

What you’ll do

Required Qualifications