Gunshi Gupta

Gunshi Gupta

Deep Learning Researcher

University of Oxford

OATML

Wayve

MILA

Biography

Hey! I’m in my final year as a Machine Learning D.phil student at the OATML group at University of Oxford. I’m co-supervised by Prof. Yarin Gal, Prof. Tim Rudner and Dr. Adrien Gaidon.

I’m currently working on designing methods, architectures and benchmarks to enable embodied agents to learn to do long horizon tasks by creating and accessing episodic memories. I am specifically focusing on transformer-based agents trained through large-scale RL. Some of the topics I have done research on over the previous two years are:

  • Leveraging advances in visual diffusion modeling for robotics
  • Mechanistic interpretability in transformer-based world models
  • Training tokenised visual world models, and
  • Causally-correct policy learning from imbalanced data.

I also collaborate closely with researchers from Toyota Research on topics related to robot learning.

Prior to starting my Ph.D, I had been working as a deep learning researcher at Wayve, a startup based in London that is employing end-to-end deep learning to achieve autonomous driving. Before that I graduated from a Machine Learning Research Master’s at Mila (Sept 2020) where I primarily did research on the topics of bayesian deep learning, continual learning and inverse reinforcement learning. I was also an ED&I Fellow with the MPLS department at the University of Oxford in 2022-2023 cohort.

I was introduced to robotics during a year-long research internship at IIIT, (Hyderabad, India 2017-2018), where I worked on Multi-Robot SLAM and view-invariant recognition for place recognition and relocalisation.

Download my resumé.

Interests

  • Policy Learning for Robotics, Reinforcement Learning
  • Diffusion modeling
  • Continual Learning, Meta Learning
  • Memory-augmented models

Education

  • D.Phil Machine Learning (AIMS CDT), 2024

    University of Oxford

  • Research Master's in Machine Learning, 2020

    Montreal Institute of Learning Algorithms

  • B.Tech in Maths and Computing (Applied Mathematics), 2016

    Delhi Technological University (DTU/DCE)

Experience

 
 
 
 
 

Deep Learning Intern (RL/IL)

Microsoft Research

Apr 2023 – Jul 2023 Cambridge
  1. I contributed to a team submission to NeurIPS titled “WHAM: World and Human Action Modelling in a Modern Xbox Game” exploring a VQGAN-transformer based world-and-action model trained on 3 years of gameplay trajectories in a high-fidelity multi-player game.
  2. Develop an evaluation suite for mechanistic interpretability of transformer representations to track emergence of game-relevant concepts like locations of adversaries, health resources of the player and so on.
 
 
 
 
 

Deep Learning Researcher

Wayve

Jul 2020 – Sep 2021 London
I am part of the policy learning team that focuses on exploring algorithms that can learn in robust and sample-efficient manner aided by expert demonstrations.
 
 
 
 
 

Graduate Research Assistant

Robotics Research Center, IIITH

Feb 2017 – Apr 2018 Hyderabad

Here I:

  • Developed a Multi Robot Visual SLAM framework for the Center for Artificial Intelligence and Robotics (CAIR, India) that was tested using the Husky UGV platform
  • Published “View-Invariant Intersection Recognition from Videos using Deep Network Ensembles” at IROS 2018
 
 
 
 
 

Software Developer

Microsoft

Jun 2016 – Feb 2017 Hyderabad
  • Built prediction and summarisation modules for employee performance feedback using deep learning and NLP
  • Organised workshops on ‘Machine Learning Fundamentals’ for Microsoft employees.
 
 
 
 
 

Computer Vision Intern

Nayi Disha Studios

Jan 2016 – Mar 2016 Hyderabad
Developed a camera-based tracking system for gestures and actions to replace Lidar for interactive gameplay-based primary-school education platform.
 
 
 
 
 

Software Developer Intern

Microsoft

Jun 2014 – Aug 2014 Hyderabad

Publications & Preprint

Quickly discover relevant content by filtering publications.

Contact

  • Queen Mary Road, Montreal, QC H3W1W3
  • Monday 10:00 to 13:00
    Wednesday 09:00 to 10:00
  • Book an appointment