Search Results for author: Victoria Dean

Found 4 papers, 3 papers with code

Hearing Touch: Audio-Visual Pretraining for Contact-Rich Manipulation

no code implementations14 May 2024 Jared Mejia, Victoria Dean, Tess Hellebrekers, Abhinav Gupta

Although pre-training on a large amount of data is beneficial for robot learning, current paradigms only perform large-scale pretraining for visual representations, whereas representations for other modalities are trained from scratch.

Train Offline, Test Online: A Real Robot Learning Benchmark

1 code implementation1 Jun 2023 Gaoyue Zhou, Victoria Dean, Mohan Kumar Srirama, Aravind Rajeswaran, Jyothish Pari, Kyle Hatch, Aryan Jain, Tianhe Yu, Pieter Abbeel, Lerrel Pinto, Chelsea Finn, Abhinav Gupta

Three challenges limit the progress of robot learning research: robots are expensive (few labs can participate), everyone uses different robots (findings do not generalize across labs), and we lack internet-scale robotics data.

Interesting Object, Curious Agent: Learning Task-Agnostic Exploration

1 code implementation NeurIPS 2021 Simone Parisi, Victoria Dean, Deepak Pathak, Abhinav Gupta

In this setup, the agent first learns to explore across many environments without any extrinsic goal in a task-agnostic manner.

Object

Cannot find the paper you are looking for? You can Submit a new open access paper.