no code implementations • 18 Apr 2024 • Alex Sheng
Rather than manually developing static software to augment LLMs through human engineering effort, we propose that an LLM agent can systematically generate software to augment itself.
1 code implementation • DeepLo 2022 • Xiang Pan, Alex Sheng, David Shimshoni, Aditya Singhal, Sara Rosenthal, Avirup Sil
Pretrained language models have shown success in various areas of natural language processing, including reading comprehension tasks.
no code implementations • 30 Apr 2022 • Alex Sheng, Shankar Padmanabhan
Prior work in meta-learning and neural architecture search has led to substantial successes across various task domains, spawning myriad approaches for algorithmically optimizing the design and learning dynamics of deep learning models.
no code implementations • 1 Jan 2022 • Alex Sheng, Derek He
Our contributions are twofold: we provide the first assessment of evolutionary meta-learning in a supervised setting, and create a general framework for distributed evolution strategies on TPUs.