no code implementations • 29 May 2023 • Leonard Tang, Gavin Uberti, Tom Shlomi
We consider the emerging problem of identifying the presence and use of watermarking schemes in widely used, publicly hosted, closed source large language models (LLMs).
no code implementations • 9 Mar 2023 • Leonard Tang, Tom Shlomi, Alexander Cai
In recent years, knowledge distillation has become a cornerstone of efficiently deployed machine learning, with labs and industries using knowledge distillation to train models that are inexpensive and resource-optimized.