Your Daily AI Research tl;dr — 2022–09–27 🧠
The Cohere For AI Scholars Program, using neural network checkpoints and an interpretable and efficient predictor for pre-trained large language models!
Welcome to your official daily AI research tl;dr (often with code and news) for AI professionals where I share the most exciting papers I find daily, along with a one-liner summary to help you quickly determine if the article (and code) is worth investigating.
Special birthday iteration today! 🎉 I hope you guys have a fantastic Tuesday!
1️⃣ LEARNING TO LEARN WITH GENERATIVE MODELS OF NEURAL NETWORK CHECKPOINTS
“A conditional diffusion transformer that, given an initial input parameter vector and a prompted loss, error, or return, predicts the distribution over parameter updates that achieve the desired metric. At test time, it can optimize neural networks with unseen parameters for downstream tasks in just one update.”
Link to the paper: https://arxiv.org/pdf/2209.12892.pdf
2️⃣ EMB-GAM: AN INTERPRETABLE AND EFFICIENT PREDICTOR USING PRE-TRAINED LANGUAGE MODELS
They use a pre-trained language model to extract embeddings for each input before learning a linear model in the embedding space (a generalized additive model (GAM), a transparent and interpretable linear function of its input features and feature interactions).
Link to the paper: https://arxiv.org/pdf/2209.11799.pdf
Cohere just announced their “Scholars Program”
They are “inviting a class of emerging talent to work alongside our team — exploring the unknown, together. If you’re looking for an opportunity to develop your research skills, your journey starts here.” If you are into NLP, research and/or currently learning AI, have a look at this cool opportunity! Learn more.
Please subscribe and share the newsletter with your AI techy friends if it was helpful and join our 4'000+ readers. Feel free to follow my weekly newsletter where I go in-depth into one of the papers shared, and our newsletter at Towards AI, sharing the most exciting news, papers, articles, and memes weekly.
Thank you for reading,