172
Chinese firms ‘distilling’ US AI models to create rival products, warns OpenAI
(www.theguardian.com)
Rules:
Also feel free to check out !leopardsatemyface@lemm.ee (also active).
Icon credit C. Brück on Wikimedia Commons.
Chains of distillation is mostly uncharted territory! There aren't a lot of distillations because each one is still very expensive (as in at least tens of thousands of dollars, maybe millions of dollars for big models).
Usually a distillation is used to make a smaller model out of a bigger one.
But the idea of distillations from multiple models is to "add" the knowledge and strengths of each model together. There's no formal selection process, it's just whatever the researchers happen to try. You can read about another example here: https://huggingface.co/arcee-ai/SuperNova-Medius