Google trained a trillion-parameter AI language model
Updated: Feb 18, 2021
Google trained a trillion-parameter AI language model
VentureBeat article https://venturebeat.com/2021/01/12/google-trained-a-trillion-parameter-ai-language-model
Google Switch Transformers: Scaling to Trillion Parameter Models with constant computational costs
Towards Data Science article https://towardsdatascience.com/google-switch-transformers-scaling-to-trillion-parameter-models-with-constant-computational-costs-806fd145923d
Switch Transformers: Scaling to Trillion Parameter Models with Simple and Efficient Sparsity
YouTube 33 minute video by Yannic Kilcher discussing paper https://www.youtube.com/watch?v=iAR8LkkMMIM
Switch Transformers: Scaling to Trillion Parameter Models with Simple and Efficient Sparsity
Google paper on arXiv https://arxiv.org/abs/2101.03961
Google paper PDF https://arxiv.org/pdf/2101.03961
Google paper GitHub: Mesh TensorFlow - Model Parallelism Made Easier
Please like and share this post if you enjoyed it using the buttons at the bottom!
Stay up to date. Subscribe to my posts https://morrislee1234.wixsite.com/website/contact
Web site with my other posts by category https://morrislee1234.wixsite.com/website
תגובות