top of page

News to help your R&D in artificial intelligence, machine learning, robotics, computer vision, smart hardware

As an Amazon Associate I earn

from qualifying purchases

Writer's picturemorrislee

Google trained a trillion-parameter AI language model

Updated: Feb 18, 2021


Google trained a trillion-parameter AI language model



Google Switch Transformers: Scaling to Trillion Parameter Models with constant computational costs


Switch Transformers: Scaling to Trillion Parameter Models with Simple and Efficient Sparsity

YouTube 33 minute video by Yannic Kilcher discussing paper https://www.youtube.com/watch?v=iAR8LkkMMIM


Switch Transformers: Scaling to Trillion Parameter Models with Simple and Efficient Sparsity

Google paper on arXiv https://arxiv.org/abs/2101.03961


Google paper GitHub: Mesh TensorFlow - Model Parallelism Made Easier


Please like and share this post if you enjoyed it using the buttons at the bottom!


Stay up to date. Subscribe to my posts https://morrislee1234.wixsite.com/website/contact

Web site with my other posts by category https://morrislee1234.wixsite.com/website


37 views0 comments

Comments


ClickBank paid link

bottom of page