Google MorphNet makes your neural net faster and smaller
Ensemble deep learning: A review
NVIDIA, Stanford, Microsoft Efficient Trillion-Parameter Model Training on GPU Clusters
Microsoft DeepSpeed lets 1 GPU train 40 billion parameter neural net
CPU beats GPU by 15x for neural net training with Rice Univ method
Top 10 AI and Machine Learning Papers in 2020
Facebook AI's MADGRAD optimizer improves neural network training
Google's MoViNets: Mobile Video Networks for Efficient Video Recognition
Top 10 Computer Vision Papers in 2020
GPT-3 AI can write a passing college paper in 20 minutes
Georgia Tech and Facebook Reduce Size of Deep Learning Recommendation Models by Factor of 112
Google Model Search AutoML automatically optimizes and identifies AI models
Google trained a trillion-parameter AI language model
Machine learning with less than one example per class
Google's Snorkel replaces slow hand label to train AI with functions to automatically assign label
Amazon SageMaker JumpStart for AutoML trains neural networks and lets you choose from 150+ models
AI scientists focusing on higher-level tasks like domain expertise rather than hyperparameter tuning
MuZero algorithm masters Go, chess, shogi and Atari WITHOUT being told the rules
M.I.T. MCUNet brings deep learning to Internet of Things (IoT) by using $8 CPU 0.5 MB SRAM 2MB FLASH