MIT 6.S191: Recurrent Neural Networks, Transformers, and Attention
Alexander Amini
118,862 views
2,479 likes
MIT 6.S191: Convolutional Neural Networks
Alexander Amini
Attention is all you need (Transformer) - Model explanation
Umar Jamil
MIT Introduction to Deep Learning | 6.S191
Alexander Amini
The Misconception that Almost Stopped AI [How Models Learn P
Welch Labs
MIT 6.S191: Deep Generative Modeling
Alexander Amini
Roger Penrose – Why Intelligence Is Not a Computational Proc
Breakthrough
Why Germany’s Rebooting Its Economy
CNBC International
Andrej Karpathy: Software Is Changing (Again)
Y Combinator
MIT 6.S191 (Google): Large Language Models
Alexander Amini
1. Introduction to 'The Society of Mind'
MIT OpenCourseWare
MIT 6.S191: Language Models and New Frontiers
Alexander Amini
Visualizing transformers and attention | Talk for TNG Big Te
Grant Sanderson
MIT 6.S191 (Microsoft): AI for Biology
Alexander Amini
AI Snake Oil: What Artificial Intelligence Can Do, What It C
MIT Shaping the Future of Work Initiative
This is why Deep Learning is really weird.
Machine Learning Street Talk
MIT 6.S191: Reinforcement Learning
Alexander Amini
Understanding AI from Scratch – Neural Networks Course
freeCodeCamp.org
AI and human evolution | Yuval Noah Harari
Yuval Noah Harari
Lecture 1: Introduction to Superposition
MIT OpenCourseWare
How to Create a Neural Network (and Train it to Identify Doo
Sebastian Lague