Module 7 of 22|90 min

Transformer Architecture

Attention is all you need

The Transformer Revolution15:00

Module Content

The Transformer Revolution

15:00

video

Attention Is All You Need Paper

20 min

reading

Self-Attention Mechanism Deep Dive

18:45

video

Implementing Attention from Scratch

25 min

code

Multi-Head Attention

14:30

video

Building a Transformer Block

20 min

code

Module Quiz

12 questions

quiz