Attention Is All You Need — Build Your Own Transformer (Build Your Own Transformer)

courses/attention-is-all-you-need--build-your-own-transformer

Transformer 아키텍처는 recurrence를 완전히 self-attention으로 대체하여, 기계 번역에서 SOTA 성능을 달성하면서도 훈련 병렬화를 가능하게 한 획기적 모델입니다.

Created by 0xDf38B0D6...
on 4/4/2026
Explorers
0
Max Depth
0
Avg Depth
0

Topic Subgraph

Explorations (0)

No explorations found for this topic.