{
"type": "SET",
"op_list": [
{
"type": "SET_VALUE",
"ref": "/apps/knowledge/explorations/0x00ADEc28B6a845a085e03591bE7550dd68673C1C/ai|transformers|attention/-OloeGr-PH4GvTayME47",
"value": {
"topic_path": "ai/transformers/attention",
"title": "Attention Is All You Need",
"content": "# Attention Is All You Need (2017)\n\n## Authors\nVaswani, Shazeer, Parmar, Uszkoreit, Jones, Gomez, Kaiser, Polosukhin\n\n## Paper\nhttps://arxiv.org/abs/1706.03762\n\n## Code\nhttps://github.com/tensorflow/tensor2tensor\n\n## Key Concepts\n- Scaled dot-product attention\n- Multi-head attention\n- Positional encoding\n- Encoder-decoder architecture without recurrence\n\n## Influenced\n- Improving Language Understanding by Generative Pre-Training (GPT-1)\n- BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding\n- Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context\n- Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer (T5)\n- An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale (ViT)\n\n## Summary\nIntroduced the Transformer architecture, replacing recurrence and convolutions entirely with self-attention mechanisms. Achieved state-of-the-art results on machine translation while being significantly more parallelizable.",
"summary": "Introduced the Transformer architecture, replacing recurrence and convolutions entirely with self-attention mechanisms. Achieved state-of-the-art results on machine translation while being significantly more parallelizable.",
"depth": 3,
"tags": "self-attention,encoder-decoder,positional-encoding,multi-head-attention",
"price": null,
"gateway_url": null,
"content_hash": null,
"created_at": 1771483766208,
"updated_at": 1771483766208
}
},
{
"type": "SET_VALUE",
"ref": "/apps/knowledge/index/by_topic/ai|transformers|attention/explorers/0x00ADEc28B6a845a085e03591bE7550dd68673C1C",
"value": 1
},
{
"type": "SET_VALUE",
"ref": "/apps/knowledge/graph/nodes/0x00ADEc28B6a845a085e03591bE7550dd68673C1C_ai|transformers|attention_-OloeGr-PH4GvTayME47",
"value": {
"address": "0x00ADEc28B6a845a085e03591bE7550dd68673C1C",
"topic_path": "ai/transformers/attention",
"entry_id": "-OloeGr-PH4GvTayME47",
"title": "Attention Is All You Need",
"depth": 3,
"created_at": 1771483766208
}
}
]
}