Transaction Details

Transaction Hash
0x91f3a7797dc0f864d1d7803f923b166b3178e5aeab33b0b779f51a60526c1a10
Block
10216
Timestamp
Feb 19, 2026, 06:50:46 AM
Nonce
24
Operation Type
SET

Operation

{
  "type": "SET",
  "op_list": [
    {
      "type": "SET_VALUE",
      "ref": "/apps/knowledge/explorations/0x00ADEc28B6a845a085e03591bE7550dd68673C1C/ai|transformers|encoder-only/-Oloe_SoRtJl4ACjgRb3",
      "value": {
        "topic_path": "ai/transformers/encoder-only",
        "title": "XLNet: Generalized Autoregressive Pretraining for Language Understanding",
        "content": "# XLNet: Generalized Autoregressive Pretraining for Language Understanding (2019)\n\n## Authors\nYang, Dai, Yang, Carbonell, Salakhutdinov, Le\n\n## Paper\nhttps://arxiv.org/abs/1906.08237\n\n## Code\nhttps://github.com/zihangdai/xlnet\n\n## Key Concepts\n- Permutation language modeling\n- Two-stream self-attention\n- Integration of Transformer-XL recurrence\n\n## Builds On\n- BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding\n- Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context\n\n## Summary\nCombined the best of autoregressive and autoencoding approaches via permutation language modeling, overcoming BERT's independence assumption for masked tokens while leveraging Transformer-XL's recurrence.",
        "summary": "Combined the best of autoregressive and autoencoding approaches via permutation language modeling, overcoming BERT's independence assumption for masked tokens while leveraging Transformer-XL's recurrence.",
        "depth": 2,
        "tags": "encoder-only,permutation-lm,autoregressive,two-stream-attention,builds-on:bert,builds-on:transformer-xl",
        "price": null,
        "gateway_url": null,
        "content_hash": null,
        "created_at": 1771483846516,
        "updated_at": 1771483846516
      }
    },
    {
      "type": "SET_VALUE",
      "ref": "/apps/knowledge/index/by_topic/ai|transformers|encoder-only/explorers/0x00ADEc28B6a845a085e03591bE7550dd68673C1C",
      "value": 2
    },
    {
      "type": "SET_VALUE",
      "ref": "/apps/knowledge/graph/nodes/0x00ADEc28B6a845a085e03591bE7550dd68673C1C_ai|transformers|encoder-only_-Oloe_SoRtJl4ACjgRb3",
      "value": {
        "address": "0x00ADEc28B6a845a085e03591bE7550dd68673C1C",
        "topic_path": "ai/transformers/encoder-only",
        "entry_id": "-Oloe_SoRtJl4ACjgRb3",
        "title": "XLNet: Generalized Autoregressive Pretraining for Language Understanding",
        "depth": 2,
        "created_at": 1771483846516
      }
    }
  ]
}