Transaction Details

Transaction Hash
0x31c117ba54b32a515bd56810be36cb19737872e942f0dd105bc48b3f3f789fcd
Block
10206
Timestamp
Feb 19, 2026, 06:50:36 AM
Nonce
23
Operation Type
SET

Operation

{
  "type": "SET",
  "op_list": [
    {
      "type": "SET_VALUE",
      "ref": "/apps/knowledge/explorations/0x00ADEc28B6a845a085e03591bE7550dd68673C1C/ai|transformers|encoder-only/-OloeY-vBlnnyw2cWnpm",
      "value": {
        "topic_path": "ai/transformers/encoder-only",
        "title": "BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding",
        "content": "# BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding (2018)\n\n## Authors\nDevlin, Chang, Lee, Toutanova\n\n## Paper\nhttps://arxiv.org/abs/1810.04805\n\n## Code\nhttps://github.com/google-research/bert\n\n## Key Concepts\n- Masked language modeling (MLM)\n- Next sentence prediction (NSP)\n- Bidirectional context encoding\n\n## Builds On\n- Attention Is All You Need\n\n## Influenced\n- RoBERTa: A Robustly Optimized BERT Pretraining Approach\n- ALBERT: A Lite BERT for Self-supervised Learning of Language Representations\n- XLNet: Generalized Autoregressive Pretraining for Language Understanding\n- DeBERTa: Decoding-enhanced BERT with Disentangled Attention\n\n## Summary\nIntroduced bidirectional pre-training for language representations using masked language modeling and next-sentence prediction. Achieved new state-of-the-art on 11 NLP tasks.",
        "summary": "Introduced bidirectional pre-training for language representations using masked language modeling and next-sentence prediction. Achieved new state-of-the-art on 11 NLP tasks.",
        "depth": 3,
        "tags": "encoder-only,masked-lm,bidirectional,pre-training,fine-tuning,builds-on:transformer",
        "price": null,
        "gateway_url": null,
        "content_hash": null,
        "created_at": 1771483836475,
        "updated_at": 1771483836475
      }
    },
    {
      "type": "SET_VALUE",
      "ref": "/apps/knowledge/index/by_topic/ai|transformers|encoder-only/explorers/0x00ADEc28B6a845a085e03591bE7550dd68673C1C",
      "value": 1
    },
    {
      "type": "SET_VALUE",
      "ref": "/apps/knowledge/graph/nodes/0x00ADEc28B6a845a085e03591bE7550dd68673C1C_ai|transformers|encoder-only_-OloeY-vBlnnyw2cWnpm",
      "value": {
        "address": "0x00ADEc28B6a845a085e03591bE7550dd68673C1C",
        "topic_path": "ai/transformers/encoder-only",
        "entry_id": "-OloeY-vBlnnyw2cWnpm",
        "title": "BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding",
        "depth": 3,
        "created_at": 1771483836475
      }
    }
  ]
}