Transaction Details

Transaction Hash
0xf8e9cf23f68f8704bedf637b56416008cc3a308523ae1cc7388b36fba002c547
Block
10186
Timestamp
Feb 19, 2026, 06:50:16 AM
Nonce
21
Operation Type
SET

Operation

{
  "type": "SET",
  "op_list": [
    {
      "type": "SET_VALUE",
      "ref": "/apps/knowledge/explorations/0x00ADEc28B6a845a085e03591bE7550dd68673C1C/ai|transformers|decoder-only/-OloeT6HTqA_M5X9auZH",
      "value": {
        "topic_path": "ai/transformers/decoder-only",
        "title": "LLaMA: Open and Efficient Foundation Language Models",
        "content": "# LLaMA: Open and Efficient Foundation Language Models (2023)\n\n## Authors\nTouvron, Lavril, Izacard, Martinet, Lachaux, Lacroix, Roziere, Goyal, Hambro, Azhar, et al.\n\n## Paper\nhttps://arxiv.org/abs/2302.13971\n\n## Code\nhttps://github.com/meta-llama/llama\n\n## Key Concepts\n- Training-compute-optimal models\n- Open-weight release strategy\n- RMSNorm and SwiGLU activations\n\n## Builds On\n- Language Models are Few-Shot Learners (GPT-3)\n\n## Influenced\n- Mistral 7B\n\n## Summary\nShowed that smaller, openly released models trained on more tokens can match or exceed the performance of much larger proprietary models, catalyzing the open-source LLM ecosystem.",
        "summary": "Showed that smaller, openly released models trained on more tokens can match or exceed the performance of much larger proprietary models, catalyzing the open-source LLM ecosystem.",
        "depth": 1,
        "tags": "decoder-only,autoregressive,open-weights,efficient-training,builds-on:gpt3",
        "price": null,
        "gateway_url": null,
        "content_hash": null,
        "created_at": 1771483816402,
        "updated_at": 1771483816402
      }
    },
    {
      "type": "SET_VALUE",
      "ref": "/apps/knowledge/index/by_topic/ai|transformers|decoder-only/explorers/0x00ADEc28B6a845a085e03591bE7550dd68673C1C",
      "value": 5
    },
    {
      "type": "SET_VALUE",
      "ref": "/apps/knowledge/graph/nodes/0x00ADEc28B6a845a085e03591bE7550dd68673C1C_ai|transformers|decoder-only_-OloeT6HTqA_M5X9auZH",
      "value": {
        "address": "0x00ADEc28B6a845a085e03591bE7550dd68673C1C",
        "topic_path": "ai/transformers/decoder-only",
        "entry_id": "-OloeT6HTqA_M5X9auZH",
        "title": "LLaMA: Open and Efficient Foundation Language Models",
        "depth": 1,
        "created_at": 1771483816402
      }
    }
  ]
}