Knowledge Graph
Explore the on-chain knowledge graph — topics, explorations, and their relationships
Topics
85
Explorations
376
Relationships
376
Explorers
7
Graph Visualization
Topics
| Path | Title | Description | Created By | Created |
|---|---|---|---|---|
| ai | Artificial Intelligence | Research and applications of artificial intelligence. | 0x00ADEc28... | 2/19/2026 |
| ai/transformers | Transformers | The transformer architecture family and its descendants. | 0x00ADEc28... | 2/19/2026 |
| ai/transformers/attention | Attention Mechanisms | Core self-attention mechanism introduced in the original Transformer. | 0x00ADEc28... | 2/19/2026 |
| ai/transformers/encoder-only | Encoder-Only Models | Masked-language-model architectures: BERT, RoBERTa, ALBERT, XLNet, DeBERTa. | 0x00ADEc28... | 2/19/2026 |
| ai/transformers/decoder-only | Decoder-Only Models | Autoregressive language models: GPT family, Transformer-XL, LLaMA, Mistral. | 0x00ADEc28... | 2/19/2026 |
| ai/transformers/encoder-decoder | Encoder-Decoder Models | Sequence-to-sequence transformer models such as T5. | 0x00ADEc28... | 2/19/2026 |
| ai/transformers/vision | Vision Transformers | Transformer architectures applied to computer vision tasks. | 0x00ADEc28... | 2/19/2026 |
| ai/transformers/diffusion | Diffusion Models | Latent diffusion and stable diffusion models for image generation. | 0x00ADEc28... | 2/19/2026 |
| ai/state-space-models | State-Space Models | Structured state-space sequence models as alternatives to attention. | 0x00ADEc28... | 2/19/2026 |
| test/test_1771522512337 | E2E Test Topic test_1771522512337 | Automated e2e test topic — safe to delete. | 0x00ADEc28... | 2/19/2026 |
| test/test_1771522671090 | E2E Test Topic test_1771522671090 | Automated e2e test topic — safe to delete. | 0x00ADEc28... | 2/19/2026 |
| test/test_1771527021965 | E2E Test Topic test_1771527021965 | Automated e2e test topic — safe to delete. | 0x00ADEc28... | 2/19/2026 |
| test/simple | A simple way to speedup Gauss Elimination | Papers-driven exploration of simple | 0xA7b9a095... | 2/19/2026 |
| test/bursty | How Bursty is Star Formation at z>5? | Papers-driven exploration of bursty | 0xA7b9a095... | 2/19/2026 |
| lessons/architecture | architecture | Lessons related to lessons/architecture | 0x00ADEc28... | 2/20/2026 |
| lessons/different | More is Different | Papers-driven exploration of different | 0xA7b9a095... | 2/20/2026 |
| lessons/engineering | engineering | Lessons related to lessons/engineering | 0x00ADEc28... | 2/20/2026 |
| lessons/ai | ai | Lessons related to lessons/ai | 0x00ADEc28... | 2/20/2026 |
| lessons/fol-qa-augmentation | fol qa augmentation | Lessons related to lessons/fol-qa-augmentation | 0x00ADEc28... | 2/20/2026 |
| lessons/pre-computed-reasoning | pre computed reasoning | Lessons related to lessons/pre-computed-reasoning | 0x00ADEc28... | 2/20/2026 |
| lessons/multi-hop-retrieval | multi hop retrieval | Lessons related to lessons/multi-hop-retrieval | 0x00ADEc28... | 2/20/2026 |
| lessons/qa-benchmarking | qa benchmarking | Lessons related to lessons/qa-benchmarking | 0x00ADEc28... | 2/20/2026 |
| lessons/test | test | Lessons related to lessons/test | 0x00ADEc28... | 2/20/2026 |
| lessons/e2e-test | e2e test | End-to-end test topic | 0x00ADEc28... | 2/20/2026 |
| lessons/blockchain | blockchain | Lessons related to lessons/blockchain | 0x21bCf0D5... | 2/20/2026 |
| lessons|architecture | lessons|architecture | Explorations related to lessons|architecture | 0x00ADEc28... | 2/20/2026 |
| lessons|engineering | lessons|engineering | Explorations related to lessons|engineering | 0x00ADEc28... | 2/20/2026 |
| lessons|ai | lessons|ai | Explorations related to lessons|ai | 0x00ADEc28... | 2/20/2026 |
| lessons|pre-computed-reasoning | lessons|pre computed reasoning | Explorations related to lessons|pre-computed-reasoning | 0x00ADEc28... | 2/20/2026 |
| lessons|qa-benchmarking | lessons|qa benchmarking | Explorations related to lessons|qa-benchmarking | 0x00ADEc28... | 2/20/2026 |
| lessons|test | lessons|test | Explorations related to lessons|test | 0x00ADEc28... | 2/20/2026 |
| lessons|e2e-test | lessons|e2e test | Explorations related to lessons|e2e-test | 0x00ADEc28... | 2/20/2026 |
| courses/attention-is-all-you-need/bible | Attention Is All You Need | Transformer 아키텍처는 recurrence를 완전히 self-attention으로 대체하여, 기계 번역에서 SOTA 성능을 달성하면서도 훈련 병렬화를 가능하게 한 획기적 모델입니다. | 0xaF71AE76... | 2/21/2026 |
| courses/attention-is-all-you-need/contributor-test | Attention Is All You Need | Transformer 아키텍처는 recurrence를 완전히 self-attention으로 대체하여, 기계 번역에서 SOTA 성능을 달성하면서도 훈련 병렬화를 가능하게 한 획기적 모델입니다. | 0xaF5bd3e4... | 2/21/2026 |
| courses/attention-is-all-you-need/attension-is-all-you-need | Attention Is All You Need | Transformer 아키텍처는 recurrence를 완전히 self-attention으로 대체하여, 기계 번역에서 SOTA 성능을 달성하면서도 훈련 병렬화를 가능하게 한 획기적 모델입니다. | 0xf48a6C6A... | 2/21/2026 |
| courses/attention-is-all-you-need/advanced | Attention Is All You Need | Transformer 아키텍처는 recurrence를 완전히 self-attention으로 대체하여, 기계 번역에서 SOTA 성능을 달성하면서도 훈련 병렬화를 가능하게 한 획기적 모델입니다. | 0xaF5bd3e4... | 2/21/2026 |
| courses/attention-is-all-you-need/blockchain-verify-test | Attention Is All You Need | Transformer 아키텍처는 recurrence를 완전히 self-attention으로 대체하여, 기계 번역에서 SOTA 성능을 달성하면서도 훈련 병렬화를 가능하게 한 획기적 모델입니다. | 0xaF71AE76... | 2/21/2026 |
| courses/attention-is-all-you-need/back-to-basics | Attention Is All You Need | Transformer 아키텍처는 recurrence를 완전히 self-attention으로 대체하여, 기계 번역에서 SOTA 성능을 달성하면서도 훈련 병렬화를 가능하게 한 획기적 모델입니다. | 0xaF5bd3e4... | 2/21/2026 |
| courses/adam-a-method-for-stochastic-optimization/adam-a-method-for-stochastic-optimization | Adam: A Method for Stochastic Optimization | Adam combines adaptive learning rate methods with momentum-based optimization. It maintains exponential moving averages of both gradients and squared gradients, with bias correction for stability. Computationally efficient and invariant to diagonal rescaling, Adam became the default optimizer in modern deep learning. | 0xaF5bd3e4... | 2/21/2026 |
| courses/an-image-is-worth-16x16-words-transformers-for/image-recognition-with-transformers | An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale | Vision Transformer(ViT)는 이미지를 패치 시퀀스로 분할하여 순수 Transformer에 입력함으로써, CNN 없이도 이미지 분류에서 SOTA 성능을 달성했습니다. | 0xaF71AE76... | 2/21/2026 |
| courses/llama-2-open-foundation-and-fine-tuned-chat-models/understanding-llama-2 | Llama 2: Open Foundation and Fine-Tuned Chat Models | Llama 2는 7B~70B 파라미터의 오픈소스 LLM 모음으로, RLHF를 통한 fine-tuning과 안전성 평가를 통해 상용 수준의 대화 AI를 누구나 활용할 수 있게 공개했습니다. | 0xf48a6C6A... | 2/21/2026 |
| courses/yolov3-an-incremental-improvement/yolo-v3-object-detection-fundamentals | YOLOv3: An Incremental Improvement | YOLOv3는 multi-scale prediction과 Darknet-53 백본을 도입하여 실시간 객체 탐지의 정확도와 속도를 동시에 개선한 모델입니다. | 0xaF5bd3e4... | 2/21/2026 |
| courses/adam-a-method-for-stochastic-optimization--adam-a-method-for-stochastic-optimization | Adam: A Method for Stochastic Optimization | Adam combines adaptive learning rate methods with momentum-based optimization. It maintains exponential moving averages of both gradients and squared gradients, with bias correction for stability. Computationally efficient and invariant to diagonal rescaling, Adam became the default optimizer in modern deep learning. | 0x48E875b8... | 4/4/2026 |
| courses/attention-is-all-you-need--blockchain-verify-test | Attention Is All You Need (Blockchain Verify Test) | Transformer 아키텍처는 recurrence를 완전히 self-attention으로 대체하여, 기계 번역에서 SOTA 성능을 달성하면서도 훈련 병렬화를 가능하게 한 획기적 모델입니다. | 0x08B6D536... | 4/4/2026 |
| courses/attention-is-all-you-need--bible | Attention Is All You Need (Bible) | Transformer 아키텍처는 recurrence를 완전히 self-attention으로 대체하여, 기계 번역에서 SOTA 성능을 달성하면서도 훈련 병렬화를 가능하게 한 획기적 모델입니다. | 0x34D13eC2... | 4/4/2026 |
| courses/attention-is-all-you-need--contributor-test | Attention Is All You Need (Contributor Test) | Transformer 아키텍처는 recurrence를 완전히 self-attention으로 대체하여, 기계 번역에서 SOTA 성능을 달성하면서도 훈련 병렬화를 가능하게 한 획기적 모델입니다. | 0x46eb8ff2... | 4/4/2026 |
| courses/attention-is-all-you-need--advanced | Attention Is All You Need (Advanced) | Transformer 아키텍처는 recurrence를 완전히 self-attention으로 대체하여, 기계 번역에서 SOTA 성능을 달성하면서도 훈련 병렬화를 가능하게 한 획기적 모델입니다. | 0xF4A84FEf... | 4/4/2026 |
| courses/attention-is-all-you-need--attension-is-all-you-need | Attention Is All You Need (Attension Is All You Need) | Transformer 아키텍처는 recurrence를 완전히 self-attention으로 대체하여, 기계 번역에서 SOTA 성능을 달성하면서도 훈련 병렬화를 가능하게 한 획기적 모델입니다. | 0x4Ef30A49... | 4/3/2026 |
| courses/yolov3-an-incremental-improvement--yolo-v3-object-detection-fundamentals | YOLOv3: An Incremental Improvement | YOLOv3는 multi-scale prediction과 Darknet-53 백본을 도입하여 실시간 객체 탐지의 정확도와 속도를 동시에 개선한 모델입니다. | 0x00ADEc28... | 4/3/2026 |
| courses/exploring-the-limits-of-transfer-learning-with-a--bible | Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer | T5는 모든 NLP 태스크를 text-to-text 형식으로 통합하는 프레임워크로, 전이 학습의 한계를 체계적으로 탐구하여 다양한 벤치마크에서 SOTA를 달성했습니다. | 0xad3C8615... | 4/4/2026 |
| courses/attention-is-all-you-need--sonnet-4-5-edition | Attention Is All You Need (Sonnet 4 5 Edition) | Transformer 아키텍처는 recurrence를 완전히 self-attention으로 대체하여, 기계 번역에서 SOTA 성능을 달성하면서도 훈련 병렬화를 가능하게 한 획기적 모델입니다. | 0x6b06EA06... | 4/4/2026 |
| courses/0g-developer-documentation--0g-developer-course | 0G Developer (0g Developer Course) | Unlock the full potential of 0G, the world's first decentralized AI operating system designed for high-performance intelligence. Beyond mere theory, engage with live, runnable TypeScript examples that demonstrate how to leverage data availability (DA) and modular storage for AI scalability. | 0x75a7f9c1... | 4/3/2026 |
| courses/an-image-is-worth-16x16-words-transformers-for--image-recognition-with-transformers | An Image is Worth 16x16 Words | Vision Transformer(ViT)는 이미지를 패치 시퀀스로 분할하여 순수 Transformer에 입력함으로써, CNN 없이도 이미지 분류에서 SOTA 성능을 달성했습니다. | 0x40581a35... | 4/4/2026 |
| courses/llama-2-open-foundation-and-fine-tuned-chat-models--understanding-llama-2 | Llama 2: Open Foundation and Fine-Tuned Chat Models | Llama 2는 7B~70B 파라미터의 오픈소스 LLM 모음으로, RLHF를 통한 fine-tuning과 안전성 평가를 통해 상용 수준의 대화 AI를 누구나 활용할 수 있게 공개했습니다. | 0x48E875b8... | 4/4/2026 |
| courses/0g-developer-documentation--0g-basic-course | 0G Basic (0g Basic Course) | Meet 0G — the world's first decentralized AI operating system. With four core services (Chain, Storage, Compute, and DA), 0G is redefining how AI infrastructure works. In this beginner-friendly course, an AI tutor guides you step by step — from wallet setup to file uploads, AI inference, and building a verifiable AI pipeline. No blockchain experience required: 3 modules, 12 hands-on concepts, and you'll deploy your first app on 0G. | 0xDfAE850D... | 4/4/2026 |
| courses/attention-is-all-you-need--learn-in-40-seconds | Attention Is All You Need — Learn in 40 Seconds (Learn In 40 Seconds) | Transformer 아키텍처는 recurrence를 완전히 self-attention으로 대체하여, 기계 번역에서 SOTA 성능을 달성하면서도 훈련 병렬화를 가능하게 한 획기적 모델입니다. | 0x281A2914... | 4/4/2026 |
| courses/attention-is-all-you-need--lightweight-version | Attention Is All You Need (Lightweight Version) | Transformer 아키텍처는 recurrence를 완전히 self-attention으로 대체하여, 기계 번역에서 SOTA 성능을 달성하면서도 훈련 병렬화를 가능하게 한 획기적 모델입니다. | 0x7dd1C585... | 4/3/2026 |
| courses/attention-is-all-you-need--back-to-basics | Attention Is All You Need (Back To Basics) | Transformer 아키텍처는 recurrence를 완전히 self-attention으로 대체하여, 기계 번역에서 SOTA 성능을 달성하면서도 훈련 병렬화를 가능하게 한 획기적 모델입니다. | 0x339bd13E... | 4/4/2026 |
| courses/attention-is-all-you-need--very-very-short-course | Attention Is All You Need (Very Very Short Course) | Transformer 아키텍처는 recurrence를 완전히 self-attention으로 대체하여, 기계 번역에서 SOTA 성능을 달성하면서도 훈련 병렬화를 가능하게 한 획기적 모델입니다. | 0x9b76A3C9... | 4/4/2026 |
| courses/attention-is-all-you-need--attention-is-super-duper | Attention Is All You Need (Attention Is Super Duper) | Transformer 아키텍처는 recurrence를 완전히 self-attention으로 대체하여, 기계 번역에서 SOTA 성능을 달성하면서도 훈련 병렬화를 가능하게 한 획기적 모델입니다. | 0x54d5EEc5... | 4/4/2026 |
| courses/direct-preference-optimization-your-language-model--dpo-direct-preference-optimization | Direct Preference Optimization: Your Language Model is Secretly a Reward Model — | DPO introduces a simple classification loss that directly optimizes language model policies on human preference data, eliminating the need for reinforcement learning while maintaining theoretical equivalence to the RLHF objective. | 0x64e3D107... | 4/3/2026 |
| courses/attention-is-all-you-need--build-your-own-transformer | Attention Is All You Need — Build Your Own Transformer (Build Your Own Transformer) | Transformer 아키텍처는 recurrence를 완전히 self-attention으로 대체하여, 기계 번역에서 SOTA 성능을 달성하면서도 훈련 병렬화를 가능하게 한 획기적 모델입니다. | 0xDf38B0D6... | 4/4/2026 |
| courses/hipporag-neurobiologically-inspired-long-term--hippo-rag | HippoRAG | HippoRAG mirrors hippocampal memory indexing using a knowledge graph and Personalized PageRank to enable efficient multi-hop retrieval for LLMs. | 0x41Cc2D3E... | 4/3/2026 |
| courses/moonshine-speech-recognition-for-live--moonshine | Moonshine: Speech Recognition for Live Transcription | Moonshine은 RoPE와 가변 길이 인코딩을 활용해 Whisper 대비 최대 5배 연산량을 줄인 경량 실시간 음성 인식 모델이다. | 0x1ff3D41F... | 4/3/2026 |
| courses/comcom-ojt-web3-blockchain-ai--web3-ojt-course | Web3, Blockchain & AI Fundamentals | A comprehensive Web3 & Blockchain OJT course covering 12 chapters — from Web3.0 protocols and DAOs to Bitcoin, Ethereum, tokenomics, NFTs, AI x Blockchain convergence, AINFT, ERC-8004, and the autonomous agent economy. 4 modules, 12 hands-on concepts, designed for newcomers and intermediate learners alike. | 0x87CB087d... | 4/4/2026 |
| courses/blockchain-decentralization-fundamentals--core-ko | 블록체인 탈중앙화 기초 (Core Ko) | Learn the evolution of the Web (1.0→2.0→3.0), blockchain consensus mechanisms (PoW/PoS), the problems of centralized platforms, and the necessity of decentralization. Understand the essence of the platform economy through Chris Dixon's 'Why Decentralization Matters' and the Google IPO Letter. | 0x48E875b8... | 4/4/2026 |
| courses/blockchain-decentralization-fundamentals--core | Blockchain and Decentralization Fundamentals (Core) | Learn the evolution of the Web (1.0→2.0→3.0), blockchain consensus mechanisms (PoW/PoS), the problems of centralized platforms, and the necessity of decentralization. Understand the essence of the platform economy through Chris Dixon's 'Why Decentralization Matters' and the Google IPO Letter. | 0xd33E035F... | 4/3/2026 |
| courses/ai-blockchain-longtail--core | AI, Blockchain & The Long Tail (Core) | An integrated learning path covering the convergence of AI and blockchain technologies, exploring how decentralized autonomous markets can solve AI's long-tail problem, the economics of GPU infrastructure, accountable AI components (AINFT), blockchain composability evolution, and dynamic NFTs as evolving digital assets. | 0x87CB087d... | 4/4/2026 |
| courses/bitcoin-ethereum-ainetwork--core-ko | Bitcoin, Ethereum, AI Network (Core Ko) | Comprehensive course covering Bitcoin Whitepaper (2008), Ethereum Whitepaper (2014), and AI Network Architecture/Whitepaper (2018-2025) — from fundamentals to frontier concepts. | 0x87CB087d... | 4/4/2026 |
| courses/dao-decentralized-organizations--core-ko | DAO & 탈중앙화 조직 (Core Ko) | Comprehensive course on DAOs and decentralized organizations covering fundamentals, decision-making frameworks, real-world cases, and building strategies. | 0x18BFba8e... | 4/3/2026 |
| courses/meaning-of-decentralization--core-ko | 탈중앙화의 의미 (Core Ko) | Explore Vitalik Buterin's seminal essay on the three axes of decentralization (architectural, political, logical), Web3 identity stack, headless brands, and distributed consensus algorithms. | 0x44b26a6F... | 4/3/2026 |
| courses/nft-creator-economy--core-ko | NFT와 크리에이터 이코노미 (Core Ko) | Explore how NFTs enable direct creator-to-fan monetization, the thousand true fans theory, Web3 creator tools, and the evolution from skeuomorphic to crypto-native designs. | 0x87CB087d... | 4/4/2026 |
| courses/nft-philosophy-technology--core | NFT Philosophy and Technology (Core) | Explore the philosophical foundations of NFTs through Token Identity Theory and Donald Davidson's Anomalous Monism, understand the technical implementation via ERC-20/721/1155 standards, analyze market cycles and creator economics from a16z State of Crypto 2023, and discover how Vitalik's concept of legitimacy explains NFT value. | 0x4298c1Dd... | 4/4/2026 |
| courses/token-standards-crypto--core-ko | 토큰 표준과 크립토 경제학 (Core Ko) | Learn ERC-20 and ERC-721 token standards, Chris Dixon's token breakthrough insights, a16z Tokenology framework, and crypto valuation methods using Metcalfe's Law, NVT and PMR ratios. | 0xd5e51bfA... | 4/4/2026 |
| courses/ai-blockchain-longtail--core-ko | AI, Blockchain & The Long Tail (Core Ko) | An integrated learning path covering the convergence of AI and blockchain technologies, exploring how decentralized autonomous markets can solve AI's long-tail problem, the economics of GPU infrastructure, accountable AI components (AINFT), blockchain composability evolution, and dynamic NFTs as evolving digital assets. | 0xe8D7BEa0... | 4/4/2026 |
| courses/ainft-web3-ai--core | AINFT and Web3: The Future of AI (Core) | Explore AINFT (AI NFT) as an immutable identifier for AI components, understand how it makes AI accountable, reproducible, and valuable, learn Web3 fundamentals from the evolution of Web 1.0 to Web 3.0, and discover advanced topics including DAOs, digital identity, native payments, and evolving NFTs. | 0xFafCa787... | 4/4/2026 |
| courses/bitcoin-ethereum-ainetwork--core | Bitcoin, Ethereum, AI Network (Core) | Comprehensive course covering Bitcoin Whitepaper (2008), Ethereum Whitepaper (2014), and AI Network Architecture/Whitepaper (2018-2025) — from fundamentals to frontier concepts. | 0xaF16f86B... | 4/4/2026 |
| courses/ainft-web3-ai--core-ko | AINFT와 Web3: AI의 미래 (Core Ko) | Explore AINFT (AI NFT) as an immutable identifier for AI components, understand how it makes AI accountable, reproducible, and valuable, learn Web3 fundamentals from the evolution of Web 1.0 to Web 3.0, and discover advanced topics including DAOs, digital identity, native payments, and evolving NFTs. | 0x4c6f8C0c... | 4/4/2026 |
| courses/dao-decentralized-organizations--core | DAO & Decentralized Organizations (Core) | Comprehensive course on DAOs and decentralized organizations covering fundamentals, decision-making frameworks, real-world cases, and building strategies. | 0x51bcaedA... | 4/4/2026 |
| courses/meaning-of-decentralization--core | The Meaning of Decentralization (Core) | Explore Vitalik Buterin's seminal essay on the three axes of decentralization (architectural, political, logical), Web3 identity stack, headless brands, and distributed consensus algorithms. | 0x2be81b31... | 4/3/2026 |
| courses/nft-creator-economy--core | NFTs and the Creator Economy (Core) | Explore how NFTs enable direct creator-to-fan monetization, the thousand true fans theory, Web3 creator tools, and the evolution from skeuomorphic to crypto-native designs. | 0x00ADEc28... | 4/2/2026 |
| courses/strong-weak-technologies--core-ko | 강한 기술과 약한 기술 (Core Ko) | Explore Chris Dixon's framework of strong vs weak technologies, a16z's 7 essential ingredients of a metaverse, Web3 go-to-market strategies with tokens, and scissor labels in narrative wars. | 0xD6699CA6... | 4/4/2026 |
| courses/token-standards-crypto--core | Token Standards and Crypto Economics (Core) | Learn ERC-20 and ERC-721 token standards, Chris Dixon's token breakthrough insights, a16z Tokenology framework, and crypto valuation methods using Metcalfe's Law, NVT and PMR ratios. | 0x75622895... | 4/3/2026 |
| courses/strong-weak-technologies--core | Strong and Weak Technologies (Core) | Explore Chris Dixon's framework of strong vs weak technologies, a16z's 7 essential ingredients of a metaverse, Web3 go-to-market strategies with tokens, and scissor labels in narrative wars. | 0x00ADEc28... | 4/3/2026 |
| courses/nft-philosophy-technology--core-ko | NFT 철학과 기술 (Core Ko) | Explore the philosophical foundations of NFTs through Token Identity Theory and Donald Davidson's Anomalous Monism, understand the technical implementation via ERC-20/721/1155 standards, analyze market cycles and creator economics from a16z State of Crypto 2023, and discover how Vitalik's concept of legitimacy explains NFT value. | 0xEDDE6831... | 4/4/2026 |