Note: This wiki will overtime grow to become an efficient way to learn Generative AI (GPT) in a contiguous period of 72 hours (optionally including sleep). This is a collection of notes and links at this point. The goal is for this to become an Alexey Guzey type of a intense minute by minute curriculum.
Goal
Hacker Retreats are fun. We’ve done music themed ones, ham radio ones etc. The goal of this web page is to compose a curriculum, or simply an intense schedule, which would guide a group of hackers from point A to point Z in developing a ChatGPT-like system from scratch.
What
- Books
- Videos
- HN
- Show HN: Train a GPT to write like Shakespeare-from scratch, in one Python file | Hacker News
- Show HN: Train a GPT to write like Shakespeare-from scratch, in one Python file | Hacker News
- Show HN: GPT-2 from scratch in PyTorch – Video tutorial | Hacker News
- WyGPT: C++ GPT Language Model from Scratch | Hacker News
- https://news.ycombinator.com/item?id=34726115
- articles
- How to Build an LLM from Scratch | Towards Data Science
- https://youtu.be/UU1WVnMk4E8?si=lpT0ajBZC2CbjD2p
- Fine-Tuning Overview — The GenAI Guidebook
- Neural Networks: Zero To Hero
- https://nandgame.com/
- https://www.nand2tetris.org/
- GPT in 60 Lines of NumPy | Jay Mody
- Speeding up the GPT - KV cache | Becoming The Unbeatable against AGI
- https://course.fast.ai/
- Box2D - Gymnasium Documentation
- learn-pytorch/SAC.ipynb at main · DevJac/learn-pytorch · GitHub
- Grokking Deep Reinforcement Learning - Miguel Morales
- Welcome to Spinning Up in Deep RL! — Spinning Up documentation
- TDDC17 > Lab 5: Reinforcement Learning
- Sutton & Barto Book: Reinforcement Learning: An Introduction
- https://youtu.be/UU1WVnMk4E8?si=38fRZhPQDvaoEr_N
- Mosaic LLMs: GPT-3 quality for
- How to build a large language model from scratch | Online Courses, Learning Paths, and Certifications - Pluralsight
- Training a causal language model from scratch - Hugging Face LLM Course
How (time management)
- apply Alexey’s lessons: My 2022 self (I don't know them) was very wrong about meditation, huge monitors, and... sleep. - Alexey Guzey
- use Gerganov’s local LLM to avoid distractions: GitHub - ggml-org/llama.cpp: LLM inference in C/C++
- Ultralearning - Scott Young’s MIT challenge
- Memorizing a programming language using spaced repetition software | Derek Sivers
- https://tracingwoodgrains.medium.com/speedrunning-college-my-plan-to-get-a-bachelors-degree-within-a-year-abe741c4c8bb
- How To Remember Anything Forever-ish
- Effective learning: Twenty rules of formulating knowledge - SuperMemo
More Tools:
- My 4-monitor computer setup (16-inches + 49-inches + 34-inches + 24-inches) - Alexey Guzey
- Worker League app by https://twitter.com/mishayagudin
Who (can help)
Is this even a good idea
- not sure - https://twitter.com/ID_AA_Carmack/status/1200111228460376064
- maybe yes - https://twitter.com/hardmaru/status/971297318665322498
- definitely yes - https://twitter.com/ID_AA_Carmack/status/970512819107090432
- No! - Cal Newport may disagree with Speedrunning this (maybe not since this is more like cramming vs creating something new) - My new book: Slow Productivity - Cal Newport
Prereqs
- Focus
- Food
- Machines ready to go
- Slow Internet? (Distraction-free internet)
- Pen & Paper
- Felt-tip pen like Bill Gates - Memex: Today's Interesting Links (Bill Gates writing with felt-tip pens; competing with ed) (2024-01-15)