Advertisement

Andrej Karpathy Launches Nanochat: An Open-Source ChatGPT-Style Model Training Platform

Andrej Karpathy Launches Nanochat: An Open-Source ChatGPT-Style Model Training Platform AI

OpenAI co-founder and Eureka Labs founder Andrej Karpathy has introduced nanochat, an open-source project designed to help users train and deploy their own ChatGPT-style models with ease. Building on the success of his earlier project nanoGPT, which focused solely on pretraining, nanochat now offers a complete end-to-end pipeline for training, fine-tuning, and inference—accessible through a simple setup and web-based interface.

Sharing the update on X, Karpathy said, “You boot up a cloud GPU box, run a single script, and in as little as four hours, you can talk to your own LLM in a ChatGPT-like web UI.” The repository features about 8,000 lines of code, covering tokenizer training in Rust, Transformer LLM pretraining on FineWeb, supervised fine-tuning, and reinforcement learning with GRPO. It also supports efficient inference with KV caching, allowing users to interact via command-line or web UI.

Nanochat supports scalable model training—from a $100, 4-hour run on 8×H100 GPUs to a $1,000, 42-hour training capable of solving math and coding problems. Karpathy calls nanochat the capstone project for his upcoming LLM101n course at Eureka Labs.