Weights
- Definition
- The numerical values inside a neural network that encode everything the model has learned during training. Model weights are the core intellectual property of AI companies and the subject of intense open-vs.-closed debates.
- Why it matters
- Weights are the product of billions of dollars in compute, data, and engineering. They represent accumulated knowledge, and controlling access to them is the central business decision for every AI company. Open weights (Meta's Llama, Mistral) democratize access and build ecosystem, but sacrifice direct monetization. Closed weights (OpenAI, Anthropic) enable API-based business models but create vendor dependency. For enterprises, the weights question determines whether you can self-host (data sovereignty), fine-tune (customization), and audit (transparency) your AI systems. The open-weight movement has been the biggest driver of AI democratization, enabling startups and researchers worldwide to build on frontier-quality models.
- In practice
- A 70B-parameter model's weights occupy approximately 140GB in FP16 precision, requiring specialized hardware to load and run. Meta's decision to release Llama weights publicly was the most consequential open-source decision in AI history, spawning thousands of fine-tuned variants and enabling the entire open-source AI ecosystem. Weight theft is a major security concern: a single exfiltrated weight file could give a competitor years of training investment for free. Google and OpenAI invest heavily in weight security, including physical security at data centers, encrypted storage, and access logging. The legal status of weights (are they copyrightable? patentable?) remains unsettled and varies by jurisdiction.
We cover models & architecture every week.
Get the 5 AI stories that matter — free, every Friday.
Related terms
Parameter
A learnable value inside a neural network that gets adjusted during training. Model size is measured in parameters (e.g., 70B, 405B), which roughly correlates with capability and cost.
Open weight
A model whose trained parameters are publicly downloadable but whose training data and code may not be shared. Most 'open-source' models are technically open-weight, an important legal and strategic distinction.
Neural network
A computing architecture inspired by the brain, made of layers of interconnected nodes (neurons) that learn patterns from data. Neural networks are the fundamental building block of all modern AI.
Quantization
Reducing the numerical precision of a model's weights (e.g., from 32-bit to 4-bit) to shrink its memory footprint and speed up inference. Quantization makes it possible to run large models on consumer hardware.
Know the terms. Know the moves.
Get the 5 AI stories that matter every Friday — free.
Free forever. No spam.