Foundational Concepts

Parameters

The internal numerical values a model adjusts during training — more parameters generally means more capable.

Definition

When you hear a model described as having '7 billion parameters' or '70 billion parameters,' this refers to the number of numerical values inside the model that were adjusted during training. These values collectively encode everything the model has learned. More parameters generally allow a model to capture more nuance and handle more complex tasks, but also require more computing power to run. Smaller models can now perform remarkably well on focused tasks, making them cheaper to deploy.

Heard enough terminology — ready to talk outcomes?

We translate AI concepts into measurable business results. No upfront fees — you pay only when independently verified results are delivered.

← Back to glossary

Disclaimer

This definition is provided for educational and informational purposes only. It represents a general explanation of a technical concept and does not constitute professional, technical, or investment advice. Artificial intelligence is a rapidly evolving field; terminology, techniques, and capabilities change frequently. Coaley Peak Ltd makes no warranty as to the accuracy, completeness, or currency of the information provided. Nothing on this page should be relied upon as the sole basis for commercial, technical, legal, or investment decisions without independent professional advice.

Document reference: ISO_webpage_knowledge-base_glossary_v1

Last modified: 29 March 2026

Knowledge Base·Foundational Concepts·Parameters