Glossary

Show full index

Show full index

These concepts are not unique to Anthropic’s language models, but we present a brief summary of key terms below.

Context windowThe “context window” refers to the amount of text a language model can look back on and reference when generating new text. This is different from the large corpus of data the language model was trained on, and instead represents a “working memory” for the model. A larger context window allows the model to understand and respond to more complex and lengthy prompts, while a smaller context window may limit the model’s ability to handle longer prompts or maintain coherence over extended conversations.

Related to Resources
Use cases
Resources

Blog

Docs

Academy

Marketplace

Desktop App

Brand

Company

About

Careers

Events

Support

Newsroom

Contact us

Legal

Abuse

Charges

Cookies

Terms

Use cases
Resources

Blog

Docs

Academy

Marketplace

Desktop App

Brand

Company

About

Careers

Events

Support

Newsroom

Contact us

Legal

Abuse

Charges

Cookies

Terms

Use cases
Resources

Blog

Docs

Academy

Marketplace

Desktop App

Brand

Company

About

Careers

Events

Support

Newsroom

Contact us

Legal

Abuse

Charges

Cookies

Terms

ENGLISH

Get started