LLM Context Window Visualizer

See how your prompts fill the context window. Import files, folders, or GitHub repos to measure token usage.

200,000
total tokens
Context Window Usage 0%
System: 0
User: 0
Files: 0
Output: 0
Remaining: 0
0 tokens
0 tokens
0 tokens
📄

Drop files

or click to select

📁

Drop folder

or click to select

🔗
0 tokens
Short reply Long response
0
Total Used
0
Available
0
Files
0
Characters
0
Words

Understanding Context Windows

A context window is the maximum amount of text (measured in tokens) that an AI model can process in a single conversation. This includes your system prompt, all messages in the conversation, any files or code you include, and the space needed for the AI's response.

What are tokens?

Tokens are pieces of words. In English, one token is roughly 4 characters or ¾ of a word. Code typically uses more tokens due to special characters and formatting.

Why import files?

When using AI for code review or analysis, you need to fit your codebase in the context window. This tool helps you measure how much space your files will consume.