LLM Context Window Visualizer
See how your prompts fill the context window. Import files, folders, or GitHub repos to measure token usage.
Drop files
or click to select
Drop folder
or click to select
Understanding Context Windows
A context window is the maximum amount of text (measured in tokens) that an AI model can process in a single conversation. This includes your system prompt, all messages in the conversation, any files or code you include, and the space needed for the AI's response.
What are tokens?
Tokens are pieces of words. In English, one token is roughly 4 characters or ¾ of a word. Code typically uses more tokens due to special characters and formatting.
Why import files?
When using AI for code review or analysis, you need to fit your codebase in the context window. This tool helps you measure how much space your files will consume.