Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
AI Fluency Playbook
Getting Started
How to Use
Core Content
Five Pillars
Exercises
Concepts
Learning Profiles
Archetypes
Pathways
Reference
Resources
Glossary
Tools
Further Reading
GW AI Fluency Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Tokenization & Context Windows
(section)
Page
Discussion
English
Read
Edit
Edit source
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
Edit source
View history
General
What links here
Related changes
Special pages
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Why this matters for your work == '''Long conversations degrade.''' If you've noticed AI giving worse answers later in a conversation than at the beginning, this is why. As the context window fills up, the model has more text to process and older information gets less "attention." Starting a fresh conversation for a new topic isn't a sign of failure β it's good practice. '''Uploaded documents have limits.''' When you upload a PDF or paste a long document, it consumes context window space. A 50-page report might use 20,000+ tokens, leaving less room for your actual questions and the AI's responses. If you're working with long documents, consider summarizing or extracting the relevant sections first. '''"AI forgot what I said" is usually a context issue.''' AI doesn't have memory between conversations (unless you're using features like Claude's Projects or custom GPTs that provide persistent context). Even within a conversation, if you're 30 messages in, the AI may lose track of something you said at the beginning because it's being pushed out of the active window. '''This is why the [[The Handoff Protocol|Handoff Protocol]] exercise matters.''' When you learn to structure handoffs between AI sessions β summarizing context, carrying forward the essential information β you're working around context window limits intelligently.
Summary:
Please note that all contributions to GW AI Fluency Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
GW AI Fluency Wiki:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)