Whenever i get to a certain point in the cat i have this error. Then i have to create a new chat window and continue from 0.
Error
This model's maximum context length is 8192 tokens. However, your messages resulted in 8407 tokens. Please reduce the length of the messages.