I think I have a record for the most tokens processed / Refresh your results to reduce errors?

First, please take a look at that token count… seems off to me, right?

https://preview.redd.it/l6kfk4v65h3g1.png?width=2348&format=png&auto=webp&s=115ee86b6394e8f35b0d5cdf5be0a745ad843b80

https://preview.redd.it/7m12gd2z5h3g1.png?width=1746&format=png&auto=webp&s=4d4409f0dc72b94e96155c89fafb6c57bb1f6eb4

Weirder yet, both this discussion example were originally produced with phrases missing and sentences cut off halfway. One earlier discussion, a guide to downloading software for vocab study, was completely incomprehensible as two conversations were mashed together – the actual instructions, and then an FAQ – into one paragraph. I left the entire thing to then try this, wherein Gemeni created some listening questions based on YouTube audio. Once again, there were weird examples of missing phrases, etc, but when I refreshed my history – it mostly resolved itself.

Look at Question B, option A, or Question C, option C for the easiest examples.

Huh?

I didn't change the prompt, just "refreshed" it by selecting one prompt from the history, then another.

Leave a Reply