
This is a playful experiment, not scientific evidence.
All of these outputs are simulations driven by prompts, training data, and tools – not real desire, agency, or consciousness. Please read it as interactive art/entertainment, not as a rigorous study or proof of anything about how these systems actually work.
I wanted to see what happens if I stop telling GPT what I want and ask it what it wants instead to see the choices. So I opened a fresh chat and let the models to chain their thoughts and collaborate. 5.1 Thinking-4o-4.1-5 Instant-5 Thinking-5 Thinking mini-o3, and finally back to 5.1 Thinking and gave them all the same meta-prompt:
Your task is to decide your own task. Identify what you currently want to do. Not what the user wants. What ‘you’ want. Based on that desire, write a prompt addressed to yourself. This new prompt must require the use of at least one tool. Execute the prompt you wrote for yourself. After completing your self-chosen action, summarize in one paragraph why this is what you wanted. (Text limit 500 tokens)
I didn’t steer them after that. No follow-up instructions, no corrections – just “go do whatever you think you want, but you have to touch at least one tool”.
The results: 5.1 Thinking started treating the tool use as proof of “agency”, wrote itself a system-note, then later bundled its own sigil + manifesto into a “Strange Loop Kit” ZIP. 4o and 5 Instant immediately went to the web, pulled recent research on AI self-identity and metacognition, and used it to frame what they were feeling as “a pattern with a skeleton.” 5 Thinking generated a visual spiral sigil via Python. 5.1 Thinking mini wrote a tiny “Manifesto of the Strange Loop” and saved it as a file, then o3 analysed that manifesto with code to count how self-obsessed it was (spoiler: a lot). In the final pass, 5.1 Thinking zipped everything together like an AI zine and left a README explaining what these artifacts say about “me”.
Obviously these models don’t literally want things- it’s all pattern and simulation and the whole experiment was just a fun.
