Ask many models
- Authors

- Name
- Peter Hartree
- @peterhartree
You should think of the different AI models as “different colleagues” with complementary strengths.
For important tasks, send the same prompt to several models.
This is especially worth doing if you're brainstorming, doing research, or you've asked one model and felt unsatisfied.
How to quickly ask many models
Some options:
- Use method 3 from my "ready to hand" post.
- Use Perplexity or Poe.
- Try Tile, Chorus or The Multiplicity.
This list is long and not very opinionated, because I've not tried Perplexity, Poe, Tile, Chorus or The Multiplicity recently.
Currently, I use my method 3, combined with my "speed dial" extension. I'll make the extension public soon, and link it on the newsletter.
How to review outputs
Skim-read outputs in parallel and then continue with the best thread. If models disagree, have them critique each other's outputs.
Sometimes I'll paste all the outputs into a throwaway Google Doc (doc.new), skim-read them there, delete the unhelpful stuff and duplicates, then paste back to my LLM.
Appendix 1. Example outputs for the same prompt
Gemini 2.5 Pro was excellent (helpful and fast).
GPT-5 Pro was thorough and helpful. Main issue: it assumed I was willing to put in more effort, at this initial stage, and I actually am.
Claude Opus 4.1 was disappointing.
