Ask AI what it knows before asking what you need. The right context leads to better answers.
Same question. One asks directly. One gathers context first.
The model might answer correctly, or it might not connect the dots. Without activating relevant knowledge, it's working from whatever comes to mind first.
Hit or miss. Depends on what the model happens to recall.
AI: "In golf, each stroke adds to your score. The goal is to complete the course in as few strokes as possible. Par is the expected number of strokes..."
Then: "Yes, the lowest score wins in golf."
Relevant facts activated. Answer grounded in context.
AI knows more than it uses. Knowledge is encoded across billions of parameters, but a direct question only activates a narrow slice. When you ask AI to recall relevant facts first, it brings the right information into working memory before answering.
It's like asking yourself "What do I know about this?" before answering a question. The pause to gather context produces better, more grounded responses.
AI can generate incorrect "facts" with confidence. If a wrong fact enters the context, the final answer will be wrong too. For anything important, verify the recalled facts before relying on the answer. This technique works best for common knowledge — not specialized, recent, or critical information where you should provide verified sources instead.
Before asking your real question, ask AI to recall what it knows about the topic. Those facts become context for a better answer. Simple, but effective — especially when you don't have external sources to provide.