AI Fundamentals

Better output from ChatGPT, Gemini, and Claude? Stop with perfect prompts — provide better context

Job van den Berg
Job van den Berg
April 15, 2026
3
min read
Better output from ChatGPT, Gemini, and Claude? Stop with perfect prompts — provide better context
A prompt is the instruction or question that you ask a language model. Context is the information you provide to help the model understand who you are, what you want to achieve, and who the output is intended for.

For years, “writing a good prompt” was the most important skill for those who want to get the most out of AI. That is no longer true. What really helps language models like ChatGPT, Claude, and Gemini today is context. Here are two techniques that work immediately.

From prompts to context: what's the difference?

A prompt is the instruction or question that you ask a language model. Context is the information you provide to help the model understand who you are, what you want to achieve, and who the output is intended for.

In the past, language models were limited enough that the precise wording of your prompt made the difference between useful and useless output. Models had to be literally controlled.

Modern models understand language much better. They need less stringent instructions — but they still can't guess what you mean if you don't tell them. The gap now is no longer in understanding the question, but in the absence of background. That's exactly where context prevails over perfect wording.

Two techniques to provide more context

Tip 1 — Ask the language model to interview you

This may feel unnatural, but it is one of the most powerful techniques. Instead of cramming all the information into one long prompt by itself, ask the model to ask you the questions it needs to do the job properly.

The model then decides which context it lacks. Your answers fill exactly the gaps that would otherwise lead to vague or generic output. The result is much more in line with what you are looking for — without you already knowing what to give.

Example prompt: “Before you begin: ask me the questions you need to do this in the best way possible.”

Tip 2 — Ask for improvement proposals after output

You had a proposal, email, or text written. Fine — but don't stop there. Then ask the model: “What could you improve about this output?”

Language models are strong in problem solving. They can take what already exists to the next level — but only if you give them that opportunity. By asking this question, you force the model to reflect on its own work, while giving it more context about what works well and what doesn't work well in the output that is already there.

Example prompt: “What could you improve about this answer? Come up with concrete suggestions for improvement or an improved version.”

Why do these techniques work so well?

Both tips have one thing in common: they increase the amount of relevant information the model has when it formulates an answer. That's exactly what context does.

In the interview technique, the model actively collects the missing information before it starts. In the improvement round question, it reflects on the output delivered with a critical eye — and combines that reflection with what it already knows about good text, structure or argumentation.

In either case, you give the model more input to work with. And more relevant input almost always leads to better, more customized output.

Summary - apply immediately

Context is more important than wording: the quality of your output depends on how much relevant information you provide, not on how beautifully structured your question is.

Get interviewed: ask the language model what information it needs before it starts. Your answers form the context.

Ask for improvement: after the output is there, ask the model what could be improved. You get concrete areas for improvement or a stronger version.

Repeat: both techniques are stackable. Use them together for the strongest results.

FAQs

Why is context more important than a good prompt in AI?

Modern language models have greatly improved in understanding language. As a result, the precise wording of your question matters less and less. What does matter is the underlying information that you provide. Without context, the model lacks the information to deliver a customized response — no matter how well your prompt is written.

How do I let an AI interview me?

Send a message like, “Before you get started, ask me the questions you need to get this done properly.” The language model then responds with specific questions. Your answers provide the context with which it can deliver a much better result.

What is the difference between a prompt and context in AI?

A prompt is the instruction or question you ask. Context is the background information you provide: who you are, what your goal is, who the output is intended for and what tone of voice fits. Context makes a prompt effective.

Can I ask an AI to improve its own output?

Yes, and it works surprisingly well. After the language model has provided output, ask: “What could you improve on this?” The model then reflects on its own work and comes up with concrete areas for improvement or an improved version.

Does this work with ChatGPT, Claude and Gemini?

Yes. Both techniques work with all major language models. They are based on how language models work in general, not on the specific properties of one platform.

Remy Gieling
Job van den Berg

Like the Article?

Share the AI experience with your friends