40 Questions Management Consultants Should Ask Before Trusting AI Research
When a client pays for your analysis, they are paying for your independent judgment, not a polished summary of what ChatGPT found. These questions help you identify where AI has filled gaps in your thinking rather than replaced your thinking.
These are suggestions. Use the ones that fit your situation.
Before you use AI research in a client deliverable
1Can you name the specific source documents where this finding appears, or have you only seen it in the AI summary?
2If you removed this AI output from your analysis, could you still make the same recommendation to your client?
3Has the AI tool cited sources for its claims, and have you actually opened those sources to verify the claim is accurately represented?
4Does this insight challenge an assumption you held before asking the AI, or does it simply confirm what you already believed?
5Which person in your client organisation would be most likely to spot that this finding is wrong, and would they accept it without pushback?
6Have you found contradictory information about this topic, and if so, why did you choose to trust the AI output over the contradiction?
7Could you explain why this pattern exists, or are you repeating an explanation the AI provided without testing it?
8If a competitor consultant presented this same analysis to your client next month, what would they ask you that you cannot answer?
9Does this research support the recommendation you want to make, or did you ask the AI to find support after you had already decided?
10What would change your mind about this finding, and is that test something you can actually perform before the presentation?
When using ChatGPT, Claude, or Perplexity for analysis
11Did you prompt the AI with your own analysis first, or did you ask it to generate analysis from scratch?
12Have you given the AI conflicting instructions that might explain why it generated multiple competing frameworks in the same response?
13Does the AI's summary of an industry report match what you read when you actually opened the report?
14When Perplexity shows you sources, have you actually clicked through to confirm the claim appears in that source?
15Has the AI generated a list of competitors, and do you recognise any as companies that do not actually compete with your client?
16If you asked Claude the same question twice with different phrasings, did you get substantially different answers?
17Are you using the AI to fill knowledge gaps you should probably close yourself before the client meeting?
18Did you tell the AI what country or region the client operates in, and if not, how reliable is that data without context?
19Have you tested the AI's numbers by checking them against the most recent annual report of the companies mentioned?
20When the AI cites a stat about market size or growth rate, have you found the original source, or are you relying on the AI's citation?
When building strategic frameworks or client slides
21Is this framework something you could defend in a room where someone disagrees, or does it feel persuasive only when you read the AI's reasoning?
22Could you draw this framework from memory after a week, or have you outsourced the understanding to the AI?
23Have you tested this framework against a client situation you worked on in the past, or are you presenting it untested?
24Does the AI's framework actually match the problem your client described, or did you reshape the problem to fit the framework?
25If you presented three different AI-generated frameworks to your client and asked them to choose, which one would they pick and why?
26Have you identified at least one limitation of this framework that the AI did not mention?
27When you build the slide deck, are you using the AI's language or translating it into language that reflects your actual understanding?
28Does this framework help your client see something they could not see before, or does it simply organise information they already had?
29Have you worked through the framework with a junior consultant to see where it breaks down in real application?
30If your client asks what you would do differently if this framework were wrong, do you have an answer?
Questions about your own analytical practice
31In the last three months, how many client insights came from your own analysis versus from AI-generated research you then shaped?
32Can you recall a recent client deliverable where you disagreed with what an AI tool produced and chose not to use it?
33Are you asking AI to do your initial problem structuring, and if so, when did you stop doing that work yourself?
34Have you noticed yourself reading AI output less critically because it comes from a tool you trust?
35When a junior consultant brings you analysis, can you still tell whether they did it themselves or assembled it from AI?
36If an AI tool became unavailable tomorrow, which core parts of your analytical process would stop functioning?
37How much time do you spend challenging your own thinking versus looking for AI tools that will do the thinking for you?
38Have you found yourself presenting frameworks you do not fully believe because they came from an AI with high credibility?
39When you disagree with an AI analysis, do you dig into why you disagree, or do you simply choose a different tool?
40What skills have you stopped practising because AI now handles them?
How to use these questions
When Perplexity cites a source, click through before the client meeting. AI misrepresentation of sources is common enough that this takes five minutes and saves your credibility.
Ask yourself: would I stake my reputation on this analysis if the client fact-checked it internally? If the answer is no, the analysis is not ready for the deck.
Junior consultants develop judgment by doing difficult research themselves. If you hand them AI outputs to polish, you are training presenters, not analysts. Give them the messy problem first.
When you notice you are asking an AI tool to do something you used to do manually, stop and do it manually one more time. You need to stay ahead of the tool, not behind it.
The competitive advantage of consulting is not speed. If you are trading analytical rigour for delivery speed, you are competing on cost, not value. Slow down the ones that matter.