For Marketing and Advertising

40 Questions Marketing and Advertising Should Ask Before Trusting AI

AI tools now generate creative, plan media, and segment audiences faster than any human team. The risk is that speed replaces judgement, and what gets measured improves while what makes a brand memorable gets lost. Asking the right questions before you act on an AI recommendation is the difference between a campaign that works and one that merely runs.

These are suggestions. Use the ones that fit your situation.

Download printable PDF

Creative Strategy and Brand Distinctiveness

1 When Midjourney or Adobe Firefly generates three visual directions, how many of them look like they could belong to a competing brand in the same category?
2 Does the AI-generated copy for this campaign say something about the client's brand that only this brand could say, or could it run unchanged for a competitor?
3 If you removed the logo from the creative output Claude or ChatGPT produced, could a consumer still recognise the brand?
4 What craft decision did the AI make about tone, visual style, or message priority, and is that decision based on what works for this specific brand or on statistical averages across similar brands?
5 Has anyone on your team articulated the strategic reason why this particular audience would care about this message, or has the AI recommendation skipped straight to message variants without strategy?
6 When you prompted the AI tool, did you give it your brand strategy document and competitive context, or did you give it only the product brief?
7 Does this creative idea depend on a human insight about your audience that the AI could not have discovered on its own?
8 If the AI generated ten creative variations, are the meaningful ones genuinely different or are they surface-level rephrasings of the same idea?
9 Can you explain to the client why you chose this direction over the others, using brand strategy language or only using efficiency metrics?
10 What would this campaign look like if you had spent two days on strategy before you opened the AI tool?

Media Buying, Targeting, and Audience Judgment

11 When Google Performance Max recommends an audience segment, what is it optimising for: your stated conversion goal or its own model of user behaviour that may not match your brand's actual customer?
12 Has the media plan reflected thinking about who your brand can genuinely win with, or has it automated toward the largest addressable audience?
13 If the AI tool recommends bidding more heavily on audiences that are statistically similar to past converters, does that overlap with audiences you actively want to avoid?
14 What audience segment is the algorithm excluding, and would you lose brand equity by being absent from that segment?
15 Does the media buying recommendation account for the fact that your brand may need to reach people who have never heard of you, or only people who already look like existing customers?
16 When Performance Max or a programmatic buying AI allocates budget across channels, is it favouring short-term conversion metrics over channels that build brand memory?
17 Has anyone checked whether the AI's audience definition matches what your client actually knows about their customer, or does it contradict what your account team learned in the last strategy meeting?
18 If you removed AI-driven audience targeting and ran the media plan based on human strategic selection of three core audiences, how different would the results be?
19 Does the AI recommendation explain why this audience matters for this brand, or does it present the numbers without the reasoning?
20 What happens to brand distinctiveness if you let the algorithm reach whoever is cheapest to convert, rather than whoever is most valuable for building brand equity?

Measurement, Effectiveness, and What Actually Gets Counted

21 The metric the AI is optimising for (clicks, conversions, efficiency ratio) is what you are measuring. What are you not measuring that your client actually cares about?
22 Has anyone run this campaign against a control group that saw zero AI-optimisation, so you know whether the AI is actually improving results or just shifting spend toward lower-hanging fruit?
23 If this campaign achieves its efficiency targets but brand awareness or brand preference declines in tracking studies, will the AI-driven approach still be considered successful?
24 The AI tool reports that creative variation B outperformed variation A. On what metric, over what timeframe, and among which audience subset?
25 Is the campaign being measured on what the brand strategy said success looks like, or on whatever metric the AI platform makes easiest to track?
26 When you compare this AI-optimised campaign to the last campaign a human planner optimised, are you comparing them on the same metrics or different ones?
27 Has anyone audited the AI's reporting to check whether it is attributing results fairly across channels, or is it taking credit for activity that would have happened anyway?
28 If short-term conversion improves but your client's brand is becoming less distinctive than competitors, does your measurement framework catch that decline before the client does?
29 The AI recommends cutting spend on an audience segment that is expensive to convert. What is the lifetime value of that audience, and is the AI factoring it in?
30 What would you need to measure to prove that this campaign built brand equity, as opposed to simply shifting conversions from expensive channels to cheap ones?

Team Judgment, Craft Knowledge, and Decision-Making Power

31 When the account lead makes a recommendation that contradicts what the AI tool suggests, what happens to that recommendation?
32 Does your team still have a strategy document that exists before the AI recommendations, or has the strategy become whatever the AI outputs?
33 Can the person running the AI tools explain in plain language why they prompted the model the way they did, or are they following a standard template?
34 If the AI recommends a direction that feels wrong to your most experienced strategist, do you have permission to reject it, or does the client expect you to follow the algorithm?
35 Who decides what counts as a good outcome: the metrics dashboard or a human conversation about what the brand needed to achieve?
36 Has anyone on your team spent time recently learning what actually makes a campaign effective in your client's category, or has that knowledge been replaced by prompt engineering?
37 When you onboard a new team member, do you teach them media strategy and creative principles or do you teach them how to use the tools?
38 If the client asks 'Why did you choose this direction?', can your team give an answer that is based on brand thinking, or only on what the AI generated?
39 What is one campaign decision from the last three months that was made by a human account person instead of by algorithm, and what was the result?
40 Does your retainer agreement with the client include time for strategy and judgement, or has it been squeezed down to time for generating and testing variations?

How to use these questions

Related reads

The Book — Out Now

Cognitive Sovereignty: How To Think For Yourself When AI Thinks For You

Read the first chapter free.

No spam. Unsubscribe anytime.