For Graphic Designers
Graphic designers often hand their brief interpretation to AI and call that design thinking. This trades away the problem-solving work that clients actually pay for.
These are observations, not criticism. Recognising the pattern is the first step.
Designers prompt Midjourney, get four results, and pick the best one. This skips the thinking stage where you'd normally sketch, reject ideas, and develop a concept before production. You end up refining aesthetic choices instead of solving the brief.
The fix
Generate images only after you have written down three different conceptual directions that respond to the brief, then use AI to visualise each direction separately.
Pasting a client brief into ChatGPT and using its summary to guide your design skips your own critical reading. You miss the specific language, contradictions, and unstated needs that reveal what the client actually wants. AI generalisations flatten the brief.
The fix
Read the brief twice before opening any AI tool, and write three questions you need to ask the client before you start any visual work.
Canva's AI layout tool works fast, but it makes decisions about what matters based on what looks balanced, not what communicates priority. You inherit a layout instead of building one that serves the information architecture the brief demands.
The fix
Sketch the visual hierarchy on paper first, assign elements importance levels, then use Canva AI only to help execute the grid you've already decided on.
Adobe Firefly's variation tool is fast, so designers generate 20 slightly different versions of one concept. This feels productive but it's just polishing. You never test whether a completely different visual approach would solve the problem better.
The fix
Use Firefly to generate variations only after you've tested at least two separate conceptual directions in low fidelity first.
Mood boards take time. AI image generation is fast. But mood boards force you to articulate visual language, emotional tone, and reference points before you start designing. Without this step, your visual choices feel reactive instead of strategic.
The fix
Build a mood board in Figma or Pinterest with real reference images before you use Midjourney or DALL-E for client work.
AI tools have visible strengths: rich textures, smooth gradients, photorealistic rendering. Over time, designers develop a preference for these qualities because they look polished when you export them. The problem is the brief might need something AI does poorly, like bold type hierarchy or stark minimalism.
The fix
For each project, write down one visual quality the brief specifically requires that is NOT a strength of the AI tool you're using, then make sure it appears in your final work.
Midjourney and DALL-E generate colours based on their training data and your prompt. Those colours often work together beautifully on screen. But they may clash with an existing brand palette or fail accessibility standards you need to meet. You stop checking because the output looks good.
The fix
Before showing any AI-generated image to a client, drop the colour palette into a contrast checker and compare it against the brand guidelines.
Firefly and Midjourney now generate photorealistic images that feel complete. This makes it tempting to present them as final artwork. But photorealism without lighting intention, colour grading, and retouching often looks generic because the AI had no brief to guide it.
The fix
When you use photorealistic AI output, always spend time on colour grading, shadow control, and detail refinement to give it the specific visual signature the client's brand needs.
Certain visual styles render extremely well in current AI models: soft focus, warm colour grading, atmospheric lighting, nostalgic texture. When you need results quickly, you gravitate toward what the tool does best. Soon your portfolio starts looking like everyone else's.
The fix
When you feel drawn to an AI output's aesthetic, ask yourself if that choice responds to the brief or to the tool's capabilities. If it's the latter, regenerate with a constraint that pushes against the tool's strength.
Graphic design depends on how type, colour, and image work together. When you use AI for image generation, it's easy to focus entirely on the visual and treat type as secondary. The result is an image that looks good but doesn't communicate priority or guide the eye where it needs to go.
The fix
Mock up every piece of type against the AI image in Figrator or Adobe before you finalise anything, and test that a user's eye lands on the most important information first.
Without AI, you'd show a concept sketch and get approval before investing in finished artwork. With Midjourney and Firefly, you can have a polished visual in minutes. Clients see it and assume you're done faster, so they expect lower fees and faster turnarounds on future work.
The fix
Show concept work separately from execution work. Present a low-fidelity exploration first, get approval on direction, then show the refined version. Document the thinking steps.
A brief like 'make it feel modern and fresh' would normally force you to ask clarifying questions. With AI, you can generate 50 interpretations in an hour. This feels productive but it means you've done concept work without understanding what the client actually values. You're guessing instead of solving.
The fix
Before you open any AI tool, require yourself to extract at least five specific requirements from the brief, and ask the client to rank them by importance.
When you hand-sketch, clients see your working. When you use AI, they see only results. Without explanation, the work looks like you clicked a button. This erodes the perceived value of your judgement, and clients start requesting 'just one more variation' as if it costs nothing.
The fix
For every deliverable, include a brief written note explaining which part of the brief each visual choice addresses and why you rejected other directions.
The pressure to deliver faster often means using AI to compress the work rather than improve it. You save time but reduce the quality of thinking. Clients feel it, even if they can't name it. Work that looks fast rarely feels intentional.
The fix
When you use AI, spend the time you save on deeper client research, testing multiple directions, or refining the work beyond what the brief asked for.
If you deliver polished work in half the time, clients assume you should charge half the price. They don't see the thinking time. They see the output speed. This traps you in a race to the bottom where the only way to make money is to work faster and do less thinking.
The fix
Keep your fees tied to the strategic value you deliver, not the rendering time. Be clear with clients about the difference between concept development and asset production.
Worth remembering
The Book — Out Now
Read the first chapter free.
No spam. Unsubscribe anytime.