For Game Developers
GitHub Copilot can write functional game code in seconds, but you cannot maintain what you do not understand. When every studio uses the same AI asset generators and the same NPC behaviour models, the games start to feel interchangeable. The real risk is not that AI will replace you, but that you will stop making the decisions that separated your work from everyone else's.
These are suggestions. Your situation will differ. Use what is useful.
When Copilot generates a pathfinding algorithm or state machine for your NPC behaviour, you must trace through it line by line before it enters your codebase. Generated code that works is not the same as generated code you understand. Six months later, when that NPC behaves strangely under load or you need to add a new behaviour state, you will be blocked by code you cannot explain to yourself or your team. Read it as if you wrote it. Rewrite the parts that feel opaque.
Midjourney and DALL-E produce images that fit a brief. They do not make the artistic choices that made your previous games feel alive. A character model generated from a text prompt will share visual DNA with character models from every other studio using the same tool. Your game's visual identity comes from constraint and intention. If you use generative art, treat it as a base layer you transform, not as final output. The decisions you make after generation matter more than the generation itself.
AI-assisted design tools optimise for what worked before. They predict player behaviour based on existing games. This makes them excellent at producing competent, predictable experiences and terrible at producing the weird, risky choices that become the games people remember. When you use AI to suggest level layouts or NPC dialogue trees, treat those suggestions as starting points for argument, not ending points. The moments your game diverges from what the AI suggested are where your creative judgement lives.
The cognitive risk is not dramatic. It is slow. Each time you accept a Copilot suggestion without understanding it, you lose a small opportunity to practice reading code and building mental models of how game systems work. Over two years, you have accepted hundreds of snippets. When you need to debug a complex interaction between your AI NPC behaviour and your player input system, you are working in code you only partially understand. Protect your learning by regularly writing code without assistance, especially for systems that are critical to your game's feel.
Your game's aesthetic identity comes partly from constraints you chose and partly from the tools available to you. When you and your competitors all use Inworld AI for NPC dialogue, those NPCs will sound subtly similar because they share the same training data and optimisation targets. When you all use Midjourney for environments, the environments will share the same visual language. This is not inevitable. You can recognise this convergence and make choices to avoid it. Use AI tools where they genuinely save you time on non-distinctive work, but invest your own judgement in the systems that make your game feel like yours.
Key principles
Key reminders
The Book — Out Now
Read the first chapter free.
No spam. Unsubscribe anytime.