For Arts and Culture
Protecting Artistic Judgement: A Guide for Arts Organisations Using AI
When you use Midjourney to explore visual ideas or ChatGPT to draft grant applications, you are outsourcing parts of the thinking that make your work distinctly yours. The risk is not that AI will replace artists and curators, but that the everyday use of these tools will gradually erode the specific judgements and skills that audiences come to you for. The question becomes: how do you use AI as a tool without letting it become the decision maker?
These are suggestions. Your situation will differ. Use what is useful.
Know the Difference Between Exploration and Outsourcing
Using DALL-E to generate ten variations on a visual concept is different from using it to finish work you lack the skill or patience to complete yourself. The first keeps your judgement in control. The second hands control to the software. Ask yourself before each use: am I using this tool to expand what I can think through, or am I using it to avoid the hard work of deciding? When you use Midjourney or Runway ML, the most valuable moment is afterwards, when you decide what to keep, what to discard, and what it means.
- ›Generate variations, then spend time explaining to yourself why one version is stronger than the others. Write this down. That explanation is your actual creative work.
- ›If you cannot articulate why an AI-generated option is better than alternatives, you have not yet made a real choice. Keep thinking before you commit.
- ›Use AI-generated material as a starting point for your own skill work, not as a replacement for it. Treat outputs like sketch material, not finished work.
Protect Curation Judgement from Engagement Metrics
Many organisations now use AI analytics tools to identify which shows or programmes drew the biggest audiences. This data is useful, but it should inform your decisions, not make them. If you let engagement metrics and audience predictors drive what you programme, your selections will gradually narrow to what algorithms already know people like. You will show fewer debuts, fewer experimental works, fewer artists from communities that do not yet appear in historical data. Curation means deciding what culture needs to exist, not just what data says people want.
- ›Ask your analytics vendor explicitly: what data is this prediction based on, and what kinds of artists or work might it systematically undervalue?
- ›Keep a record of programming decisions you made against the AI recommendation. Review these after a year to see if the algorithmic suggestion would have been better. Often you will have been right.
- ›Ensure at least 20 per cent of your annual programme comes from work that does not fit the engagement profile. Fund it from core budgets, not from earned revenue targets.
Write Grant Applications That Sound Like You
ChatGPT is tempting when funding deadlines loom. It can produce a compelling first draft in minutes. But funders receive hundreds of applications, many now written with AI assistance, and they can tell when the voice is generic rather than genuine. An application that sounds like no one in particular is easier to reject than one that sounds wrong but honest. Your grant applications are one of the few places where your actual voice and values come through on paper. That voice is what distinguishes your organisation from dozens of others applying for the same funds.
- ›Use ChatGPT to generate structure or to help you clarify your thinking, but write the paragraphs that reveal your actual perspective yourself. Let AI help you think, not replace your thinking.
- ›Read your application aloud after you finish. If you would not say it that way in conversation with the funder, rewrite it.
- ›Ask a colleague who knows your organisation well: does this sound like us, or could it be anyone? If they hesitate, start over with a blank page.
Stay Honest About What Humans Made
If your artwork includes AI-generated material, your audience deserves to know. The question of authorship and authenticity in art is not yet resolved, and it may never be fully resolved. But transparency is something you control right now. When you use Midjourney imagery, Adobe Firefly for design elements, or AI assistance in any part of your process, tell people. Label it. Explain your reasoning. This honesty will matter more to serious audiences than the work itself, because it shows that you are still the one making real choices about what gets made and how.
- ›Include a brief artist's statement whenever AI is part of your process. Say what software was used and why you chose to use it. Say what you did with the output.
- ›Consider how AI involvement changes the work's meaning or message. Is that change intentional? If not, remove the AI element.
- ›In institutional contexts, disclose AI-assisted material in your annual reports and public documentation. This builds trust and sets standards others will follow.
Build Institutional Practices That Require Judgement
Organisations that maintain strong artistic and curatorial cultures are the ones that have explicit conversations about how AI fits into their decision making processes. This is not a one-time policy but an ongoing practice. When hiring curators, programmers, or artists, prioritise people who can articulate their aesthetic judgements clearly and defend them. When reviewing internal processes that use AI recommendations, always ask: what would we decide if this data did not exist? If the answer is nothing would change, you have handed your judgement to software. Create systems where human disagreement and debate are built into how you work.
- ›In curatorial meetings, require at least one person to argue against AI-recommended programming choices. Make this a formal role, rotated among staff.
- ›When adopting new AI tools, pilot them with small teams first. Have people use both the AI recommendation and their own judgement, then compare results over time.
- ›Schedule quarterly reviews of how AI is shaping your programme, collections, or hiring. Invite artists and community members to these conversations, not just staff.
Key principles
- 1.Use AI to expand your thinking, not to replace the hard work of deciding what your art or programme actually means.
- 2.Protect your organisation's distinctive voice and values from being smoothed into algorithmic averages by engagement metrics.
- 3.Tell your audiences when AI has been part of your creative or curatorial process, because transparency about authorship matters more than the work itself.
- 4.Keep human disagreement and debate at the centre of how you make decisions, especially in curation and programming.
- 5.The most important judgements in arts and culture are about what culture needs to exist, not what data says people already like.
Key reminders
- When you use generative tools like Midjourney or DALL-E, spend as much time deciding what to reject as you spend generating options. That rejection process is where your actual creative judgement lives.
- Before adopting any AI analytics tool for programming or curation, ask the vendor what kinds of artists, communities, or work the system is likely to undervalue. The answer is always something.
- Write your own grant applications. If you must use ChatGPT, only use it on structural questions like 'what should I explain first' rather than asking it to write sentences for you.
- Document your AI-assisted creative decisions the same way you would document any other part of your practice. This creates a record that helps you stay honest about what you actually decided and why.
- Invite artists and community members into conversations about how your organisation uses AI. They will often spot problems your staff has become blind to, because you use the tools daily and they see them from outside.