For Social Workerss

Cognitive Sovereignty Self-Audit for Social Workerss

This audit measures whether AI tools are replacing your professional judgement or supporting it in child and adult social care practice. It focuses on risk assessment, documentation, and the decisions you make about people's safety.

This takes about two minutes. Answer honestly.

Download printable PDF

1. When you write a risk assessment for a child protection case, how do you use the risk score generated by your organisation's AI tool?

2. You notice that an AI tool consistently flags families from a particular ethnic background as higher risk. What do you do?

3. Your organisation uses AI to auto-populate case notes and generate summaries. How often do you actually read and edit what the AI has written before you sign off?

4. A senior colleague or manager challenges your decision on a case because it contradicts the AI risk score. How do you respond?

5. You are about to make a safeguarding decision about removing a child from a home. How much does the AI tool influence your thinking?

6. When you use ChatGPT or Copilot to help draft case notes or assessments, what do you do with the output?

7. How often do you carry out a full risk assessment without consulting an AI tool or algorithm first?

8. You feel your organisation is pushing you to adopt AI tools to reduce documentation time. What is happening to your direct contact time with service users?

Your score

Read Chapter 1 Free

Keep in mind

Related reads

The Book — Out Now

Cognitive Sovereignty: How To Think For Yourself When AI Thinks For You

Read the first chapter free.

No spam. Unsubscribe anytime.