Cognitive Sovereignty Self-Audit for Social Workerss
This audit measures whether AI tools are replacing your professional judgement or supporting it in child and adult social care practice. It focuses on risk assessment, documentation, and the decisions you make about people's safety.
Write your risk assessment on paper or in a separate document first, before the AI tool sees any of it. This breaks the habit of letting algorithms shape your thinking.
When an AI tool flags risk, ask yourself: what specific behaviour or fact am I seeing with my own eyes that confirms or contradicts this? Write that down.
Request a copy of any algorithmic tool's training data and bias audit. If your organisation cannot provide this, question whether it should be in use at all.
In supervision, explicitly discuss cases where your judgement differed from the AI score. Use these to sharpen your reasoning, not to second-guess yourself.
Protect direct contact time fiercely. If AI promises to free up time, agree with your manager in advance that this time goes to families, not other paperwork.