
Episode Summary:
Sarah Kaur is not your typical technologist, and that’s the point. As Principal Design Strategist at Portable, she brings an artist’s eye and a community-first mindset to the world of AI and justice tech. In this episode, Sarah joins Georgie Healy to explore what it really means to build AI that works with people, not just on them. They dive into the hidden biases in design, the pitfalls of “black box” systems, and the real cost of leaving communities out of the development process. From building tools in Australia’s family law system to dreaming up a sovereign digital twin that protects your data, Sarah’s approach to ethical innovation is bold, reflective, and deeply grounded in care. Whether you’re designing for scale or simply wondering how AI can be made more equitable, this conversation will give you a new lens on what it means to build technology that respects, adapts, and listens.
Episode Sponsors:
Chapters:
03:20 – From Artist to AI Strategist: Sarah’s Unconventional Path
07:42 – What Human-Centred AI Actually Means
11:57 – Sitting in Ambiguity: The Black Box Camera Experiment
14:40 – Using ChatGPT to Surface Personal Bias
18:20 – What the Public Doesn’t Understand About AI Design
22:32 – Why Co-Design Matters More Than Ever
30:32 – Designing With Communities, Not Just for Them
35:40 – When Explainability Matters More Than Accuracy in AI
40:00 – Inside Portable’s AI Sprint Model
43:02 – Ethical AI vs. Imaginative AI
47:31 – Dream Tech: A Sovereign Digital Twin That Asks for Consent
Resources:
👩💻 Sarah Kaur on LinkedIn – https://www.linkedin.com/in/sarahkaur/
🏢 Portable – https://www.portable.com.au/
📘 Amica (Family Law Tool) – https://www.amica.gov.au/
🛠️ AI Sprint Model by Portable – https://www.portable.com.au/articles/ai-design-sprints