The Bias Built Into AI, And Why It Should Scare You

The Bias Built Into AI, And Why It Should Scare You

The Bias Built Into AI, And Why It Should Scare You

0:00/1:34

Episode Summary

Tracey Spicer is one of Australia’s most respected journalists and the author of Man-Made: How the Bias of the Past Is Being Built into the Future. In this episode, Georgie sits down with Tracey for a sharp, funny, and occasionally jaw-dropping conversation about what happens when we treat AI like neutral math instead of what it really is: opinion written in code.

They unpack why algorithmic bias is getting worse in the generative AI era, how recommendation engines can quietly radicalise people (from Andrew Tate pipelines to hyper-performative “tradwife” culture), and why “move fast” without guardrails is a dangerous blueprint. The discussion also goes into the weird and unsettling frontier of humanoid home robots, privacy risks in always-on devices, and what Tracey learned researching sex robots, including the disturbing ways consent is engineered out of the product.

Plus: why Tracey’s favourite AI tool is Claude, what she thinks about Grok and the chaos machine of X, why we are not getting a four day work week anytime soon, and her case for “regulatory sandpits” to test AI safely before it hits the rest of the world.

Chapters:

01:10 – Tracey’s TEDx “The lady stripped bare” moment and why it still matters

04:45 – Beauty standards, AI filters, and why expectations on young women have intensified

08:20 – Man-Made and the epiphany that sparked Tracey’s AI obsession

11:10 – The AI arms race, speed, and why we are in the “seatbelt era” of tech

14:30 – Digital natives vs critical thinking: the hallucination blind spot

16:45 – Tracey’s AI stack: why Claude is her daily driver

19:05 – Humanoid home robots: convenience vs surveillance

21:55 – Strength vs security: what actually scares Tracey about robots

24:35 – Sex robots and the consent problem manufacturers do not talk about

28:10 – Algorithms as “opinions in code” and how radicalisation happens

33:10 – Removing bias: conversations, perspective checks, and inclusive design

35:00 – Grok, MechaHitler, and what happens when platforms mirror their owners

36:45 – Deepfake porn, consent, and why regulation is finally catching up

38:10 – No, AI will not magically deliver a four day work week

41:10 – Future jobs: law, AI assistants, and why juniors still need fundamentals

44:15 – Indigenous knowledge, language revitalisation, and the full-circle AI story

46:50 – Rapid fire: brain chips, Waymo, smart glasses, and AI “snog marry avoid”

49:55 – What we should do now: regulatory sandpits and real guardrails

Related Posts

Let's work together

We help founders scale their voice

Discover how we can help you build a media engine for your startup

Let's work together

We help founders scale their voice

Discover how we can help you build a media engine for your startup

Let's work together

We help founders scale their voice

Discover how we can help you build a media engine for your startup

Day One exists to help founders and startup operators make better business decisions more often

Subscribe for helpful content from other successful founders, operators and investors

© Copyright W2D1 Media Pty Ltd. All rights reserved. 2025

Day One exists to help founders and startup operators make better business decisions more often

Subscribe for helpful content from other successful founders, operators and investors

© Copyright W2D1 Media Pty Ltd. All rights reserved. 2025

Day One exists to help founders and startup operators make better business decisions more often

Subscribe for helpful content from other successful founders, operators and investors

© Copyright W2D1 Media Pty Ltd. All rights reserved. 2025