The Bias Built Into AI, And Why It Should Scare You

The Bias Built Into AI, And Why It Should Scare You

The Bias Built Into AI, And Why It Should Scare You

0:00/1:34

Also on

Also on

Tracey Spicer is one of Australia’s most respected journalists and the author of Man-Made: How the Bias of the Past Is Being Built into the Future. In this episode, Georgie sits down with Tracey for a sharp, funny, and occasionally jaw-dropping conversation about what happens when we treat AI like neutral math instead of what it really is: opinion written in code.

They unpack why algorithmic bias is getting worse in the generative AI era, how recommendation engines can quietly radicalise people (from Andrew Tate pipelines to hyper-performative “tradwife” culture), and why “move fast” without guardrails is a dangerous blueprint. The discussion also goes into the weird and unsettling frontier of humanoid home robots, privacy risks in always-on devices, and what Tracey learned researching sex robots, including the disturbing ways consent is engineered out of the product.

Plus: why Tracey’s favourite AI tool is Claude, what she thinks about Grok and the chaos machine of X, why we are not getting a four day work week anytime soon, and her case for “regulatory sandpits” to test AI safely before it hits the rest of the world.

Chapters

01:10 – Tracey’s TEDx “The lady stripped bare” moment and why it still matters

04:45 – Beauty standards, AI filters, and why expectations on young women have intensified

08:20 – Man-Made and the epiphany that sparked Tracey’s AI obsession

11:10 – The AI arms race, speed, and why we are in the “seatbelt era” of tech

14:30 – Digital natives vs critical thinking: the hallucination blind spot

16:45 – Tracey’s AI stack: why Claude is her daily driver

19:05 – Humanoid home robots: convenience vs surveillance

21:55 – Strength vs security: what actually scares Tracey about robots

24:35 – Sex robots and the consent problem manufacturers do not talk about

28:10 – Algorithms as “opinions in code” and how radicalisation happens

33:10 – Removing bias: conversations, perspective checks, and inclusive design

35:00 – Grok, MechaHitler, and what happens when platforms mirror their owners

36:45 – Deepfake porn, consent, and why regulation is finally catching up

38:10 – No, AI will not magically deliver a four day work week

41:10 – Future jobs: law, AI assistants, and why juniors still need fundamentals

44:15 – Indigenous knowledge, language revitalisation, and the full-circle AI story

46:50 – Rapid fire: brain chips, Waymo, smart glasses, and AI “snog marry avoid”

49:55 – What we should do now: regulatory sandpits and real guardrails

More Episodes You Might Like

Let's talk

Turn podcasting into pipeline

We help founders, funds and operators build trust, authority and deal flow with a show tailored to their market.

Win better deals and stay top‑of‑mind with founders.

Close more deals and build a category you own.

Reach founders and operators with a show they trust.

Day One® exists to help founders and startup operators make better business decisions more often

Subscribe for helpful content from other successful founders, operators and investors

Join 755 other founders & investors receiving our emails. They're cool, are you? :)

* Regrettably, mel@canva.com is not on our list… yet.

Day One® is a registered trademark of W2D1 Media Pty Ltd. All rights reserved. © 2026 W2D1 Media Pty Ltd.

Day One® exists to help founders and startup operators make better business decisions more often

Subscribe for helpful content from other successful founders, operators and investors

Join 755 other founders & investors receiving our emails. They're cool, are you? :)

* Regrettably, mel@canva.com is not on our list… yet.

Day One® is a registered trademark of W2D1 Media Pty Ltd. All rights reserved. © 2026 W2D1 Media Pty Ltd.