177 – Decision Trees and Butlerian Jihads with Matt Freeman

Matt Freeman gives a primer in how to uses Bayesian decision making in normal life, via decision trees. We also discuss utilitarianism, the Guild of the Rose, and recent AI advances. We are now in the early singularity.

Guild of the Rose:
Playlist of Decision Tree “lectures”
Guild workshop page 1
Guild workshop page 2
Guild workshop page 3

Reflecting on the 2022 Guild of the Rose Workshops

How to identify whether text is true or false directly from a model’s *unlabeled activations*.

Zvi On the Diplomacy AI

AI superpowers – can tell race via chest Xray. Can tell sex via retina scan (87%).

0:00:23 Practical Decision Trees
1:05:46 Utilitarian & Other Ethics considerations
1:31:01 Guild of the Rose info
1:39:02 Bayesian AI Corner
2:08:18 Should we have a Butlerian Jihad?
2:31:36 Thank the Patron!


Hey look, we have a discord! What could possibly go wrong?

Our Patreon page, your support is most rational, and totally effective. Also merch!


Rationality: From AI to Zombies, The Podcast

LessWrong Sequence Posts Discussed in this Episode:

Nope

Next Episode’s Sequence Posts: (for real this time)

0 And 1 Are Not Probabilities

Beautiful Math

Posted in Uncategorized | Leave a comment

176 – Circling with Aella

Aella comes on the show to tell us what the heck Circling is and why we should do it.

0:01:21 How Many Rationalists Are There?
0:04:40 Circling
1:09:42 LW Posts
1:41:35 Thank the Patron!

Aella’s Blog

The Patreon bit where we discuss The Murder Button

Circle Anywhere

Authentic Relating International

Julia Galef on Bayes’ Equation


Hey look, we have a discord! What could possibly go wrong?

Our Patreon page, your support is most rational, and totally effective. Also merch!


Rationality: From AI to Zombies, The Podcast

LessWrong Sequence Posts Discussed in this Episode:

Absolute Authority

Infinite Certainty

Next Episode’s Sequence Posts: (for real this time)

0 And 1 Are Not Probabilities

Beautiful Math

Posted in Uncategorized | Leave a comment

175 – FTX + EA, and Personal Finance

David from The Mind Killer joins us. We got to talking about the FTX collapse and some of the waves it sent through the Effective Altruism community. Afterwards David helps us out with personal finance.

0:00:00 Intro
0:02:08 The Sequences Now Formatted For Zoomers
0:11:25 FTX Collapse & EA Shockwaves
0:54:20 Personal Finance
1:43:20 Thank the Patron!

The Sequences In A Zoomer-Readable Format

FTX: The $32B implosion – a good fast roundup of what the hell happened

Yudkowsky’s essay on FTX Future Fund money that’s been paid out

Yudkowsky’s essay warning against humans trying to use utilitarianism 14 years ago

Bunch of Twitter – The reason we have excess money to giveRobert Wiblina god complexconflating utilitarianism with naive utilitarianismthe ultimate take-away

Our recent episode on Virtue Ethics

Erik Hoel criticizing EA

More details about the FTX stuff will be on the next episode of The Mind Killer

“Investment is a bet that a thing will be important in the future” – David

Nassim Taleb’s Barbell Investment Strategy


Hey look, we have a discord! What could possibly go wrong?

Our Patreon page, your support is most rational, and totally effective. Also merch!


Rationality: From AI to Zombies, The Podcast

LessWrong Sequence Posts Discussed in this Episode:

none

Next Episode’s Sequence Posts: (for really real this time)

Absolute Authority

Infinite Certainty

Posted in Uncategorized | 1 Comment

BREAKING – FTX collapse & EA shockwaves

While recording the next episode we got to talking about the FTX collapse and some of the waves it sent through the Effective Altruism community. We decided to break it out into a separate segment so it can air while it’s still relevant.

FTX: The $32B implosion – a good fast roundup of what the hell happened

Yudkowsky’s essay on FTX Future Fund money that’s been paid out

Yudkowsky’s essay warning against humans trying to use utilitarianism 14 years ago

Bunch of Twitter – The reason we have excess money to give; Robert Wiblin; a god complex; conflating utilitarianism with naive utilitarianism; the ultimate take-away

Our recent episode on Virtue Ethics

Erik Hoel criticizing EA

More details about the FTX stuff will be on the next episode of The Mind Killer


Hey look, we have a discord! What could possibly go wrong?

Our Patreon page, your support is most rational, and totally effective. 🙂

Posted in Uncategorized | Leave a comment

174 – Jon Stewart, The Consensus Emancipator

Wes and David from The Mind Killer show up for a special cross-over episode.

We discuss How Stewart Made Tucker

Listen to The Mind Killer 🙂


Hey look, we have a discord! What could possibly go wrong?

Also merch!


Rationality: From AI to Zombies, The Podcast

LessWrong Sequence Posts Discussed in this Episode:

none

Next Episode’s Sequence Posts: (for real this time)

Absolute Authority

Infinite Certainty

Posted in Uncategorized | Leave a comment

173 – Oh Lawd, Strong AI is Comin’

Matt Freeman returns to discuss General Artificial Intelligence timelines, and why they are short.

Our primary text is: Why I think strong general AI is coming soon

Other links:

“Let’s think step by step” is all you need

NVIDIA A100 info

The Mind Killer

AI suggested 40,000 new possible chemical weapons in six hours

Yo be real GPT

Hack language models with Ignore Previous Instructions

Very Bad Wizards: Is it GPT or Dan Dennett?

Tesla Bot

“it’s obviously conscious”

Whenever anyone says Elon doesn’t deliver…

HPMoR Christmas chapter extended by AI (w/ guidance)


0:00:00 Intro/Main Topic
2:02:18 Thank the Patron!


Hey look, we have a discord! What could possibly go wrong?

Also merch!


Rationality: From AI to Zombies, The Podcast

LessWrong Sequence Posts Discussed in this Episode:

none

Next Episode’s Sequence Posts:

Absolute Authority

Infinite Certainty

Posted in Uncategorized | Leave a comment