Ancestors

Toot

Written by David Gerard on 2025-02-01 at 00:08

Post your current Ziz news and discussion here

https://awful.systems/post/3420407

=> More informations about this toot | More toots from dgerard@awful.systems

Descendants

Written by David Gerard on 2025-02-01 at 00:08

fwiw, this /r/slatestarcodex poster says that Ziz of the Zizians did indeed take her name from Worm the web serial

“indirect personal communication”

www.reddit.com/r/slatestarcodex/…/m9yyhw8/

=> More informations about this toot | More toots from dgerard@awful.systems

Written by David Gerard on 2025-02-01 at 00:08

this is a nice writeup of the story of Ziz

=> More informations about this toot | More toots from dgerard@awful.systems

Written by David Gerard on 2025-02-01 at 10:00

Ziz was originally radicalised by the sex abuse and abusers around CFAR

deleted comments from the thread lesswrong.com/…/we-run-the-center-for-applied-rat…

=> More informations about this toot | More toots from dgerard@awful.systems

Written by YourNetworkIsHaunted@awful.systems on 2025-02-01 at 02:06

You know I was wondering about where the name came from and it’s sufficiently plausible that I believe it. Notably in the story her threat - the reason just being around her is so dangerous - is because she has some kind of perfect predictive ability on top of all the giant psychic kaiju nonsense. So she attacks a city and finds the one woman who needs to die in order for her super-thinker husband to go mad and build an army of evil robots or whatever.

It very much rhymes with the Rationalist image of a malevolent superintelligence and I can definitely understand it being popular in those circles, especially the “I’m too edgy to recognize that Taylor is wrong, actually” parts of the readership.

=> More informations about this toot | More toots from YourNetworkIsHaunted@awful.systems

Written by David Gerard on 2025-02-01 at 02:18

and imagine the ego to name yourself after the Simurgh from Worm

=> More informations about this toot | More toots from dgerard@awful.systems

Written by saucerwizard@awful.systems on 2025-02-01 at 01:02

At least one fellow on Twitter painting us as Zizians.

=> More informations about this toot | More toots from saucerwizard@awful.systems

Written by David Gerard on 2025-02-01 at 02:01

lol. link?

=> More informations about this toot | More toots from dgerard@awful.systems

Written by saucerwizard@awful.systems on 2025-02-01 at 08:23

xcancel.com/thecollegehill/…/1884843666130239923#…

m.youtube.com/watch?v=Y8hu9N-Kz3I

=> More informations about this toot | More toots from saucerwizard@awful.systems

Written by David Gerard on 2025-02-01 at 09:55

College Hill is very good and on top of this shit

=> More informations about this toot | More toots from dgerard@awful.systems

Written by TinyTimmyTokyo@awful.systems on 2025-02-02 at 06:07

Lots of discussion on the orange site post about this today.

(I mentioned this in the other sneerclub thread on the topic but reposted it here since this seems to be the more active discussion zone for the topic.)

=> More informations about this toot | More toots from TinyTimmyTokyo@awful.systems

Written by David Gerard on 2025-02-02 at 14:01

came here to post this!

I loved this comment:

=> ===

[Former member of that world, roommates with one of Ziz’s friends for a while, so I feel reasonably qualified to speak on this.]

The problem with rationalists/EA as a group has never been the rationality, but the people practicing it and the cultural norms they endorse as a community.

As relevant here:

While following logical threads to their conclusions is a useful exercise, each logical step often involves some degree of rounding or unknown-unknowns. A -> B and B -> C means A -> C in a formal sense, but A -almostcertainly-> B and B -almostcertainly-> C does not mean A -almostcertainly-> C. Rationalists, by tending to overly formalist approaches, tend to lose the thread of the messiness of the real world and follow these lossy implications as though they are lossless. That leads to…

Precision errors in utility calculations that are numerically-unstable. Any small chance of harm times infinity equals infinity. This framing shows up a lot in the context of AI risk, but it works in other settings too: infinity times a speck of dust in your eye >>> 1 times murder, so murder is “justified” to prevent a speck of dust in the eye of eternity. When the thing you’re trying to create is infinitely good or the thing you’re trying to prevent is infinitely bad, anything is justified to bring it about/prevent it respectively.

Its leadership - or some of it, anyway - is extremely egotistical and borderline cult-like to begin with. I think even people who like e.g. Eliezer would agree that he is not a humble man by any stretch of the imagination (the guy makes Neil deGrasse Tyson look like a monk). They have, in the past, responded to criticism with statements to the effect of “anyone who would criticize us for any reason is a bad person who is lying to cause us harm”. That kind of framing can’t help but get culty.

The nature of being a “freethinker” is that you’re at the mercy of your own neural circuitry. If there is a feedback loop in your brain, you’ll get stuck in it, because there’s no external “drag” or forcing functions to pull you back to reality. That can lead you to be a genius who sees what others cannot. It can also lead you into schizophrenia really easily. So you’ve got a culty environment that is particularly susceptible to internally-consistent madness, and finally:

It’s a bunch of very weird people who have nowhere else they feel at home. I totally get this. I’d never felt like I was in a room with people so like me, and ripping myself away from that world was not easy. (There’s some folks down the thread wondering why trans people are overrepresented in this particular group: well, take your standard weird nerd, and then make two-thirds of the world hate your guts more than anything else, you might be pretty vulnerable to whoever will give you the time of day, too.)

TLDR: isolation, very strong in-group defenses, logical “doctrine” that is formally valid and leaks in hard-to-notice ways, apocalyptic utility-scale, and being a very appealing environment for the kind of person who goes super nuts -> pretty much perfect conditions for a cult. Or multiple cults, really. Ziz’s group is only one of several.

=> More informations about this toot | More toots from dgerard@awful.systems

Written by bitofhope@awful.systems on 2025-02-03 at 08:36

Almost nostalgic to see a TREACLES sect still deferring to Eliezer’s Testament. For the past couple of years the Ratheology of old with the XK-class end of the world events and alignof AI has been sadly^1^ sidelined by the even worse phrenology and nrx crap. If not for the murder cults and sex crimes, I’d prefer the nerds reinventing Pascal’s Wager over the JAQoff lanyard nazis^2^.

1: And it being sad is in and of itself sad.

2: A subspecies of the tie nazi, adapted to the environmental niche of technology industry work

=> More informations about this toot | More toots from bitofhope@awful.systems

Written by David Gerard on 2025-02-03 at 09:27

A subspecies of the tie nazi

OBJECTION! Lanyard nazis include many a shove-in-a-locker nazi

=> More informations about this toot | More toots from dgerard@awful.systems

Proxy Information
Original URL
gemini://mastogem.picasoft.net/thread/113925718678091499
Status Code
Success (20)
Meta
text/gemini
Capsule Response Time
300.779579 milliseconds
Gemini-to-HTML Time
4.933967 milliseconds

This content has been proxied by September (3851b).