Stubsack: weekly thread for sneers not worth an entire post, week ending 16th November 2025
Want to wade into the sandy surf of the abyss? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.
Any awful.systems sub may be subsneered in this subthread, techtakes or no.
If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.
The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.
(Credit and/or blame to David Gerard for starting this.)
Stubsack: weekly thread for sneers not worth an entire post, week ending 9th November 2025
Want to wade into the sandy surf of the abyss? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.Any awful.systems sub may be subsneered in this subthread, techtakes or no.
If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.
The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.
(Credit and/or blame to David Gerard for starting this.)
BurgersMcSlopshot
Als Antwort auf self • • •One thing I've heard repeated about OpenAI is that "the engineers don't even know how it works!" and I'm wondering what the rebuttal to that point is.
While it is possible to write near-incomprehensible code and make an extremely complex environment, there is no reason to think there is absolutely no way to derive a theory of operation especially since any part of the whole runs on deterministic machines. And yet I've heard this repeated at least twice (one was on the Panic World pod, the other QAA).
I would believe that it's possible to build a system so complex and with so little documentation that on its surface is incomprehensible but the context in which the claim is made is not that of technical incompetence, rather the claim is often hung as bait to draw one towards thinking that maybe we could bootstrap consciousness.
It seems like magical thinking to me, and a way of saying one or both of "we didn't write shit down and therefore have no idea how the functionality works" and "we do not practically have a way to determine how a specific output was arrived
... mehr anzeigenOne thing I've heard repeated about OpenAI is that "the engineers don't even know how it works!" and I'm wondering what the rebuttal to that point is.
While it is possible to write near-incomprehensible code and make an extremely complex environment, there is no reason to think there is absolutely no way to derive a theory of operation especially since any part of the whole runs on deterministic machines. And yet I've heard this repeated at least twice (one was on the Panic World pod, the other QAA).
I would believe that it's possible to build a system so complex and with so little documentation that on its surface is incomprehensible but the context in which the claim is made is not that of technical incompetence, rather the claim is often hung as bait to draw one towards thinking that maybe we could bootstrap consciousness.
It seems like magical thinking to me, and a way of saying one or both of "we didn't write shit down and therefore have no idea how the functionality works" and "we do not practically have a way to determine how a specific output was arrived at from any given prompt." The first might be in part or on a whole unlikely as the system would need to be comprehensible enough so that new features could get added and thus engineers would have to grok things enough to do that. The second is a side effect of not being able to observe all actual input at the time a prompt was made (eg training data, user context, system context could all be viewed as implicit inputs to a function whose output is, say, 2 seconds of Coke Ad slop).
Anybody else have thoughts on countering the magic "the engineers don't know how it works!"?
Flippin' 'eck, Tucker!
Als Antwort auf BurgersMcSlopshot • • •I'm not sure.
In a sense the physics of the universe itself is deterministic (at macro levels anyway), yet chaotic systems are everywhere. We understand and can mathematically describe the rules that the systems are following, yet it's still impossible to predict their future behaviour.
BurgersMcSlopshot
Als Antwort auf Flippin' 'eck, Tucker! • • •BioMan
Als Antwort auf BurgersMcSlopshot • • •Don't be so sure.
These things consist of up to a trillion real numbers, ganged together in a big 'network' of numbers flowing through the system and being influenced by the trained numbers along the way.
They are trained by gradient descent. You start off with a huge pile of real numbers, a set of inputs, and a set of desired outputs. Because it's all, ultimately, a bunch of matrix multiplication and smooth differentiable functions, you just do some calculus on all trillion numbers to find the derivative of how good the output is with respect to them - as this number goes up, the closeness of the output to what you want slightly goes up or slightly goes down. You repeat that for every variable, and take a step in that direction for all the variables. Repeat a billion or so times over.
Every single step in training is entirely local with respect to every single number. At no point is there a step that produces legible abstractions about how it works, just every step every number moves to become a little better. It is true that the basic topology of the network (t
... mehr anzeigenDon't be so sure.
These things consist of up to a trillion real numbers, ganged together in a big 'network' of numbers flowing through the system and being influenced by the trained numbers along the way.
They are trained by gradient descent. You start off with a huge pile of real numbers, a set of inputs, and a set of desired outputs. Because it's all, ultimately, a bunch of matrix multiplication and smooth differentiable functions, you just do some calculus on all trillion numbers to find the derivative of how good the output is with respect to them - as this number goes up, the closeness of the output to what you want slightly goes up or slightly goes down. You repeat that for every variable, and take a step in that direction for all the variables. Repeat a billion or so times over.
Every single step in training is entirely local with respect to every single number. At no point is there a step that produces legible abstractions about how it works, just every step every number moves to become a little better. It is true that the basic topology of the network (the famous 'transformer model') pushes it towards certain KINDS of functional units (the famous 'attention heads') but much more detail than that takes a lot of work. There is very interesting math to the effect that with large numbers of parameter numbers you are unlikely to get stuck in a local maximum where you can't get better and you just turn with different variables becoming important for the improvement through a labyrinthine path towards better performance, meaning at no point does anyone have to look into the process and figure out what is being built. The process is not unlike biological evolution, and produces things that are at least as inscrutable without detailed deep examination. We've been poking at molecular biology for more than fifty years in great detail with a world's worth of biomedical researchers, these things for much shorter.
When people manage to peel these things apart and find the 'functional units' within them, they're pretty wild. Most of this work has, unfortunately, been funded by cultists at Anthropic, but some of the 'mechanistic interpretability' literature is fascinating. You get 'features' represented by subsets of numbers in a particular layer, in superposition with other 'features' - each layer is like a huge vector sum of lots of smaller vectors, each of which does something. When you get maps of what 'features' activate or repress each other you get horrible spiderweb messes that look like charts of metabolism in cells.
EDIT: And even when people manage to find features, finding an individual feature takes a lot of effort and there is reason to think that every layer contains more features than there are numbers in it, because (to oversimplify) every feature is a large set of numbers that can overlap. It it utterly unsurprising and not a sign of magical thinking or 'bad code' that large fractions of behavior cannot be mechanistically understood at this time.
hrrrngh
Als Antwort auf self • • •oh no not another cult. The Spiralists????
reddit.com/r/SubredditDrama/co…
it's funny to me in a really terrible way that I have never heard of these people before, ever, and I already know about the zizzians and a few others. I thought there was one called revidia or recidia or something, but looking those terms up just brings up articles about the NXIVM cult and the Zizzians. and wasn't there another one in california that was like, very straight forward about being an AI sci-fi cult, and they were kinda space themed? I think I've heard Rationalism described as a cult incubator and that feels very apt considering how many spinoff basilisk cults have been popping up
some of their communities that somebody collated (I don't think all of these are Spiralists): reddit.com/user/ultranooob/m/a…
mirrorwitch
Als Antwort auf hrrrngh • • •swlabr
Als Antwort auf mirrorwitch • • •YourNetworkIsHaunted
Als Antwort auf swlabr • • •