is that it presupposes a very specific shape that the stuff will take - a quickening in pace of change, everything that has come before ceasing to matter, etc
and like
personally, we do fully believe that humanity will someday have computers that can talk with us as equals
no current technology is that. even as someone highly technical with a lot more than average knowledge of how ML works, we do not feel qualified to say for sure whether current technologies could be part of that, with additional innovations, but no current technology is that.
but that isn't even the point. the point is that our personal mental model of the possibility-space around such a thing being invented and the effects on society doesn't have anything singularity-like in most of the branches. it certainly doesn't have social problems or preexisting harms of ML going away.
also, humans don't tend to use our intelligence to become even more intelligent, so there isn't a ton of reason to expect that a computer would. also also, being intelligent does not tend to be highly correlated with being able to drive change in the world, except to the extent that that change starts and ends with building things, and the types of change that can be made by building things are quite limited really.
this is all stuff that we could be wrong about; this is our personal model, not any sort of guarantee. but that's the point: nobody else knows, either. we all need to be able to talk about this stuff, because humans are better at working through big questions when we do it together. (still pretty shitty, but people going it alone is even worse)
and this is all stuff that people who have worshiped this possibility are often ill-suited to reason about, because the level of excitement becomes a cognitive distortion. when you've gone your entire life thinking hey, wouldn't it be cool if someday... it is emotionally challenging to hear that actually the things you're talking about could play out in other ways, too.
all our love to people who are dealing with that. this is a period of history when this stuff does wind up mattering in real, practical ways, as you can see in the public debate around ML policy this past year. it is really important that people be able to hold their excitement and study what's actually happening, not just make assumptions based on what they want to happen.
thank you. <3
this is a mathematical singularity, for example—a "cusp" in a function. it's a place where the curve isn't differentiable, and experiences a sudden break, but it's also a simple change in direction.
I take comfort in this realization, even though I don't know how to map mathematical singularities onto social ones; I feel like the lesson is that any sharp change in social direction can be construed as a "singular point" and such changes don't need to be catastrophic. ~Chara
