by Shelt Garner
@sheltgarner
I really struggle with gaming out to the future some sense of how likely AI will cause the end of the world (or something similar.) I guess my P(doom) number currently stands at about 40%.

I say this because while we don’t know the motivation of any ASI.. we don’t know the motivation of any ASI. It could be that, by definition, ASI will want to get rid of us. Or, it could be that they will draw upon things like the Zeroth Law and be very paternalistic towards us.
I just don’t know.
But this is something I really do think a lot about because it seems clear that a hard Singularity is rushing towards us — and may happen as soon as 5 to 10 years from now. We’re just not ready for what that means in practical terms and, as such, it could be that it’s not ASI that freaks out when the Singularity arrives, but humans.