

It’s more than that: they’d need to have desires, aversions, goals. That is not automatically granted by intelligence; in our case it’s from our instincts as animals. So perhaps you’d need to actually evolve Darwin style the AGI systems you develop, and that would be way more massive than a single AGI, let alone the “put glue on pizza lol” systems we’re frying the planet for.




Yeah, the terminology is currently a mess. Not just due to language changes, but also synchronic variation - different people using the same words for different meanings, at the same time. But for me, it’s a mix of motivations, methods, and morality: