Stephen Hawking is the world’s most famous physicist, and even he’s worried about being outsmarted. Not by a better mathematician, but by a machine. So, it might seem a strange moment for him to point out that while limited forms of artificial intelligence and machine learning have turned out to be very useful, going any further risks a dystopian future and possibly the end of human life as we know it.
I think it would be only very worrying if, unaided by humans: 1) a machine could locate and mine minerals needed for batteries 2) transport the material to a plant 3) create the batteries 4) insert the batteries into previously created machines
Otherwise, all we need to do, is pull the plug. End of story :-))
Honestly, I think before we get to a real point of concern over robots taking over the world (that all need electricity), we'll soon reach a point where our electrical grid won't support massive expansion and/or get burned out by unusual solar activity. Everything we do is electricity based, it's a very strange form of dependency really.