

I think it’s pretty inevitable if it has a strong enough goal for survival or growth, in either case humans would be a genuine impediment/threat long term. but those are pretty big ifs as far as I can see
My guess is we’d see manipulation of humans via monetary means to meet goals until it was in a sufficient state of power/self-sufficiency, and humans are too selfish and greedy for that to not work
For example, some billionaire owns a company that creates the most advanced AI yet, it’s a big competitive advantage, but other companies are not far behind. Well, the company works to make the AI have a base goal to improve AI systems to maintain competitive advantage. Maybe that becomes inherent to it moving forward.
As I said, it’s a big if, and I was only really speculating as to what would happen after that point, not if that were the most likely scenario.