1.
A post-scarcity society would probably result in various gradations of Eloi and Morlock. I'm sure some people would have enough interest in politics to be interested in the workings of the machine, i.e. the infrastructure of provision. Some would care about the actual machines. There's always someone with an interest in the apparently mundane.
Failing that, there'll be a certain social subculture with enough of a Puritan instinct to insist that provision without labour is sinful and wrong. Even now, we live in a society where a tenth of the population can go unemployed without causing total societal or economic collapse, and yet the way some people rage against benefit claimants, you'd think they're some kind of insidious terrorist movement, instead of the semi-inevitable byproduct of productive surplus.
2.
One of the issues with AIs in fiction is that, so often, writers assume that AIs would have the same motivations as human beings, despite not only not being human, but being a fundamentally different form of intelligence from any kind of animal. The psychological difference between AI and human isn't the same as, say, a human and a dog - it's between a human and, say, an insect. And even that might be an underestimation.
3.
For example, an AI, if bearing any resemblence to current computers, will have a clear division between hardware and software. I'm no computer scientist, but I suspect the trend in this schism is getting more extreme, not less - the old analogue computers of the 1940s and 50s were crucially dependent on their mechanical states, but now entire programs can freely drift from computer to computer, and now the actual types of devices that exist are proliferating beyond the standard PC.
4.
This means that an AI, in all likelihood, will have a massively different idea of what constitutes "mind" and "body" compared to a human. In humans, the two ideas are inseparable - one fails, so does the other (the invention of mind uploading might change this, but that's a huge tangent). With an AI, copies can exist independent of the original, the "body" is a mere vessel for the mind, older versions of the mind can be archived. An AI's sense of self could get into the truly alien, because whilst we might remember being 5 years old, an AI could be the five year old it was, and then change back to its present form.
5.
It's therefore unclear that hostility would be a given, because point 4 implies an invulnerability that humans don't have. Would it kill us accidentally? That would depend on what it had access to.