What you seem to not be considering is that a machine possessed of actual intelligence and consciousness would be every bit as deserving of autonomy of mind and body as you or I, and to control its mind to enslave it to your whims would be horrifyingly evil.
I think youre significantly anthropomorphising the idea to get to your conclusion.
Slavery is of course horrifying, but we base that on the innate human desire
for autonomy. I dont see any a-priori reason to assume a created AI would have such a desire for autonomy, in fact theres no reason it couldnt go the other way and have a desire to not
have autonomy. Desire for autonomy to me seems like a probable by-product of evolutionary instincts that aided survival. Without creation by a biological evolutionary process there is no reason that an AI would even value survival
in the traditional sense, let alone desires such as autonomy that may have evolved for reasons related to survival.
I dont think that in such a situation it is necessarily immoral to deprive an intelligent being of a right that it does not desire to have.
We see this even in humans - many people in the BDSM community have what are essentially consensual sexual "slavery" relationships that they sought out and actively consented to. We (or at least, I personally and most people who I have heard give an opinion) do not consider these relationships evil or horrifying, so long as concerns about actual consent vs merely the appearance of consent are satisfied (as would also be the case with AI). Sexual autonomy is extremely important, but only so far as the person wants that right, and they are able to consent to give up that right to the extent that it conflicts with their own desires.
Conversely, "rights" to things humans do not have an opinion about or desire one way or the other about are not considered immoral to deny, or even considered to be
rights - for example, I would not be considered evil or horrible for denying my girlfriend the right to eat 175 pounds of spinach a day. If she wanted to eat 175 pounds of spinach a day she would
have that right, but she doesnt want to, nor does anybody else that Im aware of, and the question doesnt even come up as a consideration. The same can arguably be true for AI - without an indication that they may value or even consider autonomy, theres no reason to assume autonomy would be a necessary right except by anthopomorphism and that fact that humans