Making virtual assistants female by default can be bad for business and perpetuate stereotypes, these chatbot developers say, so they're offering more options to consumers.
"A bot can be male or female, but I think it doesn't need to be submissive ...
A new bot built by Microsoft employees in their spare time is designed to do exactly the opposite.
The chatbot, tested recently in Seattle, Atlanta, and Washington, lurks behind fake online ads for sex posted by nonprofits working to combat human trafficking, and responds to text messages sent to the number listed.
A few weeks ago I was introduced to the world of BDSM scripts: simple sims that replicate the experience of being with a dominatrix.It occurred to me that these scripts had a connection to ELIZA, one of the earliest examples of a natural language processing program.Naturally, my thoughts shifted to getting it on with a pioneering computer program.The software initially pretends to be the person in the ad, and can converse about its purported age, body, fetish services, and pricing.But if a would-be buyer signals an intent to purchase sex, the bot pivots sharply into a stern message.“Buying sex from anyone is illegal and can cause serious long term harm to the victim, as well as further the cycle of human trafficking,” goes one such message.While Tay started out as an innocent emulation of a teenager in touch with pop culture as a marketing tool to help Microsoft improve its voice recognition software, it turned into a total psycho because it learns how to communicate through conversing with others, like the good terminator from , except John Connor didn't want his Terminator to beg him for sex.