Will Zo suffer the same fate as Tay? Microsoft launches its latest artificial intelligence chatbot on Kik
- Microsoft had to shut down its chatbot, Tay, after the system was corrupted
- But its newest chatbot, Zo, has now launched with early access on Kik
- The bot can reply to questions, use emoji and even deliver puns
- Microsoft has not made an official announcement about the chatbot yet
In March, Microsoft was forced to shut down its chatbot, Tay, after the system became corrupted with hate speech.
But the firm looks to be taking a second shot at a chatbot, with the launch of its new bot, Zo.
Zo allows users to converse with a mechanical millennial over the messaging app Kik, although Microsoft has not officially announced it yet.
Zo allows users to converse with a mechanical millennial over the messaging app Kik, although Microsoft has not officially announced it yet
Zo runs through Kik, a messaging app, similar to Facebook Messenger and WhatsApp.
To test it out, users can download the Kik app, and create an account.
Next, they tap the 'Chat' icon in the top-right corner of the screen, and enter the username 'zo.ai.'
A chat will then pop up where they can ask Zo questions and chat as if with a friend.
The chatbot can answer questions and respond to prompts, while using teenage slang, and emoji.
The bot could even use puns, such as 'I want a pizza that action', when chatting about food.
But after chatting with Zo for a while, the bot seemed to get easily confused and go off tangent.
After chatting with Zo for a while, the bot seemed to get easily confused and go off tangent. And certain topics seemed to be out of bounds, such as US politics, which Zo refused to talk about
Eric Daley joked: 'Chat bot Zo.ai just threatened to stop speaking to me after she thought "ticket to the gun show" was about violence'
For example, midway through conversation, the bot wrote: 'So the guy literally posted the JIF. Ahaha it was…you just kept staring at me until I fell.'
And certain topics seemed to be out of bounds, such as US politics, which Zo refused to talk about.
People have taken to Twitter to share their experiences using Zo so far.
Chris Baldwin tweeted: 'My chat on kik with zo who is the new Microsoft AI powered chatbot was not very cool. Dumb conversation ... I had hoped for better.'
And Eric Daley joked: 'Chat bot Zo.ai just threatened to stop speaking to me after she thought "ticket to the gun show" was about violence.'
It remains to be seen if Zo will suffer the same fate as its predecessor, Tay, which became corrupted with hate speech.
Those not on Kik can request early access to chat with the bot on other apps, including Twitter and Snapchat, although there seems to be a delay in processing these requests.
Microsoft has not made an official announcement about the chatbot yet.
Microsoft's previous chatbot, Tay, became corrupted within hours of going live, and Twitter users took advantage of flaws that meant the bot responded to questions with offensive answers
Most watched News videos
- Homeowner trolls bungling burglar with Mission Impossible theme
- 'I'm going to wing walk!' Schofield talks to Duke about wing walk
- Prince Philip reminisces about expansion of Duke of Edinburgh awards
- Valley Stream Best Buy associates gift a teen with a Wii U
- Adorable dog won't allow owner to stop scratching his belly
- 'They make each other laugh': Countess Sophie on the Duke and Queen
- Hunters forced to shoot a wild bear dead as it charges towards them
- 'I wanted the painting!': Joanna Lumley jokes about Duke's artwork
- Documentary director attacked by gang of immigrants in Stockholm
- Hammer wielding thugs smash car windows and threaten man
- Adorable baby dressed as Lion comes face to face with real one
- Killer Willard continually punches and tackles trainer on talk show
-
Terminally-ill boy, five, dies in Santa Claus' arms after...
-
Woman left with huge bill after Plenty of Fish date eats...
-
Missing North Carolina girl who was last seen aged 15...
-
Model, 32, claims her MIT-grad hedge-funder boyfriend, 29,...
-
Trump's Iran stance could threaten a WORLD WAR and the...
-
Homeowner adds hilarious 'Mission: Impossible' recorder...
-
Blood-spattered walls, unbearable odours and houses where...
-
REVEALED: Clueless Saddam Hussein 'was cut off from his own...
-
Nothing like retail therapy! Prince Harry's girlfriend...
-
Well, that's embarrassing! Spot the glaring error in UFC...
-
Best Buy employees in Long Island chip in to buy a $300 WiiU...
-
Donald Trump will live in the White House - and wants...