Monday, May 23, 2016

AI drama

From alphr comes a story about the literary exploits of Google's foray into artificial intelligence.

One of the reasons why the Turing Test continues to be such a steep bar for AI to clear is because artificial intelligences just don’t talk like normal people. Artificial chatter is often grammatically sound, but feels stuffy, formal and just not quite right. Getting artificial intelligences to sound human has been a tough old nut to crack.

Google has an interesting solution to this, and has posted a paper outlining how it taught its artificial intelligence a flair for the dramatic by what I can only describe as cruel and unusual punishment. Inspired, no doubt, by the seemingly endless streams of Mills and Boon style romance novels cluttering up charity shops around the country, Google fed a neural network model 12,000 ebooks, some 2,865 if which were of that much maligned genre.

Here's an example of its output.

“this was the only way. it was the only way. it was her turn to blink. it was hard to tell. it was time to move on. he had to do it again. they all looked at each other. they all turned to look back. they both turned to face him. they both turned and walked away.”

Not impressive, but what if the researchers eventually succeed and we can't tell the difference between human and machine output? I'm not sure, but take another look at the example above. With a few adjustments and a few key words it could easily be turned into an EU referendum argument because the standard is not high is it? 


All original material is copyright of its author. Fair use permitted. Contact via comment. Unless indicated otherwise, all internet links accessed at time of writing. Nothing here should be taken as personal advice, financial or otherwise. No liability is accepted for third-party content, whether incorporated in or linked to this blog; or for unintentional error and inaccuracy. The blog author may have, or intend to change, a personal position in any stock or other kind of investment mentioned.

No comments: