Advertisement

Plasticity wants to help chatbots seem less robotic

Plasticity wants to help chatbots seem less robotic
From TechCrunch - August 11, 2017

Y Combinator backed Plasticityis tackling the problem of getting software systems to better understand text, using deep learning models trained to understand what theyre reading on Wikipedia articlesand offering an API for developers to enhance their own interfaces.

Specifically theyre offering two APIs for developers to build more robust conversational interfaces, as they put itwith the aim of becoming a centralized solution for Natural Language Processing (NLP). Their APIs are due to be switched from private to public beta on Monday.

One thing where we think this is really useful for is conversational interfaces where you want to integrate real world knowledge, says co-founder Alex Sands. We think its also really useful when you want to provide instant answers in your applicationwhether thats over the entire Internet or over a custom corpus.

One example might be search engines that are competing with Google and dont have their own instant answer technology. [They] could use something like this. Were in talks with a few of them, notes Sands.

The other application is conversational interfaces who want a new NLP stack that will give them a lot more information than what an academic package like Stanford CoreNLP would provide them today, he adds.

A few years back, the founders worked on a hack project that expanded the powers of Apples AI voice assistant Siri, by adding support for custom commandssuch as playing a Spotify track or dialing up the temperature via a Nest device. This was before Apple opened up Siri to third party apps so they were routing voice commands through a proxyand claim to have basically built the first app store for voice commands.

The experience taught them that NLP in general is not robust for handling more complex commands and queries, says other co-founder Ajay Patel.

The other problem was a lot of the natural language processing tools out there really take a simplistic approach to understanding what a user says, he adds. The most simplistic way to explain it is theyre looking for keywords to figure out what a user is asking.

Plasticity is taking a different approach vs these keyword-based NLP systems; building a system that understands the semantics of a sentence so it can perform a linguistic breakdownto figure out all of the relationships and entities in a sentence.

They can then hand that information to developers so they can build more robust conversational interfaces around it, as Patel puts itsuch as, for example, a chatbot thats more conversational and capable, given it can serve up answers it found online.

Today you can ask Siri fact-based questions, like who directed a movie, or who a particular song. But you cant ask it a more useful question, like when is Stanford Spring break? he adds. It cant take a sentence from the Internet and then find the direct answer in that sentence and then return that to the user.

Instead Siri usually performs a Google search and serves those results to the userleaving users to do the last mile legwork of extracting an actual answer.

Advertisement

Continue reading at TechCrunch »