Speech Technology Magazine

 

Star Performers: Expect More from Expect Labs

By Leonard Klie - Posted Jul 29, 2014
Page1 of 1
Bookmark and Share

Expect Labs, a San Francisco start-up, in mid-December released the MindMeld intelligent conversation assistant app for Apple's iPad. MindMeld takes intelligent assistant technology to a new level by continuously understanding conversations and proactively finding related information in real time.

As the user talks, MindMeld listens, understands, and automatically finds relevant content from the user's social graph or across the Web. The app works with a single user or in a conversational setting with up to eight people at once.

Key features of the MindMeld application include continuous conversation analysis, which shows the key concepts it understands in real time, and proactive content discovery, which automatically identifies and displays pictures, videos, articles, and Web pages that are directly relevant to the conversation as it is taking place. The technology also includes touch-driven sharing and collaboration capabilities that let users share the content it finds with everyone involved in the conversation.

"We're making it easier for users to find information, especially on devices without keyboards," explains Tim Tuttle, CEO and founder of Expect Labs.

Originally, the app was designed to be always listening, but the current version needs to be powered up before running in the background. The app can be activated by saying "OK MindMeld" or by pushing a button on the iPad's display.

Expect Labs recently secured funding from several high-profile investors, including Intel Capital, Samsung Venture Investment, and Telefonica Digital, to help it expand its solutions set. It had previously secured backing from Google, Greylock, Bessemer, IDG Ventures, KPG Ventures, Quest Venture Partners, and other angel investors.

"Expect Labs has taken a unique approach to modeling context using sensor signals, such as GPS and audio, that are available in the new generation of computing devices," said Brannon Lacey, principal at Samsung Venture Investment, in a statement. "We think this approach is an important step toward creating a new layer of application and device intelligence."

In the coming months, Expect Labs intends to release versions of MindMeld for the iPhone and Android smartphones and tablets.

To expand MindMeld's use even further, Expect Labs in February released the MindMeld API developer platform. After signing up for a free developer account, developers can begin using the MindMeld API by pointing it to their Web sites or databases; the app will automatically crawl and index content and build a custom knowledge graph from the data. After that, the developer drops a few lines of code into the app to begin sending real-time contextual signals from users to the MindMeld platform. The API then retrieves search results and recommendations that can be displayed to users either proactively or in response to search queries.

According to Tuttle, MindMeld is the company's first exploration of how voice can be used along with other contextual cues to make it easy to discover information using any device.

Page1 of 1