-->

OrCam Unveils OrCam Hear

OrCam Technologies has introduced the OrCam Hear device, an artificial intelligence-driven assistive technology for individuals with hearing impairment.

Powered by AI and voice enhancement deep learning models, OrCam Hear isolates the voices of distinct speakers using their voice signatures. It seamlessly amplifies the targeted speakers while removing other voices and ambient noise.

The OrCam Hear's earbuds and mobile phone dongle are controlled by a dedicated app available for iPhone. After sampling the selected voices for a few seconds, using AI, the app creates a unique speaker profile that encapsulates the voice signature for each speaker. This allows users to selectively isolate specific voices, even in noisy environments. The OrCam Hear user can select the members of the conversation by turning each speaker on/off accordingly, with a single tap.

"The use of deep networks plus the latest network architecture large language models are harnessed to make a game-changing experience for hearing aids and hearables in general. The problem of speech in noise or commonly called the cocktail party problem in academic literature has been a very difficult problem to tackle at a product level. Moving from an academic demonstration to a seamless product took years to perfect," said Amnon Shashua, OrCam's co-founder, in a statement.

OrCam also this week unveiled an advanced AI companion for OrCam MyEye for people with visual impairments. It offers text reading, facial recognition, product identification, and dynamic interactions similar to natural conversations. Whether by voice, gesture, or tap on the device, users can now inquire about their surroundings.

OrCam MyEye's artificial intelligence extends its reach beyond immediate environmental interaction to access extensive information. By engaging with the web, the AI companion offers users a detailed understanding of a wide range of topics and current events. This provides them with verbal access to visual/text information from both offline and online sources.

"OrCam's MyEye, equipped with the new AI companion, including the latest large language models (LLMs), now serves as a beacon of innovation in assistive technology," Shashua said. "To provide users with visual impairments with a high level of autonomy, the OrCam MyEye uses sophisticated algorithms and LLMs models to better understand and respond to natural language queries. This makes interactions more intuitive and effective for users as they navigate their informational and physical surroundings. This new AI companion highlights the progress being made in AI to create a more inclusive and empowered experience for those who rely on technological assistance in their daily lives."

OrCam also introduced a Just Ask feature in Read3, its AI magnifier for the visually impaired with free speech interaction.

OrCam Read 3 includes an interactive AI assistant and functions as a handheld reading companion, as a magnifier, or a stationary reader. The Just Ask smart magnifier feature captures image of the text or handwriting and processes it with its AI capabilities.

First, it summarizes any captured text. The user can ask follow-up questions related to the text and beyond. In addition, it can read the full text aloud.

"Advancements in AI are not merely a matter of technological evolution; they represent a significant shift in how we interact with our world. With the right application, AI can extend human capabilities, open new prospects for innovation, and create a landscape where technology and humanity progress in harmony," Shashua said.

SpeechTek Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues