Matt Yuschik is a Human Factors Specialist at Convergys Labs in Cincinnati, OH. He is currently investigating MultiModal User Interfaces (MMUIs) which combine speech, graphics and the keyboard at the user's desktop, and addressing key issues of modal integration and user modality preferences. Yuschik previously worked at Comverse Network Services to design, develop and perform Usability Tests on Voice Controlled Voice Mail, a first in-the-world network-based service that is deployed in the US and numerous European countries and languages. His Dialog Model for this VUI has received 2 patents. Matt also was a lead UI designer at Ameritech Services where he help developed a first-in-the-network TTS service for Reverse Directory Assistance. He has a PhD in Electrical Engineering with strengths in DSP and Natural Language Processing, and has been on the AVIOS Board of Directors for 8 years.
Articles by Matt Yuschik
Banking on Multimodality
10 Nov 2012
For financial transactions, gains are seen but challenges remain.
A View from the Voice Search Conference
01 Jun 2009
Opportunities abounded to try something new.
A Look at AVIOS' Speech and Multimodality Contest
01 Jun 2008
Students introduce voice to apps covering everything from airplanes to arithmetic
Multimodal Interfaces Discussed at SpeechTEK /AVIOS 2006
03 Jul 2006
Moving from the Art to the Science of Voice User Interfaces (VUIs)
30 Jun 2003
Voice User Interfaces (VUIs) are moving from an art form to an applied science. Many ASR vendors include toolkits with modules for common interactions (like, entering a telephone number). While there are still uncertain areas in the voice transaction between a person and a computer, the understanding of requirements for successful, basic interactions is ever growing.