-->

Q&A: Dr. Nava Shaked on Evaluation, Testing Methodology & Best Practices for Speech-Based Interaction Systems

Article Featured Image

Dr. Nava  Shaked, Head of the Multidisciplinary Department, Holon Institute of Technology recently answered the following questions about testing speech and multimodal applications:

Q: Tell us about the three-hour workshop you will present on April 29 at the SpeechTEK Conference in Washington, DC.

A: Testing and evaluation processes are crucial to the success of any NLP conversational system, but testing IVR and multimodal systems presents unique challenges. Focusing on multimodal applications that involve speech and other modalities, we describe the multiple layers of testing and QA: engine quality, functional application, VUI, interfaces and infrastructure, load balancing, backup, and recovery. Learn how to set testing goals, targets, and success factors; specify and measure metrics; test and measure “soft” and “immeasurable” targets; test documentation in all stages; manage a testing project, and identify who should be on the testing team.

Q: How is testing speech-based systems different from testing other software?

A: Testing software is mainly about functionality and stability of code to support proper implementation of the tool/app/device. Speech Systems have another level of testing, the quality of the dialog, the usability of the implementation and the customer experience. The speech engine can function perfectly, nonetheless, the system as a whole could not be operational.

Q: Do text-based testing strategies and tools work for multimodal systems?

A: Text-based systems are one aspect of Multimodal interaction. They can support the rest of the modalities in case needed. They can disambiguate interaction and clarify language intent. They can add info given by speech or visual and more.

Q: How much of the testing process can be automated?

A: A lot can be automated but some issues will never fall under this category; for example identifying False Accept recognition which can only be done manually. User experience is also something that can be partly tested automatically but subjectivity is crucial for the understanding of user behavior.

Q: Sometimes developers don’t think about some classes of potential users.  How do testers identify and include these users in testing?

A: While engaging in Usability testing we try to approach different segments which are not typical Also we can do some WOS testing (a testing technique that can be applied to test functionality before the functionality is implemented) to identify out of the box approach to UI/UX design

Q: What’s wrong just releasing a "beta" of the app and let prospective users test it?

A: You can do that in a “friends and family” pilot but to go “live” with a beta system, in the digital Era for millennial users is very very risky.

Register for the SpeechTEK Conference and Dr. Shaked's workshop. There are still openings for SpeechTEK University workshops and presentations. Submit proposals here by October 11, 2002. 

SpeechTek Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues
Related Articles

Q&A: Sam Ringer Says the Revolution Is Coming — the Medium-Term Future of AI and ML

Current AI and machine learning (ML) technologies are starting to change the way we build and innovate. However, the power of our current ML technologies is not fixed. Sam Ringer will explore where ML is at the moment.

Q&A: Greg Stack on Blazing a Trail to Successful AI Migration

Greg Stack, Vice President, Speech-Soft Solutions, will be presenting at SpeechTEK 2020. Conference co-chair, Jim Larson, caught up with him to talk about "Blazing a Trail to Successful AI Migration."  

Q&A: Bruce Balentine on the Basics of Conversational Chatbots

Conversational chatbots may be the future (or even the present) but Bruce Balentine is taking developers back to basics in his workshop at SpeechTEK in April. Learn how to build chatbots that get results.

Q&A: David Attwater on the Ins and Outs of Conversation Design

Everything you need to know about conversational design. Jim Larson talked to David Attwater, Senior Scientist, Enterprise Integration Group about his upcoming workshop, AI, and how human is too human?

Q&A: Deborah Dahl on Natural Language Understanding

Jim Larson talked to Dr. Deborah Dahl, Principal, Conversational Technologies about the increasing importance and capabilities of natural language processing, speech recognition, and

Q&A: Anand Janefalkar on the Ideal IVR

Poorly designed IVRS have been angering customers for decades, but it doesn't have to be this way. We talked to the Founder and CEO of UJET about how well designed IVRs can improve customer experience.