Back to Search Start Over

Convolutional neural network for substantiation of children's books, intelligent interactive communication and application analysis of voice question answering.

Authors :
Zhang, Wenting
Liu, Qian
Source :
Entertainment Computing; Jan2024, Vol. 48, pN.PAG-N.PAG, 1p
Publication Year :
2024

Abstract

• Artificial intelligence has also made great progress with the improvement. • Traditional toys cannot be compared with children's reading robots. • It can bear 100 kg of objects, and its speed is not much different from that of people. • Audrey speech recognition is a logo developed by AT&T Bell Laboratories. At present, there are many children accompanying robots. A key indicator to measure their performance is convenient and efficient human–computer interaction. The implementation of voice question-answering technology needs to solve three problems: voice recognition, the establishment of a knowledge base, and answer matching. As the front-end data entry, speech recognition accuracy is directly related to the answer-matching effect of the back-end question-answering system. Based on convolutional neural network and intelligent communication technology, this paper analyzes the characteristics of children's reading robot and designs and develops intelligent voice interaction function on the Android platform. The focus of this paper is on Chinese word segmentation and keyword extraction and matching in the interaction function. Further, we designed a simple question-and-answer database for the children's reading robot. In this paper, convolutional neural network connection timing is used as the feature parameter. Through a large number of experiments, the structure of the CNN network is constantly adjusted to minimize the word error rate, optimize the model recognition rate, and better identify users' voice problems. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
18759521
Volume :
48
Database :
Supplemental Index
Journal :
Entertainment Computing
Publication Type :
Academic Journal
Accession number :
173752034
Full Text :
https://doi.org/10.1016/j.entcom.2023.100608