Сучасні інформаційні системи (Dec 2020)
A DYNAMIC EXPLANATION MODEL FOR HUMAN-COMPUTER INTERFACE
Abstract
The subject matter of the article is the processes of automated construction of explanations on the operation of an intelligent system for use in the human-computer interface. The goal is to develop a dynamic model of explanation for the human-computer interface using temporal knowledge about the process of functioning of the intelligent system. Temporal knowledge makes it possible to set possible sequences of decision-making actions in an intelligent system based on the known temporal order for pairs of such actions. Tasks: to develop an approach to constructing explanations for the operation of an intelligent system based on the use of temporal knowledge; development of a three-aspect model of explanations using temporal knowledge. The approaches used are: approaches to the construction of knowledge representation based on temporal dependencies, approaches to the construction of chatbot answers using rules, as well as with their automatic generation. The following results are obtained. The structuring of aspects of explanation taking into account the possibilities of their description with the help of temporal knowledge is performed; a temporal approach to constructing an explanation is proposed; a dynamic explanation model using temporal rules has been developed. Conclusions. The scientific novelty of the results is as follows. A temporal approach to constructing explanations for the operation of an intelligent system is proposed. The approach describes explanation as a process consisting of a temporally ordered sequence of facts. The order of time for pairs of facts is determined by temporal rules. Such rules may define the explanation process with varying degrees of detail over time, depending on the request for clarification. Detailed explanations reflect the subject area model and include the basic and alternative sequences of actions performed by the intelligent system. The explanation of the basic patterns of the intelligent system makes it possible to interpret the limitations that affect the obtained solution. The explanation of the system as a whole provides an implicit reflection of the key causal relationships, which allows you to get a simplified interpretation of the results of the intelligent system. A dynamic model of describing explanations based on temporal knowledge for use in the human-computer interface is proposed. The model takes into account the description of actions in the subject area, the patterns of these actions, as well as generalized causal relationships between such patterns. The model provides an opportunity to present the dynamics of the process of functioning of the intelligent system with the required level of detail, as well as change the level of detail to clarify the explanation at the request of the user.
Keywords