Trust-Aware Goal Modeling from Use Case for Cooperative Self-Adaptive Systems

The self-adaptive systems are the systems that can adjust its behavior based on the dynamically changing environments to provide the appropriate services to the user. However, as the user demands the complicated service from the self-adaptive systems, they should cooperate with other systems because...

Full description

Saved in:
Bibliographic Details
Published in2018 IEEE International Conference on Systems, Man, and Cybernetics (SMC) pp. 4405 - 4410
Main Authors Kim, Min-Ju, Shehab, Mohamed, Lee, Hyo-Cheol, Lee, Seok-Won
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.10.2018
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The self-adaptive systems are the systems that can adjust its behavior based on the dynamically changing environments to provide the appropriate services to the user. However, as the user demands the complicated service from the self-adaptive systems, they should cooperate with other systems because they have the dedicated services for the specific demand. The system should select the appropriate cooperation partner among many candidates. In this situation, the trust becomes one of the important criteria to achieve the minimum level of security in the open and distributed environment. However, many existing works on the self-adaptive system focus on the internal adaptation process or the adaptation in the closed environment. As the first attempt to tackle the trust attribute, in this paper, we propose the trust-aware goal modeling from the use case for cooperative self-adaptive system. By analyzing the characteristics of the trust in the requirements engineering process, we can understand when the trust should be considered and how it can be represented in the system design. In addition, we illustrate the unmanned vehicle scenario to show how the proposed approach can be applied to the real case.
ISSN:2577-1655
DOI:10.1109/SMC.2018.00744