Endringsdyktige og troverdige systemer Modellering av avhengigheter for å evaluere systemkvalitet 15. jan. 2009 Aida Omerovic SINTEF IKT/UiO 1
Outline Motivation PREDIQT method Practical application of the method 2
Background System change is inevitable, due to: Maintainance, system evolution Organisational change New collaboration patterns, functionalities, users, regulations, standards, technologies Dynamics of systems/componenets in collaboration with varying QoS 3
Objectives Trustworthy architecture over time, inspite of adaptions. Preserve both adaptability and trustworthiness. Reduce the time and risk of enabling and deploying more/additional types of collaborations. 4
Approach Only technical, objective and quantifiable aspects Modeling and simulation Predicting architectural change impacts on relevant quality characteristics 5
PREDIQT: Model Based Prediction of Change Impacts on Architecture Quality Enables prediction of architectural change-impacts on the quality characteristics. Prior to change deployment Preventative or adaptive, rather than a corrective approach Based on Quality models Design models Dependency views 6
Method overview: the output levels Dependency views 7
Method overview: the process Usage profile, SLA, requirements Collect input Define requirements Quality models (UML, ISO) Design models (UML) Phase 1: creation and evaluation of the prediction models Phase 2: Use of the models to predict impacts of changes Dependency views (UML) Change deployed? Yes Models QA Quality prediction Change within the predictability domain? Quality impact analysis and simulation New change enforcement on prediction models No Revise prediction models? No Fit the models (quantitative and graphical) Yes Estimation Measurement Statistics Indicators Logs Model evaluation: -Evaluate quality of predictions -Derive the predictability domain 8
Applying the method A case study addressing quality prediction of Validation Authority service (DNV) Fall 2008 9
Extracts from VAS specific quality models 10
Conceptual VA quality model - overall Quality: "The totality of characteristics of an entity that bear on its ability to satisfy stated and implied needs" [ISO 8402] 11
Conceptual VA quality model - availability Def.: the capability of the software product to be in a state to perform a required function at a given point in time, under stated conditions of use. [ISO/IEC 9126-1]" uptime Rating: Availability uptime + downtime Have to take into account overall availability, as well as service continuity. Downtime: incorrect operation time, downtime (planned or unplanned) etc. 12
Extracts from design models of Validation Authority Service 13
VA interfaces 14
Certificate validation 15
Signature verification 16
A general dependency view 17
Dependency views Based on quality and design models we derive attribute specific dependency views. Decomposition is carried on until estimates can be assigned with acceptable certainty. Nodes on the views: modules, subsystems, services and aspects. 18
Simulations Dependency views for each quality attribute Sensitivity analysis Tool supported simulation of change impacts On ALL the dependency views, automatically Next, live demos with an exemplary change 19
Fitted dependency views Fictitious values 20
Availability 1,00 0,95 0,90 0,85 0,80 0,75 0,70 0,65 Sensitivity 0 0,2 0,4 0,6 0,8 1 QCF Updates Upgrading Monitoring Database eff. mech. Message routing OS support services Middleware support serv. Hardware Network Measures for OP env. prot. User management Gateway Other Current Availability Sensitivity 1,00 0,95 0,90 0,85 0,80 0,75 0,00 0,10 0,20 0,30 0,40 0,50 EI Updates Upgrading Monitoring Database eff. mech. Message routing OS support services Middleware support serv. Hardware Network Measures for OP env. prot. User management Gateway Other Current Fictitious values 21
Fitted dependency views Fictitious values 22
Sensitivity Scalability 0,97 0,92 0,87 0,82 0,77 0 0,2 0,4 0,6 0,8 1 Updates Upgrading Monitoring Database eff. mech. Message routing OS support services Middleware support serv. Hardware Network Measures for OP env. prot. User management Gateway Other Current QCF Scalability Sensitivity 1,00 0,95 0,90 0,85 0,80 0,75 0,00 0,05 0,10 0,15 0,20 0,25 EI Updates Upgrading Monitoring Database eff. mech. Message routing OS support services Middleware support serv. Hardware Network Measures for OP env. prot. User management Gateway Other Current Fictitious values 23
Fictitious values Security 1,00 0,95 0,90 0,85 0,80 0,75 0,70 0,65 0,60 Sensitivity 0 0,2 0,4 0,6 0,8 1 QCF Updates Upgrading Monitoring Database eff. mech. Message routing OS support services Middleware support serv. Hardware Network Measures for OP env. prot. User management Gateway Other Current Security 1,00 0,95 0,90 0,85 0,80 0,75 0,70 0,65 0,60 Sensitivity 0,00 0,10 0,20 0,30 0,40 EI Updates Upgrading Monitoring Database eff. mech. Message routing OS support services Middleware support serv. Hardware Network Measures for OP env. prot. User management Gateway Other Current 24
Fitted dependency views Sensitivity Fictitious values 1,000 Availability 0,900 Scalability Sensitivity Availability Total quality 0,800 0,700 0,600 0,500 Security Current Other Total quality 1,00 0,95 0,90 0,85 0,80 Scalability Security Other 0,400 0 0,2 0,4 0,6 0,8 1 Quality attribute 0,75 0,00 0,10 0,20 0,30 0,40 0,50 0,60 Contrib Current 25
Applying PREDIQT 1. Specify the change 2. Enforce the change on the prediction models Design models, nodes (and arcs) of dependency views 3. Establish whether the change is within the prediction domain If point 2 was feasible and sensitivity within the acceptable threshold 4. Simulate the impacts on the dependency views 26
Summary The method enables prediction of implications of architectural changes on the quality characteristics. Applicable as a preventative or adaptive approach at any lifecycle stage Reduces the time and risk of enabling and deploying adaptions Tried out in a major case study and evaluated empirically 27
Thank you! Comments, questions? 28