During QoS analysis, the software architecture is refined with information on the deployment context, the usage model, and the internal structure of components. Figure 2.6 shows the process in detail.
The deployer starts with the resource environment specification based on the software architecture and use case models. Given this information, the required hardware and software resources and their interconnections are derived. As a result, this workflow yields a discription of the resource environment, for example, a deployment diagram without allocated components or an instance of the resource environment model described in section 3.4.4. Instead of specifying a new resource environment, the deployer can also use the descriptions of existing hardware and software resources. Moreover, a set of representative system environments can be designed if the final resource environment is still unknown. For QoS analysis, detailed information on the resources modelled in the environment are required.
During allocation, the deployer specifies the mapping of components to resources. The result of this workflow can be a complete deployment diagram or a resource environment plus allocation contexts for components as described in section 3.4.5. The resulting specifications are part of the inputs of the QoS analysis models used later. The resulting fully specified resource environment and component allocation is passed to the QoS analyst.
The domain expert refines the use case models from the requirements during the use case analysis workflow. A description of the scenarios for the users is created based on an external view of the current software architecture. The scenarios describe how users interact with the system and what dependencies exists in the process. For example, activity charts or usage models (see section 3.5.2) can be used to describe such scenarios. The scenario descriptions are input to the usage model refinement. The domain expert annotates the descriptions with, for example, branching probabilities, expected size of different user groups, expected workload, user think times, and parameter characterisations.
As the central role in QoS analysis, the QoS analyst integrates the QoS relevant information, performs the evaluation, and delivers the feedback to all involved parties. In the QoS requirement annotation workflow, the QoS analyst maps QoS requirements to direct requirements of the software architecture. For example, the maximum waiting time of a user becomes the upper limit of the response time of a component's service. While doing so, the QoS analyst selects QoS metrics, like response time or probability of failure on demand, that are evaluated during later workflows.
During QoS information integration, the QoS analyst collects the specifications provided by the component developers, deployers, domain experts, and software architects, checks them for soundness, and integrates them into an overall QoS model of the system. In case of missing specifications, the QoS analyst is responsible for deriving the missing information by contacting the respective roles or by estimation and measurement. The system specification is then automatically transformed into a prediction model.
The QoS evaluation workflow either yields an analytical or simulation result. QoS evaluation aims, for example, at testing the scalability of the architecture and at identifying bottlenecks. The QoS analyst performs an interpretation of the results, comes up with possible design alternatives, and delivers the results to the software architect. If the results show that the QoS requirements cannot be fulfilled with the current architecture, the software architect has to modify the specifications or renegotiate the requirements.