[en] This thesis studies the problem of performance evaluation and assurance of communications and services
quality in cloud computing. The cloud computing paradigm has significantly changed the way of doing
business. With cloud computing, companies and end-users can access the vast majority of services online
through a virtualized environment. The main three services typically consumed by cloud users are
Infrastructure-as-a-Service (IaaS), Platform-as-a-Service (PaaS) and Software-as-a-Service (SaaS).
Cloud Services Providers (CSPs) deliver cloud services to cloud customers on a pay-per-use model while
the quality of the provided services is defined using Service Level Agreements (SLAs). Unfortunately,
there is no standard mechanism which exists to verify and assure that delivered services satisfy the signed
SLA agreement in an automatic way, which impedes the possibility to measure accurately the Quality of
Service (QoS). In this context, this thesis aims at offering an automatic framework to evaluate the QoS
and SLA compliance of Web Services (WSs) offered across several CSPs. Yet unlike other approaches, the
framework aims at quantifying in a fair and by stealth way the performance and scalability of the delivered
WS. By stealthiness, it refers to the capacity of evaluating a given Cloud service through multiple workload
patterns that makes them indistinguishable from regular user traffic from the provider point of view.
This thesis work is motivated by recent scandals in the automotive sector, which demonstrate the capacity
of solution providers to adapt to the behavior of their product when submitted to an evaluation campaign to
improve the performance results. The framework defines a set of Common performance metrics handled
by a set of agents within customized clients for measuring the behavior of cloud applications on top of
a given CSP. Once modeled accurately, the agent behavior can be dynamically adapted to hide the true
nature of the framework client to the CSP. In particular, the following contributions are proposed:
• A new framework of performance metrics for communication systems of cloud computing SaaS.
The proposed framework evaluates and classifies in a fair and stealth way the performance and
scalability of the delivered WS across multiple CSPs.
• Analysis of the performance metrics for the cloud SaaS Web Service (WS) by analyzing all the possible
metrics which could be used to evaluate and monitor the behavior of the cloud applications.
• Benchmarking the cloud SaaS applications and web services by using referenced benchmarking
tools and frameworks.
• Modeling the SaaS WS by providing a set of Gaussian models. These models can be used to help
the other researchers to generate data representing the CSP’s behavior under a high load and
under the normal usage in just a couple of minutes without any experiments.
• A novel optimization model to obfuscate the testing from the CSP and to achieve stealthiness. The
optimization process relies on meta-heuristic and machine learning algorithms, such as Genetic
Algorithm and Gaussian Process Regression accordingly.
• A virtual QoS aggregator and SLA checker which takes care of evaluating the QoS and SLA compliance of the WS offered across the considered CSPs.
• Ranking multiple CSPs based on multi-criteria decision analysis.
Research center :
- Interdisciplinary Centre for Security, Reliability and Trust (SnT) > Other
Disciplines :
Computer science
Author, co-author :
Ibrahim, Abdallah Ali Zainelabden Abdallah ; University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > Computer Science and Communications Research Unit (CSC)
Language :
English
Title :
PERFORMANCE EVALUATION AND MODELLING OF SAAS WEB SERVICES IN THE CLOUD
Defense date :
10 January 2020
Number of pages :
201
Institution :
Unilu - University of Luxembourg, Esch-sur-Alzette, Luxembourg
Degree :
DOCTEUR DE L’UNIVERSITÉ DU LUXEMBOURG EN INFORMATIQUE