Resource dependency processing in web scaling frameworks

Thomas Fankhauser, Qi Wang, Ansgar Gerlicher, Christos Grecos

Research output: Contribution to journalArticlepeer-review

154 Downloads (Pure)

Abstract

The upsurge of mobile devices paired with highly interactive social web applications generates enormous amounts of requests web services have to deal with. Consequently in our previous work, a novel request flow scheme with scalable components was proposed for storing interdependent, permanently updated resources in a database. The major challenge is to process dependencies in an optimal fashion while maintaining dependency constraints. In this work, three research objectives are evaluated by examining resource dependencies and their key graph measurements. An all-sources longest-path algorithm is presented for efficient processing and dependencies are analysed to find correlations between performance and graph measures. Two algorithms basing their parameters on six real-world web service structures, e.g. Facebook Graph API are developed to generate dependency graphs and a model is developed to estimate performance based on resource parameters. An evaluation of four graph series discusses performance effects of different graph structures. The results of an evaluation of 2000 web services with over 850 thousand resources and 6 million requests indicate that resource dependency processing can be up to a factor of two faster compared to a traditional processing approach while an average model fit of 97% allows an accurate prediction.
Original languageEnglish
Pages (from-to)1-14
Number of pages14
JournalIEEE Transactions on Services Computing
VolumePP
Issue number99
DOIs
Publication statusPublished - 3 May 2016

Keywords

  • reactive processing
  • scalability
  • web service
  • cloud computing
  • graph processing
  • job scheduling
  • dynamic programming

Fingerprint

Dive into the research topics of 'Resource dependency processing in web scaling frameworks'. Together they form a unique fingerprint.

Cite this