skip to main content
10.1145/2462307.2462322acmconferencesArticle/Chapter ViewAbstractPublication PagesicpeConference Proceedingsconference-collections
research-article

Position paper: cloud system deployment and performance evaluation tools for distributed databases

Published:21 April 2013Publication History

ABSTRACT

Creating system setups for controlled performance evaluation experiments of distributed systems is time-consuming and expensive. Re-creating experiment setups and reproducing experimental results that have been published by other researchers is even more challenging. In this paper, we present an experiment automation approach for evaluating distributed systems in compute cloud environments. We propose three concepts which should guide the design of experiment automation tools: (1) capture experiment plans in software modules, (2) run experiments in a publicly accessible cloud-based Elastic Lab, and (3) collaborate on experiments in an open, distributed collaboration system. We developed two tools which implement these basic concepts and discuss challenges and lessons learned during our implementation. An initial exemplary use case with Apache Cassandra on top of Amazon EC2 provides a first insight into the types of performance and scalability experiments enabled by our tools.

References

  1. A. Avetisyan, R. Campbell, I. Gupta, M. Heath, S. Ko, G. Ganger, M. Kozuch, D. O'Hallaron, M. Kunze, T. Kwan, K. Lai, M. Lyons, D. Milojicic, H. Y. Lee, Y. C. Soh, N. K. Ming, J.-Y. Luke, and H. Namgoong. Open Cirrus: A Global Cloud Computing Testbed. Computer, 43(4):35--43, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. R. A. Bailey. Design of Comparative Experiments. Cambridge Series in Statistical and Probabilistic Mathematics. University of London, 2nd edition, 2008.Google ScholarGoogle Scholar
  3. C. Binnig, D. Kossmann, T. Kraska, and S. Loesing. How is the Weather tomorrow? Towards a Benchmark for the Cloud. In Proceedings of the Second International Workshop on Testing Database Systems, DBTest '09, pages 9:1--9:6, New York, NY, USA, 2009. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. P. Bodik, A. Fox, M. J. Franklin, M. I. Jordan, and D. A. Patterson. Characterizing, modeling, and generating workload spikes for stateful services. In Proceedings of the 1st ACM symposium on Cloud computing, SoCC '10, pages 241--252, New York, NY, USA, 2010. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. B. Chun, D. Culler, T. Roscoe, A. Bavier, L. Peterson, M. Wawrzoniak, and M. Bowman. PlanetLab: An Overlay Testbed for Broad-Coverage Services. SIGCOMM Comput. Commun. Rev., 33(3):3--12, 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. B. F. Cooper, A. Silberstein, E. Tam, R. Ramakrishnan, and R. Sears. Benchmarking Cloud Serving Systems with YCSB. In Proceedings of the 1st ACM symposium on Cloud Computing, SoCC '10, pages 143--154, New York, NY, USA, 2010. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. C. Dumitrescu, I. Raicu, M. Ripeanu, and I. Foster. Diperf: An automated distributed performance testing framework. In Proceedings of the 5th IEEE/ACM International Workshop on Grid Computing, GRID '04, pages 289--296, Washington, DC, USA, 2004. IEEE Computer Society. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. J. Elson and J. Howell. Handling ash crowds from your garage. In USENIX 2008 Annual Technical Conference on Annual Technical Conference, ATC'08, pages 171--184, Berkeley, CA, USA, 2008. USENIX Association. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. B. A. Frigyik, A. Kapila, and M. R. Gupta. Introduction to the Dirichlet Distribution and Related Processes. Technical Report 206, Uni Washington, 2010.Google ScholarGoogle Scholar
  10. S. Gaisbauer, J. Kirschnick, N. Edwards, and J. Rolia. Vats: Virtualized-aware automated test service. In QEST, pages 93--102, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Z. Ganon and I. Zilbershtein. Cloud-based performance testing of network management systems. In Computer Aided Modeling and Design of Communication Links and Networks, 2009. CAMAD '09. IEEE 14th International Workshop on, pages 1--6, june 2009.Google ScholarGoogle ScholarCross RefCross Ref
  12. A. Iosup, N. Yigitbasi, and D. Epema. On the performance variability of production cloud services. In Cluster, Cloud and Grid Computing (CCGrid), 2011 11th IEEE/ACM International Symposium on, pages 104--113, may 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. M. Klems, D. Bermbach, and R. Weinert. A runtime quality measurement framework for cloud database service systems. In Proceedings of the 8th International Conference on the Quality of Information and Communications Technology, pages 38--46. IEEE, Conference Publishing Services (CPS), September 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. A. Lakshman and P. Malik. Cassandra: structured storage system on a P2P network. In Proceedings of the 28th ACM symposium on Principles of distributed computing, PODC '09, pages 5--5, New York, NY, USA, 2009. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. S. Patil, M. Polte, K. Ren, W. Tantisiriroj, L. Xiao, J. López, G. Gibson, A. Fuchs, and B. Rinaldi. YCSB++: Benchmarking and Performance Debugging Advanced Features in Scalable Table Stores. In Proceedings of the 2nd ACM Symposium on Cloud Computing, SoCC '11, New York, NY, USA, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. A. Pavlo, P. Couvares, R. Gietzel, A. Karp, I. D. Alderman, M. Livny, and C. Bacon. The nmi build & test laboratory: Continuous integration framework for distributed computing software. In LISA, pages 263--273, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Priyanka, I. Chana, and A. Rana. Empirical evaluation of cloud-based testing techniques: a systematic review. SIGSOFT Softw. Eng. Notes, 37(3):1--9, May 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. S. Ruby, D. Thomas, and D. H. Hansson. Agile Web Development with Rails. The Pragmatic Programmers. The Facets of Ruby Series. Pragmatic Bookshelf, Raleigh, NC, 3. ed. edition, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. C. C. Ruiz Sanabria, O. Richard, B. Videau, and I. Oleg. Managing Large Scale Experiments in Distributed Testbeds. Research Report RR-8106, INRIA, Oct. 2012.Google ScholarGoogle Scholar
  20. C. Sapuntzakis, D. Brumley, R. Chandra, N. Zeldovich, J. Chow, M. S. Lam, and M. Rosenblum. Virtual Appliances for Deploying and Maintaining Software. In Proceedings of the Seventeenth Large Installation Systems Administration Conference, October 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Y. Wang, M. J. Rutherford, A. Carzaniga, and A. L. Wolf. Automating experimentation on distributed testbeds. In Proceedings of the 20th IEEE/ACM international Conference on Automated software engineering, ASE '05, pages 164--173, New York, NY, USA, 2005. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Position paper: cloud system deployment and performance evaluation tools for distributed databases

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Conferences
        HotTopiCS '13: Proceedings of the 2013 international workshop on Hot topics in cloud services
        April 2013
        94 pages
        ISBN:9781450320511
        DOI:10.1145/2462307

        Copyright © 2013 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 21 April 2013

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • research-article

        Acceptance Rates

        HotTopiCS '13 Paper Acceptance Rate10of15submissions,67%Overall Acceptance Rate10of15submissions,67%

        Upcoming Conference

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader