skip to main content
10.1145/2609829.2609838acmconferencesArticle/Chapter ViewAbstractPublication PagesmobisysConference Proceedingsconference-collections
research-article

A gesture-based interface for the exploration and classification of protein binding cavities

Published:11 June 2014Publication History

ABSTRACT

Molecular biologists seek to explain why similar proteins bind different molecular partners. The visual examination and comparison of binding cavities in protein structures can reveal information about the molecular partners that bind to a protein. Similarities can reveal regions that accommodate similar molecular fragments, while differences in binding preferences can arise from regions where binding sites differ. By comparing the binding cavities of multiple proteins, further information about the specificity of each protein can be discovered. But the visual examination of protein structure is a difficult cognitive task that requires persistence and quantitative precision. Software supports these efforts, but software for analyzing structure is difficult to use for investigators without computational backgrounds. By enabling non-computational users to better use analytical software, we hope to support progress in structural biology. Below, we present LeapRenderer, a 3-dimensional gesture-driven interface for visualizing protein surface models controlled primarily by the Leap Motion Controller. LeapRenderer serves exploration and classification functions. It allows researchers to explore the protein structures by manipulating 3-D renderings via rotation and scaling. It also aids researchers in categorizing similar proteins into groups by providing a simple interface for comparing and sorting protein surfaces. These capabilities thereby support the discovery and classification of protein binding sites.

References

  1. W. H. Chen, C. T. Hsieh, and T. T. Liu. A real time hand gesture recognition system based on dft and svm. Applied Mechanics and Materials, 284:3004--3009, 2013.Google ScholarGoogle ScholarCross RefCross Ref
  2. W. L. DeLano. The PyMOL Molecular Graphics System, 2002.Google ScholarGoogle Scholar
  3. T. Grossman, D. Wigdor, and R. Balakrishnan. Multi-finger gestural interaction with 3d volumetric displays. In Proceedings of the 17th annual ACM symposium on User interface software and technology, pages 61--70. ACM, 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. L. H. Li and J. F. Du. Visual based hand gesture recognition systems. Applied Mechanics and Materials, 263:2422--2425, 2013.Google ScholarGoogle Scholar
  5. M. R. Mine, F. P. Brooks Jr, and C. H. Sequin. Moving objects in space: exploiting proprioception in virtual-environment interaction. In Proceedings of the 24th annual conference on Computer graphics and interactive techniques, pages 19--26. ACM Press/Addison-Wesley Publishing Co., 1997. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. D. Petrey and B. Honig. GRASP2: visualization, surface properties, and electrostatics of macromolecular structures and sequences. Method Enzymol, 374(1991):492--509, Jan. 2003.Google ScholarGoogle ScholarCross RefCross Ref
  7. E. F. Pettersen, T. D. Goddard, C. C. Huang, G. S. Couch, D. M. Greenblatt, E. C. Meng, and T. E. Ferrin. UCSF Chimera--a visualization system for exploratory research and analysis. J Comput Chem, 25(13):1605--12, Oct. 2004.Google ScholarGoogle ScholarCross RefCross Ref
  8. S. S. Rautaray and A. Agrawal. Vision based hand gesture recognition for human computer interaction: a survey. Artificial Intelligence Review, pages 1--54, 2012.Google ScholarGoogle Scholar
  9. G. Rovelo, D. Vanacken, K. Luyten, F. Abad, and E. Camahort. Multi-viewer gesture-based interaction for omni-directional video. In SICCHI Conference on Human Factors in Computing Systems (CHI), 2014. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. L. Song, R. M. Hu, H. Zhang, Y. L. Xiao, and L. Y. Gong. Real-time 3d hand gesture detection from depth images. Advanced Materials Research, 756:4138--4142, 2013.Google ScholarGoogle ScholarCross RefCross Ref
  11. J. P. Wachs, H. I. Stern, Y. Edan, M. Gillam, J. Handler, C. Feied, and M. Smith. A gesture-based tool for sterile browsing of radiology images. Journal of the American Medical Informatics Association, 15(3):321--323, 2008.Google ScholarGoogle ScholarCross RefCross Ref
  12. F. Weichert, D. Bachmann, B. Rudak, and D. Fisseler. Analysis of the accuracy and robustness of the leap motion controller. Sensors (Basel, Switzerland), 13(5):6380, 2013.Google ScholarGoogle ScholarCross RefCross Ref
  13. J. O. Wobbrock, A. D. Wilson, and Y. Li. Gestures without libraries, toolkits or training: a$1 recognizer for user interface prototypes. In Proceedings of the 20th annual ACM symposium on User interface software and technology, pages 159--168. ACM, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Z. Q. Zhang and W. Zhu. Robust hand gesture detection based on feature classifier. Advanced Materials Research, 823:626--630, 2013.Google ScholarGoogle ScholarCross RefCross Ref

Index Terms

  1. A gesture-based interface for the exploration and classification of protein binding cavities

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Conferences
        MARS '14: Proceedings of the 2014 workshop on Mobile augmented reality and robotic technology-based systems
        June 2014
        60 pages
        ISBN:9781450328234
        DOI:10.1145/2609829

        Copyright © 2014 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 11 June 2014

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • research-article

        Acceptance Rates

        MARS '14 Paper Acceptance Rate6of7submissions,86%Overall Acceptance Rate6of7submissions,86%

        Upcoming Conference

        MOBISYS '24

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader