skip to main content
10.1145/1230100.1230123acmconferencesArticle/Chapter ViewAbstractPublication Pagesi3dConference Proceedingsconference-collections
Article

Parametric motion graphs

Published: 30 April 2007 Publication History

Abstract

In this paper, we present an example-based motion synthesis technique that generates continuous streams of high-fidelity, controllable motion for interactive applications, such as video games. Our method uses a new data structure called a parametric motion graph to describe valid ways of generating linear blend transitions between motion clips dynamically generated through parametric synthesis in realtime. Our system specifically uses blending-based parametric synthesis to accurately generate any motion clip from an entire space of motions by blending together examples from that space. The key to our technique is using sampling methods to identify and represent good transitions between these spaces of motion parameterized by a continuously valued parameter. This approach allows parametric motion graphs to be constructed with little user effort. Because parametric motion graphs organize all motions of a particular type, such as reaching to different locations on a shelf, using a single, parameterized graph node, they are highly structured, facilitating fast decision-making for interactive character control. We have successfully created interactive characters that perform sequences of requested actions, such as cartwheeling or punching.

References

[1]
Allen, B., Curless, B., and Popovic, Z. 2002. Articulated body deformation from range scan data. ACM Transactions on Graphics.
[2]
Arikan, O., and Forsythe, D. A. 2002. Interactive motion generation from examples. ACM Transactions on Graphics.
[3]
Arikan, O., Forsyth, D. A., and O'Brien, J. 2003. Motion synthesis from annotations. ACM Transactions on Graphics.
[4]
Bruderlin, A., and Williams, L. 1995. Motion signal processing. In ACM SIGGRAPH.
[5]
Gleicher, M., Shin, H. J., Kovar, L., and Jepsen, A. 2003. Snap-together motion: Assembling run-time animation. In ACM SIGGRAPH Symposium on Interactive 3D Graphics.
[6]
Hodgins, J., W. Wooten, Brogan, D., and O'Brien, J. 1995. Animating human athletics. In ACM SIGGRAPH.
[7]
Kim, T., Park, S., and Shin, S. 2003. Rhythmic-motion synthesis based on motion-beat analysis. ACM Transactions on Graphics.
[8]
Kovar, L., and Gleicher, M. 2004. Automated extraction and parameterization of motions in large data sets. ACM Transactions on Graphics.
[9]
Kovar, L., Gleicher, M., and Pighin, F. 2002. Motion graphs. ACM Transactions on Graphics.
[10]
Kwon, T., and Shin, S. Y. 2005. Motion modeling for on-line locomotion synthesis. In ACM SIGGRAPH/Eurographics Symposium on Computer Animation.
[11]
Lamouret, A., and van de Panne, M. 1996. Motion synthesis by example. In Eurographics workshop on Computer animation and simulation.
[12]
Lee, J., and Lee, K. H. 2004. Precomputing avatar behavior from human motion data. In ACM SIGGRAPH/Eurographics Symposium on Computer Animation.
[13]
Lee, J., Chai, J., Reitsma, P., Hodgins, J., and Pollard, N. 2002. Interactive control of avatars animated with human motion data. ACM Transactions on Graphics.
[14]
Mizuguchi, M., Buchanan, J., and Calvert, T. 2001. Data driven motion transitions. In Eurographics Short Presentations.
[15]
Mukai, T., and Kuriyama, S. 2005. Geostatistical motion interpolation. In ACM SIGGRAPH.
[16]
Park, S. I., Shin, H. J., and Shin, S. Y. 2002. On-line locomotion generation based on motion blending. In ACM SIGGRAPH/Eurographics Symposium on Computer Animation.
[17]
Perlin, K., and Goldberg, A. 1996. Improv: A system for scripting interactive actors in virtual worlds. In ACM SIGGRAPH.
[18]
Perlin, K. 1995. Real time responsive animation with personality. IEEE Transactions on Visualization and Computer Graphics.
[19]
Reitsma, P. S. A., and Pollard, N. S. 2004. Evaluating motion graphs for character navigation. In ACM SIGGRAPH/Eurographics Symposium on Computer Animation.
[20]
Rose, C., Guenter, B., Bodenheimer, B., and Cohen, M. 1996. Efficient generation of motion transitions using spacetime constraints. In ACM SIGGRAPH.
[21]
Rose, C., Cohen, M., and Bodenheimer, B. 1998. Verbs and adverbs: multidimensional motion interpolation. IEEE CG&A.
[22]
Schödl, A., Szeliski, R., Salesin, D. H., and Essa, I. 2000. Video textures. In ACM SIGGRAPH.
[23]
Shin, H. J., and Oh, H. S. 2006. Fat graphs: Constructing an interactive character with continuous controls. In ACM SIGGRAPH/Eurographics Symposium on Computer Animation.
[24]
Srinivasan, M., Metoyer, R. A., and Mortensen, E. N. 2005. Controllable character animation using mobility maps. Graphics Interface.
[25]
Sun, H., and Metaxas, D. 2001. Automating gait animation. In ACM SIGGRAPH.
[26]
Sung, M., Kovar, L., and Gleicher, M. 2005. Fast and accurate goal-directed motion synthesis for crowds. In ACM SIGGRAPH/Eurographics Symposium on Computer Animation.
[27]
Wiley, D., and Hahn, J. 1997. Interpolation synthesis of articulated figure motion. IEEE CG&A.

Cited By

View all
  • (2024)Dog Code: Human to Quadruped Embodiment using Shared CodebooksProceedings of the 17th ACM SIGGRAPH Conference on Motion, Interaction, and Games10.1145/3677388.3696339(1-11)Online publication date: 21-Nov-2024
  • (2024)Real-time Diverse Motion In-betweening with Space-time ControlProceedings of the 17th ACM SIGGRAPH Conference on Motion, Interaction, and Games10.1145/3677388.3696327(1-8)Online publication date: 21-Nov-2024
  • (2024)Virtual Animal Embodiment for Actor Training2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)10.1109/VRW62533.2024.00083(428-431)Online publication date: 16-Mar-2024
  • Show More Cited By

Index Terms

  1. Parametric motion graphs

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    I3D '07: Proceedings of the 2007 symposium on Interactive 3D graphics and games
    April 2007
    196 pages
    ISBN:9781595936288
    DOI:10.1145/1230100
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 30 April 2007

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. motion capture
    2. motion graphs
    3. motion synthesis

    Qualifiers

    • Article

    Conference

    I3D07
    Sponsor:
    I3D07: Symposium on Interactive 3D Graphics and Games 2007
    April 30 - May 2, 2007
    Washington, Seattle

    Acceptance Rates

    Overall Acceptance Rate 148 of 485 submissions, 31%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)37
    • Downloads (Last 6 weeks)4
    Reflects downloads up to 10 Dec 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Dog Code: Human to Quadruped Embodiment using Shared CodebooksProceedings of the 17th ACM SIGGRAPH Conference on Motion, Interaction, and Games10.1145/3677388.3696339(1-11)Online publication date: 21-Nov-2024
    • (2024)Real-time Diverse Motion In-betweening with Space-time ControlProceedings of the 17th ACM SIGGRAPH Conference on Motion, Interaction, and Games10.1145/3677388.3696327(1-8)Online publication date: 21-Nov-2024
    • (2024)Virtual Animal Embodiment for Actor Training2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)10.1109/VRW62533.2024.00083(428-431)Online publication date: 16-Mar-2024
    • (2024)A Deep Learning Framework for Start–End Frame Pair-Driven Motion SynthesisIEEE Transactions on Neural Networks and Learning Systems10.1109/TNNLS.2022.321359635:5(7021-7034)Online publication date: May-2024
    • (2024)Editing mesh sequences with varying connectivityComputers and Graphics10.1016/j.cag.2024.103943121:COnline publication date: 1-Jun-2024
    • (2023)3D Dynamic Image Modeling Based on Machine Learning in Film and Television AnimationJournal of Multimedia Information System10.33851/JMIS.2023.10.1.6910:1(69-78)Online publication date: 30-Mar-2023
    • (2023)Neural Motion GraphSIGGRAPH Asia 2023 Conference Papers10.1145/3610548.3618181(1-11)Online publication date: 10-Dec-2023
    • (2023)NeuroDogProceedings of the ACM on Computer Graphics and Interactive Techniques10.1145/36069366:3(1-19)Online publication date: 24-Aug-2023
    • (2023)Recurrent Motion Refiner for Locomotion StitchingComputer Graphics Forum10.1111/cgf.1492042:6Online publication date: 12-Aug-2023
    • (2023)Neural3Points: Learning to Generate Physically Realistic Full‐body Motion for Virtual Reality UsersComputer Graphics Forum10.1111/cgf.1463441:8(183-194)Online publication date: 20-Mar-2023
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media