US7931604B2 - Method for real time interactive visualization of muscle forces and joint torques in the human body - Google Patents

Method for real time interactive visualization of muscle forces and joint torques in the human body Download PDF

Info

Publication number
US7931604B2
US7931604B2 US12/251,688 US25168808A US7931604B2 US 7931604 B2 US7931604 B2 US 7931604B2 US 25168808 A US25168808 A US 25168808A US 7931604 B2 US7931604 B2 US 7931604B2
Authority
US
United States
Prior art keywords
muscle
motion
real time
forces
look
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US12/251,688
Other versions
US20090082701A1 (en
Inventor
Oshri Even Zohar
Antonie J van den Bogert
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Motek BV
Original Assignee
Motek BV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Motek BV filed Critical Motek BV
Priority to US12/251,688 priority Critical patent/US7931604B2/en
Assigned to MOTEK B.V. reassignment MOTEK B.V. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: VAN DEN BOGERT, ANTONIE J., ZOHAR, OSHRI EVEN
Publication of US20090082701A1 publication Critical patent/US20090082701A1/en
Application granted granted Critical
Publication of US7931604B2 publication Critical patent/US7931604B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/22Ergometry; Measuring muscular strength or the force of a muscular blow
    • A61B5/224Measuring muscular strength
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1113Local tracking of patients, e.g. in a hospital or private home
    • A61B5/1114Tracking parts of the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1116Determining posture transitions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • A61B5/1127Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using markers
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B23/00Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
    • G09B23/28Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/45For evaluating or diagnosing the musculoskeletal system or teeth
    • A61B5/4519Muscles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/45For evaluating or diagnosing the musculoskeletal system or teeth
    • A61B5/4528Joints
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/6804Garments; Clothes

Definitions

  • This invention most generally relates to a system that combines motion capture technology and a 3D computational musculoskeletal model to create a real time display environment where muscle forces and joint torques are illustrated. More specifically, various embodiments of the present invention create real time visualizations of the physical muscle forces and joint torques in the body during movement.
  • Motion Capture is a term for a variety of techniques, and the technology has existed for many years in a variety of applications.
  • the aim of motion capture is to create three-dimensional (3D) animation and natural simulations in a performance oriented manner.
  • 3D three-dimensional
  • Motion capture allows an operator to use computer-generated characters.
  • Motion capture can be used to create complex motion, using the full range of human movements and allow also inanimate objects to move realistically.
  • Some motion capture systems provide real-time feedback of the data and allow the operator to immediately determine whether the motion works sufficiently.
  • Motion capture can be applied to full body motion as well as to hand animation, facial animation and real time lip sync.
  • Motion capture is also used in medical, simulation, engineering and ergonomic applications, and in feature films, advertising, TV and 3D computer games.
  • Kinematics is the process of calculating the position in space of the end of a linked structure, given the angles of all the joints. Inverse Kinematics does the reverse. Given the end point of the structure, it calculates the angles of the joints needed to be in to achieve that end point. This process is used in robotics, 3D computer animation and some engineering applications.
  • One embodiment of the present invention provides a method for real time display of the array of muscle forces and joint torques in a human body using color space animation of a 3D human body muscle model.
  • Data stream coming from a motion capture system is parsed through a pipeline of specially written algorithms that derives joint orientations, accelerations and velocities and forward and inverse dynamics resulting in real time measurements of muscle forces and joint torques. Those are passed in real time to a 3D human muscle model making the forces and torques visible to the user as they happen.
  • Another embodiment of the present invention provides runtime interaction by a user or operator.
  • a further embodiment of the present invention provides a combination of motion capture technologies, simulation technology and custom real time data processing algorithms, using a combination of hardware and software elements combined with the authoring and control software to customize the visualization in real time of forces and torques exerted by the human body.
  • Still another embodiment of the invention creates a new measurement and visualization tool, bearing applications in various industries.
  • the invention creates the possibility of looking at muscle force transference in the body for determining, registering and evaluating human functional performance to a range of given situations.
  • Yet another embodiment of the present invention provides a new measurement and visualization tool, bearing applications in various industries.
  • the invention creates the possibility of looking at muscle forces and joint forces transference in the body for determining, registering and evaluating human functional performance to a range of given situations.
  • Other applications include orthopedic and ergonomic studies and designs.
  • a yet further embodiment of the present invention provides a process that incorporates real time 3D marker data streams coming from a motion capture system through real-time sets of algorithms that derive from the 3D markers cloud the joints centers of rotation, positions and orientations, then derives accelerations and velocities and converts those into an array of muscle forces that are passed to the 3D human body muscle model as a data stream used in the 3D color space visualization of the muscle forces and joint torques.
  • FIG. 1A is a computer generated image illustrating in a three dimensional representation the motion capture points disposed on a user (not shown) configured in accordance but not limited to one embodiment of the present invention.
  • FIG. 1B is a computer generated image illustrating a kinematics skeleton configured in accordance with one embodiment of the present invention derived from the same motion capture points data as FIG. 1 .
  • FIG. 1C is a computer generated image illustrating an anatomically correct skeleton configured in accordance with one embodiment of the present invention, in conjunction with the data of FIG. 1 .
  • FIG. 1D is a computer generated image illustrating a three dimensional anatomically correct muscle layer configured in accordance with one embodiment of the present invention corresponding with the data set of FIG. 1 .
  • FIG. 1E is a computer generated image illustrating a three dimensional anatomically correct muscle layer disposed on an anatomically correct three dimensional skeleton configured in accordance with one embodiment of the present invention, corresponding to the data points of FIG. 1 , where active muscle forces are indicated by color brightness and hue.
  • FIG. 2 is a computer generated image illustrating pipeline layer connections for processing the capture point data of a person with sensors and template matching it from a lookup table to generate a computational skeleton from which is derived a geometry skeleton configured in accordance with one embodiment of the present invention.
  • FIG. 3 is a computer generated image illustrating a V-Gait configured in accordance with one embodiment of the present invention, produced by a human figure adorned with optical motion capture sensors standing erect on an instrumented treadmill configured with force sensors and weight sensors, viewing a plasma screen or other video display TV, while being monitored by multiple optical motion capture cameras connected to a control computer running predictive feedback software and generating an image on the TV of a 3D real time interactive muscle model of the human figure.
  • FIG. 4 is a block diagram illustrating a motion capture computer system configured in accordance with one embodiment of the present invention.
  • FIG. 5 is a flow chart illustrating a method of motion capture configured in accordance with one embodiment of the present invention.
  • Muscle forces are typically invisible by nature and one can normally only see the results of applied muscle forces on the individual's surroundings.
  • One embodiment of the present invention makes it possible to view simulated muscle forces in the human body in real-time, in a way that makes clear the force transference in the human musculoskeletal system.
  • the process of achieving this functionality relies on fast and accurate real time motion capture data processing into an IK (inverse kinematics) skeletal layer containing joint positions and orientations, a further process deriving accelerations and velocities, a further process deriving inverse dynamics in real time, a further process deriving muscle forces from joint torques, and a final process converting the result streams into 3D visualizations of color and form changes in a 3D accurate human body muscle model.
  • IK inverse kinematics
  • One embodiment of the invention is a method for real time display of the array of muscle forces and joint torques in a human body using color space animation of a 3D human body muscle model.
  • Data streams coming from a motion capture system are parsed through a pipeline of specially written algorithms that derives joint orientations, accelerations and velocities and forward and inverse dynamics resulting in real time measurements of muscle forces and joint torques. Those are passed in real time to a 3D human muscle model making the forces and torques visible to the eye as they happen.
  • One embodiment of the present invention allows runtime interaction by a user or operator.
  • Such an embodiment of the invention can be seen as a combination of motion capture technologies, simulation technology and custom real time data processing algorithms, using a combination of hardware and software elements combined with the authoring and control software to visualize in real time the forces and torques exerted by the human body.
  • One embodiment of the invention provides a new measurement and visualization tool, bearing applications in various industries.
  • One embodiment of the invention creates the possibility of looking at muscle force transference in the body for determining, registering and evaluating human functional performance to a range of given situations.
  • at least one embodiment of the present invention is intended for medical applications, embodiments of the present invention are adaptable for other market segments including ergonomics and sports.
  • This system allows the visualizations of muscle forces for any given exercise in real-time.
  • Such a system, illustrated in FIG. 3 can be used to enhance, optimize and improve muscle forces, by providing a realistic real time visualization of the given forces and torques.
  • the system allows the user 30 to see the force transference to various muscles in the body and achieve the desired effect.
  • a motion capture system 32 instantly records the user's motion and provides immediate muscle force visualizations 34 .
  • One embodiment of the present invention may be utilized by the medical community by making it possible to view muscle forces and torques in real-time. It can assist and improve the quality of life of many patients and allow the perception of physical movement and muscle behaviors for those not otherwise capable of such motion.
  • the system may be useful for victims of traumatic brain injury, cerebral damage, and spinal damage.
  • the study of motion recognition supports the notion that the body remembers certain movements and can even regenerate synoptic paths. By visualizing the desired muscle force, the body can be re-trained to make that movement.
  • embodiments of the present invention can assist patients in understanding their present situation, where they lack muscle force and where they are exerting too much force for compensation reasons. With orthopedics, prosthetics, and amputees, the system can visualize and track muscle deficiencies while training and improving movements.
  • Yet another embodiment of the present invention combines muscle forces and resultant joint force into a calculation and visualization of the forces acting within joints. This is useful as a training tool to prevent and treat overuse injuries in the workplace, in ergonomics and in sports.
  • each joint calculated is drawn as a sphere in this drawing.
  • Inverse kinematics is used to calculate the joint orientation from the motion capture data before deriving the accelerations and velocities of every body part.
  • the next step in the pipeline is to take the calculated joint angles and to derive values of accelerations and velocities for every joint (representing every body part), the acceleration and velocities values are the base for calculating through the use of inverse dynamics, the muscle forces and joint torques which are then passed to the 3D muscle model display as color information.
  • a development project called “Virtual Gait Lab” is one embodiment of the system operating in the real-time domain. Such an embodiment pertains to the development of a virtual reality system in which the muscle forces and joint torques of the human body can be seen and evaluated in real time in a variety of reproducible conditions.
  • the enhancements are defined by allowing a medical expert team for the first time the opportunity to view and analyze muscle forces and joint torques patterns as they happen in a controlled real-time environment.
  • Such a system consists of a combination of an instrumented treadmill that is capable of measuring ground reaction forces, a large screen or projection system for the display of the forces, real time motion capture system and the custom computational pipeline translating the capture data to muscle forces and joint torques display.
  • An embodiment of the present invention seeks to develop an interactive virtual real-time muscle model, which can provide patients with means of almost unlimited exploratory behaviors and at the same time provide medical experts accurate measurement tools for monitoring the complex array of forces present in the human body.
  • the patterns of muscle activation determine whether a subject falls or not. These simulations are aimed at an understanding of normal or pathological response patterns in certain balance tasks.
  • Such an embodiment offers not only a test and learning environment for patients and doctors, but is also a valuable research environment for motor control. Such an embodiment opens the door to a new type of experiments in which real time muscle force visualization can be offered.
  • the muscle force tremors as observed in Parkinson patients are considered to be an enigma by many clinicians and human movement scientists. In these patients some visual cues are sufficient to trigger rather normal looking muscle force patterns (for instance used in walking), while in the absence of such stimuli a pattern can not even be started.
  • the continuous control of muscle force transference during walking is possible by having a multi-channel sensory input onto a vast library of learned motor patterns. Once the possibility exists to view in real time the muscle force pattern immergence, it will lead to fundamental improvement in the understanding and possible treatment of the sickness. Such an embodiment will allow a new glimpse into the complexity of the natural processes associated with human motion.
  • Another example of an application for one embodiment of the present invention is the prevention and treatment of low back pain through teaching of proper lifting techniques. Real-time calculation and visualization of the forces acting on the intervertebral discs will provide immediate feedback to the patient concerning the quality of their movement.
  • muscle forces will be visualized, but certain training applications may provide audio signals driven by muscle force values from the computational pipeline. Other training applications may use muscle force values as input for a virtual environment, which causes changes in position of virtual objects, or changes in position of the motion platform on which the subject is standing,
  • the computational pipeline that results in real time muscle force display is flexible and allows forward dynamics simulations to be run at any time during runtime of the system.
  • the flow of movements as an input to the inverse dynamics simulation is stopped during a sequence and the calculated joint movements are now used as input, while the movements become output.
  • forward simulations calculate movements and reaction forces from moments of force produced around the joints of the subjects. These forward simulations can be visualized as part of the virtual environment, and will show what might happen to the patient in hypothetical situations.
  • the forward and inverse dynamic calculations typically consist of a large set of equations. Depending on the methods used to form these equations, they are expressed in various ways, such as Newton-Euler equations, Lagrange's equations, or Kane's equations. These are called the equations of motion, which contain the relation between the generalized forces applied at the body and the generalized movements. “Generalized” in this respect means that they are formulated along the movement possibilities (or degrees of freedom) of the human body, rather than in terms of three dimensional coordinates in the external world. This implies that most of the generalized forces are actually moments of force (or torque). Equations can be added describing the kind of interaction with the environment, such as contacts with the floor. The equations can be solved simultaneously in a forward simulation, solved algebraically in an inverse simulation or rearranged and solved to do a mixed inverse and forward simulation. In one embodiment of the present invention these computations are all happening in real time.
  • the main tasks of the real time computational pipeline are processing the input data coming from the motion capture sensors, mapping the collected data into the above mentioned human body model, processing the various input and/or computed data depending on different cases.
  • Other tasks concern the display of real-time 3D graphic representations of muscle forces and joint torques 28 , as well as driving the output devices such as a treadmill 38 and a display system 34 as illustrated in FIG. 3 .
  • the user interface for the operator is implemented as the means to communicate with the real time 3D muscle model 26 of FIG. 1D through a custom written software program.
  • the real time 3D muscle model is projected on the screen in front of the subject,
  • the user stands on a platform or treadmill, which can be controlled as part of the system or as a reaction to movements of the subject.
  • the user wears motion capture markers 20 , as illustrated in FIGS. 1A and 2 of which the positions are recorded. These are fed into an algorithm that turns them into the degrees of freedom of the human body model, which is filled with the segment masses 22 and inertia of the subject and displayed as color space real time animations of the 3D muscle model of FIG. 1E .
  • the human body model 26 produces the joint moments of force of the subject, if necessary; this information can be offered in the projected image to be used by the subject. Forward dynamics simulation can also be computed to indicate where weak parts in the motor pattern are located.
  • FIGS. 1A-1E illustrate an overview of one embodiment of the present invention's computational real time pipeline wherein as illustrated in FIG. 1A a user is equipped with a number of motion capture sensors or markers 20 attached at various strategic locations of the body. The data from the sensors is received by a motion capture system 32 .
  • the motion capture data set contains the X axis, Y axis, and Z axis positions of the user for the full body, and is transmitted at >100 FPS (Frames per second) to the computer 36 .
  • the computer 36 interactively operates with operator's interface 34 and executes the first step in the computational pipeline converting the positional data in real time to an inverse kinematics skeleton 22 illustrated in FIG. 1B .
  • This data is typically applied to the inverse kinematics skeleton 22 to drive a 3D anatomically correct skeleton 24 in about approximately real time ( FIG. 1C ). Then a 3D anatomically correct muscle layer 26 of FIG. 1D is connected to the human skeleton 24 and the muscle forces and joint torques resulting from the real time computational pipeline are applied to real time animations of colors 28 of the respective muscles in the 3D muscle model of FIG. 1E .
  • a person is outfitted with markers 20 and a template 22 is processed for an initial or balance position.
  • the markers 20 are typically used to record the motion. They are substantially instantaneously captured, and used to process a complete template.
  • the template 22 utilizes a template matching algorithm to interpolate for missing or bad marker data.
  • the Template matching result is passed to the computational inverse kinematics skeleton 24 .
  • position data of the markers is plotted in real time to joint orientations in the computational skeleton 24 .
  • Constraint based rigging the data is in turn driving a geometry (anatomically correct) skeleton. This skeleton is the base for the muscle force visualization layer.
  • FIG. 3 illustrates an embodiment of the present invention, comprising a computer-based motion capture system linking a treadmill instrumented with force and weight sensors, multiple optical motion capture cameras and a plasma screen or other video display means to a control computer running predictive feedback software for generating an image on the TV of a 3D real time interactive muscle model of a figure on the treadmill, wherein the patient 30 on instrumented treadmill 38 is looking at the 3D real time interactive muscle model 34 of himself seeing the muscles in action as muscle force is exerted.
  • This interactive muscle force model 34 is calculated by a processor 36 using the method described above using data obtained from optical motion capture sensors 32 disposed on the patient's body 30 , in combination with sensors disposed in the instrumented treadmill 38 .
  • weight sensors may be disposed in the instrumented treadmill 38 while other sensors such as accelerometers, speedometers, rotation and position sensors may also be included.
  • FIG. 4 is a block diagrammatic view illustrating the hardware and software elements and possible interconnections of one embodiment of a motion capture system.
  • Computers running predictive feedback loop software are linked to presence sensing technology such as optical motion capture systems, magnetic motion capture systems, inertial motion capture systems, video based motion capture systems, force sensors, weight sensors, temperature sensors, electromyography systems, electroencephalography systems, electrocardiography systems and sound sensors.
  • the computers are also linked to motion generator devices such as passive treadmills, active treadmills, instrumented treadmills, hydraulic motion platforms, electric motion platforms, pneumatic motion platforms, and motion simulators.
  • the presence sensing technology and the motion generation devices are linked by various possible interface devices to the subject.
  • the hardware platform is based on high end multi-core Multi-processor workstations.
  • the multi-CPU hardware platform 36 is used as the computer means for processing, memory, and interface.
  • the various peripherals and communications are accomplished by using standard high-speed connections using Ethernet, serial, and SCSI connections to dedicated hosts.
  • the dedicated host can be a separate personal computer (PC) or an integrated on-board computer that interfaces with the peripheral equipment.
  • the optical motion capture system of one embodiment includes six cameras, and the data acquisition unit of the optical motion capture system translates the camera input into the desired data set.
  • the data set of one embodiment is 3D position information of the sensors 20 obtained from a person 30 in real time, and is accessible to a dedicated host that allows for the fast exchange of data to the CPU 36 .
  • Data is, in one embodiment, delivered in a custom made file format.
  • the chosen main optical capture system of one embodiment is a realtime passive marker system 32 , which is readily configurable for many setups. This technology is capable of converting and displaying 3D data coordinates of up to 300 optical markers at >100 HZ,
  • the instrumented treadmill 38 is interconnected to dedicated host that connects to the CPU for transferring data and control information.
  • the treadmill 38 of one embodiment has the capacity of measurements of real time ground reaction forces by the use of force sensors under the treadmill belt.
  • a projection device 34 such as a plasma screen or a video projector and screen is used to display the real time 3D muscle model to the user.
  • FIG. 5 illustrates a flow chart illustrating the operation of a system configured according to one embodiment of the present invention for generating real time muscle force visualization of a subject using real time motion capture with force sensors processed through a real-time pipeline utilizing a first human body model lookup table of skeleton and markers configurations, a second human body model lookup table of mass properties, and a third human body lookup table of muscle paths.
  • Input from the motion capture system 1 in the form of 3D marker coordinates is used as input for the Kinematic Solver 6 .
  • the Kinematic solver 6 is also using resource files from the first lookup table of a skeleton definition and marker set templates 3 .
  • the Kinematic Solver 6 is outputting in real-time the current skeleton pose.
  • Real-time low-pass filtering and differentiation processes the changes in skeleton pose into first and second derivatives of velocities and accelerations that are used as input to the Motion Equations 7 .
  • the Kinematic Solver output to the muscle path third lookup table also drives the generation of Muscle paths for all respective muscles 5 , and outputs the schematic skeleton used for the visualization 9 .
  • the Motion Equations 7 are also using input from ground reaction forces and other external forces coming from an array of Force sensors 2 .
  • the Motion Equations 7 also use an input from resource files of the second human body model lookup table that contains the respective body mass properties 4 .
  • the Equations of Motion 7 Output Joint moments to the Optimization process 8 ,
  • the Optimization process 8 also uses input of muscle lengths and moment arms coming from the respective muscle paths 5 from the muscle paths third lookup table.
  • the Optimization process 8 Outputs Muscle forces used in the Real Time muscle force Visualization 9 .
  • the skeleton pose (i.e. the set of generalized coordinates) is calculated in real-time by using the Levenberg-Marquardt nonlinear least-squares algorithm to solve the global optimization problem.
  • the use of the analytical Jacobian matrix makes the computations very fast.
  • equations of motion have produced via software that creates C code for the forward kinematics equations. Those equations generate coordinates of markers on the body from the generalized coordinates of the skeleton. The derivatives of the forward kinematics equations, forming a Jacobian matrix, are generated by via symbolic differentiation. Finally, one embodiment of the present invention translates these equations into computer code which is incorporated into the computational pipeline which executes the calculations at run time.
  • the muscle forces are the solution of a static optimization problem, with the general form: minimize the sum of normalized muscle forces raised to the Nth power, while requiring that all muscle forces are non-negative, and that the set of muscle forces multiplied by their respective moment arms, are identical to the joint torques solved by the inverse dynamics equations.
  • Normalized muscle force is defined as the muscle force relative to the maximal force capacity of the muscle.
  • Moment arm is the distance from the muscle force vector to the instantaneous center of rotation of a particular joint and is mathematically calculated as the derivative of muscle length with respect to the joint's generalized coordinate.
  • Motion Capture is a phrase used to describe for a variety of techniques for capturing the movement of a body or object, and the technology has existed for many years in a variety of applications.
  • the aim of motion capture is to create three-dimensional (3D) animation and natural simulations in a performance oriented manner.
  • 3D three-dimensional
  • Motion capture allows an operator to use computer-generated characters.
  • Motion capture is used to create complex natural motion, using the full range of human movements and allow also inanimate objects to move realistically.
  • Some motion capture systems provide real-time feedback of the data and allow the operator to immediately determine whether the motion works sufficiently.
  • Motion capture can be applied to full body motion as well as to hand animation, facial animation and real time lip sync.
  • Motion capture is also used in medical, simulation, engineering and ergonomic applications, and in feature films, advertising, TV and 3D computer games. In the context of the present invention, motion capture is used to output 3D XYZ marker positions.
  • Force sensors are used in many industries such as Automotive, Robotics and various Engineering applications, typically a force sensor will measure the total forces applied on it, those can be vertical force or horizontal and shear force components.
  • force sensors are used to measure ground reaction forces from the treadmill a person is standing, walking or running on.
  • the treadmill of one embodiment has the capacity of measurements of real time ground reaction forces by the use of force sensors under the treadmill belt, It's speed is interconnected to the computational pipeline for establishing a feedback loop between the motion capture system and the treadmill so that the person is remaining at the center of the treadmill regardless of changes in the walking/running speeds.
  • Skeleton definition and marker set Templates 3 are resource files used in the computational pipeline of the current invention, people are different in size and weight and a skeleton templates is selected from a group of skeleton templates to get the best match for every person.
  • Marker templates are used to define where the 4 markers are placed on the human body. Typically, such markers are disposed at every joint of the body.
  • Body mass properties 4 pertains to the weight of different body parts of different people. People vary in weight and this has ramifications on the muscle force they exert to generate specific motions. The mass properties are used as a resource for the correct real time force computations.
  • Muscle paths 5 are utilized to compensate for differences in build between users. Variations in length and width between subjects have ramifications to the force computations as a longer muscle will exert different force to generate the same motion then a shorter muscle, also the placement of the ligaments will be different in different people. In the context of one embodiment of the present invention, muscle paths are used to assist the computations of muscle forces and joint torques.
  • Kinematic solver 6 provides for the calculation of joint orientation using inverse kinematics.
  • Kinematics is the process of calculating the position in space of the end of a linked structure, given the angles of all the joints. Inverse Kinematics does the reverse. Given the end point of the structure, what angles do the joints need to be in to achieve that end point? This process is used in robotics, 3D computer animation and some engineering applications.
  • it is a single step in the data analysis pipeline, taking the data stream from the motion capture system and calculating the joint angles for every body part.
  • Inverse kinematics is used to calculate the joint orientation from the motion capture data, and to thereby convert XYZ positional data to rotation angle data of the joints in degrees or radians.
  • Motion Equations 7 are sets of mathematical equations designed to combine incoming streams of kinematics data with marker and skeleton templates and convert those to forward and inverse dynamics data. Those can be lagrangeian equation sets, Casey sets, or Euler-Newton equation sets.
  • the motion equations 7 provide the relationship between generalized forces applied at the body and generalized movements. “Generalized” in this respect means that they are formulated along the movement possibilities (or degrees of freedom) of the human body, rather than in terms of forces in the external world.
  • Equations 7 can be added describing the kind of interaction with the environment, such as contacts with the floor.
  • the equations 7 can be solved simultaneously in a forward simulation, solved algebraically in an inverse simulation or rearranged and solved to do a mixed inverse and forward simulation. In one embodiment of the present invention these computations are all happening in real time.
  • effective delay is eliminated using efficient algorithms, achieving a minimal sampling speed in real time to be greater than 30 hz, a standard familiar to those in the television and broadcast industries.
  • faster time would likewise be acceptable or desirable in some applications.
  • An optimization process 8 uses the input of muscle lengths and moment arms coming from the respective muscle paths to output muscle forces and joint torques.
  • the optimization 8 of the data contains routines for data normalization and several real time software filters
  • Real Time muscle force visualization 9 is provided by inputs of muscle forces and joint torques and are used to drive color animation on the respective muscles displayed as a 3D human body model on screen.
  • the color brightness and hue correlates with the muscle force amplitude, gain and activation patterns.
  • the user and operator can see a real time animation of the muscle forces active in the human body at any given time
  • Various embodiments of the present invention provide applications adaptable for other market segments. Sports and fitness is one such market.
  • One embodiment of the present invention provides a tool that is useful in numerous applications, including the fitness industry.
  • This system allows the visualizations of muscle forces for any given exercise in real-time.
  • the system can be used to enhance and improve muscle forces, by providing a realistic visualization of the given forces and torques.
  • the present system allows the user to see the force transference to various muscles in the body and achieve a desired effect.
  • the motion capture system instantly records the user's motion and provides immediate muscle force visualizations.
  • One embodiment of the present invention may have an enormous impact in the medical community by making it possible to view muscle forces and torques in real-time. It can assist and improve the quality of life of many patients and allow the perception of physical movement and muscle behaviors for those not otherwise capable of such motion.
  • the system is useful for victims of traumatic brain injury, cerebral damage, and spinal damage.
  • the study of motion recognition supports the notion that the body remembers certain movements and can even regenerate synoptic paths. By visualizing the desired muscle force, the body can be re-trained to make that movement.
  • the present invention can assist patients in understanding their present situation, where they lack muscle force and where they are exerting too much force for compensation reasons.
  • the system can visualize and track muscle deficiencies while training and improving movements.
  • One embodiment of the present invention in relation to medical applications can serve as an example.
  • One development project called “Virtual Gait Lab” is one embodiment of the system operating in the real-time domain. This project pertains to the development of a virtual reality system in which the muscle forces and joint torques of the human body can be seen and evaluated in real time in a variety of reproducible conditions.
  • One of the major objectives of such a project is to enhance diagnostic and therapeutic activities in a range of medical fields.
  • the enhancements are defined by allowing a medical expert team for the first time the opportunity to view and analyze muscle forces and joint torques patterns as they happen in a controlled real-time environment.
  • the system consists of a combination of an instrumented treadmill 38 that is capable of measuring ground reaction forces, a large screen or projection system for the display of the forces 34 , real time motion capture system 32 and the custom computational pipeline 36 translating the capture data to muscle forces and joint torques display.
  • Various embodiments of the present invention seek to develop an interactive virtual real-time muscle model, which can provide patients with means of almost unlimited exploratory behaviors and at the same time provide medical experts accurate measurement tools for monitoring the complex array of forces present in the human body.
  • the patterns of muscle activation determine whether a subject falls or not.
  • These simulations are aimed at an understanding of normal or pathological response patterns in certain balance tasks.
  • Such an embodiment offers not only a test- and learning environment for patients and doctors, but is also a valuable research environment for motor control.
  • Such an embodiment opens the door to a new type of experiments in which real time muscle force visualization can be offered.
  • the muscle force tremors as observed in Parkinson patients are considered to be an enigma by many clinicians and human movement scientists.
  • One embodiment of the invention is a new principle in real time visualization, where muscle force is seen and evaluated in a totally new way. This principle establishes a mechanism to achieve a visualization state whereby the persons involved can see immediately which muscles they are using and to what extent.
  • One embodiment of the present invention is a muscle force processing system, comprising a processing means, a motion capture system connected to the processing means.
  • the motion capture data is taken from a plurality of motion sensors and is processed in real-time.
  • a further embodiment is an instrumented treadmill capable of measurements of ground reaction forces, wherein the measurements of said ground reaction forces are integrated in the computational pipeline resulting in real time view of muscle forces and joint torques.
  • a further embodiment is a 3D interactive muscle model further comprising an inverse kinematics skeleton layer, a 3D geometry anatomically correct skeleton layer and an anatomically correct muscle model layer.
  • An additional embodiment is a real time computational pipeline, further comprising a memory means for recording the motion capture data and processing the data in real time through the said layers of the processing real time pipeline.
  • Another embodiment is a method and system for real time visualization registration, evaluation, and correction of muscle forces and joint torques in the human body, wherein the full process is happening in real time.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Pathology (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Dentistry (AREA)
  • Physiology (AREA)
  • General Physics & Mathematics (AREA)
  • Computational Mathematics (AREA)
  • Pure & Applied Mathematics (AREA)
  • Medicinal Chemistry (AREA)
  • Chemical & Material Sciences (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Mathematical Physics (AREA)
  • Algebra (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Theoretical Computer Science (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Processing Or Creating Images (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

A method and system are provided for the visual display of anatomical forces, that system having: a motion capture system; a computer, receiving data from said motion capture system; and a computational pipeline disposed on said computer; that computational pipeline being configured to calculate muscle forces and joint torques in real time and visually display those forces and torques.

Description

RELATED APPLICATIONS
This application is a continuation of U.S. application Ser. No. 11/832,726, filed Aug. 2, 2007, which claims the benefit of U.S. Provisional Applications No. 60/893,394, filed Mar. 7, 2007. This application is herein incorporated in its entirety by reference.
FIELD OF THE INVENTION
This invention most generally relates to a system that combines motion capture technology and a 3D computational musculoskeletal model to create a real time display environment where muscle forces and joint torques are illustrated. More specifically, various embodiments of the present invention create real time visualizations of the physical muscle forces and joint torques in the body during movement.
BACKGROUND OF THE INVENTION
Currently there is no known system or method available for visualizing in 3D the muscle forces exerted by the human body in real time. Most rehabilitation clinics and medical research institutes use specialized therapeutic programs, based on cause related classifications of movement disorders, but there is no known way that they can view the body force arrays in real time as it usually takes many hours and days of calculations to derive those parameters and the results are numerical or graphical and not intuitive to the viewer.
Motion Capture is a term for a variety of techniques, and the technology has existed for many years in a variety of applications. The aim of motion capture is to create three-dimensional (3D) animation and natural simulations in a performance oriented manner. In the entertainment industry, motion capture allows an operator to use computer-generated characters. Motion capture can be used to create complex motion, using the full range of human movements and allow also inanimate objects to move realistically. Some motion capture systems provide real-time feedback of the data and allow the operator to immediately determine whether the motion works sufficiently. Motion capture can be applied to full body motion as well as to hand animation, facial animation and real time lip sync. Motion capture is also used in medical, simulation, engineering and ergonomic applications, and in feature films, advertising, TV and 3D computer games.
Kinematics is the process of calculating the position in space of the end of a linked structure, given the angles of all the joints. Inverse Kinematics does the reverse. Given the end point of the structure, it calculates the angles of the joints needed to be in to achieve that end point. This process is used in robotics, 3D computer animation and some engineering applications.
Dynamics is the process of calculating the accelerations of a linked structure in space, given the set of internal and external forces acting on the structure. Inverse dynamics does the opposite. Given the accelerations of the structure, and a set of measured forces, it calculates the unknown internal forces needed to produce those accelerations. The result is typically provided as a set of joint torques and resultant joint forces.
What is needed, therefore, are techniques for creating a single computational pipeline of all the described steps in real time. Creating for the first time the capability to view muscle forces as they occur.
SUMMARY OF THE INVENTION
One embodiment of the present invention provides a method for real time display of the array of muscle forces and joint torques in a human body using color space animation of a 3D human body muscle model. Data stream coming from a motion capture system is parsed through a pipeline of specially written algorithms that derives joint orientations, accelerations and velocities and forward and inverse dynamics resulting in real time measurements of muscle forces and joint torques. Those are passed in real time to a 3D human muscle model making the forces and torques visible to the user as they happen.
Another embodiment of the present invention provides runtime interaction by a user or operator.
A further embodiment of the present invention provides a combination of motion capture technologies, simulation technology and custom real time data processing algorithms, using a combination of hardware and software elements combined with the authoring and control software to customize the visualization in real time of forces and torques exerted by the human body.
Still another embodiment of the invention creates a new measurement and visualization tool, bearing applications in various industries. The invention creates the possibility of looking at muscle force transference in the body for determining, registering and evaluating human functional performance to a range of given situations.
Yet another embodiment of the present invention provides a new measurement and visualization tool, bearing applications in various industries. The invention creates the possibility of looking at muscle forces and joint forces transference in the body for determining, registering and evaluating human functional performance to a range of given situations. Other applications include orthopedic and ergonomic studies and designs.
A yet further embodiment of the present invention provides a process that incorporates real time 3D marker data streams coming from a motion capture system through real-time sets of algorithms that derive from the 3D markers cloud the joints centers of rotation, positions and orientations, then derives accelerations and velocities and converts those into an array of muscle forces that are passed to the 3D human body muscle model as a data stream used in the 3D color space visualization of the muscle forces and joint torques.
The features and advantages described herein are not all-inclusive and, in particular, many additional features and advantages will be apparent to one of ordinary skill in the art in view of the drawings, specification, and claims. Moreover, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes, and not to limit the scope of the inventive subject matter.
BRIEF DESCRIPTION OF THE DRAWINGS
The patent or application file contains at least one drawing executed in color. Copies of this patent or patent application publication with color drawing(s) will be provided by the Office upon request and payment of the necessary fee.
FIG. 1A is a computer generated image illustrating in a three dimensional representation the motion capture points disposed on a user (not shown) configured in accordance but not limited to one embodiment of the present invention.
FIG. 1B is a computer generated image illustrating a kinematics skeleton configured in accordance with one embodiment of the present invention derived from the same motion capture points data as FIG. 1.
FIG. 1C is a computer generated image illustrating an anatomically correct skeleton configured in accordance with one embodiment of the present invention, in conjunction with the data of FIG. 1.
FIG. 1D is a computer generated image illustrating a three dimensional anatomically correct muscle layer configured in accordance with one embodiment of the present invention corresponding with the data set of FIG. 1.
FIG. 1E is a computer generated image illustrating a three dimensional anatomically correct muscle layer disposed on an anatomically correct three dimensional skeleton configured in accordance with one embodiment of the present invention, corresponding to the data points of FIG. 1, where active muscle forces are indicated by color brightness and hue.
FIG. 2 is a computer generated image illustrating pipeline layer connections for processing the capture point data of a person with sensors and template matching it from a lookup table to generate a computational skeleton from which is derived a geometry skeleton configured in accordance with one embodiment of the present invention.
FIG. 3 is a computer generated image illustrating a V-Gait configured in accordance with one embodiment of the present invention, produced by a human figure adorned with optical motion capture sensors standing erect on an instrumented treadmill configured with force sensors and weight sensors, viewing a plasma screen or other video display TV, while being monitored by multiple optical motion capture cameras connected to a control computer running predictive feedback software and generating an image on the TV of a 3D real time interactive muscle model of the human figure.
FIG. 4 is a block diagram illustrating a motion capture computer system configured in accordance with one embodiment of the present invention.
FIG. 5 is a flow chart illustrating a method of motion capture configured in accordance with one embodiment of the present invention.
DETAILED DESCRIPTION
Muscle forces are typically invisible by nature and one can normally only see the results of applied muscle forces on the individual's surroundings. One embodiment of the present invention makes it possible to view simulated muscle forces in the human body in real-time, in a way that makes clear the force transference in the human musculoskeletal system. The process of achieving this functionality relies on fast and accurate real time motion capture data processing into an IK (inverse kinematics) skeletal layer containing joint positions and orientations, a further process deriving accelerations and velocities, a further process deriving inverse dynamics in real time, a further process deriving muscle forces from joint torques, and a final process converting the result streams into 3D visualizations of color and form changes in a 3D accurate human body muscle model.
The applicant herein incorporates by reference U.S. Pat. No. 6,774,885 for all purposes.
One embodiment of the invention is a method for real time display of the array of muscle forces and joint torques in a human body using color space animation of a 3D human body muscle model. Data streams coming from a motion capture system are parsed through a pipeline of specially written algorithms that derives joint orientations, accelerations and velocities and forward and inverse dynamics resulting in real time measurements of muscle forces and joint torques. Those are passed in real time to a 3D human muscle model making the forces and torques visible to the eye as they happen.
One embodiment of the present invention allows runtime interaction by a user or operator. Such an embodiment of the invention can be seen as a combination of motion capture technologies, simulation technology and custom real time data processing algorithms, using a combination of hardware and software elements combined with the authoring and control software to visualize in real time the forces and torques exerted by the human body.
One embodiment of the invention provides a new measurement and visualization tool, bearing applications in various industries. One embodiment of the invention creates the possibility of looking at muscle force transference in the body for determining, registering and evaluating human functional performance to a range of given situations. Although at least one embodiment of the present invention is intended for medical applications, embodiments of the present invention are adaptable for other market segments including ergonomics and sports.
Various embodiments of the present invention provide tools that are useful in numerous applications, including the sports and fitness industries. This system allows the visualizations of muscle forces for any given exercise in real-time. Such a system, illustrated in FIG. 3 can be used to enhance, optimize and improve muscle forces, by providing a realistic real time visualization of the given forces and torques. The system allows the user 30 to see the force transference to various muscles in the body and achieve the desired effect. A motion capture system 32 instantly records the user's motion and provides immediate muscle force visualizations 34.
One embodiment of the present invention may be utilized by the medical community by making it possible to view muscle forces and torques in real-time. It can assist and improve the quality of life of many patients and allow the perception of physical movement and muscle behaviors for those not otherwise capable of such motion. The system may be useful for victims of traumatic brain injury, cerebral damage, and spinal damage. The study of motion recognition supports the notion that the body remembers certain movements and can even regenerate synoptic paths. By visualizing the desired muscle force, the body can be re-trained to make that movement. In the field of orthopedics and prosthetics, embodiments of the present invention can assist patients in understanding their present situation, where they lack muscle force and where they are exerting too much force for compensation reasons. With orthopedics, prosthetics, and amputees, the system can visualize and track muscle deficiencies while training and improving movements.
Yet another embodiment of the present invention combines muscle forces and resultant joint force into a calculation and visualization of the forces acting within joints. This is useful as a training tool to prevent and treat overuse injuries in the workplace, in ergonomics and in sports.
In the context of one embodiment of the present invention it is a first step in the data analysis pipeline illustrated in FIG. 2, taking the data stream from the motion capture system and calculating the joint angles for every body part, each joint calculated is drawn as a sphere in this drawing. In one embodiment of the present invention, Inverse kinematics is used to calculate the joint orientation from the motion capture data before deriving the accelerations and velocities of every body part. The next step in the pipeline is to take the calculated joint angles and to derive values of accelerations and velocities for every joint (representing every body part), the acceleration and velocities values are the base for calculating through the use of inverse dynamics, the muscle forces and joint torques which are then passed to the 3D muscle model display as color information.
One embodiment of the present invention in relation to medical applications can serve as an example. A development project called “Virtual Gait Lab” is one embodiment of the system operating in the real-time domain. Such an embodiment pertains to the development of a virtual reality system in which the muscle forces and joint torques of the human body can be seen and evaluated in real time in a variety of reproducible conditions.
Among the features of such an embodiment is the ability to enhance diagnostic and therapeutic activities in a range of medical fields. The enhancements are defined by allowing a medical expert team for the first time the opportunity to view and analyze muscle forces and joint torques patterns as they happen in a controlled real-time environment.
Such a system consists of a combination of an instrumented treadmill that is capable of measuring ground reaction forces, a large screen or projection system for the display of the forces, real time motion capture system and the custom computational pipeline translating the capture data to muscle forces and joint torques display.
An embodiment of the present invention seeks to develop an interactive virtual real-time muscle model, which can provide patients with means of almost unlimited exploratory behaviors and at the same time provide medical experts accurate measurement tools for monitoring the complex array of forces present in the human body.
Especially in complex balance tasks, the patterns of muscle activation determine whether a subject falls or not. These simulations are aimed at an understanding of normal or pathological response patterns in certain balance tasks.
Such an embodiment offers not only a test and learning environment for patients and doctors, but is also a valuable research environment for motor control. Such an embodiment opens the door to a new type of experiments in which real time muscle force visualization can be offered.
For example the muscle force tremors as observed in Parkinson patients are considered to be an enigma by many clinicians and human movement scientists. In these patients some visual cues are sufficient to trigger rather normal looking muscle force patterns (for instance used in walking), while in the absence of such stimuli a pattern can not even be started. In healthy subjects, the continuous control of muscle force transference during walking is possible by having a multi-channel sensory input onto a vast library of learned motor patterns. Once the possibility exists to view in real time the muscle force pattern immergence, it will lead to fundamental improvement in the understanding and possible treatment of the sickness. Such an embodiment will allow a new glimpse into the complexity of the natural processes associated with human motion.
Other examples can be found among patients with peripheral disorders, such as partial paralysis or paresis of a limb. In these situations, gait and balance are compromised both by a partial lack of sensory input and a lack of muscle coordination. The usual result of that is that in order to obtain a functional gait and balance the patients find compensations, resulting in deviant movement patterns in healthy parts of the body. Making use of the real time muscle force and joint torques visualization can help to sort out the distinction between compensation and primary disorders.
Another example of an application for one embodiment of the present invention is the prevention and treatment of low back pain through teaching of proper lifting techniques. Real-time calculation and visualization of the forces acting on the intervertebral discs will provide immediate feedback to the patient concerning the quality of their movement.
In many embodiments the muscle forces will be visualized, but certain training applications may provide audio signals driven by muscle force values from the computational pipeline. Other training applications may use muscle force values as input for a virtual environment, which causes changes in position of virtual objects, or changes in position of the motion platform on which the subject is standing,
The computational pipeline that results in real time muscle force display is flexible and allows forward dynamics simulations to be run at any time during runtime of the system. The flow of movements as an input to the inverse dynamics simulation is stopped during a sequence and the calculated joint movements are now used as input, while the movements become output. Thus forward simulations calculate movements and reaction forces from moments of force produced around the joints of the subjects. These forward simulations can be visualized as part of the virtual environment, and will show what might happen to the patient in hypothetical situations.
The forward and inverse dynamic calculations typically consist of a large set of equations. Depending on the methods used to form these equations, they are expressed in various ways, such as Newton-Euler equations, Lagrange's equations, or Kane's equations. These are called the equations of motion, which contain the relation between the generalized forces applied at the body and the generalized movements. “Generalized” in this respect means that they are formulated along the movement possibilities (or degrees of freedom) of the human body, rather than in terms of three dimensional coordinates in the external world. This implies that most of the generalized forces are actually moments of force (or torque). Equations can be added describing the kind of interaction with the environment, such as contacts with the floor. The equations can be solved simultaneously in a forward simulation, solved algebraically in an inverse simulation or rearranged and solved to do a mixed inverse and forward simulation. In one embodiment of the present invention these computations are all happening in real time.
From the dynamic simulation the location of the center of mass is calculated, which, together with the position of the feet, can be used to drive the motion of the platform, if this is required by the virtual environment. The human body model produces the joint moments of force of the subject. Forward dynamics simulation can be started to indicate where weak parts in the motor pattern are located.
The main tasks of the real time computational pipeline are processing the input data coming from the motion capture sensors, mapping the collected data into the above mentioned human body model, processing the various input and/or computed data depending on different cases. Other tasks concern the display of real-time 3D graphic representations of muscle forces and joint torques 28, as well as driving the output devices such as a treadmill 38 and a display system 34 as illustrated in FIG. 3.
The user interface for the operator is implemented as the means to communicate with the real time 3D muscle model 26 of FIG. 1D through a custom written software program. As an example of operation, after having decided on the type of motions to execute, the real time 3D muscle model is projected on the screen in front of the subject, The user stands on a platform or treadmill, which can be controlled as part of the system or as a reaction to movements of the subject. The user wears motion capture markers 20, as illustrated in FIGS. 1A and 2 of which the positions are recorded. These are fed into an algorithm that turns them into the degrees of freedom of the human body model, which is filled with the segment masses 22 and inertia of the subject and displayed as color space real time animations of the 3D muscle model of FIG. 1E.
From the skeleton motion and mass properties, also the location of the center of mass is calculated, which, together with the position of the feet, can be used to drive the motion of the treadmill or platform as required by the environment. The human body model 26 produces the joint moments of force of the subject, if necessary; this information can be offered in the projected image to be used by the subject. Forward dynamics simulation can also be computed to indicate where weak parts in the motor pattern are located.
FIGS. 1A-1E illustrate an overview of one embodiment of the present invention's computational real time pipeline wherein as illustrated in FIG. 1A a user is equipped with a number of motion capture sensors or markers 20 attached at various strategic locations of the body. The data from the sensors is received by a motion capture system 32. In a preferred embodiment, the motion capture data set contains the X axis, Y axis, and Z axis positions of the user for the full body, and is transmitted at >100 FPS (Frames per second) to the computer 36. The computer 36 interactively operates with operator's interface 34 and executes the first step in the computational pipeline converting the positional data in real time to an inverse kinematics skeleton 22 illustrated in FIG. 1B. This data is typically applied to the inverse kinematics skeleton 22 to drive a 3D anatomically correct skeleton 24 in about approximately real time (FIG. 1C). Then a 3D anatomically correct muscle layer 26 of FIG. 1D is connected to the human skeleton 24 and the muscle forces and joint torques resulting from the real time computational pipeline are applied to real time animations of colors 28 of the respective muscles in the 3D muscle model of FIG. 1E.
Referring to FIG. 2, a person is outfitted with markers 20 and a template 22 is processed for an initial or balance position. The markers 20 are typically used to record the motion. They are substantially instantaneously captured, and used to process a complete template. The template 22 utilizes a template matching algorithm to interpolate for missing or bad marker data. The Template matching result is passed to the computational inverse kinematics skeleton 24. Here position data of the markers is plotted in real time to joint orientations in the computational skeleton 24. Using Constraint based rigging; the data is in turn driving a geometry (anatomically correct) skeleton. This skeleton is the base for the muscle force visualization layer.
FIG. 3 illustrates an embodiment of the present invention, comprising a computer-based motion capture system linking a treadmill instrumented with force and weight sensors, multiple optical motion capture cameras and a plasma screen or other video display means to a control computer running predictive feedback software for generating an image on the TV of a 3D real time interactive muscle model of a figure on the treadmill, wherein the patient 30 on instrumented treadmill 38 is looking at the 3D real time interactive muscle model 34 of himself seeing the muscles in action as muscle force is exerted. This interactive muscle force model 34 is calculated by a processor 36 using the method described above using data obtained from optical motion capture sensors 32 disposed on the patient's body 30, in combination with sensors disposed in the instrumented treadmill 38. In one such embodiment, weight sensors may be disposed in the instrumented treadmill 38 while other sensors such as accelerometers, speedometers, rotation and position sensors may also be included.
FIG. 4 is a block diagrammatic view illustrating the hardware and software elements and possible interconnections of one embodiment of a motion capture system. Computers running predictive feedback loop software are linked to presence sensing technology such as optical motion capture systems, magnetic motion capture systems, inertial motion capture systems, video based motion capture systems, force sensors, weight sensors, temperature sensors, electromyography systems, electroencephalography systems, electrocardiography systems and sound sensors. The computers are also linked to motion generator devices such as passive treadmills, active treadmills, instrumented treadmills, hydraulic motion platforms, electric motion platforms, pneumatic motion platforms, and motion simulators. The presence sensing technology and the motion generation devices are linked by various possible interface devices to the subject. The hardware platform is based on high end multi-core Multi-processor workstations.
Referring to FIGS. 3 and 4, in one embodiment the multi-CPU hardware platform 36 is used as the computer means for processing, memory, and interface. The various peripherals and communications are accomplished by using standard high-speed connections using Ethernet, serial, and SCSI connections to dedicated hosts. The dedicated host can be a separate personal computer (PC) or an integrated on-board computer that interfaces with the peripheral equipment. The optical motion capture system of one embodiment includes six cameras, and the data acquisition unit of the optical motion capture system translates the camera input into the desired data set.
The data set of one embodiment is 3D position information of the sensors 20 obtained from a person 30 in real time, and is accessible to a dedicated host that allows for the fast exchange of data to the CPU 36. Data is, in one embodiment, delivered in a custom made file format. Though not limited to this type of system, the chosen main optical capture system of one embodiment is a realtime passive marker system 32, which is readily configurable for many setups. This technology is capable of converting and displaying 3D data coordinates of up to 300 optical markers at >100 HZ, The instrumented treadmill 38 is interconnected to dedicated host that connects to the CPU for transferring data and control information. The treadmill 38 of one embodiment has the capacity of measurements of real time ground reaction forces by the use of force sensors under the treadmill belt. It's speed is interconnected to the computational pipeline for establishing a feedback loop between the motion capture system 32 and the treadmill 38 so that the person is remaining at the center of the treadmill regardless of changes in the walking/running speeds. A projection device 34 such as a plasma screen or a video projector and screen is used to display the real time 3D muscle model to the user.
FIG. 5 illustrates a flow chart illustrating the operation of a system configured according to one embodiment of the present invention for generating real time muscle force visualization of a subject using real time motion capture with force sensors processed through a real-time pipeline utilizing a first human body model lookup table of skeleton and markers configurations, a second human body model lookup table of mass properties, and a third human body lookup table of muscle paths. Input from the motion capture system 1 in the form of 3D marker coordinates is used as input for the Kinematic Solver 6. The Kinematic solver 6 is also using resource files from the first lookup table of a skeleton definition and marker set templates 3. The Kinematic Solver 6 is outputting in real-time the current skeleton pose. Real-time low-pass filtering and differentiation processes the changes in skeleton pose into first and second derivatives of velocities and accelerations that are used as input to the Motion Equations 7. The Kinematic Solver output to the muscle path third lookup table also drives the generation of Muscle paths for all respective muscles 5, and outputs the schematic skeleton used for the visualization 9. The Motion Equations 7 are also using input from ground reaction forces and other external forces coming from an array of Force sensors 2. The Motion Equations 7 also use an input from resource files of the second human body model lookup table that contains the respective body mass properties 4. The Equations of Motion 7 Output Joint moments to the Optimization process 8, The Optimization process 8 also uses input of muscle lengths and moment arms coming from the respective muscle paths 5 from the muscle paths third lookup table. The Optimization process 8 Outputs Muscle forces used in the Real Time muscle force Visualization 9.
In one embodiment of the invention, the skeleton pose (i.e. the set of generalized coordinates) is calculated in real-time by using the Levenberg-Marquardt nonlinear least-squares algorithm to solve the global optimization problem. The use of the analytical Jacobian matrix makes the computations very fast.
In one embodiment of the invention, equations of motion have produced via software that creates C code for the forward kinematics equations. Those equations generate coordinates of markers on the body from the generalized coordinates of the skeleton. The derivatives of the forward kinematics equations, forming a Jacobian matrix, are generated by via symbolic differentiation. Finally, one embodiment of the present invention translates these equations into computer code which is incorporated into the computational pipeline which executes the calculations at run time.
In one embodiment, the muscle forces are the solution of a static optimization problem, with the general form: minimize the sum of normalized muscle forces raised to the Nth power, while requiring that all muscle forces are non-negative, and that the set of muscle forces multiplied by their respective moment arms, are identical to the joint torques solved by the inverse dynamics equations. Normalized muscle force is defined as the muscle force relative to the maximal force capacity of the muscle. Moment arm is the distance from the muscle force vector to the instantaneous center of rotation of a particular joint and is mathematically calculated as the derivative of muscle length with respect to the joint's generalized coordinate. Traditional optimization methods are too slow for real-time applications. For N=2, which is commonly used in muscle force estimation, a solution is obtained in real time using the neural network algorithm for quadratic programming.
Motion Capture is a phrase used to describe for a variety of techniques for capturing the movement of a body or object, and the technology has existed for many years in a variety of applications. The aim of motion capture is to create three-dimensional (3D) animation and natural simulations in a performance oriented manner. In the entertainment industry, motion capture allows an operator to use computer-generated characters. Motion capture is used to create complex natural motion, using the full range of human movements and allow also inanimate objects to move realistically. Some motion capture systems provide real-time feedback of the data and allow the operator to immediately determine whether the motion works sufficiently. Motion capture can be applied to full body motion as well as to hand animation, facial animation and real time lip sync. Motion capture is also used in medical, simulation, engineering and ergonomic applications, and in feature films, advertising, TV and 3D computer games. In the context of the present invention, motion capture is used to output 3D XYZ marker positions.
Force sensors are used in many industries such as Automotive, Robotics and various Engineering applications, typically a force sensor will measure the total forces applied on it, those can be vertical force or horizontal and shear force components. In the context of the present invention, force sensors are used to measure ground reaction forces from the treadmill a person is standing, walking or running on. For example, the treadmill of one embodiment has the capacity of measurements of real time ground reaction forces by the use of force sensors under the treadmill belt, It's speed is interconnected to the computational pipeline for establishing a feedback loop between the motion capture system and the treadmill so that the person is remaining at the center of the treadmill regardless of changes in the walking/running speeds.
Skeleton definition and marker set Templates 3 are resource files used in the computational pipeline of the current invention, people are different in size and weight and a skeleton templates is selected from a group of skeleton templates to get the best match for every person. Marker templates are used to define where the 4 markers are placed on the human body. Typically, such markers are disposed at every joint of the body.
Body mass properties 4 pertains to the weight of different body parts of different people. People vary in weight and this has ramifications on the muscle force they exert to generate specific motions. The mass properties are used as a resource for the correct real time force computations.
Muscle paths 5 are utilized to compensate for differences in build between users. Variations in length and width between subjects have ramifications to the force computations as a longer muscle will exert different force to generate the same motion then a shorter muscle, also the placement of the ligaments will be different in different people. In the context of one embodiment of the present invention, muscle paths are used to assist the computations of muscle forces and joint torques.
Kinematic solver 6 provides for the calculation of joint orientation using inverse kinematics. Kinematics is the process of calculating the position in space of the end of a linked structure, given the angles of all the joints. Inverse Kinematics does the reverse. Given the end point of the structure, what angles do the joints need to be in to achieve that end point? This process is used in robotics, 3D computer animation and some engineering applications. In the context of one embodiment of the present invention it is a single step in the data analysis pipeline, taking the data stream from the motion capture system and calculating the joint angles for every body part. In the context of one embodiment of the present invention, Inverse kinematics is used to calculate the joint orientation from the motion capture data, and to thereby convert XYZ positional data to rotation angle data of the joints in degrees or radians.
Equations used in the calculation of motion and force are known to those skilled in the physical sciences, or are readily derived from equations well known in the field of physics. Motion Equations 7 are sets of mathematical equations designed to combine incoming streams of kinematics data with marker and skeleton templates and convert those to forward and inverse dynamics data. Those can be lagrangeian equation sets, Casey sets, or Euler-Newton equation sets. In the context of one embodiment of the present invention, the motion equations 7 provide the relationship between generalized forces applied at the body and generalized movements. “Generalized” in this respect means that they are formulated along the movement possibilities (or degrees of freedom) of the human body, rather than in terms of forces in the external world. This implies that most of the generalized forces are actually moments of force (or torque). Equations 7 can be added describing the kind of interaction with the environment, such as contacts with the floor. The equations 7 can be solved simultaneously in a forward simulation, solved algebraically in an inverse simulation or rearranged and solved to do a mixed inverse and forward simulation. In one embodiment of the present invention these computations are all happening in real time. In one embodiment, effective delay is eliminated using efficient algorithms, achieving a minimal sampling speed in real time to be greater than 30 hz, a standard familiar to those in the television and broadcast industries. One skilled in the art will readily appreciate that faster time would likewise be acceptable or desirable in some applications.
An optimization process 8 uses the input of muscle lengths and moment arms coming from the respective muscle paths to output muscle forces and joint torques. The optimization 8 of the data contains routines for data normalization and several real time software filters
Real Time muscle force visualization 9 is provided by inputs of muscle forces and joint torques and are used to drive color animation on the respective muscles displayed as a 3D human body model on screen. The color brightness and hue correlates with the muscle force amplitude, gain and activation patterns. The user and operator can see a real time animation of the muscle forces active in the human body at any given time
Various embodiments of the present invention provide applications adaptable for other market segments. Sports and fitness is one such market. One embodiment of the present invention provides a tool that is useful in numerous applications, including the fitness industry. This system allows the visualizations of muscle forces for any given exercise in real-time. The system can be used to enhance and improve muscle forces, by providing a realistic visualization of the given forces and torques. The present system allows the user to see the force transference to various muscles in the body and achieve a desired effect. The motion capture system instantly records the user's motion and provides immediate muscle force visualizations.
One embodiment of the present invention may have an enormous impact in the medical community by making it possible to view muscle forces and torques in real-time. It can assist and improve the quality of life of many patients and allow the perception of physical movement and muscle behaviors for those not otherwise capable of such motion. The system is useful for victims of traumatic brain injury, cerebral damage, and spinal damage. The study of motion recognition supports the notion that the body remembers certain movements and can even regenerate synoptic paths. By visualizing the desired muscle force, the body can be re-trained to make that movement. In the field of orthopedics and prosthetics, the present invention can assist patients in understanding their present situation, where they lack muscle force and where they are exerting too much force for compensation reasons. With orthopedics, prosthetics, and amputees, the system can visualize and track muscle deficiencies while training and improving movements. One embodiment of the present invention in relation to medical applications can serve as an example. One development project called “Virtual Gait Lab” is one embodiment of the system operating in the real-time domain. This project pertains to the development of a virtual reality system in which the muscle forces and joint torques of the human body can be seen and evaluated in real time in a variety of reproducible conditions. One of the major objectives of such a project is to enhance diagnostic and therapeutic activities in a range of medical fields. The enhancements are defined by allowing a medical expert team for the first time the opportunity to view and analyze muscle forces and joint torques patterns as they happen in a controlled real-time environment.
In one embodiment such as that illustrated in FIG. 3, the system consists of a combination of an instrumented treadmill 38 that is capable of measuring ground reaction forces, a large screen or projection system for the display of the forces 34, real time motion capture system 32 and the custom computational pipeline 36 translating the capture data to muscle forces and joint torques display.
Various embodiments of the present invention seek to develop an interactive virtual real-time muscle model, which can provide patients with means of almost unlimited exploratory behaviors and at the same time provide medical experts accurate measurement tools for monitoring the complex array of forces present in the human body. Especially in complex balance tasks, the patterns of muscle activation determine whether a subject falls or not. These simulations are aimed at an understanding of normal or pathological response patterns in certain balance tasks. Such an embodiment offers not only a test- and learning environment for patients and doctors, but is also a valuable research environment for motor control. Such an embodiment opens the door to a new type of experiments in which real time muscle force visualization can be offered. For example the muscle force tremors as observed in Parkinson patients are considered to be an enigma by many clinicians and human movement scientists. In these patients some visual cues are sufficient to trigger rather normal looking muscle force patterns (for instance used in walking), while in the absence of such stimuli a pattern can not even be started. In healthy subjects, the continuous control of muscle force transference during walking is possible by having a multi-channel sensory input onto a vast library of learned motor patterns. Once the possibility exists to view in real time the muscle force pattern immergence, it will lead to fundamental improvement in the understanding and possible treatment of the sickness. Such an embodiment will allow a new glimpse into the complexity of the natural processes associated with human motion. Other examples can be found among patients with peripheral disorders, such as partial paralysis or paresis of a limb. In these situations, gait and balance are compromised both by a partial lack of sensory input and a lack of muscle coordination. The usual result of that is that in order to obtain a functional gait and balance the patients find compensations, resulting in deviant movement patterns in healthy parts of the body. Making use of the real time muscle force and joint torques visualization can help to sort out the distinction between compensation and primary disorders.
One embodiment of the invention is a new principle in real time visualization, where muscle force is seen and evaluated in a totally new way. This principle establishes a mechanism to achieve a visualization state whereby the persons involved can see immediately which muscles they are using and to what extent.
One embodiment of the present invention is a muscle force processing system, comprising a processing means, a motion capture system connected to the processing means. The motion capture data is taken from a plurality of motion sensors and is processed in real-time. There is a computational pipeline connected to the processing means, wherein resulting data is also processed in real-time, and wherein resulting data is visualized in real time through color space changes in a 3D muscle model showing the muscle forces and joint torques in real time. There is also a means of interfacing to the muscle model with a runtime control input. A further embodiment is an instrumented treadmill capable of measurements of ground reaction forces, wherein the measurements of said ground reaction forces are integrated in the computational pipeline resulting in real time view of muscle forces and joint torques. A further embodiment is a 3D interactive muscle model further comprising an inverse kinematics skeleton layer, a 3D geometry anatomically correct skeleton layer and an anatomically correct muscle model layer. An additional embodiment is a real time computational pipeline, further comprising a memory means for recording the motion capture data and processing the data in real time through the said layers of the processing real time pipeline. Another embodiment is a method and system for real time visualization registration, evaluation, and correction of muscle forces and joint torques in the human body, wherein the full process is happening in real time.
The foregoing description of the embodiments of the invention has been presented for the purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise form disclosed. Many modifications and variations are possible in light of this disclosure. It is intended that the scope of the invention be limited not by this detailed description, but rather by the claims appended hereto.

Claims (10)

1. A method for computing, measuring, recording and viewing in real time muscle forces and joint moments of a body in motion, employing a system for visual display and output of anatomical forces, said system comprising:
a motion capture system for collecting and recording real time position data;
a computer receiving data from said motion capture system and deriving rotational data therefrom;
a first look up table comprising biomechanical body skeleton definition and marker set templates, a second look up table comprising body mass properties, and a third look up table comprising muscle paths;
a computational pipeline disposed on said computer; and
a computer display unit for displaying color animation,
said method comprising:
placing the body in motion within range of the motion capture system;
collecting and recording in real time three dimensional coordinates of selected marker points on the body in motion;
calculating in real time a skeleton pose of the body in motion from the three dimensional coordinates and from said first look up table;
calculating in real time joint moments of the body using the skeleton pose of the body in motion, a first derivative velocity of the skeleton pose, a second derivative acceleration of the skeleton pose, force vectors representing external forces affecting the body in motion, and body mass properties from said second look up table;
calculating in real time muscle lengths and muscle moment arms using the skeleton pose and said third look up table;
calculating in real time muscle forces using the joint moments, muscle lengths and muscle moment arms, by using quadratic programming and a neural network optimization algorithm for said quadratic programming; and
generating and displaying in real time from the calculated muscle forces while the body in motion remains in motion, a color animation of a muscle model whereby a relative degree of muscle force of the body in motion is displayed by a relative coloring of the respective muscles of the color animation.
2. The method of claim 1, wherein said color animation of a muscle model further comprises joint moments presented by scale and color of three dimensional vector animations.
3. The method of claim 1, further comprising:
placing at least one motion capture marker on said body defining said three dimensional coordinates;
collecting and recording real time positional data from said at least one marker; and
deriving rotational data from said at least one of the motion capture markers.
4. The method of claim 1, wherein said body in motion is a human body, and said color animation is being displayed to said body in motion in real time as visual feedback.
5. The method of claim 1, wherein said motion capture system comprises sensors for said collecting and recording of real time position data, the sensors selected from the group of sensors consisting of optical, magnetic, inertial, and video based sensors.
6. The method of claim 1, wherein said system for visual display and output of anatomical forces comprises an instrumented platform supporting said body in motion.
7. The method of claim 2 wherein said muscle forces and said joint moments are displayed on a representation of a human body.
8. The method of claim 4, further comprising:
selecting a biomechanical body skeleton definition and marker set template approximating a build of the human body.
9. The method of claim 1, wherein said computation pipeline comprises use of a Jacobian matrix to facilitate said computing in real time.
10. A method for viewing in real time muscle forces and joint moments of a body in motion employing a system for visual display and output of anatomical forces, said system comprising:
a motion capture system for collecting and recording real time position data;
a computer receiving data from said motion capture system and deriving rotational data therefrom;
a first look up table comprising biomechanical body skeleton definition and marker set templates, a second look up table comprising body mass properties, and a third look up table comprising muscle paths;
a computational pipeline disposed on said computer; and
a computer display unit for displaying color animation;
said method comprising:
placing a body in motion within range of said motion capture system;
calculating in real time a skeleton pose of the body in motion from three dimensional coordinates of selected marker points on the body in motion and from said first look up table;
calculating in real time joint moments of the body in motion using the skeleton pose of the body, a first derivative velocity of the skeleton pose, a second derivative acceleration of the skeleton pose, force vectors representing external forces affecting the body in motion, and body mass properties from said second look up table;
calculating in real time muscle lengths and muscle moment arms using the skeleton pose and said third look up table;
calculating in real time muscle forces of the body in motion using the joint moment, muscle lengths and muscle moment arms, and optimization algorithms;
generating and displaying in real time from the calculated muscle forces a color animation of a muscle model whereby a relative degree of muscle force of the body in motion is presented by relative coloring of the respective muscles in the color animation; and
having an operator interact with the system during system runtime.
US12/251,688 2007-03-07 2008-10-15 Method for real time interactive visualization of muscle forces and joint torques in the human body Active 2027-10-01 US7931604B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/251,688 US7931604B2 (en) 2007-03-07 2008-10-15 Method for real time interactive visualization of muscle forces and joint torques in the human body

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US89339407P 2007-03-07 2007-03-07
US11/832,726 US20080221487A1 (en) 2007-03-07 2007-08-02 Method for real time interactive visualization of muscle forces and joint torques in the human body
US12/251,688 US7931604B2 (en) 2007-03-07 2008-10-15 Method for real time interactive visualization of muscle forces and joint torques in the human body

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US11/832,726 Continuation US20080221487A1 (en) 2007-03-07 2007-08-02 Method for real time interactive visualization of muscle forces and joint torques in the human body

Publications (2)

Publication Number Publication Date
US20090082701A1 US20090082701A1 (en) 2009-03-26
US7931604B2 true US7931604B2 (en) 2011-04-26

Family

ID=39739017

Family Applications (2)

Application Number Title Priority Date Filing Date
US11/832,726 Abandoned US20080221487A1 (en) 2007-03-07 2007-08-02 Method for real time interactive visualization of muscle forces and joint torques in the human body
US12/251,688 Active 2027-10-01 US7931604B2 (en) 2007-03-07 2008-10-15 Method for real time interactive visualization of muscle forces and joint torques in the human body

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US11/832,726 Abandoned US20080221487A1 (en) 2007-03-07 2007-08-02 Method for real time interactive visualization of muscle forces and joint torques in the human body

Country Status (6)

Country Link
US (2) US20080221487A1 (en)
EP (1) EP2120710B1 (en)
JP (1) JP5016687B2 (en)
CA (1) CA2680462C (en)
ES (1) ES2679125T3 (en)
WO (1) WO2008109248A2 (en)

Cited By (48)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110052005A1 (en) * 2009-08-28 2011-03-03 Allen Joseph Selner Designation of a Characteristic of a Physical Capability by Motion Analysis, Systems and Methods
US20110069888A1 (en) * 2009-09-22 2011-03-24 Samsung Electronics Co., Ltd. Image processing apparatus and method
US20110091070A1 (en) * 2009-09-15 2011-04-21 Sony Corporation Combining multi-sensory inputs for digital animation
US20110137138A1 (en) * 2008-05-29 2011-06-09 Per Johansson Patient Management Device, System And Method
US20110199291A1 (en) * 2010-02-16 2011-08-18 Microsoft Corporation Gesture detection based on joint skipping
US20120108909A1 (en) * 2010-11-03 2012-05-03 HeadRehab, LLC Assessment and Rehabilitation of Cognitive and Motor Functions Using Virtual Reality
US20120154409A1 (en) * 2010-12-15 2012-06-21 Microsoft Corporation Vertex-baked three-dimensional animation augmentation
US8315822B2 (en) 2011-04-20 2012-11-20 Bertec Corporation Force measurement system having inertial compensation
US8315823B2 (en) 2011-04-20 2012-11-20 Bertec Corporation Force and/or motion measurement system having inertial compensation and method thereof
US8363891B1 (en) * 2012-03-26 2013-01-29 Southern Methodist University System and method for predicting a force applied to a surface by a body during a movement
US20130059281A1 (en) * 2011-09-06 2013-03-07 Fenil Shah System and method for providing real-time guidance to a user
US20140028539A1 (en) * 2012-07-29 2014-01-30 Adam E. Newham Anatomical gestures detection system using radio signals
US8696450B2 (en) 2011-07-27 2014-04-15 The Board Of Trustees Of The Leland Stanford Junior University Methods for analyzing and providing feedback for improved power generation in a golf swing
US8704855B1 (en) 2013-01-19 2014-04-22 Bertec Corporation Force measurement system having a displaceable force measurement assembly
US8764532B1 (en) 2012-11-07 2014-07-01 Bertec Corporation System and method for fall and/or concussion prediction
US8847989B1 (en) 2013-01-19 2014-09-30 Bertec Corporation Force and/or motion measurement system and a method for training a subject using the same
US9081436B1 (en) 2013-01-19 2015-07-14 Bertec Corporation Force and/or motion measurement system and a method of testing a subject using the same
US9168420B1 (en) 2012-01-11 2015-10-27 Bertec Corporation Force measurement system
US9275488B2 (en) 2012-10-11 2016-03-01 Sony Corporation System and method for animating a body
US9504909B2 (en) 2011-05-05 2016-11-29 Qualcomm Incorporated Method and apparatus of proximity and stunt recording for outdoor gaming
US9526451B1 (en) 2012-01-11 2016-12-27 Bertec Corporation Force measurement system
US9526443B1 (en) 2013-01-19 2016-12-27 Bertec Corporation Force and/or motion measurement system and a method of testing a subject
US9588582B2 (en) 2013-09-17 2017-03-07 Medibotics Llc Motion recognition clothing (TM) with two different sets of tubes spanning a body joint
US9770203B1 (en) 2013-01-19 2017-09-26 Bertec Corporation Force measurement system and a method of testing a subject
US9916011B1 (en) 2015-08-22 2018-03-13 Bertec Corporation Force measurement system that includes a force measurement assembly, a visual display device, and one or more data processing devices
US10010286B1 (en) 2013-01-19 2018-07-03 Bertec Corporation Force measurement system
US10216262B1 (en) 2015-08-22 2019-02-26 Bertec Corporation Force management system that includes a force measurement assembly, a visual display device, and one or more data processing devices
US10231662B1 (en) 2013-01-19 2019-03-19 Bertec Corporation Force measurement system
US10390736B1 (en) 2015-08-22 2019-08-27 Bertec Corporation Force measurement system that includes a force measurement assembly, at least one visual display device, and one or more data processing devices
US10413230B1 (en) 2013-01-19 2019-09-17 Bertec Corporation Force measurement system
US10555688B1 (en) 2015-08-22 2020-02-11 Bertec Corporation Measurement system that includes at least one measurement assembly, a head-mounted visual display device, and a data processing device
US10588546B2 (en) 2013-06-26 2020-03-17 The Cleveland Clinic Foundation Systems and methods to assess balance
US10646153B1 (en) 2013-01-19 2020-05-12 Bertec Corporation Force measurement system
US10860843B1 (en) 2015-08-22 2020-12-08 Bertec Corporation Measurement system that includes at least one measurement assembly, a head-mounted visual display device, and a data processing device
US10856796B1 (en) 2013-01-19 2020-12-08 Bertec Corporation Force measurement system
US20200401224A1 (en) * 2019-06-21 2020-12-24 REHABILITATION INSTITUTE OF CHICAGO d/b/a Shirley Ryan AbilityLab Wearable joint tracking device with muscle activity and methods thereof
US10981047B2 (en) 2018-05-29 2021-04-20 Curiouser Products Inc. Reflective video display apparatus for interactive training and demonstration and methods of using same
US11052288B1 (en) 2013-01-19 2021-07-06 Bertec Corporation Force measurement system
US11167172B1 (en) 2020-09-04 2021-11-09 Curiouser Products Inc. Video rebroadcasting with multiplexed communications and display via smart mirrors
US11301045B1 (en) 2015-08-22 2022-04-12 Bertec Corporation Measurement system that includes at least one measurement assembly, a visual display device, and at least one data processing device
US11311209B1 (en) 2013-01-19 2022-04-26 Bertec Corporation Force measurement system and a motion base used therein
US11465030B2 (en) 2020-04-30 2022-10-11 Curiouser Products Inc. Reflective video display apparatus for interactive training and demonstration and methods of using same
US11530957B2 (en) * 2018-12-13 2022-12-20 Hyundai Motor Company Method for predicting clamp force using convolutional neural network
US11529025B2 (en) 2012-10-11 2022-12-20 Roman Tsibulevskiy Technologies for computing
US11540744B1 (en) 2013-01-19 2023-01-03 Bertec Corporation Force measurement system
US11857331B1 (en) 2013-01-19 2024-01-02 Bertec Corporation Force measurement system
US11992746B1 (en) 2015-08-22 2024-05-28 Bertec Corporation Hybrid display system for a force measurement assembly, an exercise device, or an interactive game
US12094360B1 (en) 2019-08-07 2024-09-17 University Of South Florida Apparatuses and methods for practicing a reduction procedure for treating radial head subluxation

Families Citing this family (119)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8352400B2 (en) 1991-12-23 2013-01-08 Hoffberg Steven M Adaptive pattern recognition based controller apparatus and method and human-factored interface therefore
US7966078B2 (en) 1999-02-01 2011-06-21 Steven Hoffberg Network media appliance system and method
US7782358B2 (en) * 2007-06-08 2010-08-24 Nokia Corporation Measuring human movements—method and apparatus
ATE512624T1 (en) * 2007-11-14 2011-07-15 Zebris Medical Gmbh ARRANGEMENT FOR GAIT ANALYSIS
US20090144092A1 (en) * 2007-12-04 2009-06-04 Terence Vardy Collection of medical data
US8620635B2 (en) 2008-06-27 2013-12-31 Microsoft Corporation Composition of analytics models
US8411085B2 (en) 2008-06-27 2013-04-02 Microsoft Corporation Constructing view compositions for domain-specific environments
US7980997B2 (en) * 2008-10-23 2011-07-19 University Of Southern California System for encouraging a user to perform substantial physical activity
US8314793B2 (en) 2008-12-24 2012-11-20 Microsoft Corporation Implied analytical reasoning and computation
US7996793B2 (en) 2009-01-30 2011-08-09 Microsoft Corporation Gesture recognizer system architecture
US20100199231A1 (en) 2009-01-30 2010-08-05 Microsoft Corporation Predictive determination
US8444564B2 (en) * 2009-02-02 2013-05-21 Jointvue, Llc Noninvasive diagnostic system
WO2010099361A1 (en) * 2009-02-25 2010-09-02 Sherlock Nmd, Llc Devices, systems and methods for capturing biomechanical motion
TWI396110B (en) * 2009-06-18 2013-05-11 Nat Univ Chin Yi Technology From the plane image of the establishment of three human space center of gravity and computer program products
US9330503B2 (en) 2009-06-19 2016-05-03 Microsoft Technology Licensing, Llc Presaging and surfacing interactivity within data visualizations
US8493406B2 (en) 2009-06-19 2013-07-23 Microsoft Corporation Creating new charts and data visualizations
US8531451B2 (en) 2009-06-19 2013-09-10 Microsoft Corporation Data-driven visualization transformation
US8788574B2 (en) 2009-06-19 2014-07-22 Microsoft Corporation Data-driven visualization of pseudo-infinite scenes
US8866818B2 (en) 2009-06-19 2014-10-21 Microsoft Corporation Composing shapes and data series in geometries
US8692826B2 (en) 2009-06-19 2014-04-08 Brian C. Beckman Solver-based visualization framework
US8412669B2 (en) * 2009-08-21 2013-04-02 Life Modeler, Inc. Systems and methods for determining muscle force through dynamic gain optimization of a muscle PID controller
US8527217B2 (en) * 2009-09-08 2013-09-03 Dynamic Athletic Research Institute, Llc Apparatus and method for physical evaluation
US8352397B2 (en) 2009-09-10 2013-01-08 Microsoft Corporation Dependency graph in data-driven model
DE112010003667A5 (en) * 2009-09-16 2012-12-06 Otto-Von-Guericke-Universität Magdeburg Medizinische Fakultät Device for determining vertebral spacing of the spinal column of a patient
US9043296B2 (en) 2010-07-30 2015-05-26 Microsoft Technology Licensing, Llc System of providing suggestions based on accessible and contextual information
US9283429B2 (en) 2010-11-05 2016-03-15 Nike, Inc. Method and system for automated personal training
US9977874B2 (en) 2011-11-07 2018-05-22 Nike, Inc. User interface for remote joint workout session
EP2635988B1 (en) 2010-11-05 2020-04-29 NIKE Innovate C.V. Method and system for automated personal training
US9457256B2 (en) 2010-11-05 2016-10-04 Nike, Inc. Method and system for automated personal training that includes training programs
US10420982B2 (en) 2010-12-13 2019-09-24 Nike, Inc. Fitness training system with energy expenditure calculation that uses a form factor
US20120165703A1 (en) * 2010-12-22 2012-06-28 Paul William Bottum Preempt Muscle Map Screen
US8840527B2 (en) * 2011-04-26 2014-09-23 Rehabtek Llc Apparatus and method of controlling lower-limb joint moments through real-time feedback training
TWI534756B (en) * 2011-04-29 2016-05-21 國立成功大學 Motion-coded image, producing module, image processing module and motion displaying module
US8771206B2 (en) 2011-08-19 2014-07-08 Accenture Global Services Limited Interactive virtual care
CN102426709B (en) * 2011-08-19 2013-12-25 北京航空航天大学 Real-time motion synthesis method based on fast inverse kinematics
ITFI20110232A1 (en) * 2011-10-21 2013-04-22 Zionamento Sant Anna METHOD FOR CALCULATING THE MASS CENTER FOR A UMANOID PLATFORM
US9811639B2 (en) 2011-11-07 2017-11-07 Nike, Inc. User interface and fitness meters for remote joint workout session
JP6185053B2 (en) 2012-06-04 2017-08-23 ナイキ イノベイト シーブイ Combined score including fitness subscore and athletic subscore
US9478058B2 (en) * 2012-08-06 2016-10-25 CELSYS, Inc. Object correcting apparatus and method and computer-readable recording medium
US20210001170A1 (en) * 2012-08-31 2021-01-07 Blue Goji Llc Apparatus for natural torso and limbs tracking and feedback for electronic interaction
US9504336B2 (en) * 2013-02-13 2016-11-29 Jon Dodd Configurable bed
ES2432228B1 (en) * 2013-02-15 2014-11-18 Asociación Instituto De Biomecánica De Valencia Procedure and installation to characterize the support pattern of a subject
US20140267611A1 (en) * 2013-03-14 2014-09-18 Microsoft Corporation Runtime engine for analyzing user motion in 3d images
US10402517B2 (en) * 2013-06-26 2019-09-03 Dassault Systémes Simulia Corp. Musculo-skeletal modeling using finite element analysis, process integration, and design optimization
US20150124566A1 (en) 2013-10-04 2015-05-07 Thalmic Labs Inc. Systems, articles and methods for wearable electronic devices employing contact sensors
US11921471B2 (en) 2013-08-16 2024-03-05 Meta Platforms Technologies, Llc Systems, articles, and methods for wearable devices having secondary power sources in links of a band for providing secondary power in addition to a primary power source
US10188309B2 (en) 2013-11-27 2019-01-29 North Inc. Systems, articles, and methods for electromyography sensors
US10042422B2 (en) 2013-11-12 2018-08-07 Thalmic Labs Inc. Systems, articles, and methods for capacitive electromyography sensors
WO2015054426A1 (en) * 2013-10-08 2015-04-16 Ali Kord Single-camera motion capture system
WO2015061750A1 (en) * 2013-10-24 2015-04-30 Ali Kord Motion capture system
US20160262685A1 (en) 2013-11-12 2016-09-15 Highland Instruments, Inc. Motion analysis systemsand methods of use thereof
WO2015135981A1 (en) * 2014-03-12 2015-09-17 Movotec A/S System, apparatus and method for measurement of muscle stiffness
US9858391B2 (en) * 2014-04-17 2018-01-02 The Boeing Company Method and system for tuning a musculoskeletal model
US9880632B2 (en) 2014-06-19 2018-01-30 Thalmic Labs Inc. Systems, devices, and methods for gesture identification
US10140745B2 (en) * 2015-01-09 2018-11-27 Vital Mechanics Research Inc. Methods and systems for computer-based animation of musculoskeletal systems
JP6589354B2 (en) * 2015-04-23 2019-10-16 学校法人立命館 Lower limb training equipment
US9836118B2 (en) * 2015-06-16 2017-12-05 Wilson Steele Method and system for analyzing a movement of a person
KR102459677B1 (en) 2015-11-05 2022-10-28 삼성전자주식회사 Method and apparatus for learning algorithm
US20170177833A1 (en) * 2015-12-22 2017-06-22 Intel Corporation Smart placement of devices for implicit triggering of feedbacks relating to users' physical activities
US11511156B2 (en) 2016-03-12 2022-11-29 Arie Shavit Training system and methods for designing, monitoring and providing feedback of training
US10625137B2 (en) * 2016-03-18 2020-04-21 Icon Health & Fitness, Inc. Coordinated displays in an exercise device
US11219575B2 (en) * 2016-03-23 2022-01-11 Zoll Medical Corporation Real-time kinematic analysis during cardio-pulmonary resuscitation
WO2018022658A1 (en) 2016-07-25 2018-02-01 Ctrl-Labs Corporation Adaptive system for deriving control signals from measurements of neuromuscular activity
US11179066B2 (en) 2018-08-13 2021-11-23 Facebook Technologies, Llc Real-time spike detection and identification
US11216069B2 (en) 2018-05-08 2022-01-04 Facebook Technologies, Llc Systems and methods for improved speech recognition using neuromuscular information
WO2018022657A1 (en) 2016-07-25 2018-02-01 Ctrl-Labs Corporation System and method for measuring the movements of articulated rigid bodies
CN110300542A (en) 2016-07-25 2019-10-01 开创拉布斯公司 Method and apparatus for predicting musculoskeletal location information using wearable automated sensors
WO2018022597A1 (en) 2016-07-25 2018-02-01 Ctrl-Labs Corporation Methods and apparatus for inferring user intent based on neuromuscular signals
US10687759B2 (en) 2018-05-29 2020-06-23 Facebook Technologies, Llc Shielding techniques for noise reduction in surface electromyography signal measurement and related systems and methods
US11331045B1 (en) 2018-01-25 2022-05-17 Facebook Technologies, Llc Systems and methods for mitigating neuromuscular signal artifacts
US20190121306A1 (en) 2017-10-19 2019-04-25 Ctrl-Labs Corporation Systems and methods for identifying biological structures associated with neuromuscular source signals
US10772519B2 (en) 2018-05-25 2020-09-15 Facebook Technologies, Llc Methods and apparatus for providing sub-muscular control
US20190362853A1 (en) * 2017-02-10 2019-11-28 Apex Occupational Health Solutions Inc. Method and system for ergonomic augmentation of workspaces
CN108211308B (en) * 2017-05-25 2019-08-16 深圳市前海未来无限投资管理有限公司 A kind of movement effects methods of exhibiting and device
US11253173B1 (en) * 2017-05-30 2022-02-22 Verily Life Sciences Llc Digital characterization of movement to detect and monitor disorders
WO2018236936A1 (en) 2017-06-19 2018-12-27 Mahfouz Mohamed R Surgical navigation of the hip using fluoroscopy and tracking sensors
US11544871B2 (en) * 2017-12-13 2023-01-03 Google Llc Hand skeleton learning, lifting, and denoising from 2D images
CN112005198A (en) 2018-01-25 2020-11-27 脸谱科技有限责任公司 Hand state reconstruction based on multiple inputs
WO2019147958A1 (en) * 2018-01-25 2019-08-01 Ctrl-Labs Corporation User-controlled tuning of handstate representation model parameters
US11961494B1 (en) 2019-03-29 2024-04-16 Meta Platforms Technologies, Llc Electromagnetic interference reduction in extended reality environments
US11150730B1 (en) 2019-04-30 2021-10-19 Facebook Technologies, Llc Devices, systems, and methods for controlling computing devices via neuromuscular signals of users
US10970936B2 (en) 2018-10-05 2021-04-06 Facebook Technologies, Llc Use of neuromuscular signals to provide enhanced interactions with physical objects in an augmented reality environment
WO2019148002A1 (en) * 2018-01-25 2019-08-01 Ctrl-Labs Corporation Techniques for anonymizing neuromuscular signal data
US10460455B2 (en) 2018-01-25 2019-10-29 Ctrl-Labs Corporation Real-time processing of handstate representation model estimates
US11493993B2 (en) 2019-09-04 2022-11-08 Meta Platforms Technologies, Llc Systems, methods, and interfaces for performing inputs based on neuromuscular control
US11069148B2 (en) 2018-01-25 2021-07-20 Facebook Technologies, Llc Visualization of reconstructed handstate information
US10937414B2 (en) 2018-05-08 2021-03-02 Facebook Technologies, Llc Systems and methods for text input using neuromuscular information
EP3742961A4 (en) 2018-01-25 2021-03-31 Facebook Technologies, Inc. Calibration techniques for handstate representation modeling using neuromuscular signals
US11907423B2 (en) 2019-11-25 2024-02-20 Meta Platforms Technologies, Llc Systems and methods for contextualized interactions with an environment
US11481030B2 (en) 2019-03-29 2022-10-25 Meta Platforms Technologies, Llc Methods and apparatus for gesture detection and classification
CN108338791B (en) * 2018-02-09 2024-08-27 苏州衡品医疗科技有限公司 Detection device and detection method for unsteady motion data
US10592001B2 (en) 2018-05-08 2020-03-17 Facebook Technologies, Llc Systems and methods for improved speech recognition using neuromuscular information
CN110109532A (en) * 2018-06-11 2019-08-09 成都思悟革科技有限公司 A kind of human action Compare System obtaining system based on human body attitude
EP3807795A4 (en) 2018-06-14 2021-08-11 Facebook Technologies, LLC. User identification and authentication with neuromuscular signatures
US11045137B2 (en) 2018-07-19 2021-06-29 Facebook Technologies, Llc Methods and apparatus for improved signal robustness for a wearable neuromuscular recording device
WO2020047429A1 (en) 2018-08-31 2020-03-05 Ctrl-Labs Corporation Camera-guided interpretation of neuromuscular signals
WO2020061451A1 (en) 2018-09-20 2020-03-26 Ctrl-Labs Corporation Neuromuscular text entry, writing and drawing in augmented reality systems
EP3857342A4 (en) 2018-09-26 2021-12-01 Facebook Technologies, LLC. Neuromuscular control of physical objects in an environment
EP3886693A4 (en) 2018-11-27 2022-06-08 Facebook Technologies, LLC. Methods and apparatus for autocalibration of a wearable electrode sensor system
US10905383B2 (en) 2019-02-28 2021-02-02 Facebook Technologies, Llc Methods and apparatus for unsupervised one-shot machine learning for classification of human gestures and estimation of applied forces
US11331006B2 (en) 2019-03-05 2022-05-17 Physmodo, Inc. System and method for human motion detection and tracking
WO2020181136A1 (en) 2019-03-05 2020-09-10 Physmodo, Inc. System and method for human motion detection and tracking
US11497962B2 (en) 2019-03-05 2022-11-15 Physmodo, Inc. System and method for human motion detection and tracking
JP7133511B2 (en) * 2019-06-11 2022-09-08 本田技研工業株式会社 Information processing device, information processing method, and program
WO2020252484A1 (en) * 2019-06-14 2020-12-17 The Regents Of The University Of California Deep learning of biomimetic sensorimotor control for biomechanical model animation
US11790536B1 (en) 2019-10-11 2023-10-17 Bertec Corporation Swing analysis system
US11097154B1 (en) * 2019-10-11 2021-08-24 Bertec Corporation Swing analysis system
US11458362B1 (en) * 2019-10-11 2022-10-04 Bertec Corporation Swing analysis system
US12089953B1 (en) 2019-12-04 2024-09-17 Meta Platforms Technologies, Llc Systems and methods for utilizing intrinsic current noise to measure interface impedances
JP7501543B2 (en) 2019-12-27 2024-06-18 ソニーグループ株式会社 Information processing device, information processing method, and information processing program
KR102362464B1 (en) * 2020-02-11 2022-02-15 한국과학기술연구원 Apparatus and method for evaluating force control ability of upper limb and prosthesis
CN112115813B (en) * 2020-08-31 2024-08-13 深圳市联合视觉创新科技有限公司 Labeling method and device for human body electromyographic signals and computing equipment
WO2022056271A1 (en) * 2020-09-11 2022-03-17 University Of Iowa Research Foundation Methods and apapratus for machine learning to analyze musculo-skeletal rehabilitation from images
CN116507276A (en) * 2020-09-11 2023-07-28 爱荷华大学研究基金会 Method and apparatus for machine learning to analyze musculoskeletal rehabilitation from images
CN112733301B (en) * 2021-01-21 2023-08-08 佛山科学技术学院 Six-dimensional moment sensor gravity compensation method and system based on neural network
US11868531B1 (en) 2021-04-08 2024-01-09 Meta Platforms Technologies, Llc Wearable device providing for thumb-to-finger-based input gestures detected based on neuromuscular signals, and systems and methods of use thereof
US11941824B2 (en) * 2021-04-12 2024-03-26 VelocityEHS Holdings, Inc. Video-based hand and ground reaction force determination
CN113456060B (en) * 2021-05-27 2023-01-17 中国科学院软件研究所 Extraction device for motion function characteristic parameters
US11951357B1 (en) * 2022-11-30 2024-04-09 Roku, Inc. Platform for visual tracking of user fitness

Citations (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5623428A (en) * 1990-12-25 1997-04-22 Shukyohoji, Kongo Zen Sohozan Shoriji Method for developing computer animation
US5623944A (en) 1991-10-10 1997-04-29 Neurocom International, Inc. Method for characterizing gait
US5625577A (en) 1990-12-25 1997-04-29 Shukyohojin, Kongo Zen Sohonzan Shorinji Computer-implemented motion analysis method using dynamics
US5791351A (en) * 1994-05-26 1998-08-11 Curchod; Donald B. Motion measurement apparatus
US5872858A (en) * 1991-09-17 1999-02-16 Fujitsu Limited Kawasaki Moving body recognition apparatus
US5904484A (en) * 1996-12-23 1999-05-18 Burns; Dave Interactive motion training device and method
US5930741A (en) * 1995-02-28 1999-07-27 Virtual Technologies, Inc. Accurate, rapid, reliable position sensing using multiple sensing technologies
US5937081A (en) * 1996-04-10 1999-08-10 O'brill; Michael R. Image composition system and method of using same
US6102832A (en) * 1996-08-08 2000-08-15 Tani Shiraito Virtual reality simulation apparatus
US6119516A (en) * 1997-05-23 2000-09-19 Advantedge Systems, Inc. Biofeedback system for monitoring the motion of body joint
US20020045517A1 (en) 2000-08-30 2002-04-18 Oglesby Gary E. Treadmill control system
US6738065B1 (en) 1999-08-10 2004-05-18 Oshri Even-Zohar Customizable animation system
US6774885B1 (en) 1999-01-20 2004-08-10 Motek B.V. System for dynamic registration, evaluation, and correction of functional human behavior
US20040256754A1 (en) 2003-02-26 2004-12-23 Fuji Photo Film Co., Ltd. Three-dimensional image forming method and apparatus
US20060135883A1 (en) 2004-12-22 2006-06-22 Jonsson Helgi Systems and methods for processing limb motion
US20060206215A1 (en) 2005-02-02 2006-09-14 Clausen Arinbjorn V Sensing systems and methods for monitoring gait dynamics
US20060247904A1 (en) * 2001-06-29 2006-11-02 Behzad Dariush Exoskeleton controller for a human-exoskeleton system
US7136722B2 (en) 2002-02-12 2006-11-14 The University Of Tokyo Method for generating a motion of a human type link system
US20070172797A1 (en) * 2006-01-12 2007-07-26 Kabushiki Kaisha Toyota Chuo Kenkyusho Method of constructing computer-based musculoskeletal model by redefining directions of pivot axes of joints in the same model
US7292151B2 (en) * 2004-07-29 2007-11-06 Kevin Ferguson Human movement measurement system
US7308826B2 (en) * 2004-06-16 2007-12-18 The University Of Tokyo Muscular strength acquiring method and device based on musculoskeletal model
US20080009771A1 (en) * 2006-03-29 2008-01-10 Joel Perry Exoskeleton
US20080180448A1 (en) * 2006-07-25 2008-07-31 Dragomir Anguelov Shape completion, animation and marker-less motion capture of people, animals or characters
US20090135189A1 (en) * 2007-11-22 2009-05-28 Electronics And Telecommunications Research Institute Character animation system and method
US7554549B2 (en) * 2004-10-01 2009-06-30 Sony Corporation System and method for tracking facial muscle and eye motion for computer graphics animation
US7573477B2 (en) * 2005-06-17 2009-08-11 Honda Motor Co., Ltd. System and method for activation-driven muscle deformations for existing character motion

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3355113B2 (en) * 1997-09-02 2002-12-09 株式会社モノリス Method for simulating human body movement and method for generating animation using the method
JP4351808B2 (en) * 1998-09-22 2009-10-28 モーテック・ビー.ブイ. A system for dynamically registering, evaluating and correcting human functional behavior
JP3860076B2 (en) * 2002-06-06 2006-12-20 独立行政法人科学技術振興機構 BODY MODEL GENERATION METHOD, BODY MODEL GENERATION PROGRAM, RECORDING MEDIUM RECORDING THE SAME, AND RECORDING MEDIUM RECORDING BODY MODEL DATA
JP3960536B2 (en) * 2002-08-12 2007-08-15 株式会社国際電気通信基礎技術研究所 Computer-implemented method and computer-executable program for automatically adapting a parametric dynamic model to human actor size for motion capture
JP4358653B2 (en) * 2003-02-26 2009-11-04 富士フイルム株式会社 3D image forming method and apparatus
US8382684B2 (en) * 2005-03-11 2013-02-26 Rsscan International Method and apparatus for displaying 3D images of a part of the skeleton
JP2007004732A (en) * 2005-06-27 2007-01-11 Matsushita Electric Ind Co Ltd Image generation device and method

Patent Citations (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5623428A (en) * 1990-12-25 1997-04-22 Shukyohoji, Kongo Zen Sohozan Shoriji Method for developing computer animation
US5625577A (en) 1990-12-25 1997-04-29 Shukyohojin, Kongo Zen Sohonzan Shorinji Computer-implemented motion analysis method using dynamics
US5872858A (en) * 1991-09-17 1999-02-16 Fujitsu Limited Kawasaki Moving body recognition apparatus
US5623944A (en) 1991-10-10 1997-04-29 Neurocom International, Inc. Method for characterizing gait
US5791351A (en) * 1994-05-26 1998-08-11 Curchod; Donald B. Motion measurement apparatus
US5826578A (en) * 1994-05-26 1998-10-27 Curchod; Donald B. Motion measurement apparatus
US5930741A (en) * 1995-02-28 1999-07-27 Virtual Technologies, Inc. Accurate, rapid, reliable position sensing using multiple sensing technologies
US5937081A (en) * 1996-04-10 1999-08-10 O'brill; Michael R. Image composition system and method of using same
US6102832A (en) * 1996-08-08 2000-08-15 Tani Shiraito Virtual reality simulation apparatus
US5904484A (en) * 1996-12-23 1999-05-18 Burns; Dave Interactive motion training device and method
US6119516A (en) * 1997-05-23 2000-09-19 Advantedge Systems, Inc. Biofeedback system for monitoring the motion of body joint
US6774885B1 (en) 1999-01-20 2004-08-10 Motek B.V. System for dynamic registration, evaluation, and correction of functional human behavior
US6738065B1 (en) 1999-08-10 2004-05-18 Oshri Even-Zohar Customizable animation system
US20020045517A1 (en) 2000-08-30 2002-04-18 Oglesby Gary E. Treadmill control system
US20060247904A1 (en) * 2001-06-29 2006-11-02 Behzad Dariush Exoskeleton controller for a human-exoskeleton system
US7136722B2 (en) 2002-02-12 2006-11-14 The University Of Tokyo Method for generating a motion of a human type link system
US20040256754A1 (en) 2003-02-26 2004-12-23 Fuji Photo Film Co., Ltd. Three-dimensional image forming method and apparatus
US7308826B2 (en) * 2004-06-16 2007-12-18 The University Of Tokyo Muscular strength acquiring method and device based on musculoskeletal model
US7292151B2 (en) * 2004-07-29 2007-11-06 Kevin Ferguson Human movement measurement system
US7554549B2 (en) * 2004-10-01 2009-06-30 Sony Corporation System and method for tracking facial muscle and eye motion for computer graphics animation
US20060135883A1 (en) 2004-12-22 2006-06-22 Jonsson Helgi Systems and methods for processing limb motion
US20060206215A1 (en) 2005-02-02 2006-09-14 Clausen Arinbjorn V Sensing systems and methods for monitoring gait dynamics
US7573477B2 (en) * 2005-06-17 2009-08-11 Honda Motor Co., Ltd. System and method for activation-driven muscle deformations for existing character motion
US20070172797A1 (en) * 2006-01-12 2007-07-26 Kabushiki Kaisha Toyota Chuo Kenkyusho Method of constructing computer-based musculoskeletal model by redefining directions of pivot axes of joints in the same model
US20080009771A1 (en) * 2006-03-29 2008-01-10 Joel Perry Exoskeleton
US20080180448A1 (en) * 2006-07-25 2008-07-31 Dragomir Anguelov Shape completion, animation and marker-less motion capture of people, animals or characters
US20090135189A1 (en) * 2007-11-22 2009-05-28 Electronics And Telecommunications Research Institute Character animation system and method

Non-Patent Citations (10)

* Cited by examiner, † Cited by third party
Title
"Nonlinear Models", Cambridge University Press, 1986-1992, pp. 675-684.
De Leva, Paolo, "Adjustments to Zatsiorsky-Seluyanov's Segment Inertia Parameters", J. Biomechanics, 1996, pp. 1223-1230, vol. 29, No. 9, Elsevier Science Ltd., Great Britain.
Delp, Scott L. et al., "An Interactive Graphics-Based Model of the Lower Extremity to Study Orthopedic Surgical Procedures", IEEE Transactions on Biomedical Engineering, Aug. 1990, pp. 757-767, vol. 37, No. 8.
McLean, Scott G. et al., "Sagittal Plane Biomechanics Cannot Injure the ACL During Sidestep Cutting", Clinical Biomechanics, 2004, pp. 828-838, Elsevier Ltd.
PCT Search Report dated Aug. 27, 2008 of Patent Application No. PCT/US08/54239 filed Feb. 19, 2008.
PCT Search Report dated Jan. 16, 2009 for PCT Application No. PCT/IB08/01835 filed May 5, 2008.
Rule 132 Declaration for Itzhak Siev-Ner, MD for U.S. Appl. No. 11/832,726, 4 pages.
Van Den Bogert, Antonie J. et al., "A Weighted Least Squares Method for Inverse Dynamic Analysis", Computer Methods in Biomechanics and Biomedical Engineering, 2007, pp. 1-7, vol. 00, No. 0.
Van Der Helm, F.C.T., "A Finite Element Musculoskeletal Model of the Shoulder Mechanism", J. Biomechanics, 1994, pp. 551-569, vol. 27, No. 5, Elsevier Science Ltd, Great Britain.
Xia, Youshen et al., "An Improved Neural Network for Convex Quadratic Optimization with Application to Real-Time Beamforming", Neurocomputing, 2005, pp. 359-374, Elsevier B.V.

Cited By (110)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110137138A1 (en) * 2008-05-29 2011-06-09 Per Johansson Patient Management Device, System And Method
US9307941B2 (en) 2008-05-29 2016-04-12 Bläckbild Patient management device, system and method
US8821416B2 (en) * 2008-05-29 2014-09-02 Cunctus Ab Patient management device, system and method
US20110052005A1 (en) * 2009-08-28 2011-03-03 Allen Joseph Selner Designation of a Characteristic of a Physical Capability by Motion Analysis, Systems and Methods
US8139822B2 (en) * 2009-08-28 2012-03-20 Allen Joseph Selner Designation of a characteristic of a physical capability by motion analysis, systems and methods
US8736616B2 (en) * 2009-09-15 2014-05-27 Sony Corporation Combining multi-sensory inputs for digital animation
US20110091070A1 (en) * 2009-09-15 2011-04-21 Sony Corporation Combining multi-sensory inputs for digital animation
US20110069888A1 (en) * 2009-09-22 2011-03-24 Samsung Electronics Co., Ltd. Image processing apparatus and method
US8885920B2 (en) * 2009-09-22 2014-11-11 Samsung Electronics Co., Ltd. Image processing apparatus and method
US8633890B2 (en) * 2010-02-16 2014-01-21 Microsoft Corporation Gesture detection based on joint skipping
US20110199291A1 (en) * 2010-02-16 2011-08-18 Microsoft Corporation Gesture detection based on joint skipping
US20120108909A1 (en) * 2010-11-03 2012-05-03 HeadRehab, LLC Assessment and Rehabilitation of Cognitive and Motor Functions Using Virtual Reality
US20120154409A1 (en) * 2010-12-15 2012-06-21 Microsoft Corporation Vertex-baked three-dimensional animation augmentation
US8963927B2 (en) * 2010-12-15 2015-02-24 Microsoft Technology Licensing, Llc Vertex-baked three-dimensional animation augmentation
US8315823B2 (en) 2011-04-20 2012-11-20 Bertec Corporation Force and/or motion measurement system having inertial compensation and method thereof
US8315822B2 (en) 2011-04-20 2012-11-20 Bertec Corporation Force measurement system having inertial compensation
US9504909B2 (en) 2011-05-05 2016-11-29 Qualcomm Incorporated Method and apparatus of proximity and stunt recording for outdoor gaming
US8696450B2 (en) 2011-07-27 2014-04-15 The Board Of Trustees Of The Leland Stanford Junior University Methods for analyzing and providing feedback for improved power generation in a golf swing
US9656121B2 (en) 2011-07-27 2017-05-23 The Board Of Trustees Of The Leland Stanford Junior University Methods for analyzing and providing feedback for improved power generation in a golf swing
US20130059281A1 (en) * 2011-09-06 2013-03-07 Fenil Shah System and method for providing real-time guidance to a user
US9526451B1 (en) 2012-01-11 2016-12-27 Bertec Corporation Force measurement system
US9168420B1 (en) 2012-01-11 2015-10-27 Bertec Corporation Force measurement system
US8363891B1 (en) * 2012-03-26 2013-01-29 Southern Methodist University System and method for predicting a force applied to a surface by a body during a movement
US9110089B2 (en) 2012-03-26 2015-08-18 Southern Methodist University System and method for predicting a force applied to a surface by a body during a movement
US9235241B2 (en) * 2012-07-29 2016-01-12 Qualcomm Incorporated Anatomical gestures detection system using radio signals
US20140028539A1 (en) * 2012-07-29 2014-01-30 Adam E. Newham Anatomical gestures detection system using radio signals
US11882967B2 (en) 2012-10-11 2024-01-30 Roman Tsibulevskiy Technologies for computing
US9275488B2 (en) 2012-10-11 2016-03-01 Sony Corporation System and method for animating a body
US11529025B2 (en) 2012-10-11 2022-12-20 Roman Tsibulevskiy Technologies for computing
US8764532B1 (en) 2012-11-07 2014-07-01 Bertec Corporation System and method for fall and/or concussion prediction
US8847989B1 (en) 2013-01-19 2014-09-30 Bertec Corporation Force and/or motion measurement system and a method for training a subject using the same
US10231662B1 (en) 2013-01-19 2019-03-19 Bertec Corporation Force measurement system
US9526443B1 (en) 2013-01-19 2016-12-27 Bertec Corporation Force and/or motion measurement system and a method of testing a subject
US9770203B1 (en) 2013-01-19 2017-09-26 Bertec Corporation Force measurement system and a method of testing a subject
US11311209B1 (en) 2013-01-19 2022-04-26 Bertec Corporation Force measurement system and a motion base used therein
US10010286B1 (en) 2013-01-19 2018-07-03 Bertec Corporation Force measurement system
US8704855B1 (en) 2013-01-19 2014-04-22 Bertec Corporation Force measurement system having a displaceable force measurement assembly
US11540744B1 (en) 2013-01-19 2023-01-03 Bertec Corporation Force measurement system
US10413230B1 (en) 2013-01-19 2019-09-17 Bertec Corporation Force measurement system
US9081436B1 (en) 2013-01-19 2015-07-14 Bertec Corporation Force and/or motion measurement system and a method of testing a subject using the same
US11857331B1 (en) 2013-01-19 2024-01-02 Bertec Corporation Force measurement system
US10856796B1 (en) 2013-01-19 2020-12-08 Bertec Corporation Force measurement system
US10646153B1 (en) 2013-01-19 2020-05-12 Bertec Corporation Force measurement system
US11052288B1 (en) 2013-01-19 2021-07-06 Bertec Corporation Force measurement system
US10588546B2 (en) 2013-06-26 2020-03-17 The Cleveland Clinic Foundation Systems and methods to assess balance
US9588582B2 (en) 2013-09-17 2017-03-07 Medibotics Llc Motion recognition clothing (TM) with two different sets of tubes spanning a body joint
US11301045B1 (en) 2015-08-22 2022-04-12 Bertec Corporation Measurement system that includes at least one measurement assembly, a visual display device, and at least one data processing device
US10555688B1 (en) 2015-08-22 2020-02-11 Bertec Corporation Measurement system that includes at least one measurement assembly, a head-mounted visual display device, and a data processing device
US11992746B1 (en) 2015-08-22 2024-05-28 Bertec Corporation Hybrid display system for a force measurement assembly, an exercise device, or an interactive game
US10860843B1 (en) 2015-08-22 2020-12-08 Bertec Corporation Measurement system that includes at least one measurement assembly, a head-mounted visual display device, and a data processing device
US10216262B1 (en) 2015-08-22 2019-02-26 Bertec Corporation Force management system that includes a force measurement assembly, a visual display device, and one or more data processing devices
US9916011B1 (en) 2015-08-22 2018-03-13 Bertec Corporation Force measurement system that includes a force measurement assembly, a visual display device, and one or more data processing devices
US10390736B1 (en) 2015-08-22 2019-08-27 Bertec Corporation Force measurement system that includes a force measurement assembly, at least one visual display device, and one or more data processing devices
US11117039B2 (en) 2018-05-29 2021-09-14 Curiouser Products Inc. Reflective video display apparatus for interactive training and demonstration and methods of using same
US11623129B2 (en) 2018-05-29 2023-04-11 Curiouser Products Inc. Reflective video display apparatus for interactive training and demonstration and methods of using same
US11123626B1 (en) 2018-05-29 2021-09-21 Curiouser Products Inc. Reflective video display apparatus for interactive training and demonstration and methods of using same
US11135504B1 (en) 2018-05-29 2021-10-05 Curiouser Products, Inc. Reflective video display apparatus for interactive training and demonstration and methods of using same
US11135505B2 (en) 2018-05-29 2021-10-05 Curiouser Products Inc. Reflective video display apparatus for interactive training and demonstration and methods of using same
US11135503B2 (en) 2018-05-29 2021-10-05 Curiouser Products Inc. Reflective video display apparatus for interactive training and demonstration and methods of using same
US11890524B2 (en) 2018-05-29 2024-02-06 Curiouser Products Inc. Reflective video display apparatus for interactive training and demonstration and methods of using same
US11173378B2 (en) 2018-05-29 2021-11-16 Curiouser Products Inc. Reflective video display apparatus for interactive training and demonstration and methods of using same
US11173377B1 (en) 2018-05-29 2021-11-16 Curiouser Products Inc. Reflective video display apparatus for interactive training and demonstration and methods of using same
US11179620B2 (en) 2018-05-29 2021-11-23 Curiouser Products Inc. Reflective video display apparatus for interactive training and demonstration and methods of using same
US11219816B2 (en) 2018-05-29 2022-01-11 Curiouser Products Inc. Reflective video display apparatus for interactive training and demonstration and methods of using same
US11253770B2 (en) 2018-05-29 2022-02-22 Curiouser Products Inc. Reflective video display apparatus for interactive training and demonstration and methods of using same
US11110336B2 (en) 2018-05-29 2021-09-07 Curiouser Products Inc. Reflective video display apparatus for interactive training and demonstration and methods of using same
US11298606B2 (en) 2018-05-29 2022-04-12 Curiouser Products Inc. Reflective video display apparatus for interactive training and demonstration and methods of using same
US11090547B2 (en) 2018-05-29 2021-08-17 Curiouser Products Inc. Reflective video display apparatus for interactive training and demonstration and methods of using same
US11883732B2 (en) 2018-05-29 2024-01-30 Curiouser Products Inc. Reflective video display apparatus for interactive training and demonstration and methods of using same
US11376484B2 (en) 2018-05-29 2022-07-05 Curiouser Products Inc. Reflective video display apparatus for interactive training and demonstration and methods of using same
US11383147B2 (en) 2018-05-29 2022-07-12 Curiouser Products Inc. Reflective video display apparatus for interactive training and demonstration and methods of using same
US11383146B1 (en) 2018-05-29 2022-07-12 Curiouser Products Inc. Reflective video display apparatus for interactive training and demonstration and methods of using same
US11383148B2 (en) 2018-05-29 2022-07-12 Curiouser Products Inc. Reflective video display apparatus for interactive training and demonstration and methods of using same
US11400357B2 (en) 2018-05-29 2022-08-02 Curiouser Products Inc. Reflective video display apparatus for interactive training and demonstration and methods of using same
US10981047B2 (en) 2018-05-29 2021-04-20 Curiouser Products Inc. Reflective video display apparatus for interactive training and demonstration and methods of using same
US11872469B2 (en) 2018-05-29 2024-01-16 Curiouser Products Inc. Reflective video display apparatus for interactive training and demonstration and methods of using same
US11872467B2 (en) 2018-05-29 2024-01-16 Curiouser Products Inc. Reflective video display apparatus for interactive training and demonstration and methods of using same
US20210146222A1 (en) 2018-05-29 2021-05-20 Curiouser Products Inc. Reflective video display apparatus for interactive training and demonstration and methods of same
US11065527B2 (en) 2018-05-29 2021-07-20 Curiouser Products Inc. Reflective video display apparatus for interactive training and demonstration and methods of using same
US11045709B2 (en) 2018-05-29 2021-06-29 Curiouser Products Inc. Reflective video display apparatus for interactive training and demonstration and methods of same
US11833410B2 (en) 2018-05-29 2023-12-05 Curiouser Products Inc. Reflective video display apparatus for interactive training and demonstration and methods of using same
USD982032S1 (en) 2018-05-29 2023-03-28 Curiouser Products Inc. Display screen or portion thereof with graphical user interface
US11117038B2 (en) 2018-05-29 2021-09-14 Curiouser Products Inc. Reflective video display apparatus for interactive training and demonstration and methods of using same
USD1006821S1 (en) 2018-05-29 2023-12-05 Curiouser Products Inc. Display screen or portion thereof with graphical user interface
US11813513B2 (en) 2018-05-29 2023-11-14 Curiouser Products Inc. Reflective video display apparatus for interactive training and demonstration and methods of using same
US11679318B2 (en) 2018-05-29 2023-06-20 Curiouser Products Inc. Reflective video display apparatus for interactive training and demonstration and methods of using same
US11697056B2 (en) 2018-05-29 2023-07-11 Curiouser Products Inc. Reflective video display apparatus for interactive training and demonstration and methods of using same
US11701566B2 (en) 2018-05-29 2023-07-18 Curiouser Products Inc. Reflective video display apparatus for interactive training and demonstration and methods of using same
US11786798B2 (en) 2018-05-29 2023-10-17 Curiouser Products Inc. Reflective video display apparatus for interactive training and demonstration and methods of using same
US11712614B2 (en) 2018-05-29 2023-08-01 Curiouser Products Inc. Reflective video display apparatus for interactive training and demonstration and methods of using same
US11717739B2 (en) 2018-05-29 2023-08-08 Curiouser Products Inc. Reflective video display apparatus for interactive training and demonstration and methods of using same
US11731026B2 (en) 2018-05-29 2023-08-22 Curiouser Products Inc. Reflective video display apparatus for interactive training and demonstration and methods of using same
US11752416B2 (en) 2018-05-29 2023-09-12 Curiouser Products Inc. Reflective video display apparatus for interactive training and demonstration and methods of using same
US11759693B2 (en) 2018-05-29 2023-09-19 Curiouser Products Inc. Reflective video display apparatus for interactive training and demonstration and methods of using same
US11771978B2 (en) 2018-05-29 2023-10-03 Curiouser Products Inc. Reflective video display apparatus for interactive training and demonstration and methods of using same
US11530957B2 (en) * 2018-12-13 2022-12-20 Hyundai Motor Company Method for predicting clamp force using convolutional neural network
US11803241B2 (en) * 2019-06-21 2023-10-31 Rehabilitation Institute Of Chicago Wearable joint tracking device with muscle activity and methods thereof
US20200401224A1 (en) * 2019-06-21 2020-12-24 REHABILITATION INSTITUTE OF CHICAGO d/b/a Shirley Ryan AbilityLab Wearable joint tracking device with muscle activity and methods thereof
US12094360B1 (en) 2019-08-07 2024-09-17 University Of South Florida Apparatuses and methods for practicing a reduction procedure for treating radial head subluxation
US11497980B2 (en) 2020-04-30 2022-11-15 Curiouser Products Inc. Reflective video display apparatus for interactive training and demonstration and methods of using same
US11986721B2 (en) 2020-04-30 2024-05-21 Curiouser Products Inc. Reflective video display apparatus for interactive training and demonstration and methods of using same
US11465030B2 (en) 2020-04-30 2022-10-11 Curiouser Products Inc. Reflective video display apparatus for interactive training and demonstration and methods of using same
US11819751B2 (en) 2020-09-04 2023-11-21 Curiouser Products Inc. Video rebroadcasting with multiplexed communications and display via smart mirrors
US11602670B2 (en) 2020-09-04 2023-03-14 Curiouser Products Inc. Video rebroadcasting with multiplexed communications and display via smart mirrors
US11433275B2 (en) 2020-09-04 2022-09-06 Curiouser Products Inc. Video streaming with multiplexed communications and display via smart mirrors
US11351439B2 (en) 2020-09-04 2022-06-07 Curiouser Products Inc. Video rebroadcasting with multiplexed communications and display via smart mirrors, and smart weight integration
US11167172B1 (en) 2020-09-04 2021-11-09 Curiouser Products Inc. Video rebroadcasting with multiplexed communications and display via smart mirrors
US11633661B2 (en) 2020-09-04 2023-04-25 Curiouser Products Inc. Video rebroadcasting with multiplexed communications and display via smart mirrors, and smart weight integration
US11707664B2 (en) 2020-09-04 2023-07-25 Curiouser Products Inc. Video rebroadcasting with multiplexed communications and display via smart mirrors
US11633660B2 (en) 2020-09-04 2023-04-25 Curiouser Products Inc. Video rebroadcasting with multiplexed communications and display via smart mirrors, and smart weight integration

Also Published As

Publication number Publication date
EP2120710A4 (en) 2013-04-10
ES2679125T3 (en) 2018-08-22
JP5016687B2 (en) 2012-09-05
WO2008109248A4 (en) 2008-12-31
CA2680462A1 (en) 2008-09-12
EP2120710A2 (en) 2009-11-25
US20080221487A1 (en) 2008-09-11
JP2010520561A (en) 2010-06-10
WO2008109248A2 (en) 2008-09-12
WO2008109248A3 (en) 2008-11-20
EP2120710B1 (en) 2018-04-18
US20090082701A1 (en) 2009-03-26
CA2680462C (en) 2016-08-16

Similar Documents

Publication Publication Date Title
US7931604B2 (en) Method for real time interactive visualization of muscle forces and joint torques in the human body
US11052288B1 (en) Force measurement system
US11301045B1 (en) Measurement system that includes at least one measurement assembly, a visual display device, and at least one data processing device
US11311209B1 (en) Force measurement system and a motion base used therein
US10646153B1 (en) Force measurement system
US10231662B1 (en) Force measurement system
US10413230B1 (en) Force measurement system
US6774885B1 (en) System for dynamic registration, evaluation, and correction of functional human behavior
US9763604B1 (en) Gait perturbation system and a method for testing and/or training a subject using the same
US9770203B1 (en) Force measurement system and a method of testing a subject
US10010286B1 (en) Force measurement system
US10117602B1 (en) Balance and/or gait perturbation system and a method for testing and/or training a subject using the same
Zhao et al. A Kinect-based rehabilitation exercise monitoring and guidance system
US9526443B1 (en) Force and/or motion measurement system and a method of testing a subject
US11540744B1 (en) Force measurement system
Williams et al. Evaluation of walking in place on a wii balance board to explore a virtual environment
EP1131734B1 (en) System for dynamic registration, evaluation, and correction of functional human behavior
US20020009222A1 (en) Method and system for viewing kinematic and kinetic information
US11857331B1 (en) Force measurement system
Ye et al. Sensation transfer for immersive exoskeleton motor training: Implications of haptics and viewpoints
Van der Eerden et al. CAREN-computer assisted rehabilitation environment
Bauer et al. Interactive visualization of muscle activity during limb movements: Towards enhanced anatomy learning
Panchaphongsaphak et al. Three-dimensional touch interface for medical education
Roozbahani et al. Development of a novel real-time simulation of human skeleton/muscles
White et al. A virtual reality application for stroke patient rehabilitation

Legal Events

Date Code Title Description
AS Assignment

Owner name: MOTEK B.V., NETHERLANDS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ZOHAR, OSHRI EVEN;VAN DEN BOGERT, ANTONIE J.;REEL/FRAME:021689/0658;SIGNING DATES FROM 20070728 TO 20070730

Owner name: MOTEK B.V., NETHERLANDS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ZOHAR, OSHRI EVEN;VAN DEN BOGERT, ANTONIE J.;SIGNING DATES FROM 20070728 TO 20070730;REEL/FRAME:021689/0658

STCF Information on status: patent grant

Free format text: PATENTED CASE

FPAY Fee payment

Year of fee payment: 4

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 12TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1553); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 12