Generating the realistic synthetic motion of objects is a complex and difficult challenge to overcome in computer graphics. Traditionally this has been achieved by pre-breaking modelled geometry through the application of shatter patterns but such an approach is not well suited to modern production demands - its is time intensive, complicated and entirely subjective... 

This was an observation that I made long ago in 2011 when finite element techniques, a branch of traditional structural engineering that simulates incredibly realistic deformation and fracture, was in its real time computational infancy - namely it took a long time to get any meaningful results!

Fortunately for me (and my bank account) Framestore (the guys that went on to make Gravity - its awesome!) were interested in exploring the notion of a non-traditional physics engine and lone behold I was given six months to prove my worth. The outcome of my endeavours was the development of a prototype Maya plug-in that demonstrated the elastic and plastic deformation of linear tetrahedron before fracture occurs in a visually plausible manner...

Anyone understood that? It's a pretty messy area of engineering - crazy mathematics, algorithmic principles, optimisation techniques...but fear not because this blog series intends to explain the entire process end to end - heck I will even throw in some lovely C++ code - so that you can start developing your very own physics engine 


  1. Introduction and Background
  2. Continuum Mechanics
  3. The Finite Element Model
  4. Code Implementation (Maya)
  5. Plasticity and Fracture 



Physical realism is becoming an increasingly mandatory requirement for computer graphics. People have an instinctive ability to detect when an image is unnatural or implausible, destroying the suspension of disbelief and removing their sense of immersion. This pursuit of enhanced realism has led to a range of significant innovations towards graphics that are visually indistinguishable from the real world. However, as the overall look or visual fidelity to computer generated sequences improves through increasingly accurate lighting algorithms and more powerful rendering programs, it is inaccuracies in the behaviour or kinetic fidelity of a simulation that become more glaringly obvious to the viewer.

Typically kinematics has been used to achieve pre-configured animation, an approach that works well for animating articulated figures that possess joints or hinges. Artists use a technique called key-framing to explicitly define an objects start and end positions, interpolating in-between to get the full motion (think back to school and flicking through a notebook of a stick man):  


However, realism can only be judged subjectively and its application to other simulations such as destruction can be a long and tedious process. Therefore, a compelling argument to enhance the overall believability is to incorporate the laws of physics into models for more realistic procedural movement. Recent efforts into this highly-interdisciplinary area of interest, encompassing advanced topics of engineering and mathematics, have culminated in a field of study called Physics-Based Animation.

In the film industry, traditional forms of practical special effects involve the use of live pyrotechnics, complicated stunt work or intricate miniature models. The first widely acknowledged special effect can be seen in 'The Execution of Mary Queen of Scots (1895)' while you will undoubtable be aware of the iconic model used in Star Wars (1977) as well as the incredible live action shots from 'Saving Private Ryan (1998)'. Such techniques have proven to be effective in telling a story and prevailed as the de-facto choice of directors for many years, ultimately due to a lack of viable alternative. However, in certain situations, artistic vision stipulates effects that are simply too dangerous and/or expensive to be realised practically, such as the perilous trip by Tom Hanks and crews to the moon in 'Apollo 13 (1995)'. Beyond safety and cost, logistics can cause problems if locations required for re-shoots are unavailable and need to be recreated digitally.

These fundamental problems have been alleviated by the digital revolution of the last 30 years, beginning with the unprecedented efforts to create the ground breaking visual effects (VFX) seen in 'Tron (1981)' - a watershed moment that heralded in a new era of visual entertainment and spawning companies such as Pixar, Weta Digital and Framestore. Impossible, previously infeasible effects such as the large scale destruction seem in the film '2010' and 'Transformers Dark of the Moon'  cannot be realised without the assistance of visually realistic computer simulations that  behave in a manner identical to an actual building collapsing or vehicle exploding. Thus, the new and exciting discipline of physics-based animation has (general) usurped traditional approaches - although a hybrid of the two is often seen. Modern cinema, saturated with a range if ground-breaking visual effects showcases many algorithms for physics based animation. This wide gamut of computer vision techniques such as de-warping, feature tracking and image segmentation have been adapted for incredible VFX applications.

Here is a summary of the main types of forward dynamic simulations:  

  • Particle Systems

These systems can be considered the earliest and simplest form of dynamic simulation that can be used to replicate a variety of simple phenomena such as explosions, smoke, snow and rain. Typically, particles have a position and velocity but no orientation and are generally implemented in three-dimensional space as point masses. Application of Newtonian law govern the potential forces action on particles that, when couple together, model so-called fuzzy objects.

  • Rigid Body Simulation

Rigid Body Dynamics (RBD) employ non-penetrating multi-body systems composed of unyielding materials that are assigned physical characteristics along with the imposition of constraints. They differ from particle systems in that they occupy a physical space and are geometrically more complex. Various physics engines such as Havok, PhysX and PhysBAM have been employed by facilities to create such simulation. Most of these are closed source engines that cost money to incorporate into a pipeline. Instead open source engines are more inviting as they allow developers to customise the code to suit specific needs. Previously, ODE has been a popular choice but this has recently been displaced by Bullet as the de-facto engine of choice in many studios.  

  • Deformable Body Simulation

Deformable objects, as the name suggests, are not rigid and include a range of different objects such as cloth, fur, hair and rope. The complexities of self intersections and collision response make them a challenging problem. The academy award winning efforts of Terzopoulos in 1987 has been followed up by further pioneering work in inelastic deformation including plasticity and fracture. Generally, mass-spring systems are sued where particles are connected together based on the principles of Hooke's Law, but are unable to capture to volumetric effects or the precision stipulated by more complex systems. Alternatively, Finite Differences can be used to simulates deformable models based on continuum models.

  • Computational Fluid Dynamics

The simulation of fluid phenomena including complex fire, smoke and water environments (that particle systems cannot handle) is accurately described by the Navier-Stokes equation. In its original form dating back to 1821 the equation simply describes the physics of fluids but has since been extended to account for turbulence, shallowness and high velocities. The challenges faced by computes graphics has been tackled by many including Digital Domain for 'The Day After Tomorrow' and DreamWork's for the pioneering work seen in 'Antz'. Current state of the art software for fluid simulation is Naiad by Exotic Matter while proprietary software developed by VFX facility Scanline earned an Oscar nomination of the tsunami sequence seen in the film 'Hereafter'.

  • Finite Element Methods

Based on theories taken from continuum mechanics, the finite element method can be used to discretise an object based on stress-strain relations and more accurately model deformable objects, as well as fracture. Originally, this has been used from a medical perspective and then the work of O'Brien at the turn of the centure looked at non-real time non-linear animation of brittle and ductile materials. The general consensus in the community is that such methods are the future of physics based animation for visual effects.

A caveat to suing physical laws in the production of visual effects for film is that it is difficult to obey the laws of nature as well as the Principles of Animation, a set of suggested rules used by animators to exaggerate motion and engage the audience. For example, the squash and stretch principle is often seen in cartoons or animations to provide surrealistic movements favoured by younger audiences, enabling unnatural character poses and impossible physical scenarios. Further to this, directors may actively seek out unrealistic effects that cannot occur in the real world such as dream sequences, alien worlds or fantasy environments. Thus, it is imperative to retain perspective when developing new tools for VFX - an inherently artistic endeavour that regularly looks to push the boundaries o acceptability and manipulate algorithms in ways that were not necessarily intended for.

Further to this, typical requirements encountered in traditional engineering applications - where emphasis is very much on accuracy - are not appropriate to VFX. Instead it is speed and predictability that are of paramount importance so that artists are able to convey interesting object motion with complete control. To keep pace with ever-changing production demands and the evolution of technology, tools must be extensible while maintaining stability and efficient.