Jump to content

Digital puppetry

From Wikipedia, the free encyclopedia

Digital puppetry is the manipulation and performance of digitally animated 2D or 3D figures and objects in a virtual environment that are rendered in real-time by computers. It is most commonly used in filmmaking and television production but has also been used in interactive theme park attractions and live theatre.

The exact definition of what is and is not digital puppetry is subject to debate among puppeteers and computer graphics designers, but it is generally agreed that digital puppetry differs from conventional computer animation in that it involves performing characters in real-time, rather than animating them frame by frame.

Digital puppetry is closely associated with character animation, motion capture technologies, and 3D animation, as well as skeletal animation. Digital puppetry is also known as virtual puppetry, performance animation, living animation, aniforms, live animation and real-time animation (although the latter also refers to animation generated by computer game engines). Machinima is another form of digital puppetry, and Machinima performers are increasingly being identified as puppeteers.

History and usage

[edit]

Early experiments

[edit]

One of the earliest pioneers of digital puppetry was Lee Harrison III. He conducted experiments in the early 1960s that animated figures using analog circuits and a cathode ray tube. Harrison rigged up a body suit with potentiometers and created the first working motion capture rig, animating 3D figures in real-time on his CRT screen. He made several short films with this system, which he called ANIMAC.[1] Among the earliest examples of digital puppets produced with the system included a character called "Mr. Computer Image" who was controlled by a combination of the ANIMAC's body control rig and an early form of voice-controlled automatic lip sync.[2]

Waldo C. Graphic

[edit]

Perhaps the first truly commercially successful example of a digitally animated figure being performed and rendered in real-time is Waldo C. Graphic, a character created in 1988 by Jim Henson and Pacific Data Images for the Muppet television series The Jim Henson Hour. Henson had used the Scanimate system to generate a digital version of his Nobody character in real-time for the television series Sesame Street as early as 1970[3] and Waldo grew out of experiments Henson conducted to create a computer generated version of his character Kermit the Frog[4] in 1985.[5]

Waldo's strength as a computer-generated puppet was that he could be controlled by a single puppeteer (Steve Whitmire[6]) in real-time in concert with conventional puppets. The computer image of Waldo was mixed with the video feed of the camera focused on physical puppets so that all of the puppeteers in a scene could perform together. (It was already standard Muppeteering practice to use monitors while performing, so the use of a virtual puppet did not significantly increase the complexity of the system.) Afterward, in post-production, PDI re-rendered Waldo in full resolution, adding a few dynamic elements on top of the performed motion.[7]

Waldo C. Graphic can be seen today in Jim Henson's Muppet*Vision 3D at Disney's Hollywood Studios in Lake Buena Vista, Florida.

Mike Normal

[edit]

Another significant development in digital puppetry in 1988 was Mike Normal, which Brad DeGraf and partner Michael Wahrman developed to show off the real-time capabilities of Silicon Graphics' then-new 4D series workstations. Unveiled at the 1988 SIGGRAPH convention, it was the first live performance of a digital character. Mike was a sophisticated talking head driven by a specially built controller that allowed a single puppeteer to control many parameters of the character's face, including mouth, eyes, expression, and head position.[8]

The system developed by deGraf/Wahrman to perform Mike Normal was later used to create a representation of the villain Cain in the motion picture RoboCop 2, which is believed to be the first example of digital puppetry being used to create a character in a full-length motion picture.

Trey Stokes was the puppeteer for both Mike Normal's SIGGRAPH debut and Robocop II.

Sesame Street: Elmo's World

[edit]

One of the most widely seen successful examples of digital puppetry in a TV series is Sesame Street's "Elmo's World" segment. A set of furniture characters were created with CGI, to perform simultaneously with Elmo and other real puppets. They were performed in real-time on set, simultaneously with live puppet performances. As with the example of Henson's Waldo C. Graphic above, the digital puppets' video feed was seen live by both the digital and physical puppet performers, allowing the digital and physical characters to interact.[9]

Disney theme parks

[edit]

Walt Disney Imagineering has also been an important innovator in the field of digital puppetry, developing new technologies to enable visitors to Disney theme parks to interact with some of the company's famous animated characters.[10] In 2004, they used digital puppetry techniques to create the Turtle Talk with Crush attractions at Epcot and Disney California Adventure Park. In the attraction, a hidden puppeteer performs and voices a digital puppet of Crush, the laid-back sea turtle from Finding Nemo, on a large rear-projection screen. To the audience, Crush appears to be swimming inside an aquarium and engages in unscripted, real-time conversations with theme park guests.

Disney Imagineering continued its use of digital puppetry with the Monsters Inc. Laugh Floor, a new attraction in Tomorrowland at Walt Disney World's Magic Kingdom, which opened in the spring of 2007. Guests temporarily enter the "monster world" introduced in Disney and Pixar's 2001 film, Monsters, Inc., where they are entertained by Mike Wazowski and other monster comedians who are attempting to capture laughter, which they convert to energy. Much like Turtle Talk, the puppeteers interact with guests in real time, just as a real-life comedian would interact with his/her audience.

Disney also uses digital puppetry techniques in Stitch Encounter, which opened in 2006 at the Hong Kong Disneyland park. Disney has another version of the same attraction in Disneyland Resort Paris called Stitch Live!

Military Simulation & Training

[edit]

Since 2014, the United States Army's Program Executive Office for Simulation, Training, Research, and Instrumentation (PEO STRI), a division of US Army Simulation and Training Technology Center (STTC), has been experimenting with digital puppetry as a method of teaching advanced situational awareness for infantry squads.[11] A single improvisor using motion capture technology from Organic Motion Inc interacted with squads through the medium of several different life-sized avatars of varying ages and genders that were projected onto multiple walls throughout an urban operations training facility. The motion capture technology was paired with real-time voice shifting to achieve the effect.[12]

Types of digital puppetry

[edit]

Waldo puppetry

[edit]

A digital puppet is controlled onscreen in real-time by a puppeteer who uses a telemetric input device known as a Waldo (after the short story "Waldo" by Robert A. Heinlein which features a man who invents and uses such devices), connected to the computer. The X-Y-Z axis movement of the input device causes the digital puppet to move correspondingly.

Computer facial animation

[edit]

Computer facial animation is primarily an area of computer graphics that encapsulates methods and techniques for generating and animating images or models of a character's face. The importance of human faces in verbal and non-verbal communication and advances in computer graphics hardware and software have caused considerable scientific, technological, and artistic interests in computer facial animation.

Motion capture puppetry/performance animation

[edit]

An object (puppet) or human body is used as a physical representation of a digital puppet and manipulated by a puppeteer. The movements of the object or body are matched correspondingly by the digital puppet in real time. Motion capture puppetry is commonly used, for example, by VTubers, who rig digital avatars to correspond to the movements of their heads.

Virtual human

[edit]

Virtual human (or also known as meta human or digital human) are simulations of human beings on computers. The research domain is concerned with their representation, movement, and behavior, and also show that the human-like appearance of virtual human shows higher message credibility than anime-like virtual human in an advertising context. A particular case of a virtual human is the virtual actor, which is a virtual human (avatar or autonomous) representing an existing personality and acting in a film or a series.

Aniforms

[edit]

Aniforms is a two-dimensional cartoon character operated like a puppet, to be displayed to live audiences or in visual media. The concept was invented by Morey Bunin with his spouse Charlotte, Bunin being a puppeteer who had worked with string marionettes and hand puppets. The distinctive feature of an Aniforms character is that it displays a physical form that appears "animated" on a real or simulated television screen. The technique was used in television production.

Machinima

[edit]

A production technique that can be used to perform digital puppets. Machinima involves creating computer-generated imagery (CGI) using the low-end 3D engines in video games. Players act out scenes in real-time using characters and settings within a game and the resulting footage is recorded and later edited into a finished film.[13]

References

[edit]
  1. ^ A Critical History of Computer Graphics and Animation: Analog approaches, non-linear editing, and compositing Archived 2007-03-28 at the Wayback Machine, accessed April 28, 2007
  2. ^ Mr. Computer Image Demo (video). 1968.
  3. ^ Jim Henson's Red Book Entry, accessed October 10, 2014
  4. ^ Finch, Christopher. Jim Henson: The Works (New York: Random House, 1993)
  5. ^ Sturman, David J. A Brief History of Motion Capture for Computer Character Animation Archived October 12, 2012, at the Wayback Machine, accessed February 9, 2007
  6. ^ Henson.com Featured Creature: Waldo C. Graphic (archive.org), accessed February 9, 2007
  7. ^ Walters, Graham. The story of Waldo C. Graphic. Course Notes: 3D Character Animation by Computer, ACM SIGGRAPH '89, Boston, July 1989, pp. 65-79
  8. ^ Barbara Robertson, Mike, the talking head Computer Graphics World, July 1988, pp. 15-17.
  9. ^ Yilmaz, Emre. Elmo's World: Digital Puppetry on Sesame Street. Conference Abstracts and Applications, SIGGRAPH '2001, Los Angeles, August 2001, p. 178
  10. ^ Kleczek, Jakub (2015). "Digital Puppeteering". Theatr Lalek (119). POLUNIMA.
  11. ^ Gregory, Rick (July 2014). "Squad Overmatch Study Looks to Build Resilience on the Battlefield" (PDF). Inside PEO STRI. United States Army. Archived from the original (PDF) on October 20, 2016.
  12. ^ Thuermer, Karen (December 15, 2015). "Avatars for Training". Military Training International. Defense House Publishing.
  13. ^ Hancock, Hugh (2007). Machinima For Dummies. John Wiley & Sons, Inc. ISBN 978-0-470-19583-3.
[edit]
  • Animata - Free, open-source real-time animation software commonly used to create digital puppets.
  • Mike the talking head - Web page about Mike Normal, one of the earliest examples of digital puppetry.
  • Organic Motion LIVE - Commercial digital puppetry technology currently used for simulation & training purposes.