Joe Gran demonstrates the use of the Leap Motion to control the behavior of a character in real-time.

“The making of: Dog of Wisdom” is a video showing the potential of this mid-air device.

The work of Joe Gran is absolutely amazing. He mapped the fingers to the rig in Maya (probably using the Leap motion plugin).

It works great!!

Open Sound Control Organization – http://opensoundcontrol.org

OSC for Python (SimpleOSC 0.3.2) – https://pypi.python.org/pypi/SimpleOSC

Qlab OSC-script: http://archiedestructo.github.io/OSC-Script/

Maya via Python try out – http://www.creativecrash.com/forums/python/topics/osc-node

Maya OSC Slibase (for old maya versions) – http://slidertime.blogspot.pt/2008/02/slibase-v02-with-osc-plugin.html

Maya OSC Slibase (for new versions of Maya) – http://www.paoloemilioselva.it/2008pages.php?page=slibase

Cinema4D OSC to work with Kinect – http://www.908video.de/lab_project/kinectcapture2-kicapsdk-kicaposc/

Cinema4D – http://frieslandav.com/xtension_osc.php

Traktor – http://hexler.net/docs/touchosc-setup-traktor

QuartzComposer – http://hexler.net/software/qcosc

OSC library for Processing – http://www.sojamo.de/libraries/oscP5/

MaxMsp – http://opensoundcontrol.org/implementation/opensoundcontrol-maxmsp

Livegrabber for Ableton Live – http://sonicbloom.net/en/livegrabber-to-sendreceive-osc-in-ableton-live/

OpenSound Control Monitor: http://www.kasperkamperman.com/blog/osc-datamonitor/

Unreal: https://github.com/monsieurgustav/UE4-OSC

For  Unity

UnityOSC – https://github.com/jorgegarcia/UnityOSC
Developed by Jorge Garcia. It was the first OSC work for Unity that I know.
Implements the OSC v1.0 specification over UDP.
Does not support Bundles

VVV Unity OSC – https://github.com/frankiezafe/VVVVUnityOSC
Developed by Frankie.
It was made to work with VVV.
Supports Bundles.
Needs a trick for publishing

Unity OSC Receiver – https://github.com/heaversm/unity-osc-receiver
Developed by Mike Heavers
Supports Bundles

 

Marionette Zoo is a digital puppetry experience for LeapMotion!

It was one of the projects accepted to the Leap Motion Developer Program in early 2013 but it was finished in 2014.

The user moves the puppets with the hand that is mapped to a virtual controller similar to a traditional string controller. This controller is connected to the virtual marionette with strings. The puppets are made with physics to achieve a marionette style look.

3D was made with blender using the bullet physics.

The sound was developed with pure data.

The application was developed with Cinder.

Link to the >> Project <<

Tagtool is an creative toolkit for IOS

An incredible tool for artists that can draw and animate in very few steps.

You can even produce performance animation…

 

An incredible digital puppetry performance with the digital puppet in 1992 at SCES.

The actor was wearing a facial Waldo developed by The Character Shop and a 3D mouse best known has the flying mouse.

Brilliance

Robert Abel and Associates developed in 1985 the first television production employing motion capture technology with computer graphics for digital character animation. “Brilliance” was not just a television advertisement, it was one of the greatest challenges in the use of computer graphics for production, featuring techniques that had never been attempted or achieved before.

Live cartoon show for television developed by TV-animation in 1999.

An interactive television show that allows children to call into the show, appear in real-time as an animated cartoon character.

The Cartoon Broadcast System was a custom hardware/software  real-time animation system developed by TV Animation, which allows producers to have real-time lip sync of up to 16 simultaneous sources, live pan, zoom and tracking and electronic puppeteering.

Behind the scenes from Nelly Nuts Show in Portugal which was called “Rita Catita e o Ursinho Oops”. It was produced by Miragem a portuguese production company in a 30 minute daily show  and broadcasted by TVI a portuguese television channel.

Hugo was an  interactive television show created by a Danish company called Interactive Television Entertainment (ITE) in 1990. An interactive game show broadcasted all over the world (more then 40 countries) until 2007. Players at home could play with the character by calling the television show and using their telephone keys as controller. Hugo was developed on custom-made ITE hardware based on a Commodore Amiga 3000 with MIDI devices and DTFM digital converter  for control. Later it was ported to a PC-based system using motion capture for live animation. Body movement and face expression of the puppeteer was retargeted to Hugo virtual character. An early digital puppetry system with great success during 17 years in the television live entertainment.

 

 

motion

Motion capture system used to control Hugo character

 

 

ite

 

The ITE 4000 a PC-based system

ITE’s Animation Mask System (AMS) A computer, a motion capture system (helmet with sensors), a control panel and a remote computer console.

console

A MIDI Control panel with a joystick and buttons to manipulate the ONLINE contents.

FlickerLab’s Cartoon Broadcast System (2012)

 

CBS is an update of the TV-animation Nelly Nut Show animation system.
It brings new features and provides more flexibility for live animation using new controls.
Lip-sync is real-time.

Here is an example of a show called “Café Central” made by a portuguese production house (HOP)

Johnny Bravo Live

In the year 2000, Cartoon networks presented and broadcasted a live cartoon show, Johnny Bravo was a 2D cartoon digital puppet animated in real-time. Johnny Bravo was a kind of “CJ” or cartoon jockey listening to request from telephone for nostalgic cartoons.

Kydara FilmBox On-Air was the first commercial solution to provide a generic real-time animation system using motion capture. It brought more flexibility to the broadcast industry because production houses or television channels could produce their own lives shows.
Producers or animators/puppeteers could trigger 3D/2D animation clip sequences in real-time.
This solution was designed to provide performance and versatility for live broadcasting.

Although the character is not fully live animated it can respond to real-time impulses creating the feeling of interaction with the audience. In a matter of fact, the only thing that is live performance is the lip-sync, it recognizes the sounds spoken by an onset actor and changes the image of the mouth to the most correspondent one.
The puppeteer just press buttons to trigger animations. The animations are previous built from dozens of animated sequences made with hundreds of frames.

The Cartoon Networks used a beta version of On-Air.

Reference: http://www.cgw.com/Publications/CGW/2000/Volume-23-Issue-8-August-2000-/Cartoon-Jockey.aspx