## Digital Hand Puppetry

ManiPullAction is a digital hand puppetry prototype built for a research project called Virtual Marionette. It was developed in March 2015 to evaluate the hand dexterity with the Leapmotion device to manipulate digital puppets in real-time. One year after its development maniPULLaction prototype is now released for Windows and OSX.
This prototype proposes an ergonomic mapping model to take advantage of the the full hand dexterity for expressive performance animation.

Try and evaluate

You can try the model and contribute with your evaluation by recording the 12 tests that are available. 8 tests with the puppet Luduvica and 4 tests with Cardboard boy puppet. If agree in making this test, just write a name in the label and start by pressing the “1” key or the “>” button. After the first test you can switch to the next by pressing the “>” button or the numeric keys “2”, “3”, and so on until the “8”. Then, you can jump to the next puppet by pressing the “2” button or the “Shift-F2” key. There are more 4 tests with this puppet that can be accessed with the “1”,”2”,”3”,”4” or using the “>” button. After finishing all tests you can send all the files that were saved to “virtual.marionette@grifu.com”. You can find or reproduce the file by pressing “browse”. The files are text files and motion capture binary files.

How to use

Just use your hand to drive the puppets. This prototype presents 4 distinct puppets with different interaction approaches that you can experiment by pressing 1,2,3,4 buttons.

  • The palm of the hand drives the puppets position and orientation
  • The little finger drives the eye direction (pupil) with 2 DOF (left and right)
  • The ring finger and middle finger are mapped to the left and right eye brows with 1 DOF (up and down)
  • The Index finger drives the eyelids with 1 DOF (open and close)
  • The thumb finger is mapped to the mouth with 1 DOF (open and close)

There are other interaction methods with each puppet:

  1. Luduvica – a digital glove puppet. This yellow bird offers a simple manipulation by driving just the puppet””””””””””””””””s head. If you use a second hand, you can have two puppets for a dramatic play. The performance test (7 key) allows you to drive the puppet in the Z-deph.
  2. Cardboard Boy (paperboy) – a digital wireless marionette (or rod puppet). This physics-based puppet extends the interaction of Luduvica by introducing physics and a second hand for direct manipulation. As in traditional puppetry, you can play with the puppet with an external hand.
  3. Mr. Brown – is a digital marionette/rod. This physics-based puppet is a similar approach to the paperboy but with more expressions. A remote console interface allows to drive a full set of facial expressions and the second hand allows the digital puppeteer to drive the puppet””””””””””””””””s hands (although not available in this version)
  4. Minster Monster – is a digital muppet with arm and hand. Facial and hands are the most expressive components of our bodies as well in monster characters. While the head of the puppet is driven by the palm of the hand, the mouth is controlled by a pinch gesture in a similar way to traditional muppets. A second hand drives the arm and hand of the monster for expressive animation.

Live cameras and lighting

To simulate a live show you can switch among cameras, change the lighting setup or even focus and defocus the camera.

It also presents live camera switching and lighting control through remote OSC in port “7000” by using applications such as TouchOSC (try the“simple” preset)
.Switch between the 4 cameras with the messages “/2/push1” .. to “/2/push4”, to manipulate the lights use “/1/fader1” to “/1/fader4”, to focus and defocus the camera 3 on the Cardboard and Mr. Brown scenes use the “/1/fader5”. You can also navigate through scenes with the addresses “/2/push13” to “2/push16”.

Download

maniPULLaction for OSX

[wpdm_package id=”1716″]

maniPULLaction for Windows

[wpdm_package id=”1719″]

Info

Developed by Grifu in 2015 for the Digital Media PhD at FEUP (Austin | Portugal) supported by FCTIt requires the Leapmotion device.

Demo video

 

[vimeo 140126085 w=580&h=326]

Hellblade is a film in development that demonstrates the potential of combining High DOF digital puppetry with real-time top-quality rendering.

hell

This is an amazing breakthrough demonstrating how digital puppetry methods (motion capture) can provide expressive animation which is rendered in real-time through CG top quality graphics.

Another step forward in the development of live films.

 

We reach a new era in the production of animation. Forget about off-line rendering, we are now living in real-time. This means that digital puppetry will soon become the hot topic. If we can render animation with film quality in real-time then, we need generate animation in real-time, and digital puppetry is our solution to achieve a brand new style of animation.

adam-unity3d

 

This short film was produced with Unity 5.4 beta version using the new cinematic sequencer tool.

It implements real time area lights and makes use of the physics simulation plugin CaronteFX.

 

 

Live cinema is arriving and with the recent release of Marza movie pipeline for Unity we are closer to this new realm.

marza-realtime

Marza allows the production of movies inside the Unity platform changing the process of movie-making. Now, is more close to the concept of Machinima including digital puppetry methods to produce performance animation which is recorded to the film medium in real-time.

MARZA brings video game technology into the film production.

MARZA Movie Pipeline for Unity — Key features 

Pursuing rich animation expression

  • Alembic importer co-developed with Unity Technologies
  • Reproducing rich animation expression and advanced, complex VFX

Implementing modern texture expression

  • Skin Shader: Character skin textures reproduction
  • Eye Shader: Character eye expression
  • Fake Fur Shader: Character “fur” expression

Dramatic improvement for efficient visual creation through automatic shot scene creation

  • All required assets, camera and animations are automatically read by the Unity platform, allowing for instant visual checking
  • Asset updates after scene creation

Enhancing visual quality
From Unity to Composite Software

  • Using Frame Capture, co-developed with Unity Technologies
  • Capturing game view as a sequence of OpenEXR image files
  • Using render pass system to output additional information as extra OpenEXR layers

Thanks to the collaboration with Unity Technologies, we are happy to announce that some of the key features in the “MARZA Movie Pipeline for Unity” are already available in open source, under the MIT license at github, for Unity and the CG community at large.

 

pull

 

Marionette Programming Language

Pull The Strings is a digital puppetry visual node-based control environment for performance animation.
It acts as a middle-ware interface between puppeteers and puppets, between device drivers and driven objects, between input interfaces and applications.

[vimeo 158084097 w=580&h=326]

It is a marionette programming engine that works as a digital mediator between devices and applications, it provides the building blocks for the digital puppeteer to establish the manipulation semantics. It is a visual programming language inspired by the strings of the marionettes and by the patch cords that connect modules in the old analog video synthesizers. In this environment the signals are processed and sent to the network. Finally the data arrives to Remote Control for Unity which is a plugin that facilitates the mapping of OSC messages.

It was designed with a minimalistic but intuitive interface and was developed in C++ with Openframeworks making use of multiple add-ons to provide their main features.
A multidisciplinary enviorment (MMMM: Multi-plataform, Multi-modal, Multi-mediator, Multi-user).
An enviorment for artists with non-programming background. Design for digital puppeteers.

Pull the strings is a middle-ware, an interface between applications, devices, and performing objects. It can be consider a visual programming environment for remote control made for artists and designers.
Sometimes you find hard to use one environment or application that offers all those nice features that your are looking for, so why not use a multiple set of applications in real-time ?
The goal of Pull the Strings is to facilitate the use and control of all your resources found in your computer and on the network.
An abstraction from technology making use of generic signals and communication protocols.
Connects, and transforms signals from input and virtual devices into performing object controls. Map and orchestrate multimedia data using OSC, DMX, MIDI and VRPN
Is a remote control application, an interface that connects inputs to outputs and provides a series of functionalities that helps to control the animation. It is a OSC-based environment and all nodes work with OSC format.

Version Alpha 0.050316 (first release 6 March 2016)
Pull The Strings will become open source soon. It is completly free to use in your art, research, or professional projects.
Built with [openFrameworks](http://www.openframeworks.cc).
It is developed by Luís Leite (GRIFU) for the Digital Media PhD with the support of FCT – Fundação para a Ciência e a Tecnologia.

## Download

Download Pull The Strings:

[Mac OS X 10.9 (32 bits) only ]

[wpdm_package id=”1684″]

## Demo video

## License

Copyright 2012-2016 Luís Leite

GNU GENERAL PUBLIC LICENSE
Version 3, 29 June 2007

Copyright (C) 2007 Free Software Foundation, Inc. <http://fsf.org/>
Everyone is permitted to copy and distribute verbatim copies
of this license document, but changing it is not allowed.

Preamble

The GNU General Public License is a free, copyleft license for
software and other kinds of works.

## Pull The Strings Interface

Pull The Strings allows you to create, save and load your projects.
You should start by always saving your project first which will create a folder to gather all the files needed.
To navigate through the canvas use the key, to open the menu use the and then start writing the operator name or use the mouse to navigate.
To delete an operator just use the to access to operators options click the <*> on the operator.
Connect nodes from the outputs to the inputs. Add new input nodes by clicking the <+ O> button.
You can create multiple graph windows.

### OSC Communication
Pull The Strings is based on OSC protocol. It receives OSC messages and sends OSC messages and all its interface communicate internally via OSC.
You can send several messages from one output port.
It supports bonjour so you can easely find Pull The strings with Bonjour compatible applications, such as TouchOSC.
It makes use of RemoteUI to remote control the operators with other applications or devices.

### Pull The Strings operators

input.number (to create a Float or Int inside the interface or with remote applications)
input.string (to create strings)
math.scale (a scalar function with a learning input range option)
osc.combine (combine several OSC messages into one, it allows message syncincing)
osc.expand (expands an OSC message into seperated out nodes, it allows to inspect the data types and choose the address)
osc.port.in (opens port for incoming OSC data)
osc.port.out (opens port for sending)
print (for debugging, it prints in the interface the incoming messages)
show.plot (makes a plot from the values)

## Version History

– Alpha 003 (March 5th 2016)

## Support

![Support](http://www.virtualmarionette.grifu.com)

## Other features
Pull The Strings as other operators that are not yet avaiable such as:
Timelines
LeapMotion support
Wiimote support
Math operations
Recording OSC performance

## Developed with
Pull The Strings is based on ofxDuct and makes use of many other ofxAddons such as:
ofxRemoteUI
ofxLeapMotion
Wiiuse
ofxBonjour
ofxAppGLFWindowMulti
ofxHistoryPlot
ofxUI
ofxTimecode
ofxMSATimer
ofxRange
ofxTimeline
ofxOsc
ofxGui
ofxMSAInteractiveObject

Results from the explorative work during an artistic residence on Espaço-Tempo in December 2015.

Solitária is an ongoing project from Alma D””””arame that explores the human Loneliness in the solitaire.

We explored the human mind and body in the dark.
How do we move, see, listen and think in a place where there is no light?
The absence of light removes any spatial–temporal references and draw us to an unbalanced world.

Solitaria Work-in-Progress

Solitaria Work-in-Progress 15th December 2015

We are trapped in a square space, and our body is the only connection to the physical world, the only sensor capable to recognize the physical space.

Our focus was on the relations between the human gestures and the space.

[vimeo 150417382 w=580&h=326]

[Portuguese]
Em termos gráficos, tentamos uma abordagem gráfica e minimal dando foco ao vazio e ao silêncio. Os elementos gráficos são geradas fundamentalmente a partir de linhas que nos remetem para as grades da prisão. Surgem igualmente letras para caracterizar as memórias e as silhuetas como representação do real. Mas esta realidade vai sendo linearizada (esquecida) começando com as silhuetas evoluindo até ao seu estado sintetizado, linhas que ganham expressão. Em termos de interação trabalhamos o corpo e objectos no espaço e no tempo. Começa-se por trabalhar o corpo e objectos no espaço da solitária, depois exploramos o espaço do próprio corpo explorando a relação espacial entre as mãos e centro gravitacional do corpo, finalmente retorna-se ao espaço da solitária relacionado o corpo com o espaço físico. Utiliza-se um sensor de profundidade e câmaras de vídeo.

Performer Amândio Anastácio
Multimedia Grifu

Technical framework
[HARDWARE]
– Macintosh Laptop
– Microsoft Kinect
– Video Camera
– Ipad

[SOFTWARE]
– Qlab
– OpenFrameworks
– eMotion
– QuartzComposer
– PureData
– MaxMsp
– PullTheStrings
– TouchOSC
– Syphon

Dec. 2015

 

Remote Control for Unity (RCu) is a open source generic mapping interface for controlling and exposing properties and methods of object””s in Unity using remote applications and devices.

Remote Control Framework Scheme

Remote Control Framework Scheme

Remote Control for Unity was published in December 2015 and is available at GitHub

This Video shows the framework of this digital puppetry environment.
[vimeo 150416605 w=580&h=326]

Remote Control is open source and provides a GUI to facilitate the mapping procedure ideal for non-expert programmers, an interface that provides an easy way for controlling Unity parameters in real-time from devices such as a smartphone.

RCu makes use of Jorge Garcia””s UnityOSC to access to the Open Sound Control (OSC) functionalities and provides connectivity with all devices and applications that support this protocol.

Videos:
Pre-Release Video
[vimeo 135032229 w=580&h=326]
This video shows the pre-released version from June 2015.

Tutorial 1 [Simple]: Controlling the object´s position
[vimeo 150416957 w=580&h=326]
In this tutorial you will learn how to connect, map, and control objects position from remote.

Tutorial 2 [Intermediate]: Controlling Blend-Shapes
[vimeo 150417112 w=580&h=326]
In this tutorial you will learn how to control Blend-shapes of your character with a flexible interface.

Tutorial 3 [Complex]: Control Lights, Cameras and Activator

In this tutorial you will learn how to change light parameters, how to move cameras and how to activate them remotely to create an interactive performative environment.
[vimeo 150417214 w=580&h=326]

Published in Dec.2015
GitHub: https://github.com/grifu/RemoteControlUnity

Resurrection performance at Casa da Música (casadamusica.com) in the 7th of December of 2015.
A major production from ESMAE (esmae-ipp.pt) for the 30th anniversary of IPP (portal.ipp.pt) embracing all the school departments (Theatre, Music, Audiovisual, Multimedia).

Resurrection at Casa da Música - 7.December.2015

Resurrection at Casa da Música – 7.December.2015

Inspired by the 2th Symphony of Gustav Mahler, Lee Beagley wrote and directed “Resurrection – Life lines” based on 65 interviews that he made during one year in Portugal with people who needed a second change in their life””s.

This video shows interventions that were performed in several places of Casa da Música and were adapted from the theatrical project presented at Teatro Helena Sá e Costa in November 2015.

Because my intervention was mainly on the multimedia field, I give you a brief insights of the project setup.
[vimeo 150416124 w=580&h=326]
[HARDWARE]
– Lobby and main Foyer (two 10k Lumen video projectors for the walls)
– Sala Renascença (one 7k Lumen projecting onto a translucent screen on a window to Sala Suggia)
– Exterior Wall (one video projector for the south wall)
– Two Macintosh””s
– Two Microsoft Kinect””s

[SOFTWARE]
– Qlab (Video mapping, video routing, video effects)
– OpenFrameworks (Computer Vision + Image Processing)
– Syphon (Video routing)
– Open Sound Control (for remote control)

A project with the students of ESMAE.

CREDITS
Texto e Encenacão: Lee Beagley
Traducão: José Topa

Ass. Encenacão e Produção: Xana Miranda
Ass. Movimento: Vítor Gomes
Ass. Voz: Bernardo Soares, Gabriela Amaro

Direcção de Cena e Produção: Gonçalo Gregório, Mariana Silva
Cenografia: Inês Mota, Luís Mesquita, Miguel Costa, Vera Matias
Figurinos: Manuel de Faria, Mónica Melo, Samanta Duarte
Luz e Som: Alexandre Candeias, Sérgio Vilela

Interpretação: Aldair Pereira, Carlos Alves, Cláudia Gomes, Gabriela Brás, Gabriela Costa, Guilherme de Sousa, Hugo Olim, Inês de Oliveira, João Lourenço, Mafalda Canhola, Maria Inês Peixoto, Mariana Coelho, Mariana Santos Silva, Marta Dias, Marta Rosas, Nuno Granja, Raquel Cunha, Rita Fernandes, Sara Xavier, Teresa Zabolitzki, Xavier Miguel, Miguel Marinho (participação especial)

Músicos: Diego Alonso, Gorka Oya Diez, Bernardo Soares, Ricardo Casaleiro, Gabriela Amaro
Fotografia: Paula da Fonte, Joana Machado, Rui Sá
Multimédia: Grifu, João Gigante, Ricardo Couto
Docentes DAI: Luís Leite (Grifu), Marco Conceição

/// This project hopefully opens new dialogues and breaks down barriers ///

Prometeu play was presented at the 10th International Shadow Theater Festival in Schwäbisch Gmünd. In the closing session of the festival program Marcelo Lafontana gave life to the story of Prometeus with a digital media shadow puppetry.

prometeus

The combination of shadow theater with a cinematic language makes this play unique.

prog-prometeu

“This multimedia performance is on the one hand inspired by the traditional Indonesian theatre „Wayang Kulit“ with its jointed silhouettes but on the other hand it makes use of the very latest in contemporary media: Against a lighted background the scene is created by the use of sand to create design and texture and to fashion space and visual surroundings. The puppets are moved in front of this background. The resulting scenes are recorded by video, electronically processed and then projected on to a screen. This is the story of Prometheus, the philanthropist, harbinger of culture and helper of people. He gave them fire and for this was severely punished by Zeus. The performance is an impressive example of the synthesis between the traditional structure of the Javanese Wayang Kulit shadow theatre and its transformation into the trappings and and characteristics of the western world.”

Setup of the play – smooth and with good vibrations!

montagem_1

Large amount of sits (more then 300)

montagem_2

 

id01

 

I was invited to be a part of the a round table during the Shadow Theater Festival at  Schwäbisch Gmünd and to discuss the latest developments in the shadow theater in particular the use of new media.

round-table-pic

It was a great opportunity to discuss with a panel of personalities related to shadow theater. This round table was moderated by Von Hartmut Topf, with Fabrizio Montecchi from Teatro Gioco Vita, Marcelo Redondo from Lafontana Formas Animadas, Alberto Jona from Controluce Teatro de Sombras and Robert Drobniuch from teatr kubus.

round-table-prog