Virtual Marionette puppet tools were used to develop several interactive installations for the animation exhibition Animar 12 at the cinematic gallery Solar in Vila do Conde. The exhibition was debuted in the 18th of February of 2017.

 

Animar 12 - Interactive Installations

Faz bem Falar de Amor

An interactive installation that challenges the participants to interpret with virtual characters some of the scenes from the animation music clip developed by Jorge Ribeiro for the music “Faz bem falar de amor” from Quinta do Bill.

Puppit tool was adapted to drive two cartoon characters (a very strong lady and a skinny young man) using the body motion of the visitors with with one Microsoft Kinect. The virtual character skeletons differ from the human body proportions. Thus, there are distinct behaviors on each puppet that do not mirror exactly the participant movement. Although our intention was to present this challenge to the participant for him to adapt his body to the target puppet we help him a little bit. To solve this discrepancy, I used two skeletons. A human like skeleton is mapped directly to the performer”s body. Then, the virtual character skeleton is mapped to the human clone skeleton with an offset and scale function. In this way, it was possible to scale up and down the movement on specific joints from the clone skeleton and make the virtual character behave more natural and cartoonish. This application was developed using Unity, OpenNI and Blender.

É Preciso que eu Diminua

This animation music clip was developed by Pedro Serrazina for Samuel Úria. It describes the story of a character that suffers from a scale problem transcending the size of the buildings. The character tries to shrink placing the arms and legs near to is body. On the other hand there is the need to free from the strict boundaries and expand all the things around him. To provide the feeling of body expansion pushing the boundaries away from his body the visitor drives a silhouette as a shadow with a contour line around him that grows when the participant expands his body and shrinks when he reduce his size. The silhouette captured by a Microsoft Kinect is projected on a set of cubes that deform the body shape. This application was developed with Openframeworks with ofxKinect and ofxOpenCV.

Estilhaços

For this short animation film produced by Jorge Miguel Ribeiro that addresses the portuguese colonial war there was the intention to trigger segments of the film when the visitor place his body above a mine. Two segments of the film show two distinct perspectives from the war, one from a father that experienced the war and the other from his child that understood the war through the indirect report of his father. A webcam was used to capture the position of the visitor”s body and whenever his body enters the mine space it sends an OSC message with a trigger to a video player application. Both video trigger and video player applications were developed with Openframeworks.

 

 

 

Untitled Anarchive is an experimental multimodal multimedia live performance of the POEX arquive. The Digital Arquive of Portuguese Experimental Literature was performed live at Salão Brazil on the 4th of February by members of Retroescavadora (Rui Torres, Luís Aly, and Luís Grifu) and their invited guests (Bruno Ministro and Sandra Guerreiro).

frames from Untitled archive at Salão Brazil Coimbra

This collaborative intervention explores new approaches to experimental literature mixing media and manipulation techniques, in a interactive environment that offers a common space for a dialog between the inter-actors.

A demo and poster of Mani-pull-action was shown at the Austin|Portugal Annual Conference at UNL Lisboa on May 23rd and 24th, 2016.

Manipullaction at Austin Conference 1

For this exhibition Mani-pull-action was shown inside a paper theatre or Kamishibai as a physical connection or metaphor for the art of puppetry. The participants were invited to manipulate different puppets with their hands and experiment digital puppetry.

Manipullaction at Austin Conference

Each puppet required a different level of dexterity simulating direct and indirect manipulation with physics and inverse kinematics approaches. The participants could also play with virtual cameras and lights using a touch device with a set of controls.

austin-portugal-2016

This was another contribution for disseminating digital puppetry among the community and try to attract more enthusiasts to this form of art.

 

leapITString

LEAP IT – STRING is a simple application that allows you to share the hand motion data captured by Leap Motion through network via OSC protocol with the fowling features:

– Record the performance
– Replay a recorded performance
– Select joints for broadcast
– Convert a recorded binary performance into a text file
– Hide/Show the GUI
– Convert the scale of the hand motion data
– Switch Palm Orientation between Pitch, Yaw, Roll or Yaw, Pitch, Roll

This application is a “string” from the Virtual Marionette Interaction Model developed by Luís Grifu for a research on digital puppetry. This application is free to use.

LeapIT String is now available for Mac and Windows

[wpdm_package id=”1753″]

[wpdm_package id=”1751″]

## Digital Hand Puppetry

ManiPullAction is a digital hand puppetry prototype built for a research project called Virtual Marionette. It was developed in March 2015 to evaluate the hand dexterity with the Leapmotion device to manipulate digital puppets in real-time. One year after its development maniPULLaction prototype is now released for Windows and OSX.
This prototype proposes an ergonomic mapping model to take advantage of the the full hand dexterity for expressive performance animation.

Try and evaluate

You can try the model and contribute with your evaluation by recording the 12 tests that are available. 8 tests with the puppet Luduvica and 4 tests with Cardboard boy puppet. If agree in making this test, just write a name in the label and start by pressing the “1” key or the “>” button. After the first test you can switch to the next by pressing the “>” button or the numeric keys “2”, “3”, and so on until the “8”. Then, you can jump to the next puppet by pressing the “2” button or the “Shift-F2” key. There are more 4 tests with this puppet that can be accessed with the “1”,”2”,”3”,”4” or using the “>” button. After finishing all tests you can send all the files that were saved to “virtual.marionette@grifu.com”. You can find or reproduce the file by pressing “browse”. The files are text files and motion capture binary files.

How to use

Just use your hand to drive the puppets. This prototype presents 4 distinct puppets with different interaction approaches that you can experiment by pressing 1,2,3,4 buttons.

  • The palm of the hand drives the puppets position and orientation
  • The little finger drives the eye direction (pupil) with 2 DOF (left and right)
  • The ring finger and middle finger are mapped to the left and right eye brows with 1 DOF (up and down)
  • The Index finger drives the eyelids with 1 DOF (open and close)
  • The thumb finger is mapped to the mouth with 1 DOF (open and close)

There are other interaction methods with each puppet:

  1. Luduvica – a digital glove puppet. This yellow bird offers a simple manipulation by driving just the puppet””””””””””””””””s head. If you use a second hand, you can have two puppets for a dramatic play. The performance test (7 key) allows you to drive the puppet in the Z-deph.
  2. Cardboard Boy (paperboy) – a digital wireless marionette (or rod puppet). This physics-based puppet extends the interaction of Luduvica by introducing physics and a second hand for direct manipulation. As in traditional puppetry, you can play with the puppet with an external hand.
  3. Mr. Brown – is a digital marionette/rod. This physics-based puppet is a similar approach to the paperboy but with more expressions. A remote console interface allows to drive a full set of facial expressions and the second hand allows the digital puppeteer to drive the puppet””””””””””””””””s hands (although not available in this version)
  4. Minster Monster – is a digital muppet with arm and hand. Facial and hands are the most expressive components of our bodies as well in monster characters. While the head of the puppet is driven by the palm of the hand, the mouth is controlled by a pinch gesture in a similar way to traditional muppets. A second hand drives the arm and hand of the monster for expressive animation.

Live cameras and lighting

To simulate a live show you can switch among cameras, change the lighting setup or even focus and defocus the camera.

It also presents live camera switching and lighting control through remote OSC in port “7000” by using applications such as TouchOSC (try the“simple” preset)
.Switch between the 4 cameras with the messages “/2/push1” .. to “/2/push4”, to manipulate the lights use “/1/fader1” to “/1/fader4”, to focus and defocus the camera 3 on the Cardboard and Mr. Brown scenes use the “/1/fader5”. You can also navigate through scenes with the addresses “/2/push13” to “2/push16”.

Download

maniPULLaction for OSX

[wpdm_package id=”1716″]

maniPULLaction for Windows

[wpdm_package id=”1719″]

Info

Developed by Grifu in 2015 for the Digital Media PhD at FEUP (Austin | Portugal) supported by FCTIt requires the Leapmotion device.

Demo video

 

[vimeo 140126085 w=580&h=326]

Shape Your Body

Shape Your Body is a multi-platform digital puppetry application that allows to manipulate silhouette puppets with the body. It makes use of low-cost motion capture technology (Microsoft Kinect) to provide an interactive environment for performance animation. It challenge the user to explore his body as marionette controller, in this way, we get to know a little better our body.

This project brings the art of shadow puppetry into the digital performance animation.

Shadow theatre is an ancient art form that brings life to inanimate figures. Shadow puppetry is great environment for storytelling. Thus, our objective was to build a digital puppetry tool to be used with non expert-artists to create expressive virtual shadow plays using body motion. A framework was deployed based on Microsoft Kinect using OpenNI and Unity to animate in real-time a silhouette. We challenge the participants to play and search the most adequate body movement for each character.

Requirements to run this application
– PC / Mac computer
– OpenNI drivers version 1.5 (you can find the Zigfu drivers above)
– Microsoft Kinect sensor

[wpdm_package id=”1558″]
[wpdm_package id=”1559″]

This project was published at ACM CHI 2012
You can download the article
[wpdm_package id=”1414″]

Poster for CHI
poster-CHI2

 

Video
[vimeo 37128614 w=580&h=326]

 

BodyPuppetry

With BodyPuppetry the users body is the puppet controller, producing performance animation with body motion. We challenge the participants to explore their bodies to give life to puppets that are different from human morphology as if they were using hand shadows to create imaginative figures.

BePuppit: BodyPuppetry is an interactive application that explores the potential of the body motion to drive virtual puppets in particular silhouettes.
Users are challenged to deconstruct their bodies and use it as a marionette controller to drive  silhouettes. Because this puppets are in two dimensions the player needs to find the body poses that work best with the puppets. I call this search for the best pose the distance of manipulation. The more direct is the manipulation (when the manipulated subject mirrors the puppeteer) the more acting skills are needed, puppetry skills are needed when this distance increases.

BODYPUPPETRY is part of a digital puppetry research called Virtual Marionette.

BodyPuppetry challenges the participant to use his body as a puppetry controller, giving life to a virtual silhouette through acting.

Inspired by traditional marionette methods, such as shadow puppetry, the project goal is to study novel interfaces as digital puppetry controllers for performance animation. In this particular case the challenge was to deconstruct our body as if it was a marionette controller to give life to non-human silhouettes figures. There is also a human-like 3D model for comparison purposes. This project was used for an experimental study with the purpose of understanding how non-expert artists would behave with their bodies when challenged to control silhouette figures.

Implementation
This application was developed for Microsoft Kinect device using the OpenNI wrapper for Unity. NITE gestures were used to drive a virtual cursor allowing to control the interface with the hands. (in this release, only the Windows version work with NITE)

Requirements
– PC/MAC with OpenNI drivers (version 1.5)
– Microsoft Kinect sensor

If you need the drivers you can download the Zigfu for Mac or Windows available on the file section

Files
[wpdm_package id=”1534″]
[wpdm_package id=”1551″]
Acknowledgements
Thanks to Luís Silva (the author of the Hercules figure), Luís Felix (author of the Punch character), Marcelo Lafontana for all the puppetry support and knowledge, to Sónia Barbosa, Vasco Barbosa, Marta Barbosa for being my inspiration.

This project was developed and released in 2012 by Luís Leite (Aka GRIFU) for the Digital Media PhD. For more information please visit WWW.VIRTUALMARIONETTE.GRIFU.COM

bepuppit

BePuppit is a set of tools, applications and experiments based on interaction methods and techniques to drive and manipulate digital performing objects with our body.

Mapping the human body limbs to digital subjects is not trivial, in particular when mapping our body to digital models that present non-human morphology.

PUPPIT – Play with Virtual Puppets
Inspired by the traditional puppetry we challenge the participants to explore their bodies as marionette controllers by playing with puppets. This performance-driven animation projects explores different digital puppetry methods presenting new ways to interact with virtual puppets.

PUPPIT consists in a several prototypes that are the work-in-progress of a PhD research called “Virtual Marionette – Interaction Model for Digital Puppetry”. The goal of the prototypes is to explore and evaluate different rigging, mapping and interaction methods, and to develop tools for artists and non-expert artists that can be used in as collaborative environment for storytelling – making live narratives.

For any of the projects based on Kinect you should download and install OpenNI 1.5.

Here are two simple installation packages developed by ZigFu that provide OpenNI drivers in a very straight way. Just download and install and that´s it, you can start using the Microsoft Kinect.

[wpdm_package id=”1553″]
[wpdm_package id=”1555″]

 

mani-PULL-action is a digital hand puppetry interactive environment that provides a rich and expressive way for performance animation.

It is based on the maxim that our hands are a powerful tool for manipulation and we are used to manipulate all kinds of things in a everyday basis.

digital hand puppetry

ManiPULLaction makes use of hand-based interface devices such as the LeapMotion to offer a high degree of freedom of manipulation providing an expressive mean for the creation of animation in real-time that can enhance the storytelling experience.

You can find some previous video experiments such as the DigiMario

[vimeo 110798138 w=580&h=326]

 

or the Ludivica digital bird puppet

[vimeo 110452298 w=580&h=326]

 

Or Mr.Gonzaga

[vimeo 101553868 w=580&h=326]

[vimeo 101467576 w=580&h=326]

 

 

Common spaces poster was available during the 2 days of the 1st Joint Conference and Exhibition of the International Partnerships. I have the opportunity to demonstrate the framework of Common Spaces during the conference with a simple setup.

FSIE_logo_v2

This conference highlights the Portugal-U.S. partnerships. A joint conference and exhibition on fostering science and innovation, a showcase activities and outcomes of the international partnerships with Portugal.

 

common spaces at 1st Joint Conference and Exhibition of the International Partnerships