Virtual Marionette project was mentioned in the monthly news of Instituto Telecomunicações (IT)

news

“(…)

Instituto de Telecomunicações has participated in the science show that was held at the Pavilhão do Conhecimento – Ciência Viva in Lisbon. A team from Porto Interactive Center presented three demos: the project LIFEisGAME, which explores the possibility of teaching people with Autism Spectrum Disorder to recognize facial emotions, using real time synthesis and automatic facial expression analysis; Virtual Marionette, a research on digital puppetry via an interdisciplinary approach that brings the art of puppetry into the world of digital animation, and 3D Scanner, a system to create a high-quality human like 3D facial avatar by drastically improving the quality of 3D avatars and reducing the time spent in the process. From IT at Lisbon came BITalino, described in page 1, and a couple of demos of the Internet of the Future concept based on RFID. The stand from IT attracted more than 500 very interested visitors.”

link to the paper

SIC National Television broadcasted a news report from Porto Interactive Center showing our projects.

Although the focus was Life Is Game project, many other projects were shown, and Virtual Marionette also appear in this report.

Circus on the Strings is a very interesting puppetry play showing the great manipulation skills from Viktor Antonov.

Although FIMP (Porto International Marionette Festival) doesn”t focus too much in traditional marionettes, Circus on the Strings was that kind of show that fulfill the expectation of a more traditional audience.

Viktor-Antonov-Circus-On-The-Strings-02-482x756

Each time that i a see great puppeteer manipulating with great precision string puppets, i just wonder how they achieve this level of manipulation. It´s very difficult to handle so many strings with such precision giving life to puppets in a expressive way.

circus

Viktor brought to FIMP this amazing work, a classic puppetry play with superb details.

The puppet mechanics were simply amazing; puppets that move their heads, mouths, eyes, even the color of their heads. To control all this parts, the manipulation controllers were very sophisticated.

 

Ellen Fullman started to develop the Long String Instrument in 1981.
An instrument with dozens of metallic strings with more then 15 meters.

Ellen plays with the fingers covered with rosin-coated producing a chorus of organ-like partials.

I had the privilege to participate in Ellen´s workshop at FIMP (Porto International Marionette Festival)

 

Ellenplaying

Beyond the magnificent experience i realized the connection between manipulating a marionette and a instrument like this.

 

Ellen Fullman 2

Ellen must have full concentration to manipulate dozen of strings with great precision.

Her body needs to be in a perfect balance moving from one place to another producing the different tones.

 

IMG_1071

 

In this picture it is possible to view the resonance wooden boxes and Konrad Sprenger with his guitar.

Konrad uses a guitar as an interaction interface for making music without playing it directly.

He uses solenoids in each string to produce sound in an electric guitar. With servo motors attached to the tuners of the guitar he can control and manipulate the tone of the strings. All the manipulation is made inside the computer in a MaxMSP patch.

VIDEO: Konrad guitar

 

 


INVERSUS – The Sensitive Machine

Inversus is the sensitive machine that makes no sense. An artistic installation exploring interaction with common objects.

Why should a lamp be used only to illuminate?

Lamps, speakers or fans are usually used as output interfaces, what would happen if we turn the output into an input interface? Inversus explores this inversion by using Lamps as light sensors, speakers as pressure sensors and fans as blowing sensors. The main concept of Inversus is to invert the meaning of transmission devices into reception devices. A sensitive machine that capture human interaction to produce sound and visual kinetics. A performing instrument that gives life to a mechanic flower that spins when someone blows into the machine producing an animated shadow like in shadow puppetry. There is also a virtual marionette inside the machine that reacts to the pressure of the pads; this marionette is rigged with bones that are mapped to the pads that make them squash and stretch producing animation. Virtual and real animation is generated based on human interaction. Made from a washing machine this audiovisual instrument makes the sense of things changing by spinning a colorful wheel.

There are 3 different types of interaction

1. by touching 4 color pads that produce sounds in the same manner as when playing a drum (pressure sensitive), the virtual marionette reacts to this interaction by moving the arms and legs

2. by passing with the hands above 3 leds which produce sound like an organ (sound keeps playing until the hand moves away from the led)

3. by blowing a fan the frequency of the sound changes and the flower starts to spin.

Inversus ( Transmission -> Reception)

This is a 3D picture simulating the appearance of the installation

Inversus - interactive installation 3D simulation

Inversus – interactive installation 3D simulation

Picture from “CHEIA” exhibition at Póvoa de Varzim – Portugal (from 5th till 31th October)

Inversus interactive installation

Inversus interactive installation Póvoa do Varzim Oct. 2013

(more…)

Multitouch surface testing (FTIR)
The purpose of this tests is to build a puppetBox, a multitouch surface to manipulate puppets using fingers.
I´ve made silicone layers with synthetic and cellular diluent.
Avoid dissolving the silicone with cellular diluent because it will take too much time to try. Using synthetic diluent
Of-course you can use expensive materials to avoid this issues.

 

Material used:
– Optoma projector PK320 80 Ansi Lumen
– EyeCam Default lens with 850 nm filter (i changed the lens to the default because i used a very small surface)
– InfraRed Ledstrip 850 nm
– Baking paper with 2 layers of silicon + synthetic diluent
– CCV 1.2 on MAC and CCV 1.3 on Windows
Conclusions:
Systems:
– Eyecam gives a better performance in Windows (60 fps) against Mac OSX (30 fps)
– Flash CCV demos work 2 times faster on windows compared to Mac OSX
Screens:
– Rosco screen gives high resolution picture and it gives you a great feeling to touch
– Baking paper gives brighter picture with low resolution but it is very low-cost
– Rosco screen needs more layers of Sillicon to track blobs

Experiment made in September 2013

Virtual Marionette demonstration at Noite Europeia dos Investigadores 2013 Lisboa (Parque das Nações)

The participants were able to explore bePuppit system. An experiment to explore different ways to interact with puppets using our body. The players can choose the digital puppet setup with just their hands.nei2013_4

nei2013_12-select

Body motion to control a Dragon figure

nei2013_13-select

Trying several body expressions with the puppets, in this case, a hug.

nei2013_14-select

Two players interacting with the puppets using their hands as puppetry controllers.

(more…)

I´ve participated in a very important workshop supervised by Marcelo Lafontanta from Lafontana Puppetry Company.

In this workshop we made and explore different puppetry styles, namely:

– Shadow puppetry;

– Toy theatre;

– Punch & Judy;

 

teatro formas animadas

In each style we had the opportunity to build/craft the puppets, props and backgrounds and also to performance with each particular style.

And so, we were able to understand the differences in each style: constrains, performance issues, manipulation controllers, materials, and so on…

 

shadow

Building the shadow puppetry project

OSCeleton2Ramdance 0.1 beta
Jun 2013

This MaxMSP Patch was develop to be able to use the Microsoft Kinect (and more) with RamDance.

Kinect Framework 1: Osceleton->Osceleton2RamDance->RamDance

RAM Dance is an excelent toolkit to create environments for dancers.It is based on OpenFrameWorks so it is possible to modify and create new behaviors.
RAM Dance can also send and receive OSC messages for remote control. This feature presents a great extension to explore.

Although i work with digital puppetry, i found RAM Dance to be a very interesting environment to explore.

By the time this version was finished it was released another application developed by “eight” presenting a easier way to connect Kinect to RamDance. I recommend the use of that application if you are using a MAC and the Microsoft Kinect v1 and don´t want to modify any parameters. If you want to use another operation system, different depth sensor or modify parameters the OSCeleton2Ramdance could be a better choice, why ? Basically because you can modify the patch to meet your needs. For instance: if you want to send the OSC data to other ports and applications it´s easy.

Because OSCeleton send orientation data in a 3×3 matrix i needed to convert this to angle base format (ffff) to be compatible with RamDance that´s why it is included the mat2axis.

Files included in the package
– Folder mat2axis (orientation data conversion)
– max.jar (this is a maxmsp file just to be sure)
– OSCeleton2Ramdance-b01.app (MAC application)
– OSCeleton2Ramdance-b01.maxpat (MAXmsp patch)
– readme.txt (this file)
(MAC or Windows)
What you need:
– You need OpenNI installed (www.openni.org)
– OSCeleton from sensebloom (https://github.com/Sensebloom/OSCeleton)
– Ramdance (http://interlab.ycam.jp/en/projects/ram/ram_dance_toolkit)

How to use:
1. Open Terminal (Mac) or Command Prompt on windows to execute OSCeleton (please read the osceleton read.me file to a better understanding)
You can execute the fowling line: (the ´w´ shows the input image / ´xt´ enables orient data)
./osceleton12 -w -xt -mx 150 -my -150
2. Run the RamDance application
3. Open the OSCeleton2Ramdance-b01 and turn it on

That´s it…

good exploration

 

DOWNLOAD

[wpdm_package id=”1668″]

 

questions or comments to virtual.marionette@grifu.com

 

Future Work:
– connect WIImotes to control points of the skeleton
– optimize the patch
– Bridge Output data

Sensor Joint Sensor Joint
0 Head 12 Right Elbow
1 Neck 13 Right Wrist
2 Torso 14 Right Hand
3 Waist 15 Right Fingertip
4 Left Collar 16 Left Hip
5 Left Shoulder 17 Left Knee
6 Left Elbow 18 Left Ankle
7 Left Wrist 19 Left Foot
8 Left Hand 20 Right Hip
9 Left Fingertip 21 Right Knee
10 Right Collar 22 Right Ankle
11 Right Shoulder 23 Right Foot

7