Instalação Ditador fez parte da exposição “Abílio-José Santos. Revelação: Concretos e Visuais” aberta ao público de 4 de julho a 8 de setembro de 2019 no Fórum Maia.

The interactive installation Ditador was at the “Abílio-José Santos. Revelação: Concretos e Visuais” exhibition from July to September 2019.

 

This installation was developed by Grifu, Luís Aly and Marco Jerónimo using eMotion, Qlab, Kinect.

Ditador

Solitaria

Poster for the Solitária performance

Solitária is a performance-play that explores the human Loneliness in the solitary confinement imagining the behavior of a prisoner through is mind and body. Our body is the only connection to the physical world, the only sensor capable to recognize the physical space. However, our sensorial capabilities begin to misbehave and soon we became disorientated.

Debut’s on 9th November 2017 – 21h30 – Blackbox d’O Espaço do Tempo.
Other performances in 10th and 11th November at 21h30 – Blackbox d’O Espaço do Tempo.

[embedyt] http://www.youtube.com/watch?v=BmSXnxXGOXk[/embedyt]

 

Solitária follows the performance-play approaches which characterizes the work developed by Alma d’arame from its beginning. Each work starts inside a delimited space, inside it’s own confinement. In one hand, the space of narrative, theater, puppet, being and object and, on the other, the space of programming, kinetics, multimedia. Starting from the solitary and creative space of each one, we saw the common space of creation born. We all have and need that time with ourselves. It is in this time that we find the space of each one, which is ours alone, and where we can relive memories, hide, think, feel, register. Here we will reach own states. That is what this performative act is all about.
It is in this laboratory space that this solitary confrontation between man and machine, between real and virtual, unfolds, and it is this confrontation that will lead us to experimentation and the search for new narratives.
The kinetic movement of the body and how it occupies the empty space will build this visual and sound narrative.

Art team
Direcção artística | Amândio Anastácio;
Interpretação | Susana Nunes;
Multimédia | Luís Grifu;
Música | João Bastos;
Marioneta | Raul Constante Pereira;
Desenho de luz e Espaço cénico | Amândio Anastácio;
Desenho de luz e montagem | António Costa;
Direcção de Produção | Isabel Pinto Coelho;
Assistente de Produção | Alexandra Anastácio;
Fotografia | Inês Samina;
Vídeo | Pedro Grenha;

Production | Alma d’Arame

Apoio | Câmara Municipal de Montemor-o-Novo .
Parceria | Espaço do Tempo .
Estrutura Financiada por | Dgartes e Governo de Portugal

The new version of Stringless for Unity supports Zeroconf networking through Bonjour for auto discovery as well RemoteUI.

stringless

Just press play on Unity, and all the parameters that you have chosen for remote control will automatically appear in RemoteUI on another computer, on a IOS or Android Device.
You do not need to manually setup the OSC addresses anymore, just drag a Receiver on a Gameobject and define the value range that RemoteUI will automatically recognize and show you a slider for you to manipulate. These sliders can also be mapped to MIDI devices or other HUD controllers in a similar manner as the MIDI learn feature. Just press the name of the parameter on RemoteUI until it starts to blink, and move a slider on your MIDI controller or a joystick on a gamepad. The mapping will be saved for future sessions. You can also define distinct devices for the same mapping allowing device interchange.

[vimeo 219604247 w=580&h=326]

 

Stringless support for RemoteUI is just a prototype and still in a developing phase.
Soon, it will be available for download.

Music: bensound.com
May.2017

Virtual Marionette puppet tools were used to develop several interactive installations for the animation exhibition Animar 12 at the cinematic gallery Solar in Vila do Conde. The exhibition was debuted in the 18th of February of 2017.

 

Animar 12 - Interactive Installations

Faz bem Falar de Amor

An interactive installation that challenges the participants to interpret with virtual characters some of the scenes from the animation music clip developed by Jorge Ribeiro for the music “Faz bem falar de amor” from Quinta do Bill.

Puppit tool was adapted to drive two cartoon characters (a very strong lady and a skinny young man) using the body motion of the visitors with with one Microsoft Kinect. The virtual character skeletons differ from the human body proportions. Thus, there are distinct behaviors on each puppet that do not mirror exactly the participant movement. Although our intention was to present this challenge to the participant for him to adapt his body to the target puppet we help him a little bit. To solve this discrepancy, I used two skeletons. A human like skeleton is mapped directly to the performer”s body. Then, the virtual character skeleton is mapped to the human clone skeleton with an offset and scale function. In this way, it was possible to scale up and down the movement on specific joints from the clone skeleton and make the virtual character behave more natural and cartoonish. This application was developed using Unity, OpenNI and Blender.

É Preciso que eu Diminua

This animation music clip was developed by Pedro Serrazina for Samuel Úria. It describes the story of a character that suffers from a scale problem transcending the size of the buildings. The character tries to shrink placing the arms and legs near to is body. On the other hand there is the need to free from the strict boundaries and expand all the things around him. To provide the feeling of body expansion pushing the boundaries away from his body the visitor drives a silhouette as a shadow with a contour line around him that grows when the participant expands his body and shrinks when he reduce his size. The silhouette captured by a Microsoft Kinect is projected on a set of cubes that deform the body shape. This application was developed with Openframeworks with ofxKinect and ofxOpenCV.

Estilhaços

For this short animation film produced by Jorge Miguel Ribeiro that addresses the portuguese colonial war there was the intention to trigger segments of the film when the visitor place his body above a mine. Two segments of the film show two distinct perspectives from the war, one from a father that experienced the war and the other from his child that understood the war through the indirect report of his father. A webcam was used to capture the position of the visitor”s body and whenever his body enters the mine space it sends an OSC message with a trigger to a video player application. Both video trigger and video player applications were developed with Openframeworks.

 

 

 

leapITString

LEAP IT – STRING is a simple application that allows you to share the hand motion data captured by Leap Motion through network via OSC protocol with the fowling features:

– Record the performance
– Replay a recorded performance
– Select joints for broadcast
– Convert a recorded binary performance into a text file
– Hide/Show the GUI
– Convert the scale of the hand motion data
– Switch Palm Orientation between Pitch, Yaw, Roll or Yaw, Pitch, Roll

This application is a “string” from the Virtual Marionette Interaction Model developed by Luís Grifu for a research on digital puppetry. This application is free to use.

LeapIT String is now available for Mac and Windows

[wpdm_package id=”1753″]

[wpdm_package id=”1751″]

Joe Gran demonstrates the use of the Leap Motion to control the behavior of a character in real-time.

“The making of: Dog of Wisdom” is a video showing the potential of this mid-air device.

The work of Joe Gran is absolutely amazing. He mapped the fingers to the rig in Maya (probably using the Leap motion plugin).

It works great!!

 

pull

 

Marionette Programming Language

Pull The Strings is a digital puppetry visual node-based control environment for performance animation.
It acts as a middle-ware interface between puppeteers and puppets, between device drivers and driven objects, between input interfaces and applications.

[vimeo 158084097 w=580&h=326]

It is a marionette programming engine that works as a digital mediator between devices and applications, it provides the building blocks for the digital puppeteer to establish the manipulation semantics. It is a visual programming language inspired by the strings of the marionettes and by the patch cords that connect modules in the old analog video synthesizers. In this environment the signals are processed and sent to the network. Finally the data arrives to Remote Control for Unity which is a plugin that facilitates the mapping of OSC messages.

It was designed with a minimalistic but intuitive interface and was developed in C++ with Openframeworks making use of multiple add-ons to provide their main features.
A multidisciplinary enviorment (MMMM: Multi-plataform, Multi-modal, Multi-mediator, Multi-user).
An enviorment for artists with non-programming background. Design for digital puppeteers.

Pull the strings is a middle-ware, an interface between applications, devices, and performing objects. It can be consider a visual programming environment for remote control made for artists and designers.
Sometimes you find hard to use one environment or application that offers all those nice features that your are looking for, so why not use a multiple set of applications in real-time ?
The goal of Pull the Strings is to facilitate the use and control of all your resources found in your computer and on the network.
An abstraction from technology making use of generic signals and communication protocols.
Connects, and transforms signals from input and virtual devices into performing object controls. Map and orchestrate multimedia data using OSC, DMX, MIDI and VRPN
Is a remote control application, an interface that connects inputs to outputs and provides a series of functionalities that helps to control the animation. It is a OSC-based environment and all nodes work with OSC format.

Version Alpha 0.050316 (first release 6 March 2016)
Pull The Strings will become open source soon. It is completly free to use in your art, research, or professional projects.
Built with [openFrameworks](http://www.openframeworks.cc).
It is developed by Luís Leite (GRIFU) for the Digital Media PhD with the support of FCT – Fundação para a Ciência e a Tecnologia.

## Download

Download Pull The Strings:

[Mac OS X 10.9 (32 bits) only ]

[wpdm_package id=”1684″]

## Demo video

## License

Copyright 2012-2016 Luís Leite

GNU GENERAL PUBLIC LICENSE
Version 3, 29 June 2007

Copyright (C) 2007 Free Software Foundation, Inc. <http://fsf.org/>
Everyone is permitted to copy and distribute verbatim copies
of this license document, but changing it is not allowed.

Preamble

The GNU General Public License is a free, copyleft license for
software and other kinds of works.

## Pull The Strings Interface

Pull The Strings allows you to create, save and load your projects.
You should start by always saving your project first which will create a folder to gather all the files needed.
To navigate through the canvas use the key, to open the menu use the and then start writing the operator name or use the mouse to navigate.
To delete an operator just use the to access to operators options click the <*> on the operator.
Connect nodes from the outputs to the inputs. Add new input nodes by clicking the <+ O> button.
You can create multiple graph windows.

### OSC Communication
Pull The Strings is based on OSC protocol. It receives OSC messages and sends OSC messages and all its interface communicate internally via OSC.
You can send several messages from one output port.
It supports bonjour so you can easely find Pull The strings with Bonjour compatible applications, such as TouchOSC.
It makes use of RemoteUI to remote control the operators with other applications or devices.

### Pull The Strings operators

input.number (to create a Float or Int inside the interface or with remote applications)
input.string (to create strings)
math.scale (a scalar function with a learning input range option)
osc.combine (combine several OSC messages into one, it allows message syncincing)
osc.expand (expands an OSC message into seperated out nodes, it allows to inspect the data types and choose the address)
osc.port.in (opens port for incoming OSC data)
osc.port.out (opens port for sending)
print (for debugging, it prints in the interface the incoming messages)
show.plot (makes a plot from the values)

## Version History

– Alpha 003 (March 5th 2016)

## Support

![Support](http://www.virtualmarionette.grifu.com)

## Other features
Pull The Strings as other operators that are not yet avaiable such as:
Timelines
LeapMotion support
Wiimote support
Math operations
Recording OSC performance

## Developed with
Pull The Strings is based on ofxDuct and makes use of many other ofxAddons such as:
ofxRemoteUI
ofxLeapMotion
Wiiuse
ofxBonjour
ofxAppGLFWindowMulti
ofxHistoryPlot
ofxUI
ofxTimecode
ofxMSATimer
ofxRange
ofxTimeline
ofxOsc
ofxGui
ofxMSAInteractiveObject

This is a small application developed by Eight / Vladimir Gusev) to use Kinect as a motion capture system for RamDance. It requires OpenNI.

It allows more then one user and accepts external “.oni” motion files (motion capture files from openni).

It substitutes the need to work with osceleton and my own patch for maxmsp (although my patch is oriented to a wide use).

It captures the motion and sends it directly to ram dance without any intermediate application making use of the OSC protocol.  You can configure the network address to send to a different computer.

Download Mac Binary

[wpdm_package id=”1670″]

This file is the Mac OSX binary file, please visit Github to download the source: https://github.com/eighteight/CocoKinect

 

Digimario - Digital Rod Puppet Style

Digi Mario – A digital rod puppet style with physics. A hand manipulated puppet with leap motion controller. One hand controls the puppet and the other simulates the puppeteer virtual hand for physical interaction.

[vimeo 110798138 w=580&h=326]

This video shows how to animate a puppet with just one hand using physics to recreate the marionette aesthetics.

Marionette animation is fascinating but requires a lot of skills, this virtual marionette is much simpler but the digital puppeteer can control many aspects of the puppet. With few training the puppeteer can bring this puppet to life in a traditional style.
All fingers are mapped to the face controls and the hand is mapped to the head. When you move the head, the body will follow along like if we had a rod connected to the head.
Interesting effect is when you turn your hand using your pinky finger first, the eyes will look to the target and the head will follow the eyes.

framework: Leapmotion v2+ Unity

[vimeo 110452298 w=580&h=326]

Demonstration of full hand control for expressive digital puppetry

This video shows how to animate a digital puppet in real-time with just one hand in an expressive manner.
It is part of a PhD research in the digital puppetry field.

Stringless hand controller (a metaphor for the marionette controller)
Hand + 5 finger controls different aspects of the puppet

Hand: position and orientation of the puppet
Pinky finger: Eye Pupils rotation in all directions
Index finger: Eyelashes rotation in the +Y and -Y axis (open and close)
Middle finger: Right Eyebrow blend shape deformation for character expressions
Ring finger: Left Eyebrow blend shape deformation for character expressions
Thumb: Mouth (open and close) blend shape deformation

There are different degrees of freedom (DOF) for each finger mapped to a certain puppet control.
The Hand as 6 DOF (position and rotation), you can move your hand freely around the tracking area but must be carful with the occlusion problem;
The pinky finger has 3 DOF for rotation of the eye pupils, and although controlling the pinky finger independently as some constraints, for this kind of small motion is more then adequate.
The middle and ring fingers are have more constraints and it is very hard to controlled them independently, so I mapped just 1 DOF (up and down) of each finger to the eyebrows.
The index finger as more potential because you can control more degrees of freedom, I tried using the index to control the eye pupils but the results were not so great. Instead, the index finger is mapped to the eyelashes rotation with just 1 DOF (up and down)

This is a good direction for digital puppeteers that have full control of the character expressivity. It require some training to act like a character, but that´s the magic of puppeteering.

This prototype can animate two different full controlled puppets for interaction.
A powerful model for performance animation using digital puppeteering techniques.

Hardware: Leap motion device for hand tracking and a macbook
Software: Unity Engine