VeraVoegelin
V r V g l n
contact: veravoegelin[at]annaverakelle.de
WORK
IN
PROGRESS!
Let's get started with Unity and Azure Kinect! But first of all, we have to wait for Unity to finish all the installations (version 2019.4.22f1) ...

... Unity is a Real-Time Development Platform we use to build a multiplayer 3D space ...

Register for our OpenDoorEvent here
We are having a #TakeCare Residency at Pathos Theater in Munich and will be on-site for 14 days.
What we will explore is:
How can we connect the installation EARTHBOUNDS with a virtual 3D multiplayer world with the help of an Azure Kinect so that you can interact with it remotely?
... we spent three days trying to get the Kinect connected to Unity at a NUC8 with Unity crashing every time we tried to run it. Finally, we found out that the graphic card is not compatible with the Body Tracking Sensor of the Kinect, which needs a specific driver for NVIDIA cards (CUDA). So caution the Kinect Body Tracking does only run with NVIDIA GPU! All other sensors can be used by Unity!
Azure Kinect DK is a developer kit with advanced AI sensors that provides complex machine vision models as well as speech models. The Kinect includes a depth sensor, a spatial microphone array with an RGB-video camera, and an orientation sensor with multiple software development kits (SDKs).

With the Azure Kinect Viewer (v1.4.10) you can check out the different sensors, this example is a recording by the depth sensor.
Azure Kinect Body Tracking Viewer
Some Screenshots of the Demos we used to explore the Kinect in Unity.
We are using the Azure Kinect Examples for Unity from the Asset Store as well as
examples Friedrich Kirschner made for a basic Unity-Multiplayer.


>> Link to Azure Kinect sensor SDK system requirements
>> Link to Short description of Demo Scenes
>> Link to Basic Unity Multiplayer by Friedrich Kirschner on GitHub
>> Link to Azure Kinect Examples for Unity in the Asset Store
The room is recorded by the depth sensor of the Kinect, transformed into a mesh, and streamed into the 3D room.
We tried as well several days to give the mesh a mesh collider, so we can "walk" on it.
However, we couldn't find a solution yet. Any ideas?


ColliderDemo


BlobDetectionDemo


PointCloudDemo / SceneMeshDemo


contact: veravoegelin[at]annaverakelle.de
How can we communicate with the earth on which we live?
At Schaubude Berlin (in autumn 2020) we created a setting for one participant on-site:
Earthbounds is a speculative performance that expands the experience of the body(ies) in space through sensors. In a prototypical world, the participants have the possibility to get in contact with their surrounding world through their being in the world and to experience themselves as part of the critical zone.


For the visual translation of the setting to our virtual multiplayer world, we are using the Kinect. To connect all the sensors and actuators in it we will use Serial Communication Protocols and Open Sound Control (OSC). We started off with the serial connection over a COM port. For that, we used the free Ardity-Package. It's easy to use and comes with a Setup Guide that we followed. Arduino Uno is talking to Unity and the other way round:
Serial.begin()

As in EARTHBOUNDS some of the feedback we get from our surroundings is bound to sound in the way that sensors are connected to piezos we want to integrate the microphone of the Kinect Azure in our multiplayer world. Unity provides a microphone audio module in its engine. However, we found out that it only lets you integrate simple microphones. The Kinect as well as the NUC come with a microphone array. It seems that Unity doesn't like that. We always got the same error:
>> Link to Unity Real-Time Development Platform
>> Link to Ardity in the Unity Asset Store
>> Other people facing the same issues with the microphone array
>>Tutorial to set up an ESP32 as a Wifi-Access-Point
Momentarily our Earthbounds-Setting consists of four ESP32-DevKits and 3 Arduino Unos. One ESP is set as a Wifi-Acess-Point so it allows us to use OSC to send messages between the ESPs. Our first idea was, to let the Unity-Multiplayer-Setting as well communicate with the ESPs via Wifi using OSC-Messages. But it turns out, that an ESP Wifi-Acess-Point can only handle four connected clients. As we already have four ESPs that all work as clients, there is no capacity for another client.
As our Arduinos will always be close to the Computer that works as the Host-Server - as we also need to connect the Kinect to it -, we can also use Serial for the whole communication between the Multiplayer and the Arduinos connected to the physical space.
So we will do that!
However, if you ever want to change your IP-settings on a Windows 10 manually, you need to assign some parameters, this is what we needed (and to ping, don't forget to turn of your firewall):
In the following, we will document our process constantly. However, it's not intended to be exhaustive. It's difficult to decide where to start and where to stop and you might have a different familiarity with the whole topic. So don't hesitate to contact us if we leave you behind with questions!

As we found out other people had the same issues and it doesn't seem like there is any solution for integrating microphone arrays in Unity yet.
With a simple microphone, it was fairly easy in the end. You just need to add an AudioSource-Component to an empty GameObject and a script that adds your microphone as an AudioClip to your AudioSource as well as starts the microphone.

We didn't care for the audio also working for multiplayer-clients yet. We will come back to that later and now continue with OSC and serial communications...
>>Here are some ways described how to do it....
We connected three Unos with our Host-Server via Serial.
This works fine, but be careful, Unity doesn't like to get too many Serial messages and tends to crash. So you shouldn't use Serial for sending frequent debug messages and make sure you have a decent delay in your Arduino code!
We had some problems with one of our Arduino codes: whenever we sent the inByte 500, it left the if-loop with a super high delay that we didn't know where it came from. But by chance we inserted this line of code into our if loop, and it worked!
So far we haven't really thought about why - there are various conspiracy theories floating around about Arduino messing with us - but ok...
The 14 days are over, we were really enjoying working on the setup so intensely and learned a lot... However, it was not much time, it's not ready yet and there are still some things we would love to figure out and look at in the future...
>> How can we position the Azure Kinect better so that you can see more of the setup (> get a better mesh) and also can see the full body of one walking around in the setup?
>> How can we transform the mesh into a physical collider?
>> Making a WebGL-Build so one can actually visit EARTHBOUNDS remotely.
>> Position the AudioSources properly.
>> Setting up the audio for multiplayer-clients.
>> Rethinking the transition of EARTHBOUNDS into a virtual world conceptually to make it even more sensible and sensitive.
>> Would it be interesting to include body-tracking of people walking around with physical space so that avatars can interact with them?
Supported by Fonds Darstellende Künste with funds from the Federal Government Commissioner for Culture and the Media
watch our little video!
[c]2020, of all content: Leoni Voegelin | Impressum
[c]2020, of all contents: Leoni Voegelin | Impressum