Добавил:
Upload Опубликованный материал нарушает ваши авторские права? Сообщите нам.
Вуз: Предмет: Файл:
Processing 2 Creative Coding Hotshot.pdf
Скачиваний:
10
Добавлен:
17.03.2016
Размер:
5.28 Mб
Скачать

Project 2

The Stick Figure

Dance Company

In Project 1, Romeo and Juliet we learned how to make Processing talk. In this project, we will learn how to make it see and dance. We will use Microsoft's Kinect to implement the seeing part and a dancing human to teach a group of stick figures how to dance. Unless you find someone else who is willing to lead your stick figures, you will have to get up and dance yourself.

Computer vision and 3D scanning have long been the domain of very specialized and expensive hardware and software. With the ever increasing computing power of CPUs and graphics cards, and new hardware controllers like Kinect, computer vision projects have invaded living rooms via various game controllers, and are accessible to everyday programmers like you and me.

Mission Briefing

In this project, we will learn how to connect Microsoft's Kinect to a computer and use depth imaging and user tracking from Processing. We will rush through the installation of the OpenNI framework in the first task, as this library is used by the Processing library SimpleOpenNI. We will then learn how to use the depth image feature of the Kinect infrared camera and the player tracking function in Processing.

Then, we will use the so-called skeleton tracker, not only to locate the user in front of the camera, but also the head, neck, and elbows. These 3D coordinates will allow us to control a stick figure.

In the final task of our current mission, we are going to add a group of additional dancers that will also be controlled by the 3D coordinates of the players' limbs.

The Stick Figure Dance Company

You can see a screenshot of the final sketch here:

Why Is It Awesome?

Kinect enables a whole lot of new possibilities for interacting with a computer. It enables the player to control a computer by simply moving in front of the computer. How awesome is that? And as a nice side effect, it makes computer users move—something computer users usually don't shine at very much.

Disclaimer

The author is not responsible for injuries and accidents that arise if you place your limbs forcefully into places that are already occupied by other space-time aggregations. So please make sure you have enough free space around you while dancing.

30

Project 2

Apart from tracking the positions of the user's body parts, Kinect also provides us with a depth map of the surrounding players and a bitmap that allows us to determine which pixels of the image belong to the player and which don't.

Your Objectives

This Hotshot project is split into the following four tasks:

ff

ff

ff

ff

Connecting the Kinect

Making Processing see

Making a dancer Dance! Dance! Dance!

Mission Checklist

To complete this mission, you need a Kinect to connect to your computer. Kinect is connected to a USB port, but it needs more power than what a standard USB port can provide. Therefore, Kinect comes with a non-standard USB connector and a separate power connector. If you got your Kinect in a bundle with the Xbox 360, you will probably need to buy a USB power adapter for it; newer versions of the Xbox 360 provide a special port to connect the Kinect directly. You can get the adapters at an electronics store or online.

Since this project is about dancing, you will also need some music to dance to. The stick figures we are going to create are very tolerant when it comes to genres of music, so feel free to choose whatever music you like—Donna Summer, Zillertaler Schürzenjäger, Bobby Brown, Skrillex. If you like it, your stick figures will like it too.

Connecting the Kinect

In this first task, I will guide you through the installation of the OpenNI framework and a library from PrimeSense (the company that developed the Kinect for Microsoft). We will use OpenNI example programs to test our installation.

31

The Stick Figure Dance Company

Engage Thrusters

1.At the time of writing, the most recent version of the OpenNI framework was 2.1 Beta, but the Processing framework that we are going to use for our next task requires Version 1.5.4 of the OpenNI framework. Open the sitehttp://www.openni.org/ openni-sdk/openni-sdk-history-2/ in your browser, as shown in the following screenshot, and select the download package for your platform:

2.If you are running Linux or Mac OS X, open a Terminal window, go to the directory where you downloaded the file, and unpack it using the tar command; make sure to unpack it as root by running the install.sh script:

tar xvjf openni-bin-dev-linux-x86-v1.5.4.0.tar.bz2 cd OpenNI-Bin-Dev-Linux-x86-v1.5.4.0

sudo ./install.sh

32

Project 2

3.On Windows, just execute the MSI file you downloaded.

4.Repeat the above steps for the NiTE v1.5.2.21 file by downloading the package and installing it using the following commands:

tar xvjf nite-bin-linux-x86-v1.5.2.21.tar.bz2 cd NITE-Bin-Dev-Linux-x86-v1.5.2.21/

sudo ./install.sh

5.Now go to https://github.com/avin2/SensorKinect/tree/unstable/Bin and download the sensor package for Kinect. You do not need the sensor package you get from http://openni.org; this is for another sensor bar and won't work with Kinect.

6.Unpack the sensor package and install it on Linux or Mac OS X by running the following commands:

tar xvjf SensorKinect093-Bin-Linux-x86-v5.1.2.1.tar.bz2 cd Sensor-Bin-Linux-x86-v5.1.2.1/

sudo ./install.sh

7.On Windows, install the sensor package file by double-clicking on the MSI file.

8.Now connect your Kinect to the power adapter and then to an empty USB slot, and place it in front of your monitor. Make sure you have some free space in your room to stand in front of the sensor bar, as shown in this diagram:

Kinect

Human

Monitor

9.Go to the development package and run the NiViewer example program from the x86-Release folder under OpenNI-Bin-Dev-Linux-x86-v1.5.4.0/ Samples/Bin/:

cd ../OpenNI-Bin-Dev-Linux-x86-v1.5.4.0 cd Samples/Bin/x86-Relelase/

./NiViewer

33

The Stick Figure Dance Company

10.Place a human in front of the Kinect sensor and ask him/her to move. In the following screenshot, you can see the author move his coffee cup:

Objective Complete - Mini Debriefing

We just downloaded and installed the OpenNI open source drivers to access the Kinect sensor board. These drivers allow us to access infrared and RGB images from the Kinect controller. Kinect also does a depth scan using the infrared camera and is capable of tracking users.

After installing the development kit, the middleware drivers, and the sensor board for the Kinect controller, we used one of the example programs from the development kit to test the installation.

Classified Intel

There are also other sensor bars apart from Kinect that use the PrimeSense technology and are supported by the OpenNI package. To use these, you need to install the sensor packages from http://www.openni.org, which we had skipped in the Engage Thrusters section.

I have run and tested the programs in this project using a Kinect, but you could try them with other sensor bars such as PrimeSense Sensors or ASUS Xtion.

34

Соседние файлы в предмете [НЕСОРТИРОВАННОЕ]