Sunday, November 6, 2011

Installation Complete!



The installation is up and running along with the selection of years feature, thus making the bulk of the programming work FINALLY COMPLETE. The only thing left to do is to put the floor decal. As many visitors were expected each day over the long weekend there hasn't been a chance to place floor decals down, but hopefully they will find the opportunity to do it sometime soon (as some visitors and children don't seem to read the instructions on the corner of projection on how to interact with it, and instead just run amok in front of it, not knowing what is going on).

IMG_1544

IMG_1543

I feel lucky that we have got a powerful computer for the installation. I could not even test the Year features on my mac as it was painfully slow to run anything at all. A fast PC really makes all the difference as it works magically smoothly onsite at MBS.

I think the reason why it was finally possible to do this within two weeks is that the equipment is so easily available (kinects can be bought at the 24 hour Challenger store even!), the device does not even need to be hacked apart physically, and the documentation online for kinect and as3kinect is excellent (although you will need to sift out the white noise of random discussions and blogs also discussing similar problems and errors encountered while building it). Although... I did work overnight in the kitchen for a few days in order to program and test it...

IMG_1442

Friday, October 28, 2011

Kinect Testing On-site

AttractionGridTriangles

photo 3

photo 2

A version of the program is up and working now. Made it in two weeks, mostly just using the depth and blob detection. Granted, I could have used more advanced features of kinect like the skeleton, but mind you I only had two weeks to code something up, no one is going to help me debug when i write myself into a corner.

2am: One of the docents points out that I need to mirror the video image. why yes of course i have to! Thankfully this is easily done in as3kinect.

3am: Terrifying discovery of CABLES BEING TOO SHORT / NOT WORKING resulted in crazy 4am trip to Funan Challenger to get VGA cables and connectors. The cables had to be run across the giant ceiling but we didnt have the connectors either. Sadly, Funan Challenger, despite having a MASSIVE wall of cables, did not have our particular M/F cable (everything was M/M!) and we found it later in our next 24 hour option, Mustafas, after searching a massive pasar malam style cage full of dusty, poorly wrapped vga cables in ragged plastic bags. But there they were! THEY WERE THERE!

Picture 5

7am: STILL SITTING HERE hoping the technician will have a breakthrough with the cabling! this is now all out of my control!

Also, because I don't want to break the code 2 hours before the opening, I am going to integrate the time/years function after today's opening. So for today's version there is only the map of 1819.

Picture 11

Picture 14

9am: Technician cannot figure out what is wrong despite dedicatedly spending the last few hours cabling, recabling, tracing the cables. So we have sadly decided to move the setup to tomorrow so the tech guys or contractors can figure out how to connect a giant projector to a computer some 10 metres away.

In the meantime, everyone shall be regaled with this promo video i made the day before!

Monday, October 24, 2011

Debug

Today I installed the kinect drivers on the actual Windows PC to be used in the installation. I am grateful that it is brand new and has lots and lots of RAM. Running all tests on kinect is A JOY. Unlike the situation on my mac itself, which induces MORE AND MORE RAGING PANIC because I can visibly see the lag.




Problem: When installing on Windows 7 (32bit), running glview, glpclview, as3-server generates the following error:

"freenect_win_as3server_0.9b\libusb0.dll is either not designed to run on Windows or it contains an error."

My Mac runs Windows 7 (64bit) so I hadnt encountered this before. Luckily the solution was easy to find; simply a matter of finding the right USB driver for the job.

Solution: Copy libusb0_x86.dl from the freenect_drivers > xbox nui camera > x86 folder, paste it into folder where as3server is, and rename it to libusb0.dll




Problem: Need to import Flex Project to Flash Project.

Picture 19
Just press update.




Next Problem: "ArgumentError: Error #1063: Argument count mismatch on Spring(). Expected 0, got 5.'

I was moving a flex project over to flashcs5 so i could have more control over the look of the interface with kinect, buttons and all. However, the working code i had simply broke at this one point, returning this miscreant error.

So I have a Spring that is expecting to receive 5 values but for SOME REASON it says it expects 0? I checked the Spring class and it was properly constructed. I checked the other classes and they were all straightforward. So what was wrong? My initial thoughts: maybe the deserializer has to construct an instance of the object before setting it's properties. So maybe I have to define a value for the arguments even if i just set them to null first. BUT THIS DIDNT WORK.

I checked it over and over again. A FEW HOURS LATER (after frantic trial and error) I JUST COMPILED IT WITH AN OLDER VERSION OF FLASH/playerglobal.swc for Flash PLayer 10.3. AND THEN THE THING WORKED? I still have no idea why such an error would surface in Flash CS5. I can't see why. It was not very complicated to begin with. Sometimes AS3 really annoys me. Grrrrrrrrrrrr.

SOLUTION:
Picture 5

Also to move a flex project over you will be prompted if you use flex stuff so you can import the SDK as well. Woo and yay. In the end i used flex sdk 3.5. someone explain to me the difference between 3.5 and 4. But explain it to me later when i have more time to comprehend this all slowly.

PS: IF YOU RANDOMLY IMPORT LOTS OF SWCs IN FOR NO GOOD REASON YOU WILL ALSO BREAK OTHER PARTS OF YOUR CODE. THe intricacies of this are lost upon me at this point.

Friday, October 21, 2011

Setting up a kinect-based installation in one weekend? Oh yes.

Goal 1: CAN COMPUTER GET IMAGE? - YES



Picture 11

YIPPEE!

I followed the instructions on this o'reilly page. I should like to add that there are many descriptions and tutorials out there but not all of them will work for you. It also depends on what you want to do and what you can work with (for me, I can only code in AS3). I was flapping about aimlessly for half a day downloading loads of random drivers and installing and uninstalling them constantly, but it did not get me anywhere. In the end, this informative and detailed introduction to openkinect and as3kinect really sorted me out.

I am running Windows 7 via Parallels Desktop 7 on a Unibody Macbook Pro 2.53 Ghz Intel Core 2 Duo 4GB RAM.

  1. Uninstall all the other goddamned kinect related drivers you've previously installed. UNINSTALL EVERYTHING THAT IS CONFUSING.
  2. Download freenect_drivers.zip and unzip them. Install them ONE BY ONE in Device Manager - http://as3kinect.org/distribution/win/freenect_drivers.zip
  3. Download and install Microsoft Visual C++ 2010 Redistributable Package - "The Microsoft Visual C++ 2010 Redistributable Package installs runtime components of Visual C++ Libraries required to run applications developed with Visual C++ on a computer that does not have Visual C++ 2010 installed." - If you installed Visual 2010 Express you can skip this part. (OR PRETEND TO "REPAIR" THE INSTALLATION IF YOU ARE PARANOID! Makes no difference, it seems, except to calm one's RAGING PANIC)
  4. Download the as3 server bridge - http://as3kinect.org/distribution/win/freenect_win_as3server_0.9b.zip
    Contents of folder as described in the original guide:
    • as3-server.exe—This is the bridge to ActionScript.
    • freenect_sync.dll—Part of the OpenKinect API. This is used to get the data when it can be processed instead of when it is available.
    • glut32.dll—This is an OpenGL library dependency.
    • pthreadVC2.dll—Unix multithreading library
    • freenect.dll—This is the OpenKinect main library (driver)
    • glpclview.exe—Demo from OpenKinect (3D projection of depth and color).
    • glview.exe—Demo from OpenKinect (2D projection of depth and color).
    • libusb0.dll—USB library
    • tiltdemo.exe—Demo from OpenKinect for controlling the Kinect's tilt position motor
  5. Running glview.exe returns the above image.


Goal 2: CAN AS3SERVER SEND KINECT DATA TO AS3CLIENT? - YES



Because I needed to understand how this worked, I redrew the image explaining how the information gets from the kinect to my AS3 client. (although the original diagram was a lifesaver and i really appreciate it, i am also a spelling nazi so itchyfingers here had to correct it)

kinect

Test 1: Opening test.exe (flash executable) provided in freenect_demo_pkg_0.9b.zip
Switch on AS3server and then open test.exe

Picture 2

YES, Flash can receive video data. Kinect can receive messages from flash (changing LED colour, motor tilt, etc)

Test 2: Opening test.fla in Flash/Flash Builder to figure out how it works

In the given text.fla, there is a piano. When you present a "finger" shape (depending on depth calibration), it will make a tone sound when it overlaps the "piano".

Picture 4

The green spot represents the blob that is detected. The blue rectangle represents a successful hit. On succesful hit, a tone plays, corresponding to the location of the "piano key" in the entire scale. In the example, the tone is produced in AS3 using AS3 Sound Synthesis IV – Tone Class.

I DO NOT KNOW WHY IT THINKS THERE IS A BLOB AT THE TABLE. Is it because this is by depth? Should my "zone of interaction" just be a line instead? Why did i not think of sound as a response for interactions earlier (is it too late to get a speaker set up on site as well?)

Picture 5

This is the portion of the code that refers to the piano function. It looks straightfoward and humanly understandable.
Crucially, it says: "Events are fired by as3kinectUtils.fireTouchEvent"

More information about the as3kinectUtils class: Openkinect AS3 API.

Side observation: latency becomes highly noticeable once i turn on blobs. I am not sure if parallels/insufficient ram is the reason for the latency during blob detection, although there appears to be no lag in normal rgb video. Will only know when i test it on a native Windows PC tomorrow.

At this point I am finally beginning to see limitations of OpenKinect as only OpenNI/NITE can do the skeleton detection (even though it could not do all the other things like change LED colour and stuff). Should I take risk and try with OpenNI wrapper now that I can finally see it sort of working with OpenKinect? Time is very short and I have to make it work within the next 24 hours... Yes my timeline is 24 hours. Oh decisions, decisions...

Goal 3: CAN AS3CLIENT RECEIVE BLOB DETECTION/TOUCHEVENTS ACCURATELY?



Skitch

Using this simple blob detection, the key is in the DEPTH DETECTION. My fingers in this example have to be at the right depth. This means that it is basically like the blob detection in reactivision. Except that now i dont need a special screen or something because my depth detection serves as the invisible screen/line. I can increase the blob size in as3kinect.as where their default settings are defined under these vars:

public static const BLOB_MIN_WIDTH:uint = 15;
public static const BLOB_MAX_WIDTH:uint = 100;
public static const BLOB_MIN_HEIGHT:uint = 15;
public static const BLOB_MAX_HEIGHT:uint = 100;

Users would have to stand at a line and wave their hands in front of them. I think I am fine with that as an interaction.
This would also mean that I have to estimate the blob size. If i am going with blob sizes (estimated hand size as interaction), then I better make some controls to change size of blobs on the fly.

Goal 4: CAN AS3CLIENT USE THE BLOBS/TOUCHEVENTS AS EVENTS IN MY RIVER APP?



Picture 9

Picture 10

Currently, my original application was made with Flash Builder 4. It looks something like the image above. My aim by wednesday afternoon is to do the following:

- integrate multiple TouchEvents from kinect into it
- trace/function for keyboard recalibration of size of hand blobs
- reduce lag/latency

More info on flash.ui.Multitouch and other TouchEvents




My work entitled "\\" or "The Singapore River as a Psychogeographical Faultline II" will be at the Singapore Local Stories Gallery (at the end of the Titanic exhibition) at the Marina Bay Sands ArtScience Museum, from this Saturday (29th October 2011) onwards. Some artifacts from Pulau Saigon will also be shown there so please come to see them when you can.

To find out more about Pulau Saigon come down THIS THURSDAY EVENING, AT THE SUBSTATION (27th October 2011), when the Singapore Psychogeographical Society will be giving a lecture on psychogeoforensics.

Thursday, October 20, 2011

The Singapore Sandbar in the Singapore River

I am producing a moving image of the Singapore River. Previously I did make a version of the map based on a number of historical maps, but as I was also building many other things simultaneously so I was unable to verify the accuracy of all those maps. I am now cleaning up the images and cross-checking them with other maps.



I drew this on 27 July 2010 last year. Some artistic liberties have been taken in this version to flatten it to a line.

This year, to make it more accurate, I will also try to add in the sandbar that is sometimes spoken about in the Singapore River. This sandbar existed far back all the way to the 14th century during the days when Singapore was called Temasek and there are a number of references to it in old maps of Singapore. Some very interesting information can be found here by another map sleuth. Interestingly it is also mentioned in this game: World of Temasek...




A FEW HOURS LATER...



ALAMAK IM STILL IN THE WORLD OF TEMASEK WHAT A DIGRESSION

Props to Jianwei and all the other people at Magma Studios who made this lovely game. Runs really smooth on my browser. Picking mangos here... I electronically wipe a nostalgic tear thinking of the first time I started playing Second Life and went around looking for the money trees. Walking around and around the money tree to find the precise angle at which the flat money icons could be seen. (OMG ITS BEEN ALMOST 5 YEARS SINCE THEN)

\\ The Singapore River as a Psychogeographical Faultline



The Singapore River is a site of historical and commercial significance for Singapore, as well as a site to socialise at, and dream of things to come. But what does the Singapore River look like? When prompted to reflect on the river, many find it hard to recall the geography of Singapore's most significant river – which has changed drastically in purpose, form, and colour over the last hundred years.

Originally developed with the support of the The Substation’s Open Call in 2010, \\ is an interactive map installation exploring the Singapore River as a “psychogeographical faultline” where reality, memories and imagined spaces interact, merge, or drift apart - like a series of tectonic plates.




I will be adapting the installation in the reactable to a wall installation using the kinect as the sensor. Visitors will be able to play with the shape of the singapore river by waving their arms and walking in front of a projection of the shape of the Singapore River.

Sunday, October 16, 2011

Kinect Technical Specs



Sensor
Colour and depth-sensing lenses
Voice microphone array
Tilt motor for sensor adjustment
Fully compatible with existing Xbox 360 consoles

Field of View
Horizontal field of view: 57 degrees
Vertical field of view: 43 degrees
Physical tilt range: ± 27 degrees
Depth sensor range: 1.2m - 3.5m

Data Streams
320x240 16-bit depth @ 30 frames/sec
640x480 32-bit colour@ 30 frames/sec
16-bit audio @ 16 kHz

Skeletal Tracking System
Tracks up to 6 people, including 2 active players
Tracks 20 joints per active player
Ability to map active players to LIVE Avatars

It appears that Kinect also sends out a constellation of IR light points which it uses to map depth of objects in the camera.

See also:
Description of Kinect's operation including how it does Depth calculation

Saturday, October 15, 2011

TouchOSC & Osculator

While waiting for everything to install on Windows, I started playing around with some other things which I meant to install on my Mac.


TouchOSC & TouchOSC Editor - http://hexler.net/software/touchosc

TouchOSC sends and receives Open Sound Control messages over a WIFI network using UDP (User Datagram Protocol). it also supports CoreMIDI (Mac's inbuilt MIDI), and also can make use of messages from devices with Accelerometer. The editor also allows you to make your own custom modular interface for the messages/values you are sending to and fro.



Osculator - http://www.osculator.net/

Osculator helps connect your devices/controllers to your music/video software. You can use it with the Nintendo Wiimote, iPhone, wacom tablets, etc. You can use it with ableton live, processing, tuio+reactivision... the technology is there, so make up a use for it!
  • Supports nearly all Wiimote extensions: Motion Plus, Guitar Hero World Tour Guitar and Drums, Nunchuk, Classic Controller and even the Balance Board.
  • Bi-directional MIDI with TouchOSC
  • Wacom Tablet
  • Symbolic Sound's Kyma workstation
  • TUIO / reacTIVision
  • Space Navigator
  • Mouse / keyboard control
  • Advanced OSC routing

Friday, October 14, 2011

Operating System

The first hurdle was that I hadn't upgraded my Mac OS since 10.5. I know, I know, this is a shocking admission for someone doing the kind of work that I do... but... there is only so much I can do at any one point of time (I focused more on teaching myself how to use all the CS5 version of the adobe applications one by one), and I will say that upgrading to Lion was simply not very high on my priority list. Not until now, when i wanted to be able to do this for an installation.

There are a few reasons why i am going with Windows for this, the most silly reason being:

"MacOSX: Only OSX 10.6 (Snow Leopard) and above with an Intel based CPU is currently supported."

You will have to upgrade to 10.6 (Snow Leopard) with the CD, BEFORE you can upgrade again to 10.7 (Lion) which is available online. Forgetful people like Debbies who are looking to buy the Snow Leopard CD (as of October 2011) at this embarrassingly belated date will be devastated to hear that no brick-and-mortar computer shops in Singapore still stock this mystical CD, which is required for the upgrade to Lion. No need to walk around Funan and Sim Lim, its all gone. It is also said that you can probably approach an Apple Servicing Centre for a copy of this elusive Snow Leopard but I am sure you'll find your own way around it. You probably aren't even having to deal with this problem if you are techy enough to be reading this far into the page.

Anyway, in the end, the faster method seemed to be installing Parallels Desktop 7 + Windows 7 as i wanted to potentially try the Microsoft Kinect SDK (windows only) which apparently has some differences compared to PrimeSense's OpenNI. We will see later after I install both and try them out.

In any case I am keen to explore Windows having seen all the stuff people are making on windows-only things such as vvvv, so its as good a reason to start installing Parallels on my mac.

This is the first time I've got Windows running in almost ten years. After patiently waiting for it to install for some time, I looked up at the screen and a giant bouncy arrow was on my screen, and it was screaming "PARALLELS!!! PARALLELS!" I clicked on it and HOLY SHIT ITS THE WINDOWS START BAR!



See Also:
Parallels
Windows 7 Home Premium

Introduction

This blog will document the two week push to convert my multitouch table project into a projection-based installation using kinect sensors.

The original project was made possible as part of the exhibition I was building for the Substation Open Call. It consisted of a multitouch table which I built from scratch and programmed with reactivision and flash. I am not formally trained in math or computing but I am always interested as a hobbyist and in learning more about how sensors and cameras can be used for interactivity - how the images are processed and how we use this intelligently to understand what human gestures were made so that we can program a suitable response.

I am keeping this blog both as a record for myself as well as to share the information on how I put it together (or how I decoded what was going on from information on the internet and put it into practice).