Posts Tagged ‘OpenNI’

Creating a simple Flash app with Kinect

lucky larry flash kinect

First of all read up on how to set the Kinect up on your computer, I’m using Windows XP here.

Getting the Kinect setup on your PC: OpenNI / Primesense

Interfacing Kinect with Flash & AS3

Now you’ve read those (If you needed to), for this tutorial you’ll also need Flash CS4 or later, which you can download a demo from Adobe for free. Alternatively you can use the opensource actionscript compilers – I’m only using Flash to quickly draw some objects because I can’t be bothered to code them, so you can take my actionscript file and add to it if you don’t want to use the Flash IDE. One note though, you’ll need to compile and run this as an executable (.exe).

You can download ALL my files here

I say *ALL* because it seems that everyone’s more than happy to show their video of this working but no one shares their code, at the very least they share the .exe files. So here it all is and it’s very simple to follow, making use of the excellent AS3 server and libraries from

lucky larry flash kinect files

To quickly test this out, plug in the Kinect, download and unzip my folder, start as3-server.exe and then start LuckyLarryFlashKinect.exe. Perform the Cornholio pose and it should start tracking your right hand. Push your hand out to press a button.

Hopefully that worked albeit a bit glitchy. So how was it done? I’ll walk through this assuming you have some Flash knowledge but essentially it’s just a rehash of the demo on

First get that demo of Flash, I tried CS3 but the AS3 kinect libraries are a bit buggy, so I’d recommend grabbing the latest demo from Adobe’s site, I’m using CS5.5 which works fine for this, but CS4 works just as well.

The LuckyLarryFlashKinect.fla file literally just has the buttons, hand icon and a link to an external actionscript file, if you really wanted to you could just draw that with actionscript code and compile the file using an open source alternative – I’m just being lazy.

If you open that up, you’ll see I have a movieclip called all with an instance name of all

actionscript instance name

That contains the movie clip buttons and 6 instances of the button movieclip

flash kinect buttons

The button movieclip has 2 frames in there, one for on and another for the off state, you could easily change the colour etc… in actionscript but I wanted to specfically show the nextFrame function in my code.

flash kinect hand

The hand movieclip is called Right, this is the hand icon which has another movieclip called handhit. I have this so that I can have a much smaller, more precise hit test bounding area. Under the properties for each movieclip on the stage you’ll see that they have instance names to link them to actionscript. Other than that, the files publishing settings are set to Actionscript 3.0, and to create a SWF and EXE file.

That’s it. Everything else happens in the actionscript file which I’ll walk through in more detail.

Looking at actionscript file now…

First we create our package and import the necessary libaries

	// load relevant libraries
	import flash.display.MovieClip;
	// AS3 kinect 
	import org.as3kinect.*;
	import org.as3kinect.as3kinectWrapper;
	// Greensock for programmatic tweening for rollovers etc...
	import com.greensock.*;
	import com.greensock.easing.*;

Now we create our package class extending the movieclip object and declare our private variables

	public class LuckyLarryFlashKinect extends MovieClip
		// declare new instance of the AS3 wrapper
		private var as3w:as3kinectWrapper;
		// configure a 'depthLimit' limit in millimetres, you need to stand further back than this
		private var depthLimit:Number = 800; 
		// store the depthLimit of the right hand
		private var right_z:Number;
		// store if something has been pushed
		private var isPush:Boolean;

The next function creates the link to the skeleton data via the libary files and then registers 3 event listeners, 1 for the skeletal data and the others for Enter Frame

		// declare the event listeners for skeleton tracking and enter frame
		public function LuckyLarryFlashKinect()
			as3w = new as3kinectWrapper();
			as3w.addEventListener(as3kinectWrapperEvent.ON_SKEL, on_skeleton);

On the EnterFrame function we get the skeletal data and check for the depth of your hand, adding or removing the interaction event listener depending upon the distance. In this function, I also handle the rollover, since it’s present every time irrelevant of depth, I’ll explain that for loop a bit later on.

		private function EnterFrame(event:Event)
			// get skeleton information to retrieve right hand info
			//Depth detection event listener - set it to work at a minimal depthLimit
			if ((right_z < depthLimit)){
				removeEventListener(Event.ENTER_FRAME, interaction);
			}else if ((right_z > depthLimit)){
				addEventListener(Event.ENTER_FRAME, interaction);

			/* Rollover
			 * Because we're not detecthing depthLimit and just x and y, this function lives
			 * outside of the interaction function it loops through all the items within 
			 * the buttons movie clip and do hit detection on each one, instead of declaring 
			 * each individual possible hit test we handle it in a much smaller for loop
			var i:int = 0;
       		for(i; i < all.buttons.numChildren; ++i){
				if (right.handhit.hitTestObject(all.buttons.getChildAt(i))){
					// use greensock to make the buttons bigger/smaller, 0.1, {scaleX:1.10, scaleY:1.10, ease:Cubic.easeIn});
				} else {, 0.1, {scaleX:1, scaleY:1, ease:Cubic.easeIn});


After this we declare the skeleton function to get the data for the right hand and map it to our movieclip named right. For the x and y data I alter this to translate the tracked data to the screen, without this you can only control a very small space inside the movieclip. Unlike the demo files, here we're also looking at z index/ distance from the sensor, this is the real difference between the demos.

		private function on_skeleton(event:as3kinectWrapperEvent):void
		// gets hand tracking and data to build into gestures
			var skel:Object =;
			// right hand position, fetched from the AS3kinect wrapper/ skeleton data
			// I add a multipler (*4) and an offset (-400) to compensate for using right hand only, saving me stretching!
			right.x = (skel.r_hand.x*4)-400;
			right.y = (skel.r_hand.y*3)-400;
			right_z = skel.r_hand.z;

In the final function, interaction, I just observe the right hand depth (z) and if that's less than my depth variable I then execute the hit tests dynamically to figure out which button to click/ animate. So I'm assuming that you stand further back than my depth limit, when the hand is less than this, you're pushing a button. I guess instead of calculating the distance of your hand from the sensor, we could calculate the distance of your hand from your shoulder, head or neck allowing you stand as close or far away from the sensor as you want as your hand depth then becomes relative to your body and not the sensor.

		private function interaction(e:Event)
		/* function that does all the button clicking /gestures
		 * in this case we're concerned on the detected depthLimit to figure out if
		 * a button has been 'pushed'
			// if right hand is at required depthLimit for 'pushing'
			if (right_z < depthLimit){
				removeEventListener(Event.ENTER_FRAME, interaction);	
				/* Same as the rollover loop, detect the hit but also register that
				 * something has been 'pushed'

By setting a depth element (depthLimit), it's this that creates the push to click gesture. However there is an issue, because we're always calculating the z index of your hand in the application file, there's quite a bit of lag and when you want to use logic loops such as 'while' then this can cause you some memory issues in Flash, especially since actionscript is not a multithreaded language.

        		var i:int = 0;
        		for(i; i < all.buttons.numChildren; ++i){
			        if(right.handhit.hitTestObject(all.buttons.getChildAt(i))&& isPush==false){
						isPush = true;
						// greensock shrink the button, 0.06, {scaleX:1, scaleY:1, ease:Cubic.easeIn});
						if(all.buttons.getChildAt(i).name == "myMovieClipName") {
							//Do something for an individual button
            		} else if (right.handhit.hitTestObject(all.buttons.getChildAt(i))&& isPush==true){
						isPush = false;				
						all.buttons.getChildAt(i).prevFrame();, 0.06, {scaleX:1, scaleY:1, ease:Cubic.easeIn});
						if(all.buttons.getChildAt(i).name == "myMovieClipName") {

The final part of the interaction function is the hit test which then loops through each button and skips to it's next frame if it's the one hit, I also left in some logic to get the name, in case you wanted specific functions. I use the for loop rather than write a whole series of if statements, it works by getting the number of child movieclips in the parent movieclip and then iterates through this number to get the index of any child movieclip. With that we can also get the name, frames etc... of that clip. You can read more here about Dynamic hit tests in Flash & AS3

In this loop we record whether something has been pushed so we can skip back and forward between frames - if you were loading new content/ new screen on a button press then you wouldn't need this logic, it's just there as a quick hack.

Finally, be warned, if you introduce a lot of complex actionscript, then expect it to drastically slow down, the tween animations being used are actually happening much slower than they should. This is because of the way in which we're grabbing the Kinect data. If you were to do this with the Microsoft Kinect SDK on Windows 7 you'd still have the same issue due to the amount of middleware needed to get this to run. Looking forward however, my hope is that from Windows 8 onwards we could have native Kinect like the Xbox, I doubt Microsoft will ever support any none Microsoft platform or even their own older OS.

Here's the video of me using the app just to prove that it does work although the push gesture is a bit glitchy...

Interfacing Kinect with Flash & AS3


A very quick guide on how to hook up the Kinect with the AS3Kinect project, you can find all the relevant information on the different wrappers for AS3Kinect at:

The AS3Kinect project works with either OpenKinect and OpenNI and you’ll need a specific version depending on the wrapper that you’re using. Talking of wrappers here’s a couple of things to remember about them:

  • You can’t have more than one running, so you have to pick and install one
  • Each one has different features, the main advantage of OpenNI is that it provides skeletal tracking data

I show how to set up the OpenNI & Primesense drivers here: Getting the Kinect setup on your PC: OpenNI / Primesense

I went the route of OpenNI purely for skeletal tracking for some of the projects I was working on. To get that skeletal data you just as easily could use the Microsoft drivers that they released only after everyone hacked the Kinect and then Microsoft realised they were missing out. BUT while they work a bit better and provide everything, it ties you to Windows 7, which I don’t have and don’t need. I also dislike the idea of being tied to specific software.

So, anyway, follow my set up for OpenNI as that’s what I’ll be using in later tutorials. Now you’ve done that it’s very easy to use the AS3 server, just download it at:

Unzip that somewhere and in the XML file you’ll need to add in the license key (Similar to the OpenNI setup steps):

For reference my later projects/ examples will have this info done for you.

The .exe file is your AS3 socket server that pulls the data from the OpenNI drivers for you to use in your Flash projects via the as3kinect actionscript libraries. I tend to keep an instance of the server in the same folder as my Flash project as well as the libraries. You’ll always need to have this running to get Kinect data, incidentally, if you close your Flash projector file (.exe, .swf) you’ll need to restart the server each time.

AS3demofilesIf the AS3 server doesn’t start and if you’re sure the drivers are installed, first check the power! If the Kinect is not plugged into the power supply and you’re just trying to use the USB power, this isn’t going to work.

Anyway, to test this works download the demo AS3 files:

Unzip that, start up your AS3server and then run 3d_test.exe if it’s all working then you should get something like the following screenshot:


To get the Kinect/ OpenNI drivers to recognise you, you’ll need to perform the pose which I’ll refer to as ‘The Cornholio’ cornholio

Once you’ve done that you should be able to see the program tracking your hand movements.

Now that’s all working have a look at to get an idea of how to pull data using the as3kinect library (in the org folder).

Got it? Good, time to move on and I’ll show some examples of using this data.

Getting the Kinect setup on your PC: OpenNI / Primesense


Here’s the full details and process on getting your Kinect running on your PC using the OpenNI and Primesense drivers, yes this may be a little old news for some, but my guide here will actually get your Kinect running pretty much first time rather than leave you to guess and try to follow the other guides out there.

So first of all you’ll need a Kinect! Now we’ve sorted that out, the rest is easy and just a matter of following the steps – I’m documenting this as much for my own use as well as yours… I’m doing this for a Windows XP laptop, but you can do this on Ubuntu, Windows 7, Mac OSX etc…

1. Remove any existing drivers

First of all, if you have any drivers previously installed for the Kinect, such as OpenKinect, Freenect, Microsoft Kinect drivers etc… you’ll need to remove/ uninstall them. They will appear in the device manager under Human Interface and normally be called something like Xbox NUI Motor, Xbox NUI Camera, Xbox NUI Audio. Incidently when you install the OpenNI/ Primesense drivers, they are named differently and you do not want to under any circumstances use the above drivers.

2. Download and install the following drivers & binaries

download kinect sensor driver

Head on over to Github and get this package which you want to download that and then navigate to the Bin directory and install the relevant driver for your OS. This has the drivers precompiled for you but you can also compile this from source. The file will be something like:

  1. SensorKinect-Win-OpenSource32-

OpenNI downloads

Next, you’ll want to go to OpenNI.orgs download page and for each of the options select stable choose which development package you want for your OS:

  1. OpenNI Binaries
  2. OpenNI Compliant Middleware Binaries
  3. OpenNI Complient Hardware Binaries

You can also download & compile these from binaries by going to the OpenNI github page

So now you should have 3 files something like:

  1. openni-win32-
  2. nite-win32-
  3. sensor-win32-

Or whatever you’ve compiled from Git, run the installs etc…

On the Primesense Nite installation, if prompted for a key, add the following:


Now that’s done you’ll need to download and install the Microsoft Visual C++ Redistributable Package for your OS which can be found at:

In my instance I’m using the x86 one, for windows 7 use x64 etc… just grab the latest one available. This allows you to run the OpenNI applications.

3. Plug and pray

Now plugin the Kinect – make sure you have the power plugged in as well, not just the USB! Your computer should recognise the Kinect and want to install the drivers – make sure you choose the Primesense ones if prompted – I found that when I had old drivers it remembered the location, so I just zipped up the folder and archived it to remove it from the driver search.

Device manager after successful install

If all goes well you should have 3 hardware installation screens on Windows and the device manager should show the hardware under Primesense

4. Setup

Now that’s installed there are a few more things you need to do in order to hook up OpenNI and Primesense, namely a few XML config files which you just need to add in the same license key so that Primesense will work for you that key was:


Insert OpenNI key

Link OpenNI and Primesense

5. Finished

Now that’s done you can start to play around with the examples – there are various wrappers/ API’s available to you to now use this to create applications.

  1. Test out OpenNI by running the NiViewer sample
  2. Test out Primesense by running any of the samples there

Primesense sample

Now you have that installed you can move on to using various wrappers etc… to hook the Kinect up to Flash, Silverlight, Processing, Arduino, Java, Python etc…