Seven teams gathered over 7 weeks and engaged in what looked like an Elite VIP Hackathon, rather than competeting ruthlessly they shared code and collectively aimed to produce applications that would demonstrate the bleeding edge of hardware and software development. Intel’s Perceptual Computing SDK comes packed with a Creative Camera and an SDK that let’s you leverage voice, touch and of course the camera. Perceptual Computing is the next evolution of computing which adds in new means of interacting with your machine.
As a judge on it was difficult to name a winner because each project brought something unique to the table that out shone all the other teams. In the end a competition has to have a winner and Sixense ended up walking away with the Ultimate Coder Title.
Sixense was a great showcase of the perceptual computing, they created a puppet game where you control a wolf who visits each of the 3 little pigs houses…you also get to control the pigs! The game records your voice and you can turn the whole thing into your own little story. The concept of the game is easy to understand for anyone who is unfamiliar with the idea of cameras and computers. It’s important when dealing with programs like these to take the time to teach the user how to position their hands to use the puppets, it’s important to take the time to educate the user about the new way that they’re interacting with their machine.
Here is a little video if you want to check it out.
There were actually a few prizes given out, Best Technical Merit, Best UX for PerC, Best Video, and Best Blogs:
Technical Merit went to Quel Solaar, who didn’t make a game or an app he made an interface development platform that hides the nuts and bolts of the SDK and allows developers to move their ideas forward in a fast seamless way. The five finger touch is stunning, this alone as an interface is something that I would kill for on my tablets right now! The head tracking implimentation is very well done with out an lag, if you want to check it out, here is a video:
Best UX for PerC
The guys from Code Monkey’s took their existing game, StarGate Gunship and added in the perceptual computing SDK. What I loved about thier implimantation is that they threw in a Dr. Manhattan like character so that you could also tell what was going on with your interactions with the camera. Part of the problem with programing for the leading edge, for pushing the way that people are used to interacting with things is that is that the user needs feedback so they know what they are doing is right. There is also the chance that things might be going wrong so the on screen feedback let the user know when they needed to recalibrate to get things working smoothly again.
Best Video goes to Lee Bamber because every week he never failed to regale me with a story of his weeks work that was both entertaining and very very informative. here is an example of what I’m talking about!
With Lee’s videos it’s really no surprise that he’s also take away Best Blog, but he had company with Infrared5 these guys made an awesome flying/racing cat killiing game that had great head tracking implimentation and voice commands. They shared their process in great detail every week and it really helped me to understand the challenges of the camera.
I also wanted to show off some of the art because it really was fantastic!
I do have to admit, I’m sad the contest is over as I enjoyed every post and watching the teams evolve.
If you think you’ve got the chops to take home such a title there is another Perceptual Computing Challenge coming soon. If you are interested in learning more about this technology, you can learn more about the Perceptual Computing SDK from the SDK download page.