If you haven’t been following the Intel Ultimate Coder Challenge over the past 3 weeks, we’ll forgive you, but suggest that you travel back in time to get caught up. This week the 7 teams have published their progress and we take a look at what they’ve done and what they’ve shared with the entire community. There is only 3 weeks left so we’re glad to see that real progress is being made!
Before we get into it, if you want to learn more about the program you can head to Intel’s about the Ultimate Coder Challenge II: Going Perceptual. to learn about the dirty details. Basically what is going on is they’ve all been given Yoga 13′s, a Creative Interactive Gesture Camera Developer Kit, a link to download the Intel Perceptual Computing SDK 2013 Beta and a directive to amaze.
Love the progress that you’ve made, that we’re able to play your game and actually get to experience the progress first hand. What you’re attempting with the pupil and eye tracking, it’s totally fascinating to read, it’s the most interesting part of the SDK, but I know that it’s only meant to support Gaze Tracking. How you’re integrating different platform to make it happen is really great!
It seems that everyone who is working on Gaze tracking or eye tracking is having difficulty, I had an interview with the Perceptual computing team and they told me that eye tracking was going to be implemented later, so I tip my hat to anyone making a go of it now.
These fellas are looking to integrate the perceptual computing SDK into their game, Stargate Gunship. They had a lot of ambitious idea’s about how they were going to change interaction with their game, voice commands, eye tracking for aiming ect. I like that they are taking a more realistic approach to what they can get done, well, in the allotted time.
Wouldn’t have minded a little more explanation around the thought process.
Seems like you had a big coding week, I’m looking forward to the visualizations next week. I’m going to be particularly interested in if the Ultrabook is going to be powerful enough for the real time processing you’re going to require.
Since real time realistic pottery isn’t going to be possible, taking the environment into consideration is going to be a very important part of how you package your application.
Thanks for setting me straight about what types of puppets you’re going to be enabling. Love the story line of the 3 little pigs and how enthusiastic you are about your project is really coming through in your posts.
I wouldn’t have minded a little more detail on why you went with AVPro Movie Capture Unity Plugin over the other options, and congratulation for getting it working!
Your progress last week was substantial, both controlling the puppet and capturing the screen. Seems like you’re well on your way…and the art looks killer!!
Wibble Peter…Wibble! But I have to admit, this was a tough read for me and in the same breath you should get a well deserved high five! It really was a post for coders looking to make serious moves and really get their app working well.
I like your explainations of why you’ve choosen which library’s to speed your development time. You’re also making better progress than I expected, you’re dealing with quite a bit of data and I do appreciate the step by step…even if I can’t quite follow all of it! (Don’t change your style at all, you’re writing this for other coders and it’s clear there is a ton of amazing insight in there!)
I loved the video, it was such a help for me to really understand how the SDK worked and what exactly the process looks like to debug something. The graze tracking example really was especially interesting, i can really understand why so many of the teams are having trouble.
Extra points for putting the entire competition and the SDK into perspective!! I really can’t stress how much I loved that video!
Wait… I can…here it is! If you haven’t watched it check it out!
As an aside, I think the fuzzyness is fine, it’s very holographic, I think it looks cool.
I love it! I totally love what you’re doing! I finally feel like I understand your project and the big data visualization you’re planning on showing to demonstrate the frame work really is what the future looks like in the movies.
I hope to see you integrate the hardware next week, I understand the GUI is an enourmous task…the entire framework is an enourmous task, but getting the hardware in there is really key. I’m looking forward to seeing that in action, I wouldn’t change a thing! I’m just glad that I’ve properly got my head around it, appologies it took me so long!
If you’re looking for more information about our thought process we shot another judges video because Chippy, Sascha and myself happened to be at CeBIT together. So this should be the last video…which I personally don’t think that we’ll be able to top!
Special thanks to Sascha for his … performance.
Full disclosure – We are being compensated for content related to the Ultimate Coder and to attend an Intel event. We have committed to posting at least once per week on this subject and judging the final applications. All posts are 100% written and edited by Mobile Geeks