It’s the final post before I have to submit my score card naming the winner of Intel’s Ultimate Coder: Going Perceptual and I’ve poured my self a drink because I can see value in everyone’s submission. As I sit down to write this post I have to laugh because roommate has been looking at me suspiciously as I’ve been yelling at my computer in a very bossy and demanding way, “Would it kill you to say please, maybe that will get it to work” is what I hear over the weekend. Enough of my belly aching it’s a tough position to be in because each app has a killer feature that earns it the top spot.
I’m going to start off with a few of the issues that I had in general with all of the apps. You need to be in the perfect position with the perfect lighting with your hands or face at just the right angle. As much as everyone complained that the circumstances weren’t perfect at GDC I think that there are countless variables that need to be taken into account when assessing performance. And every app has required countless restarts and system crashes have become common place.
Since the score cards are due Friday I decided that I would write this post with out refreshing my memory as to what exactly the criteria are and just write about each app laying out what I loved and what needed work. A personal look at how I viewed each offering.
Before we get into it, if you want to learn more about the program you can head to Intel’s about the Ultimate Coder Challenge II: Going Perceptual. to learn about the dirty details. Basically what is going on is they’ve all been given Yoga 13′s, a Creative Interactive Gesture Camera Developer Kit, a link to download the Intel Perceptual Computing SDK 2013 Beta and a directive to amaze.
Peter your app is great, I really like the gestures that were selected to apply filters and make changes to the photos. Different uses of hand gestures in combination with hand actions like fist with thumb up, open hand with fingers spread was smart. The gestures are quite intuitive and once you get used to them they really are natural.
A lot of the elegance of this application is lost for me as resizing the framing or view of the photo is extremely difficult. Pinch to zoom is fine but it shrinks the photo to the bottom left of the window and the window defaults to the top left. Pinch to quickly it shrinks too much and you have trouble locating it and getting it to fit perfectly in the photo viewing window. The photo should automatically be set so you see the whole thing, if the main goal of the program is photo editing, seeing the photo to apply all the fancy perceptual computing jazz is key.
Like all the apps I ran into a few bugs and glitches, but over all the app worked as intended, you could apply filters using your voice and through hand gestures.
Peter, I would like to thank you for your input over the course of the challenge, you’ve given me a lot of insight about how you deal with the volumes a data that the camera delivers. Your application has a lot of potential, its ground zero for demonstrating the value proposition of how perceptual computing is ready for today’s applications.
This puppet game is a fantastic concept, it’s been a pleasure to watch this grow into a game with tremendous potential. What do I think over all? The art is stellar and if I haven’t said it enough, it looks fantastic! It’s obvious that this is a graphically intensive program, Danny was telling me at GDC the power problems they were running into were greater than what they were talking about in their posts. The addition of the wolf’s legs … I was against this from the start…again…Nicole is wrong! Damn my love of legless Wii like characters lol … the legs really show off just how much physics is involved in this. The true puppet nature of the characters is demonstrated, I became more aware of the height and level of my hand because of the floppy nature of the wolf’s legs. I then noticed how the ears really move, I didn’t notice this as much until the legs really showcased the ragdoll that the characters had going on.
The hand tracking to control the puppets really took me some time to master, you really need to keep your hand at just the right distance to get it to work. Like all of the apps you need to take the time to figure out the sweet spot for getting the best responses. The addition of the calibration really helped, but I felt like it worked a little better at GDC when I had step by step guidance letting me know when the program had lost track of me and how to reset it. Having said that I got a video to work, I played around with the different resolutions available to get it working like butter.
I love it, it’s a great concept that came together… Thanks for making this contest so entertaining! I’ve always looked forward to your posts and the detail that you provided really gave me a solid understanding of what is going on behind the scenes.
StarGate Gunship has leveraged the perceptual computing SDK seamlessly. I think the additions made to an existing game were done very well … even better now that I really understand the how and why behind the decisions made. If I wanted to be entertained I’d watch TV, I’m taking part in this competition to learn…not that learning can’t also be fun
It was brilliant that you added the Dr Manhattan like character to help with the tracking on the camera. That seems to be one of my biggest issues with most of the apps, I can’t figure out what is going on with my actions and the camera. This is a contest about the bleeding edge of software interactions, it’s hopeful and ambitious to assume that everything will work seamlessly. This addition was key in helping me modify my behavior to somewhat accurately control the game. I have to admit, that even with this addition the accuracy of the aiming wasn’t perfect.
Taking a step away from the camera I find it very interesting how you re imagined the touch schema for your game. You’ve done a great job and I can see your audience loving the addition of the perceptual computing SDK to their experience.
Kiwi Catapult has been a fun game to follow, the art on this one is spectacular, I’ve said before and I’ll say it again the 2D in a 3D environment is stellar! There weren’t that many surprises with this one since they’ve been offering the phone application for testing as the contest has been going along. Having said that playing it with the camera with the voice commands was a completely new experience. The head tracking really does require you to be in just the right position to make it useful with in the game. The voice commands are hilarious! I totally love them, I was saying it with out conviction and my roommate asked me what I was doing…I told her, she said if you’re going to kill a cat you think that voice would do it…she was right! It really is Fiiiiirrrre!!!!
The gameplay is challenging, I play a lot of first person shooter games because I am so bad at them and I’m tired of the comments of my crap gameplay. Look at the graphics and not my poor aim I say…no one ever listens. This feeling of not being good was definitely there, that I have to aim while swooping was challenging. But like all first person shooter games I have to play them for an hour before I don’t embarrass my self, you should see on Dead Trigger or Front Line Commando, I’m finally awesome if I do say so my self!
Kiwi Catapult even though it is technically a kind of first person shooter it is very different since it does have a little bit of a racing feel to it with all the swooping. I didn’t find the head tracking quite sensitive enough in the beginning, but I have to say once I got into it, it was a valuable addition to the gameplay. You have to get the timing right when you’re breathing fire at the cat’s and with the difficulty I was having with swooping, aiming and timing Fire at just the right time was definitely and added challenge. But that’s the point of playing games…they’re challenging.
Congratulation of a great game!
I must admit this app is so different from the rest that it’s hard not to want to put it in a category or even class of its own. The five finger touch is stunning, this alone as an interface is something that I would kill for on my tablets right now! That it’s apart of an interface development blows my mind. Since I last got to play with it at GDC I like the addition of that the further you drag out the most accurately you can set a measure. It’s sleek, simple and smart. My only concern is that it may not leverage and really demonstrate the features of the SDK though what is demonstrated is executed very well.
Things started out a little rocky with grand statements of redesigning the way that we interact with our technology, it seemed a little grand and vague see now that they weren’t baseless statements. The implications of what you’ve done here are tremendous, if you took this on to any platform it would be successful, Intel is lucky that you’ve done it here first.
Oh Lee! How I’ll miss your verbos posts, if there is one thing that can be said is that anyone following your progress will know a TON about what type of data is being produced from this camera and how it can be used.
I have to start off with the fact that your app took me the longest to deal with because I am crap at networks. I hide it well because as a tech blogger I should be good at that, I had to google step one… You’re going to need to get some sort of networking engineer on board for that aspect of thing because this is a huge pain in the A$$.
Moving on! Once it was working how much fun is that! Sascha and I chatted with each other, we both had to hit the CTRL button to buffer things so that we were in sync. Your directions were clear, the voice activation worked well and your videos showing me what I needed to check out fantastic.
You’ve got a real winner on your hands here! I know the contest is over, but I know that you’re work isn’t done! Congratulations on all your hard work, your excellent posts that really embodied the spirit of the competition. I’m happy that I was wrong and that you did have enough time to pull it off and as a one man show to boot! High Five!
Full disclosure – We are being compensated for content related to the Ultimate Coder and to attend an Intel event. We have committed to posting at least once per week on this subject and judging the final applications. All posts are 100% written and edited by Mobile Geeks