Enter OpenTiles

Earth Tiles 8:00am April 20th, I enter the warehouse known as Making Awesome, Tallahassee Florida’s own makerspace. They were one of the 75 locations around the globe chosen to host the world’s largest collaborative 2 day hackathon, The 2013 International Space Apps Challenge. I see a friend of mine at his laptop amidst the tables, wires and people. I approach him and after the usual greetings we being the following dialogue.


Nathan: “So what project are you thinking of working on?”
Olmo: “I have no idea… I was just going to find some people working on a cool project and join them.”
Nathan: “Wow, I had the same plan…”
Olmo: “Well it doesn’t look like any of the groups here are working on a cool project.”
Nathan: “Ya… How about we just start one ourselves?”

And that’s exactly what we did.

We developed a solution to the Earthtiles challenge. Both Olmo and I work at the Center for Ocean-Atmospheric Prediction Studies, simply known as COAPS, and we’ve collected a good deal of experience processing satellite data, which was the underlying principle of the challenge. Shortly after we got started, a new participant approached us. It was Samuel Rustan, an Electrical Engineering student who had lofty goals of working with people on one of the cubesat challenges. After we explained our project to him, he continued perusing. I think he quickly realized there was no one at our local branch working on the cubesats, and he got a long with Olmo and I pretty well, so he came back and asked if he could join our group. We gladly welcomed him. His programming prowess was humble, but his enthusiasm was unrivaled.

We worked through the night and come Sunday evening, and come the judges to our station, we presented our work. They seemed impressed, and we were content with our presentation. After their lengthy deliberation (which was in the room filled with the cookies and snacks), they emerge to announce the winners. Our project was announced last along with a very cool board game project as the 2 which will continue to global judging.

Our Project – OpenTiles

Our project page, OpenTiles, was finished last night and will be the medium which presents our project to the international judges. Judging will end on May 22nd where the 5 award winning projects will be announced. The 5 awards are…

  • Best Use of Data – The solution that best makes space data accessible or leverages it to a unique purpose / application.
  • Best Use of Hardware – The solution that exemplifies the most innovative use of hardware.
  • Galactic Impact – The solution that has the most potential to significantly improve life on Earth or in the universe.
  • Most Inspiring – The solution that captured our hearts and attention.
  • People’s Choice – Determined from a public voting process facilited through the spaceappschallenge.org website.

If we have any chance of winning, it will be the Best Use of Data award. Our project is very specialized, it’s unlikely that most laypeople will understand our project so the People’s Choice award looks improbable… Unless all of my altruistic readers vote for our project on the SpaceApps website!

Posted in OpenTiles | Leave a comment

Ubuntu gets Commercial Grade Video, Audio, and Games


As an avid open source user and unabashed Ubuntu advocate I’ve been politely coercing people to use Ubuntu/Linux products for years. I’m sure we’ve all heard the most common rebuttals, “I’ve got nothing against Linux, but…”

  • “There’s no audio or video editing software.”
  • “Most of my favorite games won’t run on Linux.”
  • “I can’t edit my documents and spreadsheets for work.”
  • “No one really uses it.”

The list goes on and on… While the majority of the excuses are just plain silly, many of the others had been “patched” with the advent of WINE. Unfortunately, most of the users who would actually need it don’t know about it. But recently the 2 leading excuses have suffered a brutal defeat in the past few months and days. While the war is not over, today we can all be proud Ubuntu and Linux users.

04/30/2013 – The Beta release of a Hollywood quality video/audio editing suite for Ubuntu…

Lightworks

02/14/2013 – The long awaited arrival of Ubuntu’s future gaming pride…

Steam

May all of our futures be rich with Ubuntu software.

Posted in Ubuntu | Leave a comment

Registering Two Point Clouds


I finished this project last year and it’s been ‘dying’ to be posted. My goal was to take point clouds obtained from two Kinects and register them into one coordinate system. After the registration process was completed, the new point cloud would be much more robust with many of the obstructed blank spots filled in.



The project follows a simple algorithm.

  1. Use libfreenect to obtain point clouds from both Kinects
  2. Use OpenGL to display the point clouds in an interactice virtual environment
  3. Use OpenCV to display the RGB streams for the user to select correspondences
  4. Calculate the transformations using Procrustes Analysis and the correspondence matrices
  5. Apply the translation and rotation to the point clouds visualized in OpenGL

The implementation of this algorithm can be found here.

After a few weeks of stagnant development with the project, I made a bet with my friends and advisor that I could finish the project in one weekend before I left for a vacation. Below is the video which resulted from my sleepless weekend hackathon.

After the project was finished. I used it to complete what is called an “Honor’s Thesis” here at my University. It is an undergraduate research project, which once defended successfully allows the student to graduate “with honors” on their diploma. Looking back on the thesis now I would have done things differently – but isn’t that almost always the case. None-the-less it was a mile stone in my life and it is my work.

Let me know if you’d like a copy of it and I’ll be happy to send it to you!

Posted in 3D Scene Reconstruction | 22 Comments

Interest Curves

Without play testing, the structure of the interest curve is hard to determine. We have tho opening animation which explains the plot and gets the player familiar with the story. After the player volunteers to go into their own mind to explore their understanding of emotions, the first person shooter is encountered. This stage will increase interest even further, the challenge from the gameplay mixed with the intellectual focus to find the right target will retain the attention of player.

Posted in Educational Game | Leave a comment

Balance

Our game has very fundamental and simple implementations of most of the basic components of a good game. Challenge vs. Success is inherent in a fps. The challenge is shooting the targets, the success is successfully shooting the targets. This is developed a bit further in ou game though, because the objective is not to just shoot every target, it is to shoot the appropriate apparition representing the emotion or feeling in question. We do more than just incorporate hand and eye coordinations required for a FPS, but we also challenge the player’s mind. Their tasked with successfully shooting the correct target, finding the correct target is a challenge to the head, then being able to shoot it is a challenge of the hands.

We also have a punishment system and scaffolding system. If the player chooses the wrong target and begins to shoot it, they will lose points and be warned that their choice was an incorrect one. So far we only have a punishment for the intellectual component of the game, as for the hands and skill required to shoot the appropriate targets, we have not implemented a scaffolding system to help them develop these skills.

Posted in Educational Game | Leave a comment

Probability and Chance

Our game introduces only a simple implementation of chance, so far. After the player enters the mind of the main character to begin exploring feelings, they have to hunt down manifestations of those feelings. To make this hunt interesting these objects need to be difficult to catch, to do this they must follow arbitrary paths. Using the random library in python I assign a random theta value for a 2D rotation, then a random value to be travels in the new direction the object is facing. With only a random rotation and a random forward distance traversed the path of the objects appear to be quite random and yields an appealing and somewhat natural desire to hunt them down.

Future implementations of chance could be in the aiming system. Currently there is direct ray tracing being do from the cross hair, if the player clicks the left mouse button while the ray pings back a positive hit the appropriate actions are taken for scoring and game response. The direction of the ray could randomly modulate through a tiny circle around the ray origin. Being able to shoot efficiently with this implementation will require skill. They will need to become familiar with the “workings” of the targeting system, much like understanding the strengths and weaknesses of real guns and weapons.

The player currently has no ability to influence the chance or probability in the game. The movement of the NPC’s as well as the random inaccuracies of the targeting system are fundamental game mechanics. That player has no access to understanding anything about the NPC’s beyond what they can observe. This however is plenty to get a good estimation of how their movement works. It should be noted however that even though the random movement pyton script is simple and is a series of rotations and straight line traversals. In game with your movements and them moving around in front of you in 3D, this is not as obvious to see.

As requested by my advisor:
The probability of drawing a king of diamond AND an ace of spades from two full decks of cards (that are shuffled), is simple the product of their individual probabilities… 2/104 x 2/103 = 1/2678
Throw three dice (with faces 1-6). What is the probability that the sum will be 10? 12? 14?
This is the all possible combinations to make the sum divided by all possible combinations…
10: 1/8
12: 25/216
14: 5/72

Posted in Educational Game | Leave a comment

In Game Animation

So far you can move the hand in the 4 basic directions with the usual, w, s, a, d, keys. You can also use e, and r to move up and down in the z direction. I added a simple animation where the hand spins on it’s side! Playing it in the game engine doesn’t look exactly as planned, but got the job done perfectly!

Here is the blendfile -> hand_anim.tar.gz

Posted in Educational Game | Leave a comment

Game Rules and Mechanics

Our game will have very few rules. It is being designed as some sort of extra pseudo natural experience. The player stands in front of the Kinect, and then their standing in a party on the screen. This is in accommodation for our audience, we “know” our audience and we’re designing the game mechanics and parameters to suit their gaming experience. Once in the party, moving their body around will be the interaction in the game, this will be VERY natural and objectively the game rules and be inherently learned after a few moments in front of the camera.

Currently, the players motivation or ‘reward system’ will come from within. We’re going for a players curiosity. We have not yet decided on an appropriate goal, but one that came to my mind is to build on the curiosity component. Not only will the player be using a natural interface, but there will be an NPC in the house, motioning for the player to follow them. The player will become ponderous… “Who this person?” “Why are they asking me to follow them?” “where are we going?” This will be the motivation to continue through the game, where along the way the story will unravel and the player will be exposed to various emotional states and experiences from the NPCs.

The operative actions for the player will be moving their hand. Until further testing and development can be done with the Kinect sensor planning for any further functionality will me unfeasible. The added sophistication to the interface will take a tremendous amount of time not within the scope of our project, but is definitely something that should be considered in the future. Through this single operative action there will be various resultant actions within the game, the player will be able to control the position of their avatar, and they will be able to make in game choices to decide the development of the game and where the story will go. We can also add simple functionality to interact with the 3D environment they are in to add a simple but interesting and intriguing degree of sophistication.

Though the development of virtual skills will be very limited, the real life skills learned in this game should be extensive. The entire purpose of this educational game is to expose autistic children to a wide variety of emotions and help them learn what these emotions are and how to interpret and react to these emotions. Using the sophisticated scaffolding system which is being developed with psychological research the skill set and knowledge left with the player should be useful in the real world, at home, and in public.

Posted in Educational Game | Leave a comment

My Team Role

I am the team Programmer. Though my intentions have always been to focus on the design and production of the Kinect/Blender interface, I have been spending time working on modeling and planning out the story with my other teammates. Our team progress had been stagnant for the past few weeks after a dispute we had regarding the direction of the game, but with that behind us it seems like we will be moving forward with more fervor.

I have various components to my programming tasks. The most overwhelming and time consuming is implementing the tracking algorithm within the python code on every frame received from the Kinect. Plugging it into blender wasn’t very difficult, there is a python interface to the Kinect, just call a python script from within the game engine which accesses the information from the Kinect, and voilà. What to do with this information is more difficult. Applying the required computer vision algorithms to the arrays to find and track the hands is a task which I don’t yet completely comprehend. However, once an implementation has been produced, the next step will be coordinate system registration.

So the user will be standing in the ‘user coordinate system’ while the game is happening in the computer world, and in particular the ‘world coordinate system’ as Blender calls it, but here we’ll call it “game coordinate system”. I will need to find the appropriate transformation which maps the user coordinates into the world coordinates. After this has been done, the coordinates of the users hand will be Blender game coordinates, and all that is left is to have a blender object move to the new points every frame.

This will be the rough prototype for the Kinect interface into Blender. For our game however the hand is going to be used to interact with the environment. There will need to be more programming done to recognize where in the environment the hand model is at and have predefined animations, or responses to being in those locations.

Posted in Educational Game | Leave a comment

Progress Update

Though sluggish, our team is still making progress towards producing our game. We have three current directions being taken to develop our game.

  • The Story – Both Sungwoong and Filiz are developing the story together, while implementing psychological components to enhance to educational benefits to the players.
  • The Graphics – We have all been working on small components to contribute to the collection of graphics we’re planning to use in the game. I have constructed the hand model that we will be using for the selections process, Sungwoong and Filiz have broken the construction of the environment into both the construction of the NPCs and the house model.
  • The Interface – I have been working to plug the Kinect sensor into the game engine to use what it sees as input into the game. Ultimately we will be tracking the players hand and using it to move objects on the screen. This is mostly going to be used for answering questions.

Our original idea of just having the child answer questions coming from some on screen avatar wasn’t comprehensive enough. Through the presentation of our game, and the feedback we received we have refactored the components of our game to provide more “game like” functionality. We made a decision and began working in that direction, but after reviewing where we started and where we ended up we learned a lot about how to improve our game.

Various components could go wrong in this game. The most likely candidate will be the natural user interface. Plugging the Kinect into the game engine proved to be easier than anticipated, however analyzing the data from the kinect, applying computer vision to it, and using the results as input is proving more daunting. There are many aspects that could go wrong, number one being processing time. Depending on the implementation I use to track the hand, if it is slower than the intrinsic frame rate in the game engine, then we will see the ever abhorrent, lag…

I am confident that most all users of this game will be impressed and pleased with the interface. Being able to use one’s body to maniplate the virtual world has always been something of science fiction, however with the advent of the Kinect it has become a very affordable reality. As a part of the feedback system to the player, we’re developing a sophisticated scaffolding system which should provide very ‘pleasurable’ feedback to the user in good light. However if the user is having difficulty making correct decisions, or is not following the rules of the game, the feed back will be more direct and stern. Our target audience is autistic children, so we plan to ensure that all feed back and game parameters are geared towards optimizing their in game experience.

Posted in Educational Game | Leave a comment