Date: 13th of June 2013
Participating group members: Kenneth Baagøe, Morten D. Bech and Thomas Winding
Activity duration: 4 hours
Our vision for this end-course project was to build a game consisting of a remote controlled robot and a number of autonomous robots as described in End-course project 0 .
Building and programming the RC robot was the very first achievement in this project. The simple idea of making a remote controlled robot which you control through a first person view, gave a larger amount of immersion than we had expected.
In this project, the autonomous robots are the elements that we’re the most proud of. Making the robots behave as animals searching for food while also trying to protect themselves from the player has been a challenge due to the sensors and the environment. Also the remote controlled robot we find very satisfying due to the expectations of the functionality. The game is fully functional, however, we were not able to achieve some of our envisioned goals. Originally we intended to control the remote controlled robot through an interface on a tablet, but the current implementation has the video feed and the control on a PC instead. The cause of this has been described in an earlier entry : We were unable to achieve an acceptable amount of latency on a video feed implemented in the program for controlling the robot. Instead of implementing the video stream in the remote control program, we used a Skype video call, which had a much more acceptable amount of latency. The result of this was that we needed to have two different programs running simultaneously. Due to time constraints we did not look in to how we would be able to handle this on a tablet device and instead chose to run the Skype call on one machine and the remote control program on another.
Observations and future works
After testing the game, we thought of some changes that could help to improve the game. The first person view gives a realistic and fun aspect to the game, but it also gives some unforeseen complications. When the player robot approaches an autonomous robot, to hit it, the distance between the player robot and the autonomous is very short, which results in the first person view being completely obscured by the autonomous robot. This makes it hard to see if it is actually possible to hit the autonomous robot from where you are. As a result of this, if it is possible, one might find himself looking directly at the arena instead of at the screen displaying the view of the robot, which we did not intend. A reconstruction of the RC vehicle could be one solution to this problem. By placing the camera in a way so as to create a third person view instead of the first person view, could provide a better view of what is going on in the arena during the game, while still providing immersion in the game.
Another improvement to the remote controlled robot would be to add an amount of acceleration as the latency in both video feed and Bluetooth communication meant that, when controlling the robot, you would get jerky movements as it would instantly go full speed. This instant full speed also meant that it was problematic to make small adjustments to the position of the robot.
Another observation was that determining whether the autonomous robots were in or outside of the arena was sometimes troublesome using the color sensor. As described in End-course project 5 , this was caused by the change of light and brightness in the room. The autonomous robots sometimes ended up outside of the arena unable to get back, or constantly repeating the avoid behavior, because of this problem. To solve this, we could try using light sensors instead to measure the scales of light and thereby determine whether you are at the edge of the arena or not. This might also provide a solution to the problem we had with the “food” as we could possibly remove the light shielding we added to the color sensor, meaning that we could reintroduce the Lego base on the food.
As the project is now it has a lack of an actual game element. With some changes we believe we could add a simple game element in: Instead of just having the animals grabbing the food, when “eating”, and moving around for a while with it, we could have them remove the food completely, for instance simply by keeping hold of the food indefinitely. The player would then have to defend the food even more – because once eaten, the food will not return to the game, that way the player could lose the game when no more food is available. Another idea could be telling the animals to move the food to some specific destination and that way remove it from the game. This could be done by mapping the arena, making the robots aware of their position and thus making it possible to have them go to specific destinations, using, for example, fiducials  and camera tracking. Using this idea we could expand the game further: We could have the autonomous robots have to bring the food back to their home to eat, not removing the food from the game, and implement hunger on the player robot as well. We could then have the player robot “die” from hunger eventually if it does not eat, and the same thing for the autonomous robots, again providing win/loss scenarios.
The above taken into account, could also lead us to a solution to another problem – the IR emitter. There is no need for emitting the infra-red light vertically – only horizontally. If we constructed the RC vehicle to have the third person view, as described above, the construction might block the IR light. Obviously, we need the IR emitter to be unobstructed, which we could solve by having an IR LED band surrounding the robot. But using the fiducials as mentioned above, we could track every robot on the arena, and that way make them aware of their surroundings and thereby remove the IR-dependent behavior from the autonomous robots, instead replacing it with one depending on the tracking.