Date: 16th of May 2013
Participating group members: Kenneth Baagøe, Morten D. Bech and Thomas Winding
Activity duration: 7 hours
Goal
The goal of this lab session is to discuss our project idea with Ole, and, if he approves of it, start constructing and programming the remote controlled animal plus finding a proper way of streaming video from the robot to a computer or tablet.
Plan
The plan is straightforward once we have received the approval from Ole. We start by establishing a BlueTooth connection between the NXT and a computer which we have done in a previous lab session. Next up we need to encode key presses on computer in such a way that it can be transmitted over a BlueTooth connection and the NXT needs to decode them and act appropriately. Furthermore we need to find a device that can stream video wirelessly.
Progress
After having a quick chat with Ole Caprani we got started on the RC animal as he approved our idea and gave us some project blogs from last year we could use as inspiration for our project. We split our work into two separate jobs, one was to build a phone holder and the other was to program the remote control software for the NXT and computer.
The simple part of building the phone mount was assigned to Morten and he quickly built a first prototype which was rebuilt to accommodate the possibility of putting the phone in and taking it out again without having to take the whole mount apart each time. On the picture below you can see the phone mount mounted to the standard robot we used in the week following up to the project start.
The other part we worked with was the software for remote controlling the robot via a BlueTooth connection. As we had worked with a BlueTooth connection to the robot before we could draw some inspiration from that project, this meant that the basic problem of establishing the connection was easily solved:
We wrote a BlueTooth connection class for the NXT that runs in a seperate thread and accepts connections. When the thread receives some data from the controller it relays it to the thread controlling the robot.
The first implementation of the code was able to accept single keypresses on the controller, the controller in this case being a PC, which meant that the robot could simply move forward, backward and turn in place. When we tested this functionality we noted that being able to steer while moving forward would be useful and thus we set about implementing this.
We established an enumerated class to tell which action was supposed to be sent to the robot and a number of booleans to tell which buttons were pressed. When a button is pressed the boolean for it is set to true and vice versa.
1 | private enum State {FORWARD, BACKWARD, RIGHT, LEFT, FORWARD_RIGHT, FORWARD_LEFT, BACKWARD_RIGHT, BACKWARD_LEFT, STOP}; |
1 | private boolean upPressed = false, downPressed = false, leftPressed = false, rightPressed = false, spacePressed = false; |
Following this the updateState()
method is called and finally the action is sent to the robot via the sendMove()
method.
1 2 3 4 5 6 7 8 9 10 11 | private void updateState() { if (upPressed && rightPressed) { state = State.FORWARD_RIGHT; return; } if (upPressed && leftPressed) { state = State.FORWARD_LEFT; return; } if (downPressed && rightPressed) { state = State.BACKWARD_RIGHT; return; } if (downPressed && leftPressed) { state = State.BACKWARD_LEFT; return; } if (upPressed) { state = State.FORWARD; return; } if (downPressed) { state = State.BACKWARD; return; } if (rightPressed) { state = State.RIGHT; return; } if (leftPressed) { state = State.LEFT; return; } state = State.STOP; } |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 | private void sendMove() { int action; switch (state) { default: action = 0; break; case FORWARD: action = 1; break; //... case BACKWARD_LEFT: action = 8; break; } try { dos.writeInt(action); dos.flush(); } catch (Exception e) { e.printStackTrace(); } } |
When the robot receives the action it then acts accordingly via a simple switch:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 | switch (action) { default: pilot.stop(); break; case 1: pilot.forward(); break; case 2: pilot.backward(); break; //... case 8: pilot.steerBackward(-turnSharpness); break; } |
When we got the code working we were able to drive around using the remote control software and a Skype video chat to live stream the view of the robot. We had a bit of fun driving the robot around in the Zuse building and we made a small video of the robot which can be viewed below.
The video shows the robot driving on one of the arenas while in the corner of the video you can see the view available for the controller – the two feeds are synchronized.
Backlog
We implemented a quick way of telling the robot to attack, however, when we tested this we noted that when the robot was moving forward and steering left or moving backwards and steering right it was not able to attack. We need to look into what the cause of this bug is.
We would like to have a single Java application in which the video stream is shown and key presses are detected and transmitted to the robot, instead of the current solution using Skype to send the video feed, however we are going leave it be for the moment as there is plenty of other stuff that doesn’t work yet which we also have to complete.