A MOARbots volunteer (Scott) made some python visualizations for the data stored from the multiple waypoint navigation competition.
Each path comes from a different (completely or slightly) piece of code running on the robot. The waypoints are the large gray circles, and are always in these same pattern, though sometimes the board was rotated (the tags were physically taped to it).
You can see the curve of the path segments in robots that did not account for the differences in output wheel speed relative to PWM (pulse-width modulation) value between the left and right motors. The actual physical robots were not the same between all these runs (the number at the top is just the robot ID tag which is removable). They all curve left because the right hand wheel motor is in the bias direction when moving forward, but the left hand wheel motor is in the anti-bias direction.
The target is considered reached if the robot's tag center point is within 20 units (pixels, but our camera never moves) of the goal point tag center. The score is computed as such:
Score = 120 - t + 25*n
t: time elapsed in seconds since trial started
n: number of waypoints visited
Total number of waypoints: 5
Trial ends if the score reaches zero
The maximum score is therefore 245, for a theoretical 'teleporting' robot.
The competition can be summed up by the two key challenges: (1) Figuring out how to drive quickly towards a single waypoint without overshooting the goal, and (2) Computing the shortest path given the locations of the goals and the robot start position.
The robot in the above visualization did a little too much zig zagging. It also didn't compute the best path.
The robot in the above visualization correctly figured out the minimum length path given the tag locations and robot start position.