Wednesday, June 7, 2017

Indoors Navigation Robot ("Driving Drawers") Post 2

This post is about the path planning algorithm for the robot.

Because the robot is a non-holonomic, non-radially symmetric, non-zero turn radius type of vehicle, the path planning problem was fairly difficult.

I was struggling to find a readymade solution to the following problem: plot a curve between obstacles, with a constraint on the sharpest allowable curvature in the path.

For a setting with no obstacles, with a minimum turn radius specified, and where starting and ending state are specified as coordinate and angle in 2D space, I found Reeds-Shepp, which was already implemented in Python on GitHub. https://github.com/ghliu/pyReedsSheppReeds-Shepp Curves

I briefly considered using Rapidly Exploring Random Trees to search for solutions in this problem space, but decided on something different instead after reading some literature and testing some code. I think some of the variants I've seen published of RRTs would work nicely, but I didn't want to implement them in Python from scratch. Two repositories I looked at when making this decision:
https://github.com/iamprem/rmphttps://github.com/ArianJM/rapidly-exploring-random-trees

I decided to go with A* for search. To prepare my problem for that, I first discretized the map of obstacles (represented by polygons) using a visibility graph. I built the visibility graph using another Python repository pyvisgraph. I spent several days hunting down bugs in the code, coming up with one case of rounding error and one case of logical error that were causing problems in my setup. I opened up a few "Issues" through the GitHub portal and the author will be able to fix them. I considered doing a pull request but for just a few lines of code like that I figured the Issues would suffice. After fixing the bugs I provided support for interior polygons and adding obstacles to the visibility graph without recomputing, so I guess I should probably clean it all up and do a real pull request eventually.

I also had to discretize the robot angle at this point. I decided to go with 4 of what I called 'cones': North, East, South, West. The code has this parameter as K, so I call them K-cones (you can just as easily specify five cones, but then they can be harder to refer to casually when describing the solution). Since Reeds-Shepp takes in an exact angle, when computing the cost of a path from (start point, start cone), to (goal point, goal cone), I try all four combinations of clockwise and counterclockwise limits of the start cone and goal cone. Then I take the worst score and use that as the cost. Paths intersecting the obstacle map are ignored in this step, so they don't count towards the concept of the worst score.

Then I searched over this space with A*, and found solutions in the form of a list of (location, cone). But to resolve this into a navigable path still required resolving the cones (North, East, South, West) into angles. If there are n steps in the path, and we choose to sample d regularly spaced angles within a cone to find the best overall path length, then the size of the space is d raised to the power n. It isn't easy or necessarily possible to make n any smaller than it already is--so keep d small. I found that even for d=2 (checking just the most clockwise and counterclockwise limit of the cone) could find a solution. The difference between best total path length found were minimal anyhow.

I used Shapely to handle geometry related tasks like union-ing polygons together, checking for intersections, and so forth. I used matplotlib for drawing things to the screen.

You had to read all that, so here are some pictures!


An example solution after the angles have been resolved.

An easy solution, where the goal is immediately reached from the start point with a single Reeds-Shepp curve.

Here's a messy picture of all the edges in the graph for a particular setup. The pink things represent the cones.

This picture that I saved at some point in my work represents paths from points on the visibility graph that intersect the walls of this empty room. The intersecting sections are shown in red.
The strongest thing I can say for my hacked together system, "A* over Reeds-Shepp with K-Cones" is that is seems to work in all the cases I need it to work. I can't say anything about the optimality of the solution or even about whether a solution will be found if it exists. Computing the visibility graph and edge costs on the visibility graph when expanded for the 4 cones on each point takes about ten seconds on my laptop, for what that is worth. It can be saved and reloaded for future use, assuming the obstacle map hasn't changed.

I'm taking a break from this project to prepare for the Advanced Topics in CS class that I will be teaching this summer (high school grade levels, but college level materials). When I come back to this project, I will take the hardware from the last post, the path finding from this post, and a few bits of math I've worked out in my notes, to put the whole thing together into a robot that navigates from room to room using ArUco tags

Indoors Navigation Robot ("Driving Drawers") Post 1

[Foreword: you may notice the fonts are mismatched in this post. Or maybe you noticed that the images in all the posts in this blog, when viewed on a larger screen (as opposed to a small mobile device screen), are placed haphazardly within the body of the post. This is because Blogger leaves much to be desired when it comes to the WYSIWYG post editor. In part because WYS ("What you see") is not always WYG ("what you get") in the context of Blogger's post editor, but also because there are very limited options for aesthetic image layouts in the body of a post. I am planning a Jekyll based webpage for this blog in the general future that will fix these problems. Until then, this is a cheap and easy way to make sure I keep a blog at all.]


I had an idea for a chest of drawers that would drive to my location in a space (assumes no stairs).

Here's the first part of my progress on that project.

Physical Build

I started by dissecting my collection of Roombas and taking the wheels from two different models and hot gluing them, along with some support material, to the base of a plastic chest of drawers.








I purchased a Raspberry Pi 3 and a PiCam, and while I waited I put together a 3D pan-tilt design from Thingiverse. I'll provide a link (link to thing), but keep in mind I do not recommend this design. Unless your 3D printer is very precise, you'll have to do quite a bit of filing and sanding to get it to go together. The pan servo will absorb all the impact whenever it hits something, and will fail (mine did). If you can find a ring of the right thickness and diameter to stick in between the orange disc and blue plate in the photo, the problem is mitigated (I used a bearing I found in a bin--way overkill as it isn't even acting as a bearing--but a quicker solution than cutting my own ring on the lathe).




Not entirely certain of the layout I wanted, I just taped everything in place with masking tape. The battery is stored underneath, in the topmost drawer of the robot. It's a 30,000 mAh battery I bought for use with my smart phone. It has a port that will source 2.5A, which is needed by the Raspberry Pi. I paid about $75 for this model; you should be able to find comparable batteries by other brands if that one is not available (sometimes, when an item is out of stock, a few vendors will offer it for an inflated price, so beware. The price on Amazon for this model was briefly $399 before dropping to $69.99 again). 
I pulled the Arduino Mega out of a project I'm not working on at the moment, though it is of course overkill for this application. I wasn't sure how many sensors and actuators I wanted on board, so this 54 I/O pin Arduino allows for quite a bit of room to grow the project. The Raspberry Pi 3 itself only has one PWM enabled pin available to me, so the Arduino is convenient for handling all the low level stuff. It talks to the Raspberry Pi over the USB. The micro servos are powered from the Arduino Mega which are in turned powered off the Raspberry Pi. The micro servos stall current is low enough for this to be possible with the Arduino Mega.




 The Roomba wheel motors are safe at 12 volts (the Roomba battery voltage), so I put another battery in the system just for them. The battery is a 3 cell Lithium Polymer battery, which measures in at roughly 11.1 volts when the battery needs to be recharged and 12.6V when the battery is fully charged. The motor drivers are L298 chips on those red-colored Sparkfun breakout boards, with the heatsinks mounted to them.







So at this point the robot was driving, but only in a straight line. Turns would drag at least one wheel and make a terrible noise. Only very slight turns worked. This was fairly predictable, but trying to make it work anyway was very much in keeping with my glue-and-tape, then iterate style of prototyping. Jesse helped me put together a steering mechanism in a very short amount of time. It worked, so that evening I took the robot out for an inaugural journey around Sector67, using it as a sort of telepresence robot as I controlled it from my desk.




Then I broke the steering mechanism gearmotor by switching it back and forth too fast when I got stuck in a corner. The gear before the final output shaft broke into a bunch of tiny pieces.



I replaced it with the gearmotor in the image above on the right that has the partly blue casing. Now that I had a working robot again, it was time to work on the high level path planning and code. I'll put that in the next post.