Wednesday, January 9, 2019

Rhombic Enneacontahedron

Over a year ago I decided the next 3d printed magnetic tile based polyhedron I would create would be the rhombic enneacontahedron.

In the past I created shapes mostly by looking up the dihedral and face vertex angles from the internet. Only for the most recent one, the parametric pyritohedra/dodecahedron script, I generated the points of the whole polyhedron and manipulated them, before creating the individual faces as tiles. For the trapezohedra, some of the ones I attempted to create had the wrong angles, and it wasn't clear to me if the issue was in the angles I had gotten from the internet or if my OpenSCAD script contained some errors. I decided though that the right way forward would be to build the tiles as I did in the pyritohedra script, by creating the full polyhedron and then computing the angles for the tiles myself.

I had just learned about Conway polyhedron notation and considered that the right way to build this would be to use that concept. But I didn't end up getting a library for Conway polyhedron notation, or creating a library for these operators. Instead I did everything a bit less abstractly and elegantly than I'd have liked, but I think that's normal to expect for a first draft.

I switched from OpenSCAD to python at this point, because I find it impossible to do anything with for loops in OpenSCAD (highly inefficient and slow). For creating polygons in 3D space I used pyny3d, which is already integrated with matplotlib for drawing. And I use numpy for math stuff. I finally switched to python 3 for this and wrote every print statement incorrectly the first time...

We begin with the icosahedron:


For the icosahedron I started with a list of 12 vertices as listed on various places online.
Each of the below descriptions of vertices in format (x, y, z) expands into four vertices given there are ± options in two of the dimensions for each one.
(0, ±1, ± φ)
(±1, ± φ, 0) 
(± φ, 0, ±1)
Where φ (phi) is the golden ratio, 0.5*(1+sqrt(5))




The first goal is to truncate the icosahedron. I came up with my implementation of truncation by interpreting the picture and one line description from Wikipedia, which I'll include below, and is drawn on an example cube:


Truncate cuts off the polyhedron at its vertices but leaves a portion of the original edges.[10]

I describe truncation like this:
For each vertex, find every edge is is a part of, and create a vertex one-third of the way inward on that edge. The new shape's vertices are the newly created vertices.
The new shapes faces I separate into two categories.
The first category are created by drawing edges between all the new vertices spawned off a single old vertex. In our case, the icosahedron has vertices at 5-edge intersections, so these are five sided shapes (pentagons).
The second category are the truncated versions of the original faces. For each original face, consider each original vertex. That original vertex spawned some new vertices, but only two lie on this original face that we are currently evaluating. The new face that will replace the old face is defined by all of these new vertices corresponding with the old face, that still lie on this face.
We had triangular faces with the icosahedron, so with two vertices per original vertex multiplied by three original vertices, we have created six vertex faces (hexagons).



The truncated icosahedron is a soccer ball, shown above in red pentagons and blue hexagons.

Now we must run a join operation on the truncated icosahedron. Again, expanding from only the short wikipedia entry:

Join creates quadrilateral faces. 

That's not a lot to go on... To better understand it I considered a similar operation, kis:


Kis raises a pyramid on each face

Okay, so join is like kis without the original edges. The old vertices can stay put, but the new vertices, created in the middle of each face, need to be lifted. But by how much to lift them? And where to put them to center them on the face?

I had gone over the math for finding the vertex on the face, when I went over the math to define the dual operator. Luckily for me the pentagons and hexagons are regular, so I could just take the mean of the points in this case.

The insight for how to lift the vertices came from the symmetry. Some of the newly created quadrilaterals will go from one hexagon center point to another hexagon center point. To preserve symmetries, I assumed that all hexagon center points will be lifted the same amount. I drew a quick diagram to figure out the computation for how far to raise them in order to make all four points coplanar (otherwise, we don't have a valid face. At time of writing, this site which I enjoyed using in the process of learning about Conway operators, seems to have some nonplanar 'faces', so it looks very strange).

New quadrilaterals connecting old hexagon centers to old pentagon centers must also be created, utilizing two old vertices that come from the edge shared between the adjacent hexagon and pentagon. Since we finalized the height of the point raised off the hexagon center, and we know the two points from the shared edge are the same as they were before, the only thing to do is compute how far to raise the pentagon center along the norm of the pentagon face until it lies on the plane defined by these other three points.

To draw the faces I needed to keep track of which faces were adjacent to which other faces, so I created some dictionaries and populated them with simple for loops because I'm not concerned about optimizing my code at all given what I'm using it for.



The next step was to compute and print out the angles I needed to create a tile. This was just a question of identifying the right vertices, edges, and vectors and computing various norms and cross products.

For the tiles I moved back to OpenSCAD. I reused my rhombic triacontahedron tile code for the parallelogram faces (red). For the kite faces (blue) I created some new, but very similar code.

Here's the info printed out, there's definitely going to be some rounding error but not significant to my purposes.

There are two dihedral angles:
Red to blue faces: 159.4455573432933 degrees
Blue to blue faces: 160.81186354627906 degrees

The red parallelogram faces have an acute interior angle: 56.67993474152847 degrees
And the complement (computed just to check if correct, but not needed as additional info): 123.32006525847153 degrees

The blue kites are more interesting. The skinniest angles are the tips of the five pointed stars they create.
56.67993474152846 degrees

The opposite angle, which lies in the middle of those stars where they meet in groups of five, is close but certainly not the same angle:
69.7316519531762 degrees

It is a kite, so symmetric in this remaining dimension, and both of these larger angles are the same:
116.7942066526476

This didn't seem to match the info on Wikipedia and I'm also not seeing such a clear distinction between 'slim' and 'broad' rhombi. I wonder if I did something wrong, though so far everything is internally consistent for my construction. I wonder if there's an assumption I made that got rid of one degree of freedom....

Finally I needed some info about the edges. The red edges are all the same length. And, they match the length of the slightly longer of the edges of the blue tiles. The other type of edge is the ones that the blue tiles share with each other. I call this a 'short edge.' 

I computed a long side: 0.4339800705873414
And a short side: 0.36037602329028573

And the ratio of short to long as their ratio:
0.36037602329028573/0.4339800705873414

I ordered a few kinds of magnets in sufficient quantity (I need 90 faces * 4 magnets a face, but I might also double up and do 2 magnets an edge or 8 magnets a face). While I wait, I am 3D printing some tiles just to tape together and see if the angles work. So far it is looking promising.

parallelogram faces model, shown with magnet holes


 kite faces model, shown without magnet holes


They look fairly similar to the untrained eye...

Taped together with masking tape as a test....red parallelograms and white kites. They were all printed in white, and then I used a sharpie to color the parallelograms in red.



A five pointed star of kites.



Since these come together nicely, I think I have all the vertex angles and the dihedral angles correct.

I've always used either masking tape (testing) and magnets (super-glued into the slots) to construct these, but somebody who printed one of my designs and tried gluing them together noticed that there appeared to be an accumulating offset problem that grew as the shape was constructed and made it so it wouldn't close. It seems that you need to leave some flexibility in order to distribute errors (that come from the manufacturing process or mis-alignment during assembly). Some ideas I haven't tried include: putty, velcro.

Once I've had a chance to build the whole thing, I'll follow up with another post, and clean up the code and commit it to GitHub as well.

Monday, December 3, 2018

Joystick Color Wheel with 3 Op Amps

I love microcontrollers but I've seen one too many 'Raspberry Pi deployed in order to blink a light' projects. Don't they know you can do that without a computer? They might not know.

I was sitting at the hardware table at HackUMass and watching everybody check out Arduinos and Raspberry Pis and ignoring the transistors. So I thought I'd make a few simple circuits for demonstration. First was a simple flex sensor controlling an LED. Then I inverted the behavior--the flex sensor turned off the LED instead of on. Then I did the same sort of thing with the potentiometers from a joystick. It made sense to upgrade to an RGB (red green blue) LED. But there are two potentiometers (and one switch) on the joystick, and three colors in the LEDs. What kind of behavior would be most satisfying?

It was suggested that I implement a classic colorwheel. Three axes, set apart 120 degrees from each other, for red, blue, and green.



Okay, perfect--I can do that with some op amps. There are two axes (potentiometers) on the joystick and each is configured as a voltage divider. We need to make a weighted sum of the X direction and the Y direction outputs of the voltage divider to create the Blue and Green directions. The red is aligned with the Y axis already.


https://en.wikipedia.org/wiki/Operational_amplifier_applications#Summing_amplifier

I went through a few plans for the design. I ultimately settled on the LM358N chip using a single-sided supply and a virtual ground. The virtual ground I set to half Vcc with a simple voltage divider, guessing that the joystick rests at half Vcc (might not be completely true).

The Blue direction sits 30 degrees below the X+ direction. X*cos(30) + Y*sin(30) implemented in a summing amplifier--that's the first op-amp. For the Green, I used the same calculation, but flipped the X axis using an inverting amplifier, so that takes two more op amps for the Green axis. There are two op-amps per LM358N, so that's two ICs.

Each axis controls an NPN BJT transistor (I chose the 2N4401 for this). The red axis is the output of the joystick Y axis voltage divider controlling the transistor. It looks like it would go in the wrong direction (-Y) but because the summing amplifiers also invert the value relative to the virtual ground, everything ends up working out.

Finally there are some potentiometers inline with the base resistor to allow calibration of the three color channels. I found I got best results when the lights are all on and balanced for a medium white in the middle default joystick state.



This description wasn't very detailed, but it isn't a tutorial. I'm hoping to create a series of high quality instructional resources in the future, and I'd create a module on this circuit as part of that. It takes a lot of time to create high quality content though and I just don't have time at this moment. So for now, if you want to make one, I'll leave the details of implementation as an exercise to you, the reader.

Finally here's the video. I put a piece of plastic on the RGB led to get the diffuse light, because it had a clear package. That made it much nicer to look at, but the video still suffers from poor dynamic range.



Wednesday, November 28, 2018

Ubuntu won't boot, waiting on dev-mapper-cryptswap

I updated Ubuntu 18.10 cosmic and of course everything was crappy. The trackpad started registered spurious touches of my palm in the upper left corner and made typing very frustrating. And then the machine started hanging with a weird window manager (GNOME?) glitch after I had it in sleep mode all night with the lid shut, so I had to hard reboot in the morning.

This happened twice and then it wouldn’t reboot anymore. It would hang on the purple screen with the word 'ubuntu' and some loading bar dots. I force rebooted (hold shift for GRUB) and selected the option 'Advanced' and chose a recovery mode option of the latest version. Then I could see that we were endlessly waiting on dev-mapper-cryptswap1.device and that is why it will not boot, not even in recovery.

I found several sites suggesting I edit the /etc/fstab file and comment out any lines talking about cryptswap. (to do note for myself for later: figure out if I should make an encrypted swap, or just live without the swap)
https://ubuntuhak.blogspot.com/2017/05/a-job-is-running-for-dev-mapper.html

OK but if I can’t boot, I can’t access any shell! How do I edit the fstab file??? I found this answer:
https://superuser.com/questions/1013658/how-to-skip-startup-jobs-for-fstab-no-timeout-centos7

But I was missing the context for it. Where do I enter that emergency boot parameter?

I was getting sick of typing google search queries in to my phone, so I went back to Grub and looked around. In the advanced options section, the text suggests you can press ‘e’ to edit the entry. Then I saw something like this:

setparams ‘Ubuntu, with Linux 4.18.0-11-generic’
    recordfail
    load_video
    gfxmode $linux_gfx_mode
    insmod gzio
    if [ x$grub_platform = xxen ]; then insmod xzio; insmod lzopio; fi
    insmod part_gpt
    insmod ext2
    if [ x$feature_platform_search_hint = xy ]; then
      search --no-floppy --fs-uuid --set=root [uuid]
    else
      search --no-floppy --fs-uuid --set=root [uuid]
    fi
    echo ‘Loading Linux 4.18.0-11-generic …’
    linux /boot/vmlinuz-4.18.0-11-generic root=UUID=[the uuid] ro acpi_rev_override quiet splash $vt_handoff
    echo ‘Loading initial ramdisk …’
    initrd /boot/initrd.img-4.18.0-11-generic 

So that seemed promising. I entered a -b into the /boot/vmlinuz arguments, as such:

linux /boot/vmlinuz-4.18.0-11-generic root=UUID=[the uuid] ro acpi_rev_override quiet splash $vt_handoff -b

Then hit control + x as the instructions suggested to boot. This doesn’t permanently change the options, but rather boots with this modified entry just this once. So I entered emergency mode, hit control + d as the instructions suggested, and I was in.

Back to the instructions from the answer from before: https://superuser.com/questions/1013658/how-to-skip-startup-jobs-for-fstab-no-timeout-centos7
In case this leaves the root file system read-only, you can run mount -o remount,rw / once in the shell.
I didn’t give this a try without doing that; I just assumed it was necessary, and that had I skipped that I would have found that I was looking at my file system in read-only mode.

So back to https://ubuntuhak.blogspot.com/2017/05/a-job-is-running-for-dev-mapper.html
The solution is to remove or comment out the "cryptswap" entries from /etc/fstab and /etc/crypttab. This can be done easily by editing the above mentioned files as commenting out the lines that say cryptswap by placing a "#" in front of the matching lines.
I did that, saved the file in nano (I forgot how to use nano, but the bottom of the screen suggested some commands, so I did the one for exit and then it asked me to type Y to save before exit). Then I restarted the computer, I believe with the command shutdown, and then pressing the power button afterwards to reboot.

Then I figured before I got back to work, I could make a short blog post out of it. Here you go.

-Shira

Tuesday, June 5, 2018

CRT and Magnets Exhibit



This Cathode Ray Tube + Magnets exhibit started off with a couple of CRT monitors that were gathering dust in storage. I was asked to consider putting them out in the main space of the makerspace. I decided I would only allow this if they did something. I set out to decide on what that something would be. It turned into a fun, easy, accessible to all ages exhibit that we now turn on for all the tours we lead through this makerspace. It is a great way to quickly and cheaply construct a meaningful interactive science exhibit to add to your collection.




Here's a document I put together to explain what's happening. I taped this to one of the TV antenna (the antenna is not used) so it stays front and center to the exhibit and people are encouraged to actually read it. Here's a lower quality image of the same document so you can see it embedded in the post.


Here's the bill of materials:

  • CRT television. You may have to turn to eBay or Craigslist. These may only get harder to acquire with time.
  • If the CRT has VHF/UHF inputs, you'll need a box like this one to convert the signal to composite video: https://www.amazon.com/dp/B0014KKV7W/
  • For the camera, the backup camera is cheap and outputs a composite video signal over RCA connectors. https://www.ebay.com/itm/CMOS-Car-Rear-View-Reverse-Backup-Camera-Parking-Night-Vision-Waterproof-7-LED/291918612347
  • If you'd like to place the camera elsewhere, you can get a cheap 2.4GHz transmitter like this one https://www.ebay.com/itm/2-4G-Wireless-Video-Transmitter-Receiver-Kit-for-Car-Rear-Backup-View-Camera/163041336550
  • A powerful neodymium magnet. You want something strong enough to have an effect and a good size to be easy to handle. Something like 0.5 inch diameter and 0.5 inch height seems like a good size to me, but I'd suggest just seeing what's available and trying it. You can always stack up multiple smaller magnets.
Assembly is just a question of plugging everything in to power and getting the signal in to the TV. Note that the backup camera requires a little work to plug in; it is designed to be hooked up to a 12V car battery connection point. You will need a 12VDC wall adapter if it does not come with one. You will need to make sure you have the red/positive connector and black/negative connector going to the right places. You will probably need to solder at least one connection, or use another strategy to connect the wires.

The magnet should be protected with something soft to avoid it hitting against metal and breaking or pinching fingers. I used two furniture feet and some masking tape. I also suggest putting it on a string so it doesn't wander off.

Cable management was the longest part of the project. I zip-tied all the extra length of cabling in back of the assembly. A single switch controls the extension cord to which the entire unit is plugged in. Note the CRTs make a high pitched noise that some people don't like to hear all day, so I don't leave it on all the time, since the main room of the makerspace is also a meeting and study space.

The cameras are taped to the top of the CRT and pointing at brightly colored pieces of paper. This is important because the magnet effect is not nearly as visible on black & white images.


An additional, optional modification I made to the back up cameras was the removal of the infrared lights that it contains in order to provide better visibility at night. The camera was very warm and when the IR lights activated, which sometimes happened if the exhibit experienced low light conditions, the image was washed out. I opened the unit and de-soldered the IR LEDs. My original plan involved putting gaffer's tape over the IR LEDs, but that seemed to make the camera heat up even more. There is a sensor inside the unit and in bright lighting the infrared LEDs are not activated, so this is not necessarily something that needs to be addressed for the exhibit to function.


It is also important to note that the backup camera is designed to mirror images. Note the sign in the image above is printed as a mirror image in order to show up correctly on the display. The backup distance overlay is another artifact of the choice to use a backup camera; I find it is fun and adds to the color distortion effect since it is displayed in bright colors.

Wednesday, June 7, 2017

Indoors Navigation Robot ("Driving Drawers") Post 2

This post is about the path planning algorithm for the robot.

Because the robot is a non-holonomic, non-radially symmetric, non-zero turn radius type of vehicle, the path planning problem was fairly difficult.

I was struggling to find a readymade solution to the following problem: plot a curve between obstacles, with a constraint on the sharpest allowable curvature in the path.

For a setting with no obstacles, with a minimum turn radius specified, and where starting and ending state are specified as coordinate and angle in 2D space, I found Reeds-Shepp, which was already implemented in Python on GitHub. https://github.com/ghliu/pyReedsSheppReeds-Shepp Curves

I briefly considered using Rapidly Exploring Random Trees to search for solutions in this problem space, but decided on something different instead after reading some literature and testing some code. I think some of the variants I've seen published of RRTs would work nicely, but I didn't want to implement them in Python from scratch. Two repositories I looked at when making this decision:
https://github.com/iamprem/rmphttps://github.com/ArianJM/rapidly-exploring-random-trees

I decided to go with A* for search. To prepare my problem for that, I first discretized the map of obstacles (represented by polygons) using a visibility graph. I built the visibility graph using another Python repository pyvisgraph. I spent several days hunting down bugs in the code, coming up with one case of rounding error and one case of logical error that were causing problems in my setup. I opened up a few "Issues" through the GitHub portal and the author will be able to fix them. I considered doing a pull request but for just a few lines of code like that I figured the Issues would suffice. After fixing the bugs I provided support for interior polygons and adding obstacles to the visibility graph without recomputing, so I guess I should probably clean it all up and do a real pull request eventually.

I also had to discretize the robot angle at this point. I decided to go with 4 of what I called 'cones': North, East, South, West. The code has this parameter as K, so I call them K-cones (you can just as easily specify five cones, but then they can be harder to refer to casually when describing the solution). Since Reeds-Shepp takes in an exact angle, when computing the cost of a path from (start point, start cone), to (goal point, goal cone), I try all four combinations of clockwise and counterclockwise limits of the start cone and goal cone. Then I take the worst score and use that as the cost. Paths intersecting the obstacle map are ignored in this step, so they don't count towards the concept of the worst score.

Then I searched over this space with A*, and found solutions in the form of a list of (location, cone). But to resolve this into a navigable path still required resolving the cones (North, East, South, West) into angles. If there are n steps in the path, and we choose to sample d regularly spaced angles within a cone to find the best overall path length, then the size of the space is d raised to the power n. It isn't easy or necessarily possible to make n any smaller than it already is--so keep d small. I found that even for d=2 (checking just the most clockwise and counterclockwise limit of the cone) could find a solution. The difference between best total path length found were minimal anyhow.

I used Shapely to handle geometry related tasks like union-ing polygons together, checking for intersections, and so forth. I used matplotlib for drawing things to the screen.

You had to read all that, so here are some pictures!


An example solution after the angles have been resolved.

An easy solution, where the goal is immediately reached from the start point with a single Reeds-Shepp curve.

Here's a messy picture of all the edges in the graph for a particular setup. The pink things represent the cones.

This picture that I saved at some point in my work represents paths from points on the visibility graph that intersect the walls of this empty room. The intersecting sections are shown in red.
The strongest thing I can say for my hacked together system, "A* over Reeds-Shepp with K-Cones" is that is seems to work in all the cases I need it to work. I can't say anything about the optimality of the solution or even about whether a solution will be found if it exists. Computing the visibility graph and edge costs on the visibility graph when expanded for the 4 cones on each point takes about ten seconds on my laptop, for what that is worth. It can be saved and reloaded for future use, assuming the obstacle map hasn't changed.

I'm taking a break from this project to prepare for the Advanced Topics in CS class that I will be teaching this summer (high school grade levels, but college level materials). When I come back to this project, I will take the hardware from the last post, the path finding from this post, and a few bits of math I've worked out in my notes, to put the whole thing together into a robot that navigates from room to room using ArUco tags

Indoors Navigation Robot ("Driving Drawers") Post 1

[Foreword: you may notice the fonts are mismatched in this post. Or maybe you noticed that the images in all the posts in this blog, when viewed on a larger screen (as opposed to a small mobile device screen), are placed haphazardly within the body of the post. This is because Blogger leaves much to be desired when it comes to the WYSIWYG post editor. In part because WYS ("What you see") is not always WYG ("what you get") in the context of Blogger's post editor, but also because there are very limited options for aesthetic image layouts in the body of a post. I am planning a Jekyll based webpage for this blog in the general future that will fix these problems. Until then, this is a cheap and easy way to make sure I keep a blog at all.]


I had an idea for a chest of drawers that would drive to my location in a space (assumes no stairs).

Here's the first part of my progress on that project.

Physical Build

I started by dissecting my collection of Roombas and taking the wheels from two different models and hot gluing them, along with some support material, to the base of a plastic chest of drawers.








I purchased a Raspberry Pi 3 and a PiCam, and while I waited I put together a 3D pan-tilt design from Thingiverse. I'll provide a link (link to thing), but keep in mind I do not recommend this design. Unless your 3D printer is very precise, you'll have to do quite a bit of filing and sanding to get it to go together. The pan servo will absorb all the impact whenever it hits something, and will fail (mine did). If you can find a ring of the right thickness and diameter to stick in between the orange disc and blue plate in the photo, the problem is mitigated (I used a bearing I found in a bin--way overkill as it isn't even acting as a bearing--but a quicker solution than cutting my own ring on the lathe).




Not entirely certain of the layout I wanted, I just taped everything in place with masking tape. The battery is stored underneath, in the topmost drawer of the robot. It's a 30,000 mAh battery I bought for use with my smart phone. It has a port that will source 2.5A, which is needed by the Raspberry Pi. I paid about $75 for this model; you should be able to find comparable batteries by other brands if that one is not available (sometimes, when an item is out of stock, a few vendors will offer it for an inflated price, so beware. The price on Amazon for this model was briefly $399 before dropping to $69.99 again). 
I pulled the Arduino Mega out of a project I'm not working on at the moment, though it is of course overkill for this application. I wasn't sure how many sensors and actuators I wanted on board, so this 54 I/O pin Arduino allows for quite a bit of room to grow the project. The Raspberry Pi 3 itself only has one PWM enabled pin available to me, so the Arduino is convenient for handling all the low level stuff. It talks to the Raspberry Pi over the USB. The micro servos are powered from the Arduino Mega which are in turned powered off the Raspberry Pi. The micro servos stall current is low enough for this to be possible with the Arduino Mega.




 The Roomba wheel motors are safe at 12 volts (the Roomba battery voltage), so I put another battery in the system just for them. The battery is a 3 cell Lithium Polymer battery, which measures in at roughly 11.1 volts when the battery needs to be recharged and 12.6V when the battery is fully charged. The motor drivers are L298 chips on those red-colored Sparkfun breakout boards, with the heatsinks mounted to them.







So at this point the robot was driving, but only in a straight line. Turns would drag at least one wheel and make a terrible noise. Only very slight turns worked. This was fairly predictable, but trying to make it work anyway was very much in keeping with my glue-and-tape, then iterate style of prototyping. Jesse helped me put together a steering mechanism in a very short amount of time. It worked, so that evening I took the robot out for an inaugural journey around Sector67, using it as a sort of telepresence robot as I controlled it from my desk.




Then I broke the steering mechanism gearmotor by switching it back and forth too fast when I got stuck in a corner. The gear before the final output shaft broke into a bunch of tiny pieces.



I replaced it with the gearmotor in the image above on the right that has the partly blue casing. Now that I had a working robot again, it was time to work on the high level path planning and code. I'll put that in the next post.



Sunday, May 7, 2017

FRC Machine Shop with Sector67

In April Sector67 volunteered to run a machine shop for both FIRST Robotics Seven Rivers Regional and St. Louis Worlds events. It had been a while since I participated in FRC in any form (though I did mentor an FTC team this year, which is another FIRST competition).

Here are some photos. Since I only took photos when there was a break in the work orders coming in to the shop (usually because there was some mandatory attendance event for teams), this photo set gives the impression that we were a lot less busy than we actually were.

Seven Rivers Regional one view of the machine shop

Seven Rivers Regional game field

Teaching lockpicking at the table photo left


St. Louis Worlds Machine shop front side, 3D printer and laser cutter with fume extractor

More stuff in the St. Louis machine shop setup

View of the St. Louis playing fields in the convention center arena



Monday, April 3, 2017

Giant Display and Home for my First Arduino


The extra large (6.5") seven segment display array pictured above has been at my desk for a long time now, but I never posted about it. Remembering how it came together is a fun walk through memory lane.

Central to this project is my very first Arduino. My boyfriend at the time bought it, spent an afternoon on it with a friend, and then decided he didn't want to do anything else with it and asked me if I wanted it. It is an Arduino Diecimila, and this must have been around 2008-2009, because I remember one of my first projects with Arduino was interfacing with an ultrasonic rangefinder and an optical encoder that year. It later become Swervy's brain. Memories of my freshman dorm remind me why students need hackerspaces:


I was a really bad roommate...I shared this room with my unfortunate roommate. The dividing line is supposed to be roughly in between the black backpack and black storage cube.

My 'hackerspace' at the time was a $99 refurbished Ryobi drill+circle saw, a $30 set of Ryobi drill bits,  a Radioshack soldering iron, and misc. things taken out of the dumpster from USC, along with some sensors I got from my FIRST robotics team and the aforementioned Arduino Diecimila.

Sophomore year of college, beginning of the spring semester, Sparkfun has a big promotional event/server load test. They were giving away one hundred $100 free orders (Sparkfun Free Day). I skipped that morning's Calculus lecture and hit refresh on the page until I won my free order. The four 6.5" red seven segment displays were in my cart for that order, along with a few other things including a Simon Says kit that taught me how to do surface mount soldering.


At some point in 2015 I dug up these parts and put it all together. The circuit is just a bunch of BJT transistors and some buttons to set the input. I never figured out what I wanted it to do, so I put it up on the wall and update it as a calendar, manually, whenever I come in to my office. The Arduino doesn't have enough output ports to handle all the digits at the same time so it enables/sets one at a time, repeatedly and very quickly. Very common solution to a common problem, but it was a fun afternoon project anyway. The wiring harness took the most time. I think I glued the whole thing to the wall with no consideration for whether I'd want to move it in the future. I'll have to tear it off the wall eventually, since Sector67 is getting a new building. Chris jokes sometimes that it looks like what the TSA might think is a bomb; this clock slightly predates the recent clock incident. Which also reminds me of this page which left a huge impression on me when I first read it in college and turned me into a fan.

Friday, March 31, 2017

Telegraph Sounder Player

Yesterday I put the finishing touches on my latest project, the telegraph sounder player.

Video link first if you want to see it in action before you read: https://www.youtube.com/watch?v=vKzZMfzuw68



The device on the left is the telegraph sounder player, and the device on the right is the telegraph sounder. The telegraph sounder itself is a vintage telegraph module. The player allows for up to four sounders to be plugged in to it, and plays back random snippets from text files loaded up on an SD card. It uses American Morse Code (as opposed to international). I loaded the SD card with excerpts from my favorite novels as well as full novels from Project Gutenberg.

The project was made for the owner of the telegraph sounder, Don. Don used to work as a telegrapher for a railroad company (US land based telegraph operations used American Morse code). He had a first version of the player built for him during the most recent Build Madison event and came back to Sector67 with notes for some features he would like for the next version.

Features I included in this telegraph player include: SD card playback, 4 sounder support, knob for speed of playback, separate knob for inter-sentence interval length, status led and ON/OFF toggle, CNC milled hardwood box with brass corners, battery power (est. 10 hours of playback), battery charging circuit onboard.

Design process in brief:

I had four 5V relays I found in a bin here at Sector, so I chose to support up to 4 sounders and to use a 5V Arduino for the brains. The relays are each switched by a BJT transistor circuit, and when switched they connect the sounder port to the battery pack.

The ports are barrel jack connectors with a 10uF capacitor across the leads and a high power rating. low resistance resistor. The resistor is mostly just in case the leads going to the telegraph are placed so that they are touching each other.

I had been playing around with trying to minimize the currents induced by the sounder turning off (the clack, as opposed to the click). A large ferrite choke close to the telegraph with many windings of the connector cable worked, but was a little unwieldy. The capacitor doesn't entirely solve the problem, but helps make the clack sharper. Too large a value and the clack actually bounces presumably as the capacitor charges on the induced current, and then discharges back into the sounder and re-enables it somewhat. The 'problem' is in any case limited to the Arduino resetting only when connected to a computer during the disconnection of the sounder, so it was possible to ignore. 

The SD card is connected through a bi-directional level converter. The Arduino has limited memory, so the code handles things by grabbing small chunks of the files into four small buffers for the four ports. The Morse codes are stored as 2-byte unsigned int objects, with the exception of a few special 4-byte unsigned long objects for some of the punctuation. The code can be found here: Arduino Code.

Access to the Arduino is through the bottom of the box, which is attached with screws to corners that are glued inside the box.


The project uses two battery cells that each have overcurrent and undervoltage protection built in. The charging circuit is as described here.

You can find more photos of the project and an additional write up here: https://imgur.com/gallery/fIBFZ

And a video here:
https://www.youtube.com/watch?v=vKzZMfzuw68






Friday, March 17, 2017

Painting Robot Compilation

A while back somebody donated an old robot arm kit to Sector67. I built a small model of it with glue, tape, foamboard, and potentiometers, and hooked it up to an Arduino. The project is very simple, but it has been very popular at science fairs (and has survived many hours of use requiring minimal fixes only). Here's a compilation video.