The Micro One-Armed Bandit

Over the holidays I came up with an idea to create a micro slot machine coded in #CircuitPython using the Adafruit 5×5 NeoBFF Led Matrix. A micro switch, or limit switch is used to ‘pull’ the slot machine arm. A Piezo buzzer bleeps and bloops familiar tones and may signal a win. Ding-ding-ding!

I paired this add-on board with the Atmel SAMD21 QTPY microcontroller which provided some issues with limited memory. This particular microcontroller has constraints which made this a fun challenge. The ESP32 and RP2040 versions of QTPY microcontroller boards on the other hand would allow for more advanced libraries to be used out-of-the-box and thus could have a much different solution.

If you’d like to jump straight to the CircuitPython code you can find it here:
https://github.com/somenice/micro-one-armed-bandit
Otherwise I’ll walk through the build and code.

Parts required:
Adafruit 5×5 NeoPixel Grid BFF Add-On for QT Py and Xiao
Adafruit QT Py – SAMD21 Dev Board with STEMMA QT
Piezo Buzzer
Micro Switch w/Lever
Header pins and a bit of wire
Optional: Right angle USB-C cable

Pins used:
The micro switch – ‘NO’ or Normally Open pin is on A1
The Piezo is on pin A2 which requires PWM
The 5×5 matrix uses pin A3 by default
All share ground

I first clipped 2 header pins of 7 then soldered the 5×5 NeoBFF on top of the headers while fitted in a breadboard. Next I soldered the QT Py board to the bottom of the headers, back-to-back, making sure to line the boards in the correct direction which is smartly printed on the back of matrix.

There is really no case or supports. The lever arm and piezo buzzer were soldered to the back with bits of wire dead bug style.

Back, top, bottom, and side views of wiring.
A piece of heat shrink tubing is wedged for some protection against short circuits

The right angle USB-C cable is perfect to have the matrix sit upright.

Almost all code is derived from some Adafruit CircuitPython tutorial.
One of the key functions of the micro switch as slot machine arm was to have the action happen on the release or rise of the lever.
This was made easy with Adafruit Debounce library.

from adafruit_debouncer import Debouncer
if switch.rose:

The idea of the slot machine was to have 5 “Reels” which get randomly offset as they “spin”. Most physical slot machines will have different reel sequences but this example uses just one.

I declare hex values to identify the colour then manually created the “Reel” with different colour frequencies. Note the number of RED instances vs PURPLE.

RED = 0x100000
REEL = [RED,GREEN,YELLOW,BLUE,PURPLE,YELLOW,RED,GREEN,BLUE,RED,YELLOW,RED,RED]

On release of the lever the first step is animate the existing columns down until coming slowly to rest with a random offset on the reel.

Yup, this whole thing relies on random.randint(0,len(REEL)) So not a hardened gambling system.

I played with piezo tones, simulated physical mechanisms and settled on some familiar notes that some 8-bit fans may remember while collecting coins in another world. The final slot gets a slightly higher note, adding that audio and visual stimulation.

One of the most fun and satisfying puzzles was working with only the Adafruit CircuitPython Neopixel library to animate the 5×5 RGB matrix using a nested loop of columns and rows.

for i in range(5): #COLUMNS
    for j in range(5): #ROWS

Using various math equations, very simple but new-to-me discoveries via trial and error to produce very satisfying results.

pixels[20-(5*i-j)] = #TOP LEFT ACROSS
pixels[20-(5*j-i)] = #TOP LEFT DOWN
pixels[5*i+j] = #BOTTOM LEFT ACROSS
pixels[5*j+i] = #BOTTOM LEFT UP
pixels[24-(5*i+j)] = #TOP RIGHT ACROSS
pixels[24-(5*j+i)] = #TOP RIGHT DOWN

Once again, you can see the code and fork here: https://github.com/somenice/micro-one-armed-bandit

As earlier mentioned, the memory constraints were a fun challenge as well as using minimal parts on a basic microcontroller.
Being a hobbyist with almost endless access to sensors, radios, and actuators can often lead to scope creep. Nothing stopping me from endlessly adding functionality until I get bored and quit. This one was fun to get at least to V 0.1

Wooden Macropad

What is it?

It’s an open source electronics kit, the Adafruit Macropad, embedded in a solid block of quarter-sawn tigerwood.

What can it do?

It’s a programable HID keyboard with OLED display and rotary encoder running CircuitPython. Which is a hardware specific, light-weight port of Python for microcontrollers.
The keys have RGB LEDs and can be programmed to send single or multiple keystrokes to the computer.

It shows up as a mountable drive, you can live-edit the code.py file so
when you save, the new code is automatically loaded. No compiling.

Similar to other computer mice, keyboards, or other peripherals it’s powered by USB so it does not run standalone.

The Woodworking began as a solid block of South American Tigerwood. Nice pattern and hardness but I can’t say I enjoy the smell of this wood. It’s got a gluey stank which is not particularly enjoyable.

I used a bandsaw to take a 1/4 inch veneer off the top. This will be the cover.

The bottom base was hollowed out using a plunge router. A device you must respect. It can clear out a lot of material quickly but at 10k to 30k RPMs it can easily get away from you in a hurry. Safety always first.

The two two pieces were then carved and fitted to accept the Macropad. It’s a bit of a shame to seal up the beautiful silkscreen art of this particular PCB.

The fitted Macropad had one other addition. I used a cut down credit card sized plastic magnifying glass using lenticular? magnification. The offset from the OLED gives the display a slight floating feeling.

Macrophotograpy of the Macropad

The project was finished with a couple coats of MInwax Tung Oil finish. Not a “true” tung oil but it makes the grain pop while not filling the wood pores.

The seams are a little more visible than I would like. From a distance however it’s not that noticeable. I’ve learned that minimal handling is required after separating using the “bandsaw box” technique to prevent “dings”.

As for programming I’m mainly using the application hotkeys demo found on the Adafruit Learning Guide but the sky is the limit because it can be programmed to do anything a keyboard or mouse can do.

Motivation to write this up comes from the recent Hackaday.io contest for Odd inputs and peculiar peripherals.

“Big Flood” lightbox

Blue led animation in steel laser cut Coast Salish artwork

Using the Adafruit Adabox #017 I added leds and an e-ink display to a lasercut lightbox made of steel by Coast Salish artist Xwalacktun as a gift this year.

A miniature version of He-yay meymuy (Big Flood), it’s 30cm tall with an 11cm diameter. The original piece is an impressive 487.8cm tall by 167.6cm made of aluminum. Located at the entrance to the Audain Art Museum, it’s is a powerful piece inside a beautiful building.

I used a metre of RGB neopixels wrapped around a cardboard tube, diffused with the bubblewrap, and plugged into the Adafruit Magtag to animate rain. In this mode the lightbox will eventually fill to a full blue colour.

There are three other modes with different colours and animations using the Adafruit_led_animation library. Each mode animates differently and updates the e-ink display with a section of art from the lightbox.

Adafruit MagTag eink display
Top View: With e-ink the last image will remain, even after removing power.

I even upcycled the spool from the neopixel strip to mount the MagTag as it fit snuggly. There are plenty of other features I’ve yet to take advantage using built in Wi-fi, light sensor, etc.

Art by Coast Salish artist Xwalacktun
296 x 128 px Indexed Colour .BMP used to display on e-ink

The base is made with glued pieces of western red cedar to mimic the architecture of the museum and carved to receive the artwork.

Grow Conference 2014

grow
I just attended the Grow Conference here in Whistler and I wanted to share a couple thoughts.

Billed as “an experiential playground exploring the future of innovation, growth and entrepreneurship” the conference tagline is LIVING IN A CONNECTED WORLD. These themes are near-and-dear to me so I was thrilled when they announced they were coming to Whistler this year. Offering a Lean pass, I was able to secure a fair priced ticket without the need to pay for transportation etc.

The conference, which took place at the Fairmont Chateau Whistler, was well organized and aside from a couple audio and scheduling issues, everything ran smooth. Along side the conference was a two day hack-a-thon. The challenge; To create a “connected” resort town! I wish it preceded the conference so that I could participate in both.

The mix between technical and non-technical attendees kept the discussions mostly high-level. The people I tended to connect with were lower-level developer types. With my poor entrepreneurship skills, I was unable to secure millions of dollars in venture capital. I wasn’t actually trying to sell anything. Except maybe Whistler. I would offer local advice, help every lost attendee, and generally just say “how awesome is this place eh?”. I probably wouldn’t have turn down 100k for my idea to start a hackerspace here in Whistler. In fact, I am trying to start a hackerspace right here in Whistler. Sadly, this town isn’t filled with geeks. Only few local tech companies like Guestfolio and Ridebooker here in Whistler and they represent a small percentage of the employers. No reason this can’t increase.

A few highlights and common themes;

Wearables and Internet of Things – These are definitely the hot topics getting all the attention. The Internet of Things (IoT), which everyone mutually agrees is a poor term, seems what you call any connected device that’s not a computer. Wearables, obviously are worn. Think Google Glass, Recon Instruments, fitbit, Nike+, etc.

Privacy and Security – These topics always come up immediately afterwards. As soon as you think “Cool, I can open my door from the internet” you realize that theoretically anyone else can too. That sleep monitor, check-in data, and all the things tracking you for convenience can be mined or interpreted and used against you. My opinion is those that take this serious will win out.
I was very impressed with SmartThings founder Jeff Hagins opinion on the subject and glad they are staying separate from Samsung after just recently being acquired. I can’t say the same for lax attitude of Life360, whose founder repeatedly made broad statements like “I don’t think your average user cares”. Their product GPS tracks family members btw.

Data – Data, Data, everywhere data. It’s not enough to just collect it, you need to use it. Inform.
Scale, infrastructure, APIs and the other things that were once hard have since been figured out. Some companies whole business is collecting customer data. Once you have the data… (see Privacy and Security)

Best talk and what resonated with me most was Scott Jenson’s How to make everything discoverable with the Physical Web.
In it, he smartly discussed how the physical web will need a mechanism unlike the current app model. The idea that every future smart-thing would require it’s own smart-app obviously has flaws.
Just Google “app fatigue” and you’ll see for the past year or so the world isn’t very appy anymore. Yay web! And that’s the idea, broadcast URI’s, no passive tracking. I’m looking forward to experimenting with this. You can too. Find out more with examples on GitHub.

Overall it was a great conference. I hope to be back again next year.

PABLO

Pablo is a physical chatbot. An open source social robot.

Pablo got his name from (PyAIML Arduino Bluetooth Low-Energy Object) or something like that.

1. A Python program running on a host computer accepts input from a web form.
2. Input is interpreted using Artificial Intelligence Markup Language (AIML)
3. Response is sent via Bluetooth and spoken by Pablo.

Pablo is open source software and hardware. Code can be found on GitHub

He can be found on Twitter here https://twitter.com/pablo_robot

Using the basic PyAIML example plus a simple webpy form we are able to talk to PABLO. I still need to get the text-to-speech Javascript API working. I was using the speech-to-text input feature in Chrome (x-webkit-speech) but it has since been deprecated.
AIML responses are constructed from a set of reduced answers to planned questions. eg. “What’s your name?” “Who are you?” “What are you called?” = PABLO.

NOTE: The USB cable in the eye is a temporary 5v power source only.
Not quite Natural Language Processing but with random responses and recollection it can make for a convincing conversations.

On the physical side of things, PABLO is made up of:
Pablo in pieces

(Clockwise from top right)
Arduino Duemilanove microcontroller (Or any compatible)
Adafruit BlueFruit EZ-Link Bluetooth Shield
Emic2 text-to-speech Module
1000 mAh Lipo battery
2 Adafruit NeoPixel rings (I’m using one 16 and one 12 pixel)
Adafruit 3v Trinket
Adafruit 4-channel I2C-safe Bi-directional Logic Level Converter
2 Hobby servo motors with Pan-tilt brackets
Cardboard head with wire-spool LED diffusing eyes
8 Ohm speaker

An Arduino with the Adafruit EZ-Link Bluetooth Shield receives the response from the host computer. The response is interpreted then commands issued to eyes, servos, and speech module. I used the proto area of the shield to connect headers to which I can temporarily plug in the text-to-speech module, two servo motors, and the level converter connection to eyes.

The eyes are controlled by a small microcontroller from Adafruit called the Trinket and is powered by the lipo battery. They are self supported and can easily be repurposed for other projects. I used a 16 pixel ring and a 12 pixel ring which made some of the eye functions a little specific to this build. The logic level converter is used to receive commands over I2C from the 5v Arduino microcontroller using the Tinywire library.

Pablo

Everything is currently crammed into a cardboard box with a speaker pointed down into the mouth. A talking function randomly moves the jaw servo while Pablo is talking, opening and closing the mouth. This combined with the advanced settings of the Emic2 voice module make for endless hilarity. A second servo twists the head briefly, as one might picture a confused dog, when an answer is not known. The datasheet from Parallax(PDF) shows you how to change the basic settings and take advantage of the more powerful DECTalk processor.

Lots of things to build on, still tons to do, least of which is his “personality”.

I plan to document more of the details and code as I go because today I’m hoping Pablo can help me win a trip to space! If not, he’s about the size of a CubeSat and I’ll send him into space.

UPDATE: Pablo was honoured to make an appearance on Adafruit’s July 23rd Show and Tell!