Pablo is a physical chatbot. An open source social robot.

Pablo got his name from (PyAIML Arduino Bluetooth Low-Energy Object) or something like that.

1. A Python program running on a host computer accepts input from a web form.
2. Input is interpreted using Artificial Intelligence Markup Language (AIML)
3. Response is sent via Bluetooth and spoken by Pablo.

Pablo is open source software and hardware. Code can be found on GitHub

He can be found on Twitter here

Using the basic PyAIML example plus a simple webpy form we are able to talk to PABLO. I still need to get the text-to-speech Javascript API working. I was using the speech-to-text input feature in Chrome (x-webkit-speech) but it has since been deprecated.
AIML responses are constructed from a set of reduced answers to planned questions. eg. “What’s your name?” “Who are you?” “What are you called?” = PABLO.

NOTE: The USB cable in the eye is a temporary 5v power source only.
Not quite Natural Language Processing but with random responses and recollection it can make for a convincing conversations.

On the physical side of things, PABLO is made up of:
Pablo in pieces

(Clockwise from top right)
Arduino Duemilanove microcontroller (Or any compatible)
Adafruit BlueFruit EZ-Link Bluetooth Shield
Emic2 text-to-speech Module
1000 mAh Lipo battery
2 Adafruit NeoPixel rings (I’m using one 16 and one 12 pixel)
Adafruit 3v Trinket
Adafruit 4-channel I2C-safe Bi-directional Logic Level Converter
2 Hobby servo motors with Pan-tilt brackets
Cardboard head with wire-spool LED diffusing eyes
8 Ohm speaker

An Arduino with the Adafruit EZ-Link Bluetooth Shield receives the response from the host computer. The response is interpreted then commands issued to eyes, servos, and speech module. I used the proto area of the shield to connect headers to which I can temporarily plug in the text-to-speech module, two servo motors, and the level converter connection to eyes.

The eyes are controlled by a small microcontroller from Adafruit called the Trinket and is powered by the lipo battery. They are self supported and can easily be repurposed for other projects. I used a 16 pixel ring and a 12 pixel ring which made some of the eye functions a little specific to this build. The logic level converter is used to receive commands over I2C from the 5v Arduino microcontroller using the Tinywire library.


Everything is currently crammed into a cardboard box with a speaker pointed down into the mouth. A talking function randomly moves the jaw servo while Pablo is talking, opening and closing the mouth. This combined with the advanced settings of the Emic2 voice module make for endless hilarity. A second servo twists the head briefly, as one might picture a confused dog, when an answer is not known. The datasheet from Parallax(PDF) shows you how to change the basic settings and take advantage of the more powerful DECTalk processor.

Lots of things to build on, still tons to do, least of which is his “personality”.

I plan to document more of the details and code as I go because today I’m hoping Pablo can help me win a trip to space! If not, he’s about the size of a CubeSat and I’ll send him into space.

UPDATE: Pablo was honoured to make an appearance on Adafruit’s July 23rd Show and Tell!

GPS Glove with RGB LEDs


Thrilled to have been mentioned on Adafruit’s Ask an Engineer live webcast last week, I decided to write up more about my glove project. Here is a quick 6 second video.

Using Adafruit’s open-source Arduino compatible board, the Flora, a GPS module, and four RGB LED “pixels”, I adapted a North Face Hyvent glove to passively respond to my location on earth and to relay data.


Mostly for fun but I can see where this could be useful to someone in situations. I’m discovering some limitations along the way as well.

The project is basically a fork of Adafruit’s own excellent Flora GPS Jacket tutorial. You can find their code on Github.

I’ve added coordinates for all the lifts on Whistler Blackcomb in an array that get checked against my current location. If within a specified range (10m) the LEDs blink a calming red pulse (or warning).

GPS Coordinates of Whistler Blackcomb Lifts

#define GEO_LAT1 50.093744
#define GEO_LON1 -122.988872

#define GEO_LAT2 50.085781
#define GEO_LON2 -122.963975

#define GEO_LAT3 50.066972
#define GEO_LON3 -122.951978

#define GEO_LAT4 50.067606
#define GEO_LON4 -122.931094

#define GEO_LAT5 50.058631
#define GEO_LON5 -122.918017

#define GEO_LAT6 50.084258
#define GEO_LON6 -122.941625

#define GEO_LAT7 50.112939
#define GEO_LON7 -122.953297

//W Gondola
#define GEO_LAT8 50.112906
#define GEO_LON8 -122.954214

//B Gondola
#define GEO_LAT9 50.113458
#define GEO_LON9 -122.953381

#define GEO_LAT10 50.115503
#define GEO_LON10 -122.947719

#define GEO_LAT11 50.106319
#define GEO_LON11 -122.920436

#define GEO_LAT12 50.099919
#define GEO_LON12 -122.915525

#define GEO_LAT13 50.078456
#define GEO_LON13 -122.895761

#define GEO_LAT14 50.106478
#define GEO_LON14 -122.900831

#define GEO_LAT15 50.106447
#define GEO_LON15 -122.899889

#define GEO_LAT16 50.109542
#define GEO_LON16 -122.90565

#define GEO_LAT17 50.111767
#define GEO_LON17 -122.922981

Altitude, time, location and speed are displayed by a unique animation indicating the data type; the numbers are then blinked  in base 10 sequence across the fingers. Digits on digits.

I’ve got it set up to also indicate when I’m above a mile high (1609m) and to do other functions at precise times or locations. Get moving over twenty knots (37kph) and the lights flash alternating blue and red for a police chase effect. I purposely let the glove do it’s thing and be passive rather than introduce controls to make it interactive.


Building the glove was made easy by the design of the glove, or so I thought. The mesh inside pocket on top of the glove easily houses everything and the mesh makes weaving conductive thread easier. Working with conductive thread was *by far* the hardest part. Thread with a mind of it’s own. Nail polish as a knot sealer is the key here. Snaps sewn inside allow the Flora and GPS boards to be removed and used in other projects.

It’s holding up after a few days wear-and-tear skiing and snowshoeing except around the LEDs where the fabric is lifting. I’m still experimenting and fixing little things. Next steps are to add logging. Once stable I’ll share it on Github.

555 PIR Motion Activated Camera

I have just started playing around with robotics and electronics after getting hooked by the Arduino microcontroller. Mostly tinkering with different sensors and servos and discovering how things work. The 555 timer showed up in lots of reading as being simple but a very versatile integrated circuit. There are a ton of examples on the internet of what the little chip can do and the 555 Contest sparked my interest to learn more. Here is my entry into the contest.

A circuit with a 555 in monostable mode triggered by a PIR

Nothing real original here, I’m simply using the 555 timer in monostable mode. A PIR sensor detects motion and using an NPN transistor, triggers pin 2 by setting the voltage negative. Pin 3 sends just over 3V to the Canon WL-DC100 camera remote which has been rigged with its button permanently pressed.

Depending on your camera, you could trigger it directly with a cable but I like the idea of separating the sensor from the camera.

Motion sensor camera

Next step is to get this hooked up with batteries in a weather-proof case and try to catch some wildlife.
Lots of fun learning all about the 555.

Here is the schematic for the circuit;


Categorized as Electronics