RFID Gesture Generating Robot

RFID Gesture Generating Robot

I built a robot that was exhibited at the Takeaway Festival which ran from May 19th to 21st 2009 at the Dana Centre in London. It generated gestures in space in response to RFID cards (e.g. Oystercards) presented by visitors.

Media Files

The RFID robot installed in the Dana Centre.

The RFID robot installed in the Dana Centre.

RFID gesture generating robot video no. 1

RFID gesture generating robot video no. 2

RFID gesture generating robot video no. 3

RFID gesture generating robot video no. 4

RFID gesture generating robot video no. 5

Exhibition brief

The Takeaway Festival of Do It Yourself Media 2009 put out a call at the end of 2008. The brief I responded to was:

“Artists are asked to submit proposals of work that utilize RFID (Radio Frequency Identification). This technology is normally used for retail stock control, supply and security but also has great potential as a way of identifying changes in the environment. Electronic RFID tags are attached to objects to allow them to transmit certain signals within a short distance or enclosed space.

Artists may choose to use RFID to share information, collect information, or to create reactions to the transmitted signals. Ideas involving the use of shop windows that are easily accessible to the public are welcome. RFID tags are part of the manufacture of many products and everyone has at least one on their person at any time. The selection panel will be looking for work that exposes this hidden nature or RFID. (You will find some useful links on the website that will enable artists who have not used RFID to learn about the technology. These also include existing examples of projects and associated information).”

My Proposal

My response was a proposal for an installation comprising of an RFID reader mounted in a console with a robot that swept out gestures in space determined by the tag number read from users’ cards. I was awarded the commission in January 2009 and spent the next few months building and programming the system.

I am interested in the human reaction to physical motion, particularly the characteristics of motion that make something appear elegant as it moves. In addition, I am fascinated by how one can encourage empathy with a machine by the way it moves. I am convinced that a moving, tangible object has a different impact on people to a 2D image displayed on a screen.

Rendering of proposed robot

Rendering of proposed robot

I proposed a robot made of two stepper motors mounted so that one rotated in a horizontal plane and one in a vertical plane. They moved a wand-like appendage in space, much like a orchestra conductor’s baton. An RFID reader read people’s RFID cards and generated a pattern of motion from the tag number and thus generated a gesture that was unique (and repeatable) associated with that card. It was intended to be an elegant, sweeping kind of motion, changing the path and speed according to the tag number.

Mifare cards store a tag number as 4 bytes (allowing 4,294,967,295 different tag numbers). The algorithm split the bytes into nibbles (4 bits), giving a array of eight values each ranging from 1 to 16. Sixteen (arbitrary) points in space, arranged around the periphery of the robot’s workspace, were assigned these values. When a card was read, the robot started moving towards the point representing the first nibble in its tag number. Before it could get there, the point associated with the next nibble started attracting the robot. Again, before it got there, the following nibble attracted it. This continued until all eight points were processed. The motion was smoothed so that the robot moved in sweeping curves. Thus, each card would generate a pattern unique to that card, but one that was the same every time the card was read. The gesture generated “belonged” to that card user.

Gesture Simulation results

Gesture Simulation results

I wrote a simulation program that generated gestures from RFID tag numbers. When it ran, it was clear that the robot was “trying” to get to a destination, but is always prevented from doing so. This triggers and empathic response from an observer and one “felt sorry” for it.

The build process

I built the robot in the period January 2009 to May 19 2009.

The base of the robot was an assembly of machined aluminium that I salvaged from a skip some years ago. I loved the cooling fins on it that gave it a kind of art deco look. It was very heavy and provided a stable base to the robot. It was screwed to a large wooden baseboard to prevent the robot from tipping over.

The main body of the robot was an acrylic tube 200mm diameter, 1000mm long  that I bought from Retail Engineering Design Ltd.

I used stepper motor drivers and power supplies I bought from Motion Control Products Ltd. I mounted them on horizontal aluminium plates that were secured on four 1000mm threaded rods, running vertically up the robot body.

The robot was run by one Arduino, together with some extra electronics (e.g. a relay to switch the mains light on and off). It was mounted in the body of the robot on another aluminium plate.

Two fans were mounted in the body to keep air circulating to keep the electronics cool.

I had an acrylic sheet cut to fit the top of the acrylic tube, secured to the four long threaded rods. A large stepper motor was secured to the underside of the plate, with its shaft inserted through a hole in the plate. Both the stepper motors I used were surplus ones I have had for some years. They were large, powerful and very heavy.

A turntable ball bearing was mounted on the upper surface of the top plate. In hindsight this was a bad choice because after the robot was run for a long period during the exhibition the bearing failed. There was too much torque on it from the weight of the arm twisting it upwards.

A platform was laser etched from acrylic and mounted on the bearing. The shaft of the stepper motor was connected directly to it. Another stepper motor was mounted on this platform in a horizontal position.

The design of the arm went through a series of iterations, the final one being an assembly of aluminium strips (to be as lightweight as possible) and they were slightly bowed to keep it stiff. A mains powered blue LED light was mounted on the distal end of the arm and the weight of the arm was balanced with counterweights. The arm was about 1000mm long and at head-height.

A console was built to house the RFID reader from tinker.it mounted on an Arduino. An LED indicated when the robot was ready to run and a button was provided to generate random RFID tag numbers in case a user didn’t have a card. The Arduino read the tag number from a card and split it into bytes. The byte values were sent as a serial stream over a wire to the Arduino inside the robot. That Arduino generated suitable stepper motor control pulses to move motors towards the targets specified by the tag number, as described above.  A smoothing algorithm was written to make the motion of the robot as smooth as possible. The blue LED light was lit when the robot was tracing out a gesture.

Installation

I had a lot of problems with the installation. Despite trying to make the arm as light as possible, it still required high torque from the stepper motors. The stepper motors got very hot (normal for stepper motors) which partially melted the couplings I machined out of acrylic. I would have used metal couplings but I didn’t have access to suitable workshop facilities. I initially programmed the robot to only run when someone presented a card, but I was requested to alter that so that it moved more to attract visitors’ interest. Because it ran continuously all day, there was a lot more strain on the structure than was originally planned for. At some point during the exhibition the main bearing failed.

I am planning to modify some of the mechanics of the robot so that it can be exhibited again at some point.

Comments are closed.