SCARA Robot Calibration

Background

I work as a technician in the mechanical engineering department of a local university where I am responsible for the robotics and control laboratory. I am currently developing a new piece of lab equipment: namely, a bespoke letterpress greeting card line.

In addition to being used for various lab classes, the card line will also give visitors the opportunity to take home a memento that they might actually value - a personalised birthday card for a relative or the like. The fact that imprinted on the back of each card will be a cute little logo saying something like “Made by Robots!” is neither here nor there :slight_smile:

The Problem

The robot I will be using to perform the composition and imposition of type is an Epson SCARA robot. Due to an error on my part, calibration information for this robot was lost. As a result, the robot has sat idle for several years - something I now wish to change.

A Solution

Whilst there are several ways to calibrate such a robot and thus bring it back to factory levels of accuracy and repeatability, I am going to attempt to do so using the minimum amount of equipment, ie no laser trackers, barbells, or other expensive toys.

Having thought about it a bit, I am convinced that I can calibrate the robot using nothing more than a random printed grid and the robot’s own vision system. What do I mean by a random grid? Picture a grid whose squares measure 10mm per side, and whose overall dimensions exceed the robot’s work area. Now randomly colour the squares black or white, and you get the idea.

The Procedure

The calibration procedure will go something like this:

  1. Drink way too much coffee
  2. Write a script to generate a random grid
  3. Print the grid onto a rigid sheet
  4. Secure the grid to the robot table
  5. Perform a crude robot joint calibration
  6. Drive the robot TCP to random (x,y) locations
  7. Capture an image of the grid at each location
  8. Process the image to determine the true TCP (x,y) location
  9. Use sparse bundle adjustment to back out the D-H parameters

Devember 2020

My goal for Devember is thus to learn enough OpenCV that I can perform the image processing necessary to determine whereabouts on the grid the robot TCP is positioned. I could do the image processing using the MATLAB image processing toolbox, but would like to learn OpenCV because why not.

3 Likes

Hey, another robotics person! I also work on robots. Calibration is a big topic, but I want to steer you away from generating your own targets, and instead using something preexisting. You’re going to have enough tasks at hand applying the calibrations to the robot.

For the ‘new’ hotness, check out ChArUco targets: https://docs.opencv.org/master/da/d13/tutorial_aruco_calibration.html

Another preexisting solution is apriltags: https://pypi.org/project/apriltag/

Is there a reason the robot’s built-in joint calibration isn’t providing you with the spatial repeatability you need? From a quick search, the SCARA has a pretty robust joint calibration routine (here).

1 Like