The JARVIS of testing POS: meet GRANT

Being in the payment business, we often feel the heavy burden of testing on different terminals. 2 years ago, the Payworks Team started a project where we automated some of the testing process using a LEGO Mindstorms EV3, you can read about that more here. The limitations with this solution were quickly reached and we started looking for something more reliable.

Great Robotic Assistant Necessary for Testing (GRANT)

 

Meet GRANT

 

A challenger arose: I built the first iteration of a robotic solution to automate testing the products being developed at the company. GRANT, compared to the LEGO invariant, is now capable of using both NFC and magnetic stripe, and the terminal being tested is easy to replace. GRANT is triggered by commits and it is kept busy by continuous development which helps our developers focus on improving our product.

 

Breakdown

This project is the love child of wood and plastic: I used wood to fix the components to their place and custom 3D printed parts to convert the circular movement from the servos. Just switching the softer plastic of LEGO to PLA made a huge improvement.

Let’s see the recipe:

  • 1 Raspberry Pi 3 Model B: this runs the Python code to control the robot. There’s an API exposed to it, so you can trigger movement with simple HTTP GET requests e.g.: /arm/1234 or /icc/use. I also generated a website so positions for the buttons can be recorded. This is super important if you want to switch terminals — different devices have different dimensions, for example, the buttons are aligned differently and their sizes can also be different.
  • 7 smart servo motors: Four of them give 2–2 degrees of freedom for the arm, 1 taps the NFC card on the terminal, 1 inserts the card and 1 swipes. The servos I’ve chosen are Daisy-chainable, which is really convenient for the arm. You just have to connect the servo to the one next to it, saving you a lot of trouble from cable management. They are all connected to a serial hub, which also feeds them the juice: the power supply barrel is connected to a 12V5A PSU (this equals to roughly 4 pieces of Lithium-ion AAA battery).

 

Current development

The distant vision of automation in the company is to have the capability of doing test cases from suites such as ADVT and M-TIP. Currently we do our tests with emulated cards, however, it’s not only our SDK which is required to be tested, but we also have to ensure that our software works as intended after integrating into other solutions. For this, there are specific test scenarios which have to be performed. The test environment has a built-in server, so I was able to create a command line interface to remotely start a test case, show the info bound to that specific test (e.g. what kind of validations to look out for), allow for card simulation to be automatically triggered through the start of the test, and as soon as the command line tool gets the details of the test case from the server, I automatically trigger a transaction on a terminal with the amount defined in the test case.

In short: you only have to select the test case in the tool, then insert the card in the terminal. After this, the tool collects some of the data required by the test case (e.g. the currency, sometimes the TSI, TVR, etc.), sends you the customer/merchant receipt, gets the card logs if they exist.

This solution is already capable of making use of the magnetic stripe simulator attached, therefore I will be able to use any kind of card which has an image defined by the major brands (e.g. MasterCard).

 

What does the future hold?

 

The Future

 

For starters, since the first iteration is constantly being used in our continuous delivery flow, I need to build a second robot. Sneak peek: no more wood.

Second: software-side improvements. Currently, the results can only partially be evaluated automatically. This mainly happens when the terminal shows some behavior which we can’t process on our side, if it beeps/shows some text, the test doesn’t necessarily specify all the steps precisely, but the terminal gives you instructions (e.g. when you need to select an application). There are so many possible combinations of such scenarios that starting to do “exception-driven development” sounds like the least reasonable option.
But wait! Python? Raspberry? I already have the 2 main ingredients for a deep learning system! Adding a camera to the mix, letting the robot run and collect errors — after a while, it’ll learn which steps will lead to a successful test! Python supports a lot of incredibly efficient, open source image recognition and deep learning libraries (e.g. TensorFlow), which will make it easier to implement such a system.

Stay tuned for more developments!

 

Written by Pal Szentgyorgyi, Automation Engineer at Payworks
First published on Medium on 11-10-2017