ARbattles is a project that combines robotics, electronics, computer vision, and projection mapping in a game where players battle each other through an augmented reality. It is controlled via a python script consisting of two modules, one for the game itself (shooting bullets, score, and displaying it all on the screen through pygame) and one for tracking where the robots are with a camera (computer vision). The project was displayed at various events, including xLabs at TEDxYouth@Austin , Austin Maker Faire, iPadPalooza and B&N Mini Maker


Project Lead

Riya Aggarwal


Michael Huang
Lee Balboa
Rachel Gardner


Project Status: Completed

Exhibition Dates:

Project Tags:

TedX, Robots, Programming, Computer Vision, Projection Mapping


ARbattles is made with two main parts: the software for the game/computer vision, and the actual robots, which are connected and controlled via BLE Bluetooth signals. The software uses computer vision to find the position and rotation of each robot and sends this information to the pygame part of the script. It was made using OpenCV for the basis of the computer vision with tweaks to intgrate it into the game module of the script. This module uses pygame, a cross-platform set of Python modules designed for writing video games. We use it to create a window that is projected onto a whiteboard where the game is played. It uses controller and camera input, and outputs bluetooth signals via a bluetooth transmitter and a screen that is projected on the whiteboard. Robots receive bluetooth signals, which contain a packet with the setting for the continuous rotation servo, and set their servos accordingly through a BLE RN4020 and an Arduino Pro Mini.

ARbattles Overview
ARbattles Software

Software: pygame and OpenCV

[Find the ARbattles repository on GitHub here]
ARbattles uses two main technologies for it’s software: pygame and OpenCV. The game part of the software is a pygame script. It receives information from the computer vision module and the joysticks; executes game mechanics code like score, bullets, rotation, etc.; displays the game on another window that is projected on the whiteboard; and sends out bluetooth signals in hex packets (0-255 for servo values) to the bluetooth transmitter connected to the computer which sends signals to the BLE2’s on the robots. The computer vision module uses OpenCV to find the rotation and location of each robot based on it’s color and shape, and sends this to the pygame script so it can display bullets in relation to those locations and rotations and detect collisions of bullets.


The electronics for each robot are essentially identical. The main processor used is the Arduino Pro Mini, which is connected to a Bluetooth Low Energy chip that sends bluetooth messages to the Arduino which sets the continuous rotation servos when the Program Switch is in “Run” mode. When in “Prog” mode the connection is broken to allow each chip to be programmed individually. The robot is made of 3D printed plastic with a magnet, 2 wheels controlled by continuous rotation servos, and a caster.

Close up Robot

 TEDx Exhibition

We took ARbattles to TEDx, and it was an amazing experience! After 3 months of working endlessly on this project, it was very rewarding to be able to see it in action. After setting up the night before, we tested calibration, charged batteries and did a few test runs. We saw some issues with collisions and boundaries so we worked on debugging right up until xLabs began. During the hour and a half long event, we had over 100 people come to visit our booth, fascinated by the robot that seemed to float (via magnets) on the whiteboard. It was exciting to see kids play with our project, enjoying the competitive aspect of the game. After the event, we listed out a variety of items to work on before our next exhibition, the Mini Maker Faire on May 7th.