Group 2: Card playing robot – Sprint 3


Sprint 3: 04.09.2020 -11.09.2020

During our weekly meeting we updated each other on the status of the different tasks from the scum board and challenges we faced during our last sprint. Using our scrum board, we structured the different tasks During the last sprint there were several concepts within our system that were still up for debate. This was true for the card dealing and shuffling functionality. So, when we had our weekly meeting, this was something we discussed. After several discussions, we came to the conclusion that we would implement a dealer with a robot arm as shown below, and that the shuffling of the cards would be of a lower priority.


Sketch of shuffler and dealer

The drawing shows a rough sketch of the shuffler and the robotic arm. The idea was that the shuffler has a piston that can lower and raise a platform that the card deck rests on. When it needs to shuffle the cards, the piston raises the platform up to the shuffler. After shuffling, the cards are fed onto the platform once again, and the piston lowers the card deck down for the robotic arm to grab and distribute. 

After deciding that we’ll put aside the shuffler function of the system for now, it was determined that we’ll focus on the distribution of cards function first. The robotic arm shall be equipped with a distributor sub-system that deals the cards to the players and is able to distribute cards facing both up and down.

Machine (Simen):

To prove the concept and design the parts as precisely as possible, we have started to design an arm in Solidworks that we aim to 3D print as a prototype, which we will remake and refurbish into a sturdier and better fit model for the “task at hand”, following our initial testing.

The difficulties with both designing and printing it will be the weight factors, the size factors (this is an ARM meant for holding regular playing cards), in addition to the functionality of the arm itself.


Initial design of the robot arm.

This arm will need two servos to rotate both arm and hand along the arms axis, in addition to 3 servos to adjust the angles of the arm parts. The hand (to the right in picture) here is the integral part, and will also need a motor for shooting our cards. Standing as is, the arm reaches about 34cm from the first angle joint. It may be too short, but we will figure this out with our initial testing and prototyping.

The arm could also be made of plywood, but that would require more parts, and most likely, more servos to lift and turn it, but on the other “hand” it would greatly increase the potential length of the arm, as most 3d printers have a limited area it can print for one part, without going overboard on partitioning of parts (albeit it could be a possibility if needed), in addition to the aforementioned added servos increasing the strength of the arm. There is also a chance to make it out of composites, although that would take time and may not yield enough of a benefit to focus on for this project.

A design I will focus on next will be the hand mechanism, how to let it hold a deck of cards and shoot them. Idea in the making: Pre-made decks as cartridges, that the arm just pushes into and self-inserts, uses, then ejects after use. A mock design has been made. Will aim to 3d print to find out the realism of the parts, and what material and solutional choices to make onwards. Shown in picture below:

The one armed bandit

Data: Components-(Danial)

During our last sprint, we chose to use the Raspberry Pi separately, but after further discussions we deduced that, since we want to increase the complexity of our system, we have to connect the Arduino 2560 to the Raspberry pi. The robot arm and the ejector require more than 4 engines. Since we plan to use more than 4 engines, we have to connect  the Arduino to the Raspberry Pi since it does not have more than four IO-pins. This also gives the best result for a reasonable price. Arduino should do the simplest commands since it does not have the ability to run multiple commands at the same time. The Raspberry Pi is able to deliver 3.3V which is a bit too small since we are going to use motors that require 5V. That is why we have connected the Arduino because it has 5V and will have the ability to run the motors(stepper motor, etc.). We will have to use a transformer or similar, this will be presented in the electrical part. 

We have also looked at the possibility of using a PWM board which has the possibility of controlling all the motors on a board which has 16 IO-pins, we will also take this opportunity further. Jetson Nano is also an alternative to a microcontroller that we have considered and will take with us further in the process. We are also waiting to receive the various parts from our teacher, so that we can test the computer components. We are waiting for Raspberry Pi, Raspberry camera Module.

Data: Object detection-(Azim)

For our card playing robot we will implement object detection in order to identify the number and the suit. As mentioned in the previous post, we will do this by using a raspberry pi and a pi camera module. I’m fairly new to machine learning, but luckily, this area within computer science is very popular, and there are a lot of helpful articles and tutorials to learn from, when trying to choose the best application. So, from this research I quickly learned that the best library for our task would be OpenCV, a library of programming functions to allow for real-time computer vision, it is also widely used with Python and C++ which is an advantage, considering that I have worked a lot with these languages.

For the object-detection algorithm we initially looked at YOLO (you only look once), which is a state-of-the-art, real-time object detection system, it applies a single neural network to the full image and divides the image into regions and predicts bounding boxes and probabilities for each region and it works for video streams. This seemed perfect in the beginning, but when we began further researching, and more importantly consulting with previous groups who had tried to use yolo on the raspberry pi, we came across some recurring issues regarding low fps, unexpected crashes and overheating in the raspberry pi. We also considered that the task we needed to perform with the object detection did not warrant the extensiveness of the YOLO system, hence why we decided to work with TensorFlow object detection, and an open source library that works well for image recognition and training neural networks.

In order for the system to recognize our cards with different numbers and suits we need some training images. For this purpose, we can go with a labeled dataset for the playing cards or create one ourselves with several training images. We decided to go with the latter for better optimization and control of the dataset.

Building up data and tagging

While i’m waiting for the raspberry pi, i’ve started playing with Tensorflow using my webcam to set up the object detection and seeing how well it does recognizing objects. Obviously, the object detection is not the hardest part considering there are many great tutorials one can follow, but what i’m spending more time on, is finding out the most efficient way of processing the information from the camera about the card number and suit, and using that information to build up the blackjack game.

Data: App-(Bjørnar)

We figured an app connected to the robot via bluetooth would be the best and easiest solution for remote connection. Virtually everyone in the modern world has a phone with bluetooth, and can therefore control the robot simply by installing our app. By designing an app ourselves we can also make it as user friendly and simple to use as we want. One weakness of bluetooth is a slight input delay. Since our app will mostly be used for changing settings and not real time control this should not be a problem to us. 

To change the games settings we needed to choose between multiple different methods of interaction. We discussed the following:

  1. Voice recognition
  2. Physical buttons
  3. Bluetooth
  4. LAN connection

Furthermore, many functions within the app are still under discussion, but there are a couple of functions with high priority that we will absolutely implement, like:

  1. Establishing a stable bluetooth connection with the raspberry pi
  2. Injecting or changing variables in the raspberry’s python script from the UI in the app

So far the baseline for the apps interface is beginning to take shape. I have also written some code that in theory should set up a connection between the phone and the raspberry. The python script on the raspberry will constantly listen to the bluetooth channel, here it picks up the different bitstreams that the app is sending. These bitstreams can be converted to integers, which again change variables in the script. Though all of this is hard to test before we get our hands on a real unit. We anticipate getting the raspberry within next week, so the progress will continue in the next sprint.


Work in progress picture of the app’s “game settings” page

Electro (Sondre):

Deciding on the design of the arm means that decisions surrounding the strength of the arm and how fast we want it to act can be made. After the arm has been 3D-printed, the process of placing the actuators that will move the arm can begin. Which actuators we’ll use haven’t been decided yet as we’re still unsure how heavy the arm and its ‘hand’ will be. After a brief discussion about which motors that are available to us with the one responsible for the equipment, it turns out that there are plenty of motors to choose from. 

Knowing that there isn’t a lack of motors to choose from, we have to determine which motors for our system that are capable of generating enough power to run the system efficiently, as well as ensuring that the motors are as small as possible to make sure that they fit inside the parts. Moving forward, calculations and measurements must be made to choose the right motors, as well as which type of motors that we want, as there are different operations and functions that these motors need to perform. In some cases we probably need precision over speed, and vice versa in other cases. By next week, we will begin with the motors on the main part of the arm.

While determining which motors are needed and how many, this issue also leads to the question of how we’re going to run these motors. As of now, the system is utilizing a Raspberry Pi to run the system operations and functions. When the Raspberry Pi needs to send out signals to the motors, we need something that can handle these signals and to pass them forward to the correct motors. Our first initial thought was to utilize an Arduino board to handle these signals, but after some consideration, another possibility is to include a dedicated board for this objective, in some form of a multiplexer. This is something that we’ll be looking at in the next sprint.




Leave a Reply