This week we have looked into how to integrate ROS with the raspberry pi and all the other components of our system. We started setting up a virtual machine with ubuntu and ROS installed. Then we use catkin python tools to build the workspace and create our own package with some simple dependencies. When we create our own custom package for ROS it will allow us to use the cameras for the 3d mapping.
When we chose the camera input we found out that the system would either need one stereo camera as the optical sensor or two monocular cameras that we define as a stereo camera. This means that we need to find a package able to define the cameras in a way that allows us to use each camera output and define them as left and right camera for the 3d mapping. After defining the left and right camera we have to calibrate and synchronize the cameras so that we get data that is usable for the 3d mapping.
We looked into how to use ROS because we are not very experienced with how to program using ROS and how the syntax is. We found a sample code on how to run the cameras at the same time, this does not show us how to calibrate and use the data outputs in the 3d mapping.
Synchronizing would also need another package that we also need to use with ROS and define our cameras. We need to calibrate the cameras using a chessboard and openCV.
When it comes to the arm, we finished the physical build and attached it to the car. We had some friction problems and had to change the model, after the change the problem was solved. The reason we have the step motor upside down is that it is easier to attach it to the car and makes the gears rotate easier with less friction.