AROWEEK – Week 1& 2

Bram van den Nieuwendijk – Mechinical Engineering

Week 2 I did some recearch in possible layouts of the car, and started thinking about what type of mechincal aspects come to play in the design. I did some brainstorming with Rick about this aswell.

Åsmund Thygesen – Software Engineer

Week 2 I was supposed to start looking into the software side of the motion control system and getting familiar with using the GPIO pins on the raspberry pi. Due to “life”, I have not really gotten started on much, other than looking a bit into controlling the GPIO pins on the pi.

I’m off to a bit of a slow start, but we have now borrowed some equipment and narrowed down the plan a bit, so hopefully this next week will be more productive.

Rick Embregts – Electrical Engineer

To make a first design for the electrical system we need to figure out what components we can use and what the vehicle is going to look like in its simplest form. Together with Bram we had a lite brain storm session to come up with a few rough layout’s of the vehicle. Next week will be about deciding on the vehicle layout based on the inventory at school.

Darkio Luft – Software Engineer

Week 2 I was searching for the tools that I could use to do the image recognition and

remembering implementation techniques to develop the neural network.

I decided to use the framework YOLO and implement it on a raspberry pi 4 to recognize the weeds.

Sulaf Bozomqita – Software Engineer

Navigation and Terrain Mapping Options for AROWEEK

Overview

During this week I looked into the different navigational options and alternatives for our autonomous robot.

I had several criteria and guidelines I adhered to while considering the multiple alternatives, using these questions as a guideline:

Is the navigational system autonomous enough?

Is it too simplistic?

Is it achievable?

Do we have the materials to implement the functionalities needed?

Option 1 – Real-Time Line Tracking with Obstacle Detection:

The first navigation concept involves real-time movement mapping using lines on the ground that the vehicle robot can pick up using IR sensors. Combined with ultrasonic sensors, obstacle detection can be included also as a reassurance that the robot will be capable of avoiding any objects in its way. While this approach offers dynamic responsiveness, it presents several limitations, particularly with scalability.

This method may work on a small scale, but it is not very scalable for larger or more complex environments. While we do not need to make the project function on a large scale, it would be advantageous if we created a product that could be adapted on a larger scale if it was to be implemented within agricultural fields. Additionally, since our project is only a prototype and not intended for commercial deployment, this level of complexity with lines, that might be substituted for wires, on such a large scale, is unnecessary.

Option 2 – Terrain Recognition & Mapping

LIDAR-Based Mapping is another option I looked into, as it seemed very promising the more I read about it.

By using a LIDAR, which emits laser pulses and measures return time to map the terrain it’s in, the terrain mapping part could be tackled. Paired with stereo camera, which estimates depth via disparity (difference between images), it could offer highly accurate terrain detection.

Even so, certain limitations occurred. For one, I did not know whether we had such a device on campus. However, I was able to find it online for a decent price. Shipping time, as well as the reliability of the product, was something to consider as well when looking at this option. I kept this option in the back of my mind while I kept looking for other alternatives.

Option 3 – Hybrid: Obstacle Detection with a Varied Navigational System

Just like with the first option, Real-Time Line Detection, this method could use Ultrasonic Sensors to navigate through obstacles. This option experiments with the idea of having a predefined terrain pattern instead of any real time navigation. The navigational method through the field could possibly be in a zig-zag motion to ensure larger terrain coverage, just as autonomous lawn mowers maneuver. This idea also explores wheel encoders that measure how far the robot has traveled by also using IMU sensors to track its’ orientation. This method I found out is already well established, and is called dead reckoning.

How will it work?

It relies on local sensors to detect obstacles, in our case that could possibly be other flowers or rocks in our terrain and effectively avoids them. It also has a camera, of course, to identify the weeds and trigger the spray option. The IMU here as briefly mentioned above, could be used for the navigation itself to track the robot’s orientation and movement. It does not require GPS or LiDAR, can operate with the help of sensors, and remember where it has been before in the terrain (ergo effectively not visiting the same spot twice withing the cycle (which is done through Rasberry Pi)). The robot could also be programmed to follow a pattern that covers the entire terrain, if the predefined pattern is known. And by having the robot able to avoid obstacles when maneuvering through the terrain, we have a solid plan in with the navigational system. All these options would be presented to the group the upcoming week, and we would discuss the options, and any other alternatives tha could be viable as well.


Leave a Reply