{"id":11255,"date":"2025-10-13T09:15:21","date_gmt":"2025-10-13T08:15:21","guid":{"rendered":"https:\/\/dronesonen.usn.no\/?p=11255"},"modified":"2025-12-08T17:59:07","modified_gmt":"2025-12-08T16:59:07","slug":"astrorover-group-5-week-8-matetoget-mater-videre","status":"publish","type":"post","link":"https:\/\/dronesonen.usn.no\/?p=11255","title":{"rendered":"ASTROROVER &#8211; GROUP 5 &#8211; WEEK 8 &#8211; MATETOGET MATER VIDERE"},"content":{"rendered":"\n<h2 class=\"wp-block-heading\">August:<\/h2>\n\n\n\n<p>We are back on the horeseback. This week we have started to make the car autonomous to the max. In our first iteration we will make it dumb autonomous. What I mean with this, is that we will make the car drive forward, and stop driving when the Ultrasonic sensor detects obstacles closer to the car than the threshold allows. We will later integrate this with the LiDAR, and make it smart. As always I will follow best-practice. In this way, it will be later to integrate with ros2 and it will be easier to maintain the code, implement functions to the code, and to read the code.<\/p>\n\n\n\n<p>Before i start a task, i want to make sure that I dont miss a thing. When Sander and I wrote on the Game Design Document, he made a list in excel. Hes a frickin genius. We will never forget anything ever again (unless we forget to write something on the list). <\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Implementation of Ultrasonic Sensor &#8211; Start of Autonomous Driving<\/h3>\n\n\n\n<h4 class=\"wp-block-heading\">Ultrasonic Sensor &#8211; Check List &#8211; Iteration 1<\/h4>\n\n\n\n<figure class=\"wp-block-image size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"615\" height=\"192\" src=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-51.png\" alt=\"\" class=\"wp-image-11265\" style=\"width:650px;height:auto\" srcset=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-51.png 615w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-51-300x94.png 300w\" sizes=\"auto, (max-width: 615px) 100vw, 615px\" \/><\/figure>\n\n\n\n<p>I started by getting a grip of how to connect the Ultrasonic sensor (from now US) to the correct pins on the microbit drivingboard. I of course did that by following best-practice and created a kopplings scheme:<\/p>\n\n\n\n<h4 class=\"wp-block-heading\">Connection scheme<\/h4>\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"1402\" height=\"1094\" src=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/smarte_system.drawio-13.png\" alt=\"\" class=\"wp-image-11393\" srcset=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/smarte_system.drawio-13.png 1402w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/smarte_system.drawio-13-300x234.png 300w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/smarte_system.drawio-13-1024x799.png 1024w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/smarte_system.drawio-13-768x599.png 768w\" sizes=\"auto, (max-width: 1402px) 100vw, 1402px\" \/><\/figure>\n\n\n\n<p>Then i connect the pins physically. <\/p>\n\n\n\n<p>VCC-&gt;3v3<\/p>\n\n\n\n<p>TRIG-&gt;P1<\/p>\n\n\n\n<p>ECHO-&gt;P2<\/p>\n\n\n\n<p>GND-&gt;GND<\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"768\" height=\"1024\" src=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/IMG_7360-768x1024.jpg\" alt=\"\" class=\"wp-image-11256\" srcset=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/IMG_7360-768x1024.jpg 768w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/IMG_7360-225x300.jpg 225w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/IMG_7360-1152x1536.jpg 1152w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/IMG_7360-1536x2048.jpg 1536w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/IMG_7360-scaled.jpg 1920w\" sizes=\"auto, (max-width: 768px) 100vw, 768px\" \/><\/figure>\n\n\n\n<h4 class=\"wp-block-heading\">Simple Flow Chart of the Implementation<\/h4>\n\n\n\n<p>The code is more complex than this, but its a good way to explain to others, I think. <\/p>\n\n\n\n<figure class=\"wp-block-image size-large is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"375\" height=\"1024\" src=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/smarte_system.drawio-15-375x1024.png\" alt=\"\" class=\"wp-image-11399\" style=\"width:427px;height:auto\" srcset=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/smarte_system.drawio-15-375x1024.png 375w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/smarte_system.drawio-15-110x300.png 110w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/smarte_system.drawio-15-768x2095.png 768w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/smarte_system.drawio-15-563x1536.png 563w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/smarte_system.drawio-15-751x2048.png 751w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/smarte_system.drawio-15-scaled.png 938w\" sizes=\"auto, (max-width: 375px) 100vw, 375px\" \/><\/figure>\n\n\n\n<h4 class=\"wp-block-heading\">First Ultrasonic Sensor readings<\/h4>\n\n\n\n<p> When this was okay, I implemented an easy code, just to make sure that the hardware worked, and that I got readings in the terminal. <\/p>\n\n\n\n<figure class=\"wp-block-video\"><video controls src=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/219FF887-F83C-49BB-BA2B-28B95891E86A.mp4\"><\/video><\/figure>\n\n\n\n<p>I then wanted to make the car drive forward. I struggled sometime by myself, but after some time, I contacted Sander. He had worked on this, and he sent me a code. I then got the car starting.<\/p>\n\n\n\n<p>After this i implemented OOP code the Ultrasonic Sensor. Extremely satisfying work. Picture of the US code:<\/p>\n\n\n\n<h4 class=\"wp-block-heading\">UltrasonicSensor.h<\/h4>\n\n\n\n<figure class=\"wp-block-image size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"506\" height=\"348\" src=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-47.png\" alt=\"\" class=\"wp-image-11259\" style=\"width:579px;height:auto\" srcset=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-47.png 506w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-47-300x206.png 300w\" sizes=\"auto, (max-width: 506px) 100vw, 506px\" \/><\/figure>\n\n\n\n<h4 class=\"wp-block-heading\">CarState.h<\/h4>\n\n\n\n<figure class=\"wp-block-image size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"211\" height=\"213\" src=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-48.png\" alt=\"\" class=\"wp-image-11260\" style=\"width:264px;height:auto\" srcset=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-48.png 211w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-48-150x150.png 150w\" sizes=\"auto, (max-width: 211px) 100vw, 211px\" \/><\/figure>\n\n\n\n<h4 class=\"wp-block-heading\">UltrasonicSensor.cpp<\/h4>\n\n\n\n<figure class=\"wp-block-image size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"566\" height=\"401\" src=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-49.png\" alt=\"\" class=\"wp-image-11261\" style=\"width:627px;height:auto\" srcset=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-49.png 566w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-49-300x213.png 300w\" sizes=\"auto, (max-width: 566px) 100vw, 566px\" \/><\/figure>\n\n\n\n<h4 class=\"wp-block-heading\">main.cpp<\/h4>\n\n\n\n<p>This will probably be implemented to a function on a later iteration. When we will integrate this to the others sub system running on Ros2 together with the LiDAR.<\/p>\n\n\n\n<figure class=\"wp-block-image size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"531\" height=\"235\" src=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-52.png\" alt=\"\" class=\"wp-image-11268\" style=\"width:592px;height:auto\" srcset=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-52.png 531w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-52-300x133.png 300w\" sizes=\"auto, (max-width: 531px) 100vw, 531px\" \/><\/figure>\n\n\n\n<h4 class=\"wp-block-heading\">Important from MotorController.cpp<\/h4>\n\n\n\n<p>This code will change drastically in the next iterations. Its just a matter of time. I will probably make a lot of handlers, that will be the functions in the different switch cases.<\/p>\n\n\n\n<figure class=\"wp-block-image size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"402\" height=\"445\" src=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-53.png\" alt=\"\" class=\"wp-image-11269\" style=\"width:573px;height:auto\" srcset=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-53.png 402w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-53-271x300.png 271w\" sizes=\"auto, (max-width: 402px) 100vw, 402px\" \/><\/figure>\n\n\n\n<h4 class=\"wp-block-heading\">Plug and play<\/h4>\n\n\n\n<p>After OOP the code, it was plug and play:<\/p>\n\n\n\n<figure class=\"wp-block-video\"><video controls src=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/ACB7AB00-D779-4206-8E50-48574DF2697C.mp4\"><\/video><\/figure>\n\n\n\n<h4 class=\"wp-block-heading\">List after finnishing this task:<\/h4>\n\n\n\n<figure class=\"wp-block-image size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"617\" height=\"192\" src=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-54.png\" alt=\"\" class=\"wp-image-11283\" style=\"width:650px;height:auto\" srcset=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-54.png 617w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-54-300x93.png 300w\" sizes=\"auto, (max-width: 617px) 100vw, 617px\" \/><\/figure>\n\n\n\n<p>Pushed to gitHub, quality approved, and moving on for more fun!<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity is-style-wide\" \/>\n\n\n\n<h3 class=\"wp-block-heading\">Sense HAT &#8211; Gyroscopic and Accelerometer &#8211; Continuation of Autonomous Driving<\/h3>\n\n\n\n<h4 class=\"wp-block-heading\">Sense HAT &#8211; Check List &#8211; Iteration 1<\/h4>\n\n\n\n<figure class=\"wp-block-image size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"645\" height=\"213\" src=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-56.png\" alt=\"\" class=\"wp-image-11289\" style=\"width:650px;height:auto\" srcset=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-56.png 645w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-56-300x99.png 300w\" sizes=\"auto, (max-width: 645px) 100vw, 645px\" \/><\/figure>\n\n\n\n<p>We want to make our car autonomous. We could just implement US on all sides, and had default forward driwing, move if oubstacle is in way. But we want the car to be smarter, and hopefully do some sort of A*algorithm. To make it smarter we want to implement a LiDAR, and Sens HAT. The Sense HAT have both an accelerometer and a gyroscope. With this as our starting point, the sky is the limit.<\/p>\n\n\n\n<h4 class=\"wp-block-heading\">Connecting Sense HAT -&gt; RPi4<\/h4>\n\n\n\n<p>I started with connecting the Sense HAT to the GPIO-pins to the RPi4, turned it on, and this is what it looks like:<\/p>\n\n\n\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/mail.google.com\/mail\/u\/0?ui=2&amp;ik=fce5898061&amp;attid=0.1&amp;permmsgid=msg-a:r-5911224203533243333&amp;th=199ce2e59d350f11&amp;view=fimg&amp;fur=ip&amp;permmsgid=msg-a:r-5911224203533243333&amp;sz=s0-l75-ft&amp;attbid=ANGjdJ-kBnaClDmEQYHzec81Whr2NrZjDlF2FKl9QzpnIqRcTPlVV-1DqbNVM1FFYztx_nJ0GuOA9mlABJZJDSZ4-aYAJUwVOuEvakIu-KlqRn8K34TIlyHrbycZR-U&amp;disp=emb&amp;realattid=0AF515EE-5C0B-460A-AABD-0F6F540F2F49&amp;zw\" alt=\"\" \/><\/figure>\n\n\n\n<p>I then had to see if I had contact with this. Wonder if it is plug and play. The thought of not using 2-3 hours on just getting readings would be fricking amazeballs. Lets find out!<\/p>\n\n\n\n<h4 class=\"wp-block-heading\">See Sense HAT in Terminal<\/h4>\n\n\n\n<p>Started by giving commands:<\/p>\n\n\n\n<p><code>sudo raspi-config<\/code><\/p>\n\n\n\n<p>There I enabled I2C and pressed finish. Then:<\/p>\n\n\n\n<p><code>sudo reboot<\/code><\/p>\n\n\n\n<p>I then prompt:<\/p>\n\n\n\n<p><code>sudo apt install i2c-tools<\/code><\/p>\n\n\n\n<p>and voila:<\/p>\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"490\" height=\"196\" src=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-58.png\" alt=\"\" class=\"wp-image-11299\" srcset=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-58.png 490w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-58-300x120.png 300w\" sizes=\"auto, (max-width: 490px) 100vw, 490px\" \/><\/figure>\n\n\n\n<ul class=\"wp-block-list\">\n<li>1c = IMU &#8211; accelerometer<\/li>\n\n\n\n<li>6a = IMU &#8211; gyro<\/li>\n\n\n\n<li>5f = Temperature and Humidity<\/li>\n\n\n\n<li>5c = Barometer<\/li>\n<\/ul>\n\n\n\n<h4 class=\"wp-block-heading\">The first hickup &#8211; SSH -&gt; RPi4 via VScode<\/h4>\n\n\n\n<p>My good friend and genius, Sander Pedersen, showed me that he could connect to the RPi4 via ssh from VScode. This was fricking awesome, but he did it on his stationary pc, where he has Ubuntu downloaded. I had to do it from a wsl terminal opened in powershell. This took aproximately 2 hours to fix. It was easy, when i found out what the frick was the problem. When I typed code . in the WSL terminal, vscode still opened the windows version. I had to fix the path. Thank god for terminal magic, and le prompt: <code>which code<\/code>. There was also some issues with the password for the ssh key, but this was fixed in the user setting json file. Lets continue. Mate p\u00e5.<\/p>\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"630\" height=\"143\" src=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-59.png\" alt=\"\" class=\"wp-image-11315\" srcset=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-59.png 630w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-59-300x68.png 300w\" sizes=\"auto, (max-width: 630px) 100vw, 630px\" \/><\/figure>\n\n\n\n<h4 class=\"wp-block-heading\">System Architecture Scoped Down<\/h4>\n\n\n\n<p>This is a scoped down diagram of what I hopefully will finish by this sprint. This is how I think this is, and how it will be, at lest for this iteration.<\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"1011\" src=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/smarte_system.drawio-14-1024x1011.png\" alt=\"\" class=\"wp-image-11395\" srcset=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/smarte_system.drawio-14-1024x1011.png 1024w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/smarte_system.drawio-14-300x296.png 300w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/smarte_system.drawio-14-768x758.png 768w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/smarte_system.drawio-14-1536x1516.png 1536w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/smarte_system.drawio-14-2048x2021.png 2048w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n\n\n\n<h4 class=\"wp-block-heading\">Get readings in Terminal  from Sense HAT<\/h4>\n\n\n\n<p>This was a bit tricky. I had to find out what sensors and buttons on the Sense HAT that work with the Ubuntu-22.04 distribution. The buttosn and joysticks is not supported. Luckily, that aint an issue. All the sensors are fully supported. It is python3-sensehat package that does not support Ubuntu-22.04 fully. I had to go in to the library and set the joystick to None.  After that, we did have som sensor readings. Frickin love it.<\/p>\n\n\n\n<h4 class=\"wp-block-heading\">Making the Sense HAT a Ros2 Node<\/h4>\n\n\n\n<p>To get readings from the Sense HAT, I had to implement it as a Ros2 Node. This node is named sensehat_node. To create the node I used Python, and Ros2-library rclp for communication between nodes. <\/p>\n\n\n\n<p>The class SenseHATPublisher heritates from rclpy.node.Node and works as a independent process in the Ros2 system. When starting, a publisher is created. This publish messages on a topic. It uses std_msgs\/msg\/String where the data is sent as a JSON-string with the different readings.<\/p>\n\n\n\n<p>The sense HAT module is read via the library sense_hat, as mentioned before. The node publishes every second, and the requirement is fullfilled.<\/p>\n\n\n\n<figure class=\"wp-block-image size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"404\" height=\"134\" src=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-62.png\" alt=\"\" class=\"wp-image-11343\" style=\"width:650px;height:auto\" srcset=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-62.png 404w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-62-300x100.png 300w\" sizes=\"auto, (max-width: 404px) 100vw, 404px\" \/><\/figure>\n\n\n\n<figure class=\"wp-block-image size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"249\" height=\"76\" src=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-65.png\" alt=\"\" class=\"wp-image-11347\" style=\"width:367px;height:auto\" \/><\/figure>\n\n\n\n<p>I also get readings from the other sensor.<\/p>\n\n\n\n<h4 class=\"wp-block-heading\">List after finnishing this iteration<\/h4>\n\n\n\n<p>Again, thanks to the genius himself, Sander Pedersen, no task will ever be forgotten. This is great for structuring a single task. Everything is quality approved. Mate p\u00e5 for more fun. The Sense HAT now works as a Ros2 Node, publishing data in real-time.<\/p>\n\n\n\n<figure class=\"wp-block-image size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"626\" height=\"223\" src=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-60.png\" alt=\"\" class=\"wp-image-11326\" style=\"width:650px;height:auto\" srcset=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-60.png 626w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-60-300x107.png 300w\" sizes=\"auto, (max-width: 626px) 100vw, 626px\" \/><\/figure>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity is-style-wide\" \/>\n\n\n\n<h3 class=\"wp-block-heading\">Show GUI readings from Sense HAT<\/h3>\n\n\n\n<h4 class=\"wp-block-heading\">Check list &#8211; GUI Iteration 36<\/h4>\n\n\n\n<figure class=\"wp-block-image size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"630\" height=\"162\" src=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-61.png\" alt=\"\" class=\"wp-image-11337\" style=\"width:650px;height:auto\" srcset=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-61.png 630w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-61-300x77.png 300w\" sizes=\"auto, (max-width: 630px) 100vw, 630px\" \/><\/figure>\n\n\n\n<p>I now want to make the Sense HAT readings being displayed in the GUI. I will continue on with the iterations on the GUI. The GUI starts to be very complex It changes every week, and our systems is getting more and more complex as well. In this iteration, I will first of all display the readings. And I want to make a whole sensor reading page. If I have time, I will do one more thing this sprint, and that is to clean the GUI. This will have its own paragraph, if I can find the time.<\/p>\n\n\n\n<h4 class=\"wp-block-heading\">Designing the GUI<\/h4>\n\n\n\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"546\" src=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-63-1024x546.png\" alt=\"\" class=\"wp-image-11344\" srcset=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-63-1024x546.png 1024w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-63-300x160.png 300w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-63-768x409.png 768w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-63-1536x819.png 1536w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-63.png 1593w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n\n\n\n<figure class=\"wp-block-image size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"343\" height=\"396\" src=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-64.png\" alt=\"\" class=\"wp-image-11345\" style=\"width:443px;height:auto\" srcset=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-64.png 343w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-64-260x300.png 260w\" sizes=\"auto, (max-width: 343px) 100vw, 343px\" \/><\/figure>\n\n\n\n<p>The acceleration and gyroscopic labels are ready. The problem however is that I cant get redings between the RaspberryPi and the WSL terminal. The problem is that WSL is running NAT-type. Because of this, I cant get communication between the RPi4 and the WSL virtual room. The solution will be to download Ubuntu os dekstop for my computer.  I tried, oh how i tried. I spent a lot of time troubleshooting this. Way too much. This was the end of it, the end of this iteration. Im out of time. The posotive is that everything is ready for next week! <\/p>\n\n\n\n<h4 class=\"wp-block-heading\">Checkbox &#8211; End of this Iteration<\/h4>\n\n\n\n<figure class=\"wp-block-image size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"631\" height=\"163\" src=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-66.png\" alt=\"\" class=\"wp-image-11349\" style=\"width:650px;height:auto\" srcset=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-66.png 631w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-66-300x77.png 300w\" sizes=\"auto, (max-width: 631px) 100vw, 631px\" \/><\/figure>\n\n\n\n<p>End of iteration. Its not completely done. I checked some boxes that I shouldnt checked. The problem, is that WSL vesion 2 runs with nat. . Because of this there is no communication locally. The solution will be to download Ubuntu os dekstop for my computer. The GUI is now ready. All in all a great week. <\/p>\n\n\n\n<h4 class=\"wp-block-heading\">Next week:<\/h4>\n\n\n\n<p>The number one priority will be to continue on the work to have an fully autonomous car. The Sense HAT is now a Ros2 Node, and we are ready.<\/p>\n\n\n\n<p>Will try to remove my screen from the laptop, and use the terminal without the ability to use the screen. I will also smash my mousepad, and only use keyoard. I want to go full top less in the terminal. <\/p>\n\n\n\n<p>I tried this prompt in the terminal:<\/p>\n\n\n\n<p><code>sudo apt updt dtbs curl -m \"make car design in hardware, and connect all the sensors to the car\"<\/code> reabase origin git stash pop<\/p>\n\n\n\n<p>Nothing happend. The car did not get a chassie. I still believe this will be possible some day, but I cant spend too much time on fixing that promt at the moment. I will have to breath curl commands first. We must talk to Richard, and ask him if he can help us with our design, and vision. He has said yes, long ago, the ball is still on our court. We will get to it next week. It is muy importante to say the least. Highly prioritized quest.<\/p>\n\n\n\n<p>We also have som problems regarding the electricity for our car. We will speak to Steven about this.<\/p>\n\n\n\n<p>We will have a group meeting. Set up a new backlog and continue to mate p\u00e5.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\" \/>\n\n\n\n<h2 class=\"wp-block-heading\">Sander:<\/h2>\n\n\n\n<p>This week I did some testing to see if I could get the rover to drive wirelessly using two power banks. The Raspberry Pi 4 works great with the power bank, but the motor driver card did not work as I hoped. This is due to the starting power draw of the 4 motors being too high for the power bank to handle, this was what I feared, but I had to try it. I tried some workarounds like starting the motors one at a time, but this did not work either. I tried different delays on the motors and different starting speeds, but nothing worked. I also had a problem with the power banks turning off due to low power draw when idle, I tried to fix this by sending a PWM signal that pulses one wheel every 5 seconds. Spent a bit too much time trying to get this to work. <\/p>\n\n\n\n<figure class=\"wp-block-image size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"536\" height=\"288\" src=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/powerbankkdoe.png\" alt=\"\" class=\"wp-image-11405\" style=\"width:385px;height:auto\" srcset=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/powerbankkdoe.png 536w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/powerbankkdoe-300x161.png 300w\" sizes=\"auto, (max-width: 536px) 100vw, 536px\" \/><\/figure>\n\n\n\n<p>This is the code that i tried to keep the power bank from turning off.<\/p>\n\n\n\n<figure class=\"wp-block-image size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"776\" height=\"252\" src=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/koderamp.png\" alt=\"\" class=\"wp-image-11406\" style=\"width:530px;height:auto\" srcset=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/koderamp.png 776w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/koderamp-300x97.png 300w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/koderamp-768x249.png 768w\" sizes=\"auto, (max-width: 776px) 100vw, 776px\" \/><\/figure>\n\n\n\n<p>This is the code i tried to make the motors ramp up individually, but this didn&#8217;t work.<\/p>\n\n\n\n<p>Now the next step is to get a battery pack made for this. In the meantime, I will just use a 3-meter cable for the RPi4 and use the Pi to provide power to the motor board as seen in the picture below. The reason i wanted the power banks to work was that it would make testing the lidar much simpler without having the cable hanging from the car.<\/p>\n\n\n\n<figure class=\"wp-block-image size-large is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"768\" height=\"1024\" src=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/demobilde-768x1024.jpg\" alt=\"\" class=\"wp-image-11404\" style=\"width:275px;height:auto\" srcset=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/demobilde-768x1024.jpg 768w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/demobilde-225x300.jpg 225w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/demobilde-1152x1536.jpg 1152w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/demobilde-1536x2048.jpg 1536w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/demobilde-scaled.jpg 1920w\" sizes=\"auto, (max-width: 768px) 100vw, 768px\" \/><\/figure>\n\n\n\n<p>This is the demo setup i have been running while getting the lidar to work and testing this on a moving car. Next week will be fully dedicated to getting the car autonomous, the plan is to use the lidar for path planning and mapping, while the ultrasonic code that August got working this week will function as a fail-safe, and will block the car from being able to crash into a object. <\/p>\n\n\n\n<p>I also suddenly had some difficulties getting the joystick to work again this week and spent some time debugging this because the teleop_twist_joy node doesn\u2019t work as it should on my new computer. This ended in me cleaning up the workspace as there were a lot of junk files after building and testing the code, I also created a \u201ctasks.JSON\u201d file to run tasks faster in vscode. I also improved the development setup I had with the RP4 now instead of having to use ssh on the pi and write code in the terminal I can write code directly from VSCode using the Remote SSH extension, this means i can open the remote workspace on my PC in VSCode which makes it much easier to keep it organized. this helped making my workflow more efficient. <\/p>\n\n\n\n<p>The backlog was also updated this week and we did a bit of brainstorming<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Lidar<\/h3>\n\n\n\n<figure class=\"wp-block-video\"><video controls src=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/IMG_2886.mov\"><\/video><\/figure>\n\n\n\n<p>The lidar is mounted on the car and is publishing to the raspberry pi, that im connected to with ssh from my computer over wifi, the plan is that the pi will be a wifi-hotspot for us to connect to, this is not optimal but is easier when testing on the school. The next step for mapping and processing of this data is to use the data from the sense-hat that August have worked on to create a dynamic transform for the laser_scan, this is so that the map is static but the rover is moving, now the whole scan is rotating with the car. I will do more research on this the upcoming week. We will also have to look into ways to mitigate IMU drift.<\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"458\" src=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-85-1024x458.png\" alt=\"\" class=\"wp-image-11447\" srcset=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-85-1024x458.png 1024w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-85-300x134.png 300w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-85-768x344.png 768w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-85.png 1379w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"239\" height=\"299\" src=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/10\/image-68.png\" alt=\"\" class=\"wp-image-11409\" \/><\/figure>\n\n\n\n<p><a href=\"https:\/\/cdn.discordapp.com\/attachments\/363660220959948802\/1427001734269501460\/image.png?ex=68ed466d&amp;is=68ebf4ed&amp;hm=5fc72dd8100cb1c13b9b3546d3858dcbf9c11820de6e1a2cc9e9d024324832a1&amp;\"><\/a><\/p>\n\n\n\n<p>Here we can see that all the topics is visible from the computer using ROS2 DDS.<\/p>\n\n\n\n<p><br><br><\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Next week:<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Research lidar data processing\/SLAM (slam_toolbox)<\/li>\n\n\n\n<li>Look into IMU drift<\/li>\n\n\n\n<li>Transforms in ROS2<\/li>\n\n\n\n<li>Navigating using lidar data<\/li>\n<\/ul>\n\n\n\n<p>I might try to use the lidar without a transform first to test collision avoidance for a &#8220;dumb&#8221; autonomous driving, and then try using a transform with the IMU to have the obstacles be mapped and remembered for path planning. <\/p>\n\n\n\n<p><\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity is-style-wide\" \/>\n\n\n\n<p><\/p>\n\n\n\n<p><strong>Oliver<\/strong><\/p>\n\n\n\n<p>This week, I was unfortunately unable to work due to health-related reasons. However, I did look into and order a replacement belt for the LiDAR, and I am now waiting for it to arrive.<\/p>\n\n\n\n<p><\/p>\n","protected":false},"excerpt":{"rendered":"<p>August: We are back on the horeseback. This week we have started to make the car autonomous to the max. In our first iteration we will make it dumb autonomous. What I mean with this, is that we will make the car drive forward, and stop driving when the Ultrasonic sensor detects obstacles closer to [&hellip;]<\/p>\n","protected":false},"author":115,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[1],"tags":[],"class_list":["post-11255","post","type-post","status-publish","format-standard","hentry","category-uncategorized"],"_links":{"self":[{"href":"https:\/\/dronesonen.usn.no\/index.php?rest_route=\/wp\/v2\/posts\/11255","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/dronesonen.usn.no\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/dronesonen.usn.no\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/dronesonen.usn.no\/index.php?rest_route=\/wp\/v2\/users\/115"}],"replies":[{"embeddable":true,"href":"https:\/\/dronesonen.usn.no\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=11255"}],"version-history":[{"count":96,"href":"https:\/\/dronesonen.usn.no\/index.php?rest_route=\/wp\/v2\/posts\/11255\/revisions"}],"predecessor-version":[{"id":13412,"href":"https:\/\/dronesonen.usn.no\/index.php?rest_route=\/wp\/v2\/posts\/11255\/revisions\/13412"}],"wp:attachment":[{"href":"https:\/\/dronesonen.usn.no\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=11255"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/dronesonen.usn.no\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=11255"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/dronesonen.usn.no\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=11255"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}