{"id":10882,"date":"2025-09-22T14:24:52","date_gmt":"2025-09-22T13:24:52","guid":{"rendered":"https:\/\/dronesonen.usn.no\/?p=10882"},"modified":"2025-12-08T17:04:38","modified_gmt":"2025-12-08T16:04:38","slug":"week-5-astrorover","status":"publish","type":"post","link":"https:\/\/dronesonen.usn.no\/?p=10882","title":{"rendered":"WEEK 5 &#8211; Astrorover"},"content":{"rendered":"\n<p><strong>AUGUST:<\/strong>&nbsp;<\/p>\n\n\n\n<p><strong>NightVision&nbsp;Camera:<\/strong>&nbsp;<\/p>\n\n\n\n<p>Over&nbsp;the&nbsp;past&nbsp;week, I spent way to many hours&nbsp;trying to get a&nbsp;Raspberry&nbsp;Pi 4&nbsp;running&nbsp;Ubuntu-22.04&nbsp;to&nbsp;successfully&nbsp;capture&nbsp;an image&nbsp;with&nbsp;a&nbsp;ZeroCam NoIR&nbsp;with IR &#8211; Camera&nbsp;Module. The&nbsp;process&nbsp;turned&nbsp;out&nbsp;to be far more&nbsp;challenging&nbsp;than&nbsp;expected, and I&nbsp;went&nbsp;through&nbsp;several&nbsp;iterations&nbsp;before&nbsp;finding&nbsp;a&nbsp;working&nbsp;solution.&nbsp;<\/p>\n\n\n\n<p>At first, I&nbsp;attempted&nbsp;to&nbsp;integrate&nbsp;the&nbsp;camera&nbsp;as a&nbsp;ROS2 node&nbsp;using&nbsp;ninja&nbsp;and&nbsp;colcon,&nbsp;but&nbsp;quickly&nbsp;realized&nbsp;it&nbsp;was&nbsp;not a&nbsp;viable&nbsp;path&nbsp;for&nbsp;this&nbsp;setup.&nbsp; I&nbsp;tried&nbsp;several&nbsp;times,&nbsp;spending&nbsp;multiple&nbsp;hours&nbsp;just&nbsp;waiting&nbsp;for&nbsp;the&nbsp;colcon&nbsp;to&nbsp;build&nbsp;the&nbsp;ros2&nbsp;work&nbsp;space. The&nbsp;breakthrough&nbsp;came&nbsp;when&nbsp;I&nbsp;installed&nbsp;raspi-config&nbsp;by&nbsp;running&nbsp;sudo&nbsp;apt&nbsp;install&nbsp;raspi-config,&nbsp;launched&nbsp;the&nbsp;Raspberry&nbsp;Pi&nbsp;configuration&nbsp;GUI by&nbsp;running&nbsp;sudo&nbsp;rasp-connfig, and&nbsp;explicitly&nbsp;enabled&nbsp;the&nbsp;camera&nbsp;module&nbsp;interface.&nbsp;This was really a nightmare, but a great feeling figuring it out.<\/p>\n\n\n\n<p>Once&nbsp;the&nbsp;camera&nbsp;was&nbsp;recognized&nbsp;by&nbsp;the&nbsp;system, I&nbsp;switched&nbsp;to&nbsp;OpenCV&nbsp;(cv2)&nbsp;for image&nbsp;capture&nbsp;and&nbsp;streaming. With&nbsp;that, I&nbsp;was&nbsp;finally&nbsp;able&nbsp;to&nbsp;capture&nbsp;images&nbsp;directly&nbsp;from&nbsp;the&nbsp;Pi.&nbsp;First picture, what a relief:<\/p>\n\n\n\n<figure class=\"wp-block-image size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"636\" height=\"476\" src=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/09\/image-64.png\" alt=\"\" class=\"wp-image-10883\" style=\"width:488px;height:auto\" srcset=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/09\/image-64.png 636w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/09\/image-64-300x225.png 300w\" sizes=\"auto, (max-width: 636px) 100vw, 636px\" \/><\/figure>\n\n\n\n<figure class=\"wp-block-image size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"333\" height=\"512\" src=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/09\/image-65.png\" alt=\"\" class=\"wp-image-10884\" style=\"width:244px;height:auto\" srcset=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/09\/image-65.png 333w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/09\/image-65-195x300.png 195w\" sizes=\"auto, (max-width: 333px) 100vw, 333px\" \/><\/figure>\n\n\n\n<p><strong>Flask server:<\/strong>&nbsp;<\/p>\n\n\n\n<p>This&nbsp;week, I&nbsp;expanded&nbsp;the&nbsp;system\u2019s&nbsp;API to&nbsp;include&nbsp;several&nbsp;new&nbsp;routes&nbsp;that&nbsp;provide&nbsp;direct&nbsp;control&nbsp;over&nbsp;the&nbsp;camera&nbsp;module&nbsp;and&nbsp;allow&nbsp;external&nbsp;clients&nbsp;to monitor&nbsp;its&nbsp;status and&nbsp;stream&nbsp;live video.&nbsp;These&nbsp;additions&nbsp;make&nbsp;the&nbsp;system more&nbsp;modular,&nbsp;accessible, and&nbsp;easier&nbsp;to&nbsp;integrate&nbsp;with&nbsp;the&nbsp;GUI and&nbsp;other&nbsp;components.&nbsp;<\/p>\n\n\n\n<p>\/api\/camera\/on:&nbsp;Activates&nbsp;the&nbsp;camera&nbsp;and&nbsp;updates&nbsp;its&nbsp;status.&nbsp;<\/p>\n\n\n\n<p>\/api\/camera\/off&nbsp;:&nbsp;&nbsp;Deactivates&nbsp;the&nbsp;camera&nbsp;and&nbsp;releases&nbsp;resources.&nbsp;<\/p>\n\n\n\n<p>\/api\/camera\/:&nbsp;Returns&nbsp;the&nbsp;current&nbsp;status&nbsp;of&nbsp;the&nbsp;camera&nbsp;(active\/inactive).&nbsp;<\/p>\n\n\n\n<p>\/api\/camera\/capture:&nbsp;Captures&nbsp;an image and stores it in&nbsp;the&nbsp;predefined&nbsp;directory. <\/p>\n\n\n\n<p>\/video_feed&nbsp;:&nbsp;Streams&nbsp;live video from&nbsp;the&nbsp;camera,&nbsp;which&nbsp;is&nbsp;embedded&nbsp;into&nbsp;the&nbsp;GUI or&nbsp;accessed&nbsp;via&nbsp;browser&nbsp;<\/p>\n\n\n\n<p>\/api\/history: by using a client and type rpi4 ip-address and port\/api\/history you can see the blockchain.<\/p>\n\n\n\n<p><strong>GUI:<\/strong>&nbsp;<\/p>\n\n\n\n<p>Thanks&nbsp;to&nbsp;the&nbsp;camera&nbsp;i&nbsp;could&nbsp;extend&nbsp;the&nbsp;GUI&nbsp;with&nbsp;implementing&nbsp;a video&nbsp;feed&nbsp;to&nbsp;the&nbsp;video&nbsp;feed&nbsp;placeholder.&nbsp; The GUI&nbsp;application&nbsp;is&nbsp;running&nbsp;on&nbsp;my PC,&nbsp;which&nbsp;connects&nbsp;to&nbsp;teh&nbsp;Pi over&nbsp;ssh. The&nbsp;arduino&nbsp;is&nbsp;connected&nbsp;to&nbsp;the&nbsp;Pi by USB. The GUI has&nbsp;been&nbsp;expanded&nbsp;to support&nbsp;several&nbsp;new&nbsp;features:&nbsp;<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Image&nbsp;capture:&nbsp;the&nbsp;user&nbsp;can&nbsp;trigger&nbsp;the&nbsp;camera&nbsp;from&nbsp;the&nbsp;GUI, and&nbsp;the&nbsp;images&nbsp;are&nbsp;automatically&nbsp;stored&nbsp;in a&nbsp;predefined&nbsp;path&nbsp;with&nbsp;catalog.&nbsp;<\/li>\n\n\n\n<li>Camera&nbsp;control:&nbsp;the&nbsp;GUI&nbsp;provides&nbsp;buttons&nbsp;to turn&nbsp;the&nbsp;camera&nbsp;on&nbsp;and&nbsp;off,&nbsp;updating&nbsp;status display&nbsp;accordingly&nbsp;<\/li>\n\n\n\n<li>Alarm&nbsp;integration:&nbsp;if&nbsp;the&nbsp;alarm system is&nbsp;triggered, an email&nbsp;notification&nbsp;is sent&nbsp;along&nbsp;with&nbsp;a&nbsp;newly&nbsp;captured&nbsp;image&nbsp;of&nbsp;the&nbsp;intruder&nbsp;as an&nbsp;attachment.&nbsp;<\/li>\n\n\n\n<li>Future&nbsp;extensions:&nbsp;the&nbsp;GUI has placeholders&nbsp;prepared&nbsp;for&nbsp;car&nbsp;control&nbsp;and&nbsp;autonomous&nbsp;driving status. Lets&nbsp;the&nbsp;user&nbsp;chose&nbsp;controlling&nbsp;the&nbsp;car&nbsp;with&nbsp;a joystick, or let&nbsp;the&nbsp;car&nbsp;drive&nbsp;atonmous.&nbsp;Also a lbl for showing the current driving status.<\/li>\n<\/ul>\n\n\n\n<p>I&nbsp;also&nbsp;tried&nbsp;to&nbsp;implement&nbsp;a flash&nbsp;every&nbsp;time&nbsp;the&nbsp;user&nbsp;captures&nbsp;a image,&nbsp;but&nbsp;the&nbsp;GUI&nbsp;always&nbsp;crashed&nbsp;after&nbsp;capturing&nbsp;with&nbsp;flash. I&nbsp;decided&nbsp;to&nbsp;drop&nbsp;the&nbsp;flash, and&nbsp;instead&nbsp;print&nbsp;out&nbsp;in&nbsp;the&nbsp;termial&nbsp;of&nbsp;the&nbsp;GUI&nbsp;that&nbsp;the&nbsp;picture&nbsp;has&nbsp;been&nbsp;capture. By&nbsp;doing&nbsp;this,&nbsp;the&nbsp;GUI dont&nbsp;crash,&nbsp;but&nbsp;the&nbsp;GUI still lets&nbsp;the&nbsp;user&nbsp;know&nbsp;that&nbsp;the&nbsp;GUI has&nbsp;taken&nbsp;an image an&nbsp;akcknowledges&nbsp;this.&nbsp;<\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"701\" src=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/09\/image-67-1024x701.png\" alt=\"\" class=\"wp-image-10886\" srcset=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/09\/image-67-1024x701.png 1024w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/09\/image-67-300x205.png 300w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/09\/image-67-768x526.png 768w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/09\/image-67.png 1279w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n\n\n\n<figure class=\"wp-block-image size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"521\" height=\"337\" src=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/09\/image-68.png\" alt=\"\" class=\"wp-image-10887\" style=\"width:650px;height:auto\" srcset=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/09\/image-68.png 521w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/09\/image-68-300x194.png 300w\" sizes=\"auto, (max-width: 521px) 100vw, 521px\" \/><\/figure>\n\n\n\n<p><strong>Blockchain:<\/strong><\/p>\n\n\n\n<p>To make the alarm system credible and secure I implemented a blockchain. This allows the system to store each alarm event in an immutable chain, making it possible to verify the authenticity of captured images and sensor data over time. Each block contains a timestamp, the system mode ALARM, temperature and humidity readings, PIR sensor status, and a SHA256 hash of the captured image.<\/p>\n\n\n\n<p>The blockchain is implemented using a custom Python module with two core classes: Block <code>and<\/code> Blockchain. The Block class defines the structure of each block, including its index, timestamp, data payload, previous hash, and its own hash. The Blockchain class manages the chain, starting with a genesis block and appending new blocks as events occur.<\/p>\n\n\n\n<p>When an alarm is triggered, the system captures an image, calculates its hash using Python\u2019s hashlib library, and packages the relevant sensor data into a dictionary. This dictionary is then passed to the add_block() function, which creates a new block and appends it to the chain. The entire chain is serialized and saved to a chain.json file for persistence and future verification.<\/p>\n\n\n\n<p><strong>Arduino&nbsp;and sensorsystem:<\/strong>&nbsp;<\/p>\n\n\n\n<p>I have&nbsp;created&nbsp;a&nbsp;wiring&nbsp;diagram for&nbsp;the&nbsp;Arduino, so it&nbsp;will&nbsp;be&nbsp;easy&nbsp;to&nbsp;implement&nbsp;the&nbsp;hardware&nbsp;of&nbsp;the&nbsp;sensorsystem to&nbsp;the&nbsp;car. I have&nbsp;iterated&nbsp;the&nbsp;sensorcode&nbsp;in&nbsp;c++, and i have&nbsp;cleaned&nbsp;the&nbsp;code. It&nbsp;now&nbsp;follows&nbsp;best-practice&nbsp;to&nbsp;the&nbsp;max. The sensorsystem&nbsp;works&nbsp;as&nbsp;intended.&nbsp;<\/p>\n\n\n\n<figure class=\"wp-block-image size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"856\" height=\"798\" src=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/09\/image-66.png\" alt=\"\" class=\"wp-image-10885\" style=\"width:517px;height:auto\" srcset=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/09\/image-66.png 856w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/09\/image-66-300x280.png 300w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2025\/09\/image-66-768x716.png 768w\" sizes=\"auto, (max-width: 856px) 100vw, 856px\" \/><\/figure>\n\n\n\n<p><strong>Plan for&nbsp;further&nbsp;development<\/strong>&nbsp;<\/p>\n\n\n\n<p>Overall,&nbsp;the&nbsp;system is&nbsp;now&nbsp;much&nbsp;more robust and&nbsp;modular. The&nbsp;Raspberry&nbsp;Pi handles&nbsp;the&nbsp;camera&nbsp;feed&nbsp;and alarm&nbsp;logic,&nbsp;while&nbsp;the&nbsp;GUI offers intuitive&nbsp;controls&nbsp;and feedback. With OOP-compliant&nbsp;Arduino&nbsp;code&nbsp;and a&nbsp;scalable&nbsp;GUI,&nbsp;the&nbsp;platform&nbsp;is&nbsp;ready&nbsp;for&nbsp;the&nbsp;next&nbsp;stage:&nbsp;integrating&nbsp;autonomous&nbsp;driving and&nbsp;controller-based&nbsp;controlling. <\/p>\n\n\n\n<p><strong>Sander<\/strong><\/p>\n\n\n\n<p>This week I started exploring how to make the rover move autonomously, without direct joystick input. The idea was to use the Lidar sensor to detect obstacles and then let a new ROS 2 \u201cautomove\u201d node make decisions about movement. Since the Lidar we got is not working, I had to do a bit of high-level planning of the logic behind the autoMove node, without having hands-on with the hardware. The lidar we have is missing the band to spin the sensor. If we can\u2019t fix this during the next few days, we will order a new lidar specifically made for ros2 and the PI, as this is an important component for the further development of the rover\u2019s core capabilities.<\/p>\n\n\n\n<p>LIDAR as Input<\/p>\n\n\n\n<p>The LIDAR provides a 360\u00b0 view around the rover in 2D, publishing data on the \/scan topic as a sensor_msgs\/LaserScan. Each message contains hundreds of distance points, which essentially give the robot a &#8220;radar-like&#8221; perception of its surroundings. This can be visualized in Rviz when we get the lidar and later display the point cloud in the GUI.<\/p>\n\n\n\n<p>Autopilot Node Concept<\/p>\n\n\n\n<p>I designed the architecture for an autopilot node that subscribes to \/scan, processes the data, and decides on safe motion commands. At a high level, the logic works like this:<\/p>\n\n\n\n<p>Split the Lidar data into sectors, simplifying the decision-making process in the autoMove node. The ultrasonic sensor can also be used for a safety measure for detecting obstacles that the lidar might not detect, placed in front of the rover the code can be run directly on the microBit which handles the motor control, this can ensure that the presence of a object in front of the rover will stop if from moving forward separate from the commands comming from the PI.<\/p>\n\n\n\n<p>Check for the minimum distance in each sector.<\/p>\n\n\n\n<p>If the front is clear -&gt;move forward.<\/p>\n\n\n\n<p>If the front is blocked-&gt; turn towards the side with more space.<\/p>\n\n\n\n<p>If all sides are blocked -&gt; stop.<\/p>\n\n\n\n<p>The node would then publish its decisions as geometry_msgs\/Twist on \/cmd_vel, in the same way the joystick teleop node does.<\/p>\n\n\n\n<p>Integration with Existing Pipeline<\/p>\n\n\n\n<p>This approach fits neatly into the current setup:<\/p>\n\n\n\n<p>Joystick (teleop_twist_joy) and autopilot both publish to \/cmd_vel.<\/p>\n\n\n\n<p>The rover\u2019s motor driver node listens to \/cmd_vel and forwards commands to the micro:bit motor controller.<\/p>\n\n\n\n<p>One of the plans is to have a button on the joystick as a toggle between the autonomous and manual mode, also displayed in the GUI.<\/p>\n\n\n\n<p>Next Steps<\/p>\n\n\n\n<p>Hands-on testing with the lidar.<\/p>\n\n\n\n<p>Implement and test the autopilot node in ROS 2 with real LiDAR data.<\/p>\n\n\n\n<p>Tune the sector definitions and thresholds for smoother decision making.<\/p>\n\n\n\n<p>Experiment with scaling speed based on distance to obstacles.<\/p>\n\n\n\n<p>This week was about laying the foundation for autonomy. While the rover can already be teleoperated smoothly, the next phase will let it navigate and avoid obstacles on its own using Lidar. This is the main goal we need to be working on for next week together with the presentations.<\/p>\n\n\n\n<p><\/p>\n\n\n\n<p><strong>Sondre<\/strong><\/p>\n\n\n\n<p>This week I started researching alternative cameras after running into several issues trying to get the Pi Camera Module working on Ubuntu. I found a promising USB camera and decided to order it. The main reason I switched from CSI to USB was the simplicity of plug-and-play, which allows me to focus on integrating the camera with the ROS environment and setting up a camera node.<\/p>\n\n\n\n<p>After a few days of waiting, the camera finally arrived. I started experimenting with it but initially ran into several issues, the Raspberry Pi and my laptop couldn\u2019t see each other\u2019s topics in ROS 2, which meant the camera feed wasn\u2019t showing up. After some hours of trying and failing, I tried switching from CycloneDDS to Fast DDS. Finally the two devices started sharing topics and the camera feed published successfully. From there, I could stream video from the Raspberry Pi and view it on my laptop as a listener.<\/p>\n\n\n\n<p>In addition, I spent some time learning more about RViz and plan to test it further to visualize the camera data.<\/p>\n\n\n\n<p>The next step is to start working with OpenCV for video processing. My goal for next week is to have both a camera node and an image processing node up and running.<\/p>\n\n\n\n<p><strong>oliver<\/strong><\/p>\n\n\n\n<p>I have unfortunately been unable to work during this period due to health-related reasons<\/p>\n\n\n\n<p><\/p>\n","protected":false},"excerpt":{"rendered":"<p>AUGUST:&nbsp; NightVision&nbsp;Camera:&nbsp; Over&nbsp;the&nbsp;past&nbsp;week, I spent way to many hours&nbsp;trying to get a&nbsp;Raspberry&nbsp;Pi 4&nbsp;running&nbsp;Ubuntu-22.04&nbsp;to&nbsp;successfully&nbsp;capture&nbsp;an image&nbsp;with&nbsp;a&nbsp;ZeroCam NoIR&nbsp;with IR &#8211; Camera&nbsp;Module. The&nbsp;process&nbsp;turned&nbsp;out&nbsp;to be far more&nbsp;challenging&nbsp;than&nbsp;expected, and I&nbsp;went&nbsp;through&nbsp;several&nbsp;iterations&nbsp;before&nbsp;finding&nbsp;a&nbsp;working&nbsp;solution.&nbsp; At first, I&nbsp;attempted&nbsp;to&nbsp;integrate&nbsp;the&nbsp;camera&nbsp;as a&nbsp;ROS2 node&nbsp;using&nbsp;ninja&nbsp;and&nbsp;colcon,&nbsp;but&nbsp;quickly&nbsp;realized&nbsp;it&nbsp;was&nbsp;not a&nbsp;viable&nbsp;path&nbsp;for&nbsp;this&nbsp;setup.&nbsp; I&nbsp;tried&nbsp;several&nbsp;times,&nbsp;spending&nbsp;multiple&nbsp;hours&nbsp;just&nbsp;waiting&nbsp;for&nbsp;the&nbsp;colcon&nbsp;to&nbsp;build&nbsp;the&nbsp;ros2&nbsp;work&nbsp;space. The&nbsp;breakthrough&nbsp;came&nbsp;when&nbsp;I&nbsp;installed&nbsp;raspi-config&nbsp;by&nbsp;running&nbsp;sudo&nbsp;apt&nbsp;install&nbsp;raspi-config,&nbsp;launched&nbsp;the&nbsp;Raspberry&nbsp;Pi&nbsp;configuration&nbsp;GUI by&nbsp;running&nbsp;sudo&nbsp;rasp-connfig, and&nbsp;explicitly&nbsp;enabled&nbsp;the&nbsp;camera&nbsp;module&nbsp;interface.&nbsp;This was really a nightmare, but a great feeling figuring it out. Once&nbsp;the&nbsp;camera&nbsp;was&nbsp;recognized&nbsp;by&nbsp;the&nbsp;system, I&nbsp;switched&nbsp;to&nbsp;OpenCV&nbsp;(cv2)&nbsp;for image&nbsp;capture&nbsp;and&nbsp;streaming. With&nbsp;that, I&nbsp;was&nbsp;finally&nbsp;able&nbsp;to&nbsp;capture&nbsp;images&nbsp;directly&nbsp;from&nbsp;the&nbsp;Pi.&nbsp;First picture, what a relief: Flask [&hellip;]<\/p>\n","protected":false},"author":115,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[1],"tags":[],"class_list":["post-10882","post","type-post","status-publish","format-standard","hentry","category-uncategorized"],"_links":{"self":[{"href":"https:\/\/dronesonen.usn.no\/index.php?rest_route=\/wp\/v2\/posts\/10882","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/dronesonen.usn.no\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/dronesonen.usn.no\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/dronesonen.usn.no\/index.php?rest_route=\/wp\/v2\/users\/115"}],"replies":[{"embeddable":true,"href":"https:\/\/dronesonen.usn.no\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=10882"}],"version-history":[{"count":30,"href":"https:\/\/dronesonen.usn.no\/index.php?rest_route=\/wp\/v2\/posts\/10882\/revisions"}],"predecessor-version":[{"id":13390,"href":"https:\/\/dronesonen.usn.no\/index.php?rest_route=\/wp\/v2\/posts\/10882\/revisions\/13390"}],"wp:attachment":[{"href":"https:\/\/dronesonen.usn.no\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=10882"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/dronesonen.usn.no\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=10882"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/dronesonen.usn.no\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=10882"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}