{"id":3166,"date":"2019-09-12T22:37:32","date_gmt":"2019-09-12T21:37:32","guid":{"rendered":"http:\/\/dronesonen.hibu.no\/?p=3166"},"modified":"2019-12-05T08:22:15","modified_gmt":"2019-12-05T07:22:15","slug":"3166","status":"publish","type":"post","link":"https:\/\/dronesonen.usn.no\/?p=3166","title":{"rendered":"Haptix \u2013 Week 37"},"content":{"rendered":"\n<p>Virtual\u00a0meeting\u00a0at 10:00 AM\u00a0on Thursday\u00a0through\u00a0Microsoft Teams.\u00a0The\u00a0group\u00a0decided\u00a0to\u00a0split\u00a0up\u00a0into\u00a0a\u00a0computer and\u00a0electro\u00a0party for\u00a0this\u00a0week.\u00a0The\u00a0computer students\u00a0will focus on\u00a0the setup of\u00a0a Unity environment for the\u00a0VR,\u00a0while also\u00a0learning\u00a0how we can take advantage of\u00a0the software.\u00a0<\/p>\n\n\n\n<figure class=\"wp-block-image\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"637\" src=\"http:\/\/dronesonen.usn.no\/wp-content\/uploads\/2019\/09\/image-16-1024x637.png\" alt=\"\" class=\"wp-image-3167\" srcset=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2019\/09\/image-16.png 1024w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2019\/09\/image-16-300x187.png 300w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2019\/09\/image-16-768x478.png 768w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n\n\n\n<p>The Unity application will be companied with&nbsp;<a href=\"https:\/\/www.vive.com\/us\/\" target=\"_blank\" rel=\"noreferrer noopener\">HTC Vive<\/a>,&nbsp;<a href=\"https:\/\/www.leapmotion.com\/\" target=\"_blank\" rel=\"noreferrer noopener\">Leap Motion<\/a>&nbsp;tracking&nbsp;and the Haptic gloves.&nbsp;Something which will be connected in a later session.&nbsp;But for now,&nbsp;the computer students&nbsp;will have to make a foundation for how we are going to connect all these technologies together.&nbsp;<\/p>\n\n\n\n<p>We have so far had the opportunity to borrow a Leap Motion device and HTC&nbsp;Vive&nbsp;from the University.&nbsp;So, one of the first challenges we meet&nbsp;this week&nbsp;is to get those&nbsp;set up.&nbsp;However,&nbsp;it can be noted that the&nbsp;computer students&nbsp;do not have to focus solely on the hardware&nbsp;in order to make progress.&nbsp;Since Unity has an excellent&nbsp;way to implement the VR device, the computer students can already start to outline&nbsp;an&nbsp;application.&nbsp;Which can include the scenes, user interface and interactions.&nbsp;<\/p>\n\n\n\n<p>When it comes to the electro students, then the task is to design and make&nbsp;two functional Haptic gloves.&nbsp;The purpose of the gloves is to feel&nbsp;(in the physical world)&nbsp;when you touch&nbsp;something&nbsp;inside the Unity application. To achieve this, the gloves must give&nbsp;some&nbsp;sort of&nbsp;feedback to the hands when something in&nbsp;the&nbsp;application&nbsp;is touched. This is made possible by having some sort of tiny vibration motors at the tip of each finger.&nbsp;Which is&nbsp;then again&nbsp;controlled by a microcontroller&nbsp;that&nbsp;constantly listens to the position&nbsp;of the hands&nbsp;in&nbsp;Unity.&nbsp;&nbsp;<\/p>\n\n\n\n<p>The current solution to see the hands, and correspond its position in VR, is to use a&nbsp;Leap&nbsp;Motion camera that traces the&nbsp;position of the&nbsp;hands. The necessity of the&nbsp;Leap&nbsp;Motion camera would be eliminated if the gloves itself could track the position of the hands and fingers, which is theoretically possible through sensors implemented in some or each joint of the glove.&nbsp;<\/p>\n\n\n\n<figure class=\"wp-block-image\"><img loading=\"lazy\" decoding=\"async\" width=\"1013\" height=\"552\" src=\"http:\/\/dronesonen.usn.no\/wp-content\/uploads\/2019\/09\/image-17.png\" alt=\"\" class=\"wp-image-3168\" srcset=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2019\/09\/image-17.png 1013w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2019\/09\/image-17-300x163.png 300w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2019\/09\/image-17-768x418.png 768w\" sizes=\"auto, (max-width: 1013px) 100vw, 1013px\" \/><\/figure>\n\n\n\n<p>&nbsp;<br>For the next week, the group should take full advantage of&nbsp;<a href=\"https:\/\/github.com\/Danielskry\/Haptix\" target=\"_blank\" rel=\"noreferrer noopener\">Github&nbsp;repository<\/a>&nbsp;for the project.&nbsp;Because,&nbsp;with&nbsp;Github&nbsp;we will be able to track the technical progress&nbsp;better and&nbsp;use a Kanban\/SCRUM sprint system.&nbsp;Which is important now that we\u2019ve done our research&nbsp;and are ready to start the development&nbsp;at a full scale.&nbsp;<\/p>\n\n\n\n<p><strong>Daniel:<\/strong>&nbsp;<\/p>\n\n\n\n<p>Went into Unity&nbsp;exploring&nbsp;the possibilities&nbsp;we have for&nbsp;the creation&nbsp;of&nbsp;a&nbsp;VR application.&nbsp;Specifically,&nbsp;the part&nbsp;about&nbsp;creating a&nbsp;scene&nbsp;and user interface.&nbsp;A goal here&nbsp;with the scene&nbsp;would&nbsp;be&nbsp;to create&nbsp;a terrain with&nbsp;textures and materials&nbsp;that&nbsp;looks&nbsp;appealing.&nbsp;E.g. one could create a&nbsp;fantasy&nbsp;terrain&nbsp;that locks the player into a small section of the scene, and then place&nbsp;the interaction in that&nbsp;small section.&nbsp;That way, the&nbsp;player would&nbsp;be able to experience an&nbsp;interesting&nbsp;environment&nbsp;which&nbsp;also&nbsp;limits the&nbsp;player&nbsp;when it comes to movement.&nbsp;The reason for this is to simply draw attention to the interaction, rather than the movement of the player.&nbsp;<br>&nbsp;<\/p>\n\n\n\n<figure class=\"wp-block-image\"><img loading=\"lazy\" decoding=\"async\" width=\"600\" height=\"372\" src=\"http:\/\/dronesonen.usn.no\/wp-content\/uploads\/2019\/09\/image-2.gif\" alt=\"\" class=\"wp-image-3169\" \/><\/figure>\n\n\n\n<p>The&nbsp;image above illustrates&nbsp;a white box (player) and a Unity scene&nbsp;with a fantasy terrain. The player is locked into the circular platform and cannot move outside it.&nbsp;Since all&nbsp;that is outside the platform is lava and spikey mountains.&nbsp;Which might be an ideal environment for the player. Because in this case, we&nbsp;will&nbsp;have the advantage of placing the interaction (E.g.&nbsp;a&nbsp;Rubix&nbsp;cube) onto the platform.&nbsp;Which should immediately draw attention to the main objective of the application.&nbsp;As a little&nbsp;side note, adding a&nbsp;simple&nbsp;sprite&nbsp;into this scene should&nbsp;also&nbsp;eliminate the&nbsp;unpleasing&nbsp;greyish&nbsp;background&nbsp;seen&nbsp;from&nbsp;the image.&nbsp;<\/p>\n\n\n\n<p>In&nbsp;the creation of this&nbsp;scene I also wrote two small scripts for both the camera&nbsp;(first-person)&nbsp;and player. That way, one could easily move&nbsp;around,&nbsp;look&nbsp;and&nbsp;interact&nbsp;with&nbsp;the environment.&nbsp;Something I did intentionally to leave an opportunity to interact&nbsp;with the environment without already&nbsp;having&nbsp;implemented&nbsp;the&nbsp;VR&nbsp;device&nbsp;and Haptic gloves.&nbsp;When it comes to the textures and materials seen from the image, then these were imported from the Unity Asset Store as free assets.&nbsp;Fortunately for us, Unity has a great selection of high-quality assets that are also free.&nbsp;<\/p>\n\n\n\n<p>A user interface for the application should also serve a&nbsp;somewhat&nbsp;valuable purpose. Because, with a user interface one would have the option to&nbsp;customize the settings of the application&nbsp;and&nbsp;choose an interaction and\/or scene.&nbsp;But&nbsp;the latter is&nbsp;more about aesthetics than anything else.&nbsp;<\/p>\n\n\n\n<p><strong>William:<\/strong>&nbsp;&nbsp;<\/p>\n\n\n\n<p>Worked with&nbsp;Unity&nbsp;testing&nbsp;and exploring effects,&nbsp;visualization&nbsp;and animations.&nbsp;Checked out several tutorials&nbsp;and ideas that our group can&nbsp;potentially&nbsp;implement into&nbsp;our system.&nbsp;For next week I will continue&nbsp;working&nbsp;with&nbsp;Unity&nbsp;and C#&nbsp;to gain more knowledge and&nbsp;figure out what we can potentially implement\/create.&nbsp;<\/p>\n\n\n\n<p><strong>Tom Erik:<\/strong>&nbsp;<\/p>\n\n\n\n<p>Had a meeting with Herman at&nbsp;Uni, where we detailed a rough layout of the sensor, and feedback motor positions. We further researched different IMU\u2019s&nbsp;and landed on initially placing one IMU on each hand.&nbsp;&nbsp;<\/p>\n\n\n\n<p>Further research went&nbsp;into&nbsp;different kinds of sensors, and ways to give the user feedback from VR space through different means.&nbsp;&nbsp;<\/p>\n\n\n\n<p>We are looking at different technologies to track the hands position. So&nbsp;far,&nbsp;our current options are;&nbsp;<\/p>\n\n\n\n<ul class=\"wp-block-list\"><li>Visual Camera tracking&nbsp;E.g&nbsp;Leap Motion&nbsp;<\/li><li>Echo location (very unlikely)&nbsp;<\/li><li>Several IMU connected to the fingers&nbsp;<\/li><li>Stretch sensor\/&nbsp;strain gage&nbsp;<\/li><li>IR tracking&nbsp;<\/li><\/ul>\n\n\n\n<ul class=\"wp-block-list\"><li>3G grid or directional&nbsp;<\/li><\/ul>\n\n\n\n<p>Each method has&nbsp;its&nbsp;drawback, which we\u2019ll spend more time looking into in the coming week.&nbsp;<\/p>\n\n\n\n<p>For feedback, we are considering whether to use vibration, or&nbsp;create a pushing sensation somehow, perhaps with a solenoid.&nbsp;<\/p>\n\n\n\n<p><strong>Even:&nbsp;<\/strong>&nbsp;<\/p>\n\n\n\n<p>Started&nbsp;development&nbsp;of UML diagrams.&nbsp;&nbsp;<\/p>\n\n\n\n<ul class=\"wp-block-list\"><li>Use case&nbsp;(complete)&nbsp;<\/li><li>Sequence&nbsp;diagram&nbsp;(not finished)&nbsp;<\/li><\/ul>\n\n\n\n<p><strong>Petter<\/strong>:&nbsp;&nbsp;<\/p>\n\n\n\n<p>Created&nbsp;a prototype of a Rubix cube in Unity and discussed a potential solution&nbsp;for&nbsp;how&nbsp;the&nbsp;Rubix&nbsp;cube&nbsp;is&nbsp;going&nbsp;to&nbsp;interact with&nbsp;the&nbsp;glove. While also thinking about how we could implement the programming part of the interaction.&nbsp;Also started setup of the leap motion device that we got earlier this&nbsp;thursday.&nbsp;<\/p>\n\n\n\n<figure class=\"wp-block-image\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"1024\" src=\"http:\/\/dronesonen.usn.no\/wp-content\/uploads\/2019\/09\/image-18.png\" alt=\"\" class=\"wp-image-3170\" srcset=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2019\/09\/image-18.png 1024w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2019\/09\/image-18-150x150.png 150w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2019\/09\/image-18-300x300.png 300w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2019\/09\/image-18-768x768.png 768w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n\n\n\n<p>As&nbsp;a&nbsp;brief&nbsp;explanation,&nbsp;despite&nbsp;its&nbsp;many&nbsp;combinations.&nbsp;The&nbsp;Rubix&nbsp;cube&nbsp;has very&nbsp;few&nbsp;options&nbsp;when it comes to movement. By&nbsp;rotating&nbsp;one&nbsp;corner,&nbsp;the&nbsp;rest&nbsp;of&nbsp;the&nbsp;section&nbsp;rotates either&nbsp;horizontally&nbsp;or vertically.&nbsp;&nbsp;<\/p>\n\n\n\n<p>For&nbsp;example,&nbsp;if&nbsp;I&nbsp;rotate&nbsp;the&nbsp;top&nbsp;right&nbsp;of&nbsp;the&nbsp;green&nbsp;grid,&nbsp;the&nbsp;vertical&nbsp;section&nbsp;of&nbsp;that&nbsp;cube&#8217;s&nbsp;parts&nbsp;will&nbsp;react&nbsp;accordingly&nbsp;(shown&nbsp;in the visual illustration&nbsp;below).&nbsp;<\/p>\n\n\n\n<figure class=\"wp-block-image\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"1024\" src=\"http:\/\/dronesonen.usn.no\/wp-content\/uploads\/2019\/09\/image-19.png\" alt=\"\" class=\"wp-image-3171\" srcset=\"https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2019\/09\/image-19.png 1024w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2019\/09\/image-19-150x150.png 150w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2019\/09\/image-19-300x300.png 300w, https:\/\/dronesonen.usn.no\/wp-content\/uploads\/2019\/09\/image-19-768x768.png 768w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n\n\n\n<p>We could&nbsp;also add movement on the middle corners, however it\u2019s&nbsp;not&nbsp;necessary to complete the cube.&nbsp;<\/p>\n\n\n\n<p>This&nbsp;means&nbsp;that&nbsp;the only movement we&nbsp;need&nbsp;to&nbsp;script&nbsp;is this single one.&nbsp;From there on out, we can&nbsp;apply&nbsp;it&nbsp;to every corner of the cube.&nbsp;However,&nbsp;there\u2019s one problem with this;&nbsp;In real life this works because the rest of the cube is held fast by the other hand.&nbsp;One solution&nbsp;that avoids this&nbsp;is to use the vector&nbsp;from the movement and use that direction&nbsp;and apply it&nbsp;to&nbsp;the cube.&nbsp;But&nbsp;this&nbsp;would&nbsp;also&nbsp;assume&nbsp;that&nbsp;there\u2019s&nbsp;a small&nbsp;difference in&nbsp;the angle that people normally&nbsp;rotate. We need to test&nbsp;the&nbsp;cube in vr to see how it reacts.&nbsp;<\/p>\n\n\n\n<p><strong>Herman:<\/strong>&nbsp;<\/p>\n\n\n\n<ul class=\"wp-block-list\"><li>Researched different IMUs&nbsp;<\/li><li>Researched circuit design and Arduino code for the most basic features&nbsp;<\/li><li>Roughly mapped out how to implement wireless communication through WLAN or BLE&nbsp;<\/li><li>Roughly sketched out component layout for the gloves&nbsp;<\/li><\/ul>\n\n\n\n<p><strong>Next week:<\/strong> <\/p>\n\n\n\n<ul class=\"wp-block-list\"><li>Research how the current VR handheld controllers work and interact with the VR sensors.&nbsp;<\/li><li>Acquire and test an IMU.&nbsp;<\/li><li>Prepare a piece of code for the possibility for IoT, either through wireless or wired communication.&nbsp;<\/li><\/ul>\n","protected":false},"excerpt":{"rendered":"<p>Virtual\u00a0meeting\u00a0at 10:00 AM\u00a0on Thursday\u00a0through\u00a0Microsoft Teams.\u00a0The\u00a0group\u00a0decided\u00a0to\u00a0split\u00a0up\u00a0into\u00a0a\u00a0computer and\u00a0electro\u00a0party for\u00a0this\u00a0week.\u00a0The\u00a0computer students\u00a0will focus on\u00a0the setup of\u00a0a Unity environment for the\u00a0VR,\u00a0while also\u00a0learning\u00a0how we can take advantage of\u00a0the software.\u00a0 The Unity application will be companied with&nbsp;HTC Vive,&nbsp;Leap Motion&nbsp;tracking&nbsp;and the Haptic gloves.&nbsp;Something which will be connected in a later session.&nbsp;But for now,&nbsp;the computer students&nbsp;will have to make a foundation for how we [&hellip;]<\/p>\n","protected":false},"author":72,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[59,1],"tags":[],"class_list":["post-3166","post","type-post","status-publish","format-standard","hentry","category-haptix","category-uncategorized"],"_links":{"self":[{"href":"https:\/\/dronesonen.usn.no\/index.php?rest_route=\/wp\/v2\/posts\/3166","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/dronesonen.usn.no\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/dronesonen.usn.no\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/dronesonen.usn.no\/index.php?rest_route=\/wp\/v2\/users\/72"}],"replies":[{"embeddable":true,"href":"https:\/\/dronesonen.usn.no\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=3166"}],"version-history":[{"count":2,"href":"https:\/\/dronesonen.usn.no\/index.php?rest_route=\/wp\/v2\/posts\/3166\/revisions"}],"predecessor-version":[{"id":3173,"href":"https:\/\/dronesonen.usn.no\/index.php?rest_route=\/wp\/v2\/posts\/3166\/revisions\/3173"}],"wp:attachment":[{"href":"https:\/\/dronesonen.usn.no\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=3166"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/dronesonen.usn.no\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=3166"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/dronesonen.usn.no\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=3166"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}