{"id":33,"date":"2024-03-13T01:15:33","date_gmt":"2024-03-13T01:15:33","guid":{"rendered":"https:\/\/geoffzink.com\/?page_id=33"},"modified":"2024-03-18T04:49:26","modified_gmt":"2024-03-18T04:49:26","slug":"invoke","status":"publish","type":"page","link":"https:\/\/geoffzink.com\/?page_id=33","title":{"rendered":"Invoke"},"content":{"rendered":"\n<p>First of all &#8211; I must sincerely thank the highly talented Invoke team I worked with to realise the exploratory work described below:<\/p>\n\n\n\n<ul>\n<li><a href=\"https:\/\/www.linkedin.com\/in\/charles-bridger-9b182b132\/\">Charles Bridger<\/a> <em>Co-Founder<\/em><\/li>\n\n\n\n<li><a href=\"https:\/\/www.linkedin.com\/in\/akram-darwazeh\/\">Akram Darwezeh<\/a> <em>Lead Software Engineer<\/em><\/li>\n\n\n\n<li><a href=\"https:\/\/www.linkedin.com\/in\/lloydmcinnes\/\">Lloyd McIness<\/a> <em>Software Engineer<\/em><\/li>\n\n\n\n<li><a href=\"https:\/\/www.linkedin.com\/in\/craig-vaz\/\">Craig Vaz<\/a> <em>Super Intern #1<\/em><\/li>\n\n\n\n<li><a href=\"https:\/\/www.linkedin.com\/in\/sammcgillicudy\/\">Sam McGillicudy<\/a> <em>Super Intern #2<\/em><\/li>\n<\/ul>\n\n\n\n<figure class=\"wp-block-image aligncenter size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"1000\" height=\"500\" src=\"https:\/\/geoffzink.com\/wp-content\/uploads\/2024\/03\/Invoke-Badge-Large.png\" alt=\"\" class=\"wp-image-43\" style=\"width:311px;height:auto\" srcset=\"https:\/\/geoffzink.com\/wp-content\/uploads\/2024\/03\/Invoke-Badge-Large.png 1000w, https:\/\/geoffzink.com\/wp-content\/uploads\/2024\/03\/Invoke-Badge-Large-300x150.png 300w, https:\/\/geoffzink.com\/wp-content\/uploads\/2024\/03\/Invoke-Badge-Large-768x384.png 768w\" sizes=\"(max-width: 1000px) 100vw, 1000px\" \/><\/figure>\n\n\n\n<div class=\"wp-block-columns alignfull is-layout-flex wp-container-core-columns-layout-1 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<div class=\"wp-block-group has-global-padding is-layout-constrained wp-block-group-is-layout-constrained\">\n<div class=\"wp-block-cover aligncenter wp-duotone-duotone-4\"><span aria-hidden=\"true\" class=\"wp-block-cover__background has-background-dim\"><\/span><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"576\" class=\"wp-block-cover__image-background wp-image-46\" alt=\"\" src=\"https:\/\/geoffzink.com\/wp-content\/uploads\/2024\/03\/Invoke-Tracker-Hero-Image-1024x576.jpg\" data-object-fit=\"cover\" srcset=\"https:\/\/geoffzink.com\/wp-content\/uploads\/2024\/03\/Invoke-Tracker-Hero-Image-1024x576.jpg 1024w, https:\/\/geoffzink.com\/wp-content\/uploads\/2024\/03\/Invoke-Tracker-Hero-Image-300x169.jpg 300w, https:\/\/geoffzink.com\/wp-content\/uploads\/2024\/03\/Invoke-Tracker-Hero-Image-768x432.jpg 768w, https:\/\/geoffzink.com\/wp-content\/uploads\/2024\/03\/Invoke-Tracker-Hero-Image-1536x864.jpg 1536w, https:\/\/geoffzink.com\/wp-content\/uploads\/2024\/03\/Invoke-Tracker-Hero-Image-2048x1152.jpg 2048w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><div class=\"wp-block-cover__inner-container is-layout-flow wp-block-cover-is-layout-flow\">\n<p class=\"has-text-align-center has-contrast-color has-text-color has-link-color has-x-large-font-size wp-elements-288cebac44478aa185a3921450282817\"><strong><em><a href=\"#Tracker\">Tracker<\/a><\/em><\/strong><\/p>\n<\/div><\/div>\n<\/div>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<div class=\"wp-block-cover aligncenter wp-duotone-duotone-4\"><span aria-hidden=\"true\" class=\"wp-block-cover__background has-background-dim\"><\/span><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"681\" class=\"wp-block-cover__image-background wp-image-47\" alt=\"\" src=\"https:\/\/geoffzink.com\/wp-content\/uploads\/2024\/03\/Shared-Use-Example-1024x681.png\" data-object-fit=\"cover\" srcset=\"https:\/\/geoffzink.com\/wp-content\/uploads\/2024\/03\/Shared-Use-Example-1024x681.png 1024w, https:\/\/geoffzink.com\/wp-content\/uploads\/2024\/03\/Shared-Use-Example-300x200.png 300w, https:\/\/geoffzink.com\/wp-content\/uploads\/2024\/03\/Shared-Use-Example-768x511.png 768w, https:\/\/geoffzink.com\/wp-content\/uploads\/2024\/03\/Shared-Use-Example.png 1405w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><div class=\"wp-block-cover__inner-container is-layout-flow wp-block-cover-is-layout-flow\">\n<p class=\"has-text-align-center has-contrast-color has-text-color has-link-color has-x-large-font-size wp-elements-7494e30df5b55904366fac5b8c938167\"><strong><em><a href=\"#Portal\">Portal<\/a><\/em><\/strong><\/p>\n<\/div><\/div>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<div class=\"wp-block-cover aligncenter wp-duotone-duotone-4\"><span aria-hidden=\"true\" class=\"wp-block-cover__background has-background-dim\"><\/span><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"623\" class=\"wp-block-cover__image-background wp-image-48\" alt=\"\" src=\"https:\/\/geoffzink.com\/wp-content\/uploads\/2024\/03\/Screen-Shot-2024-03-13-at-3.31.30-PM-1024x623.png\" data-object-fit=\"cover\" srcset=\"https:\/\/geoffzink.com\/wp-content\/uploads\/2024\/03\/Screen-Shot-2024-03-13-at-3.31.30-PM-1024x623.png 1024w, https:\/\/geoffzink.com\/wp-content\/uploads\/2024\/03\/Screen-Shot-2024-03-13-at-3.31.30-PM-300x183.png 300w, https:\/\/geoffzink.com\/wp-content\/uploads\/2024\/03\/Screen-Shot-2024-03-13-at-3.31.30-PM-768x467.png 768w, https:\/\/geoffzink.com\/wp-content\/uploads\/2024\/03\/Screen-Shot-2024-03-13-at-3.31.30-PM.png 1385w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><div class=\"wp-block-cover__inner-container is-layout-flow wp-block-cover-is-layout-flow\">\n<p class=\"has-text-align-center has-contrast-color has-text-color has-link-color has-x-large-font-size wp-elements-c80c538d0be68e112c60d4e45828f3a6\"><strong><em><a href=\"#Projection-Mapping\">Projection Mapping<\/a><\/em><\/strong><\/p>\n<\/div><\/div>\n<\/div>\n<\/div>\n\n\n\n<p>I am fascinated by the future of spatial computing. Following my studies in mechatronics engineering and commerce, I went on to co-found a startup called Invoke with my final year engineering project partner Charles Bridger.<\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\"  src=\"https:\/\/geoffzink.com\/wp-content\/uploads\/2024\/03\/Core-MR-Inputs-Diagram.svg\" alt=\"\" class=\"wp-image-51\"\/><\/figure>\n\n\n\n<p>Building on the proof-of concept project, we developed a device for dedicated object tracking, to supplement user and spatial tracking systems already showcased on mixed reality hardware at the time.<\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\"  src=\"https:\/\/geoffzink.com\/wp-content\/uploads\/2024\/03\/Layers-of-Abstraction-Diagram.svg\" alt=\"\" class=\"wp-image-52\"\/><\/figure>\n\n\n\n<p>As users, we have learned to interact with virtual information in abstracted ways. The challenge for a creating an intuitive interface for spatial computing is heavily dependent on the performance and reliability of the tracking systems. More recent <a href=\"http:\/\/youtube.com\/watch?v=OFvXuyITwBI\">reactions to the Apple Vision Pro&#8217;s eye tracking interface <\/a>shows how natural interactions can overcome more cumbersome solutions.<\/p>\n\n\n\n<h2 class=\"wp-block-heading is-style-asterisk\" id=\"Tracker\">Tracker<\/h2>\n\n\n\n<figure class=\"wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<iframe loading=\"lazy\" title=\"Invoke: Mixed Reality Object Tracking\" width=\"500\" height=\"281\" src=\"https:\/\/www.youtube.com\/embed\/5gbb8SgrhNc?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" allowfullscreen><\/iframe>\n<\/div><\/figure>\n\n\n\n<details class=\"wp-block-details is-layout-flow wp-block-details-is-layout-flow\"><summary>More about the Invoke Tracker<\/summary>\n<figure class=\"wp-block-image aligncenter size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"411\" height=\"348\" src=\"https:\/\/geoffzink.com\/wp-content\/uploads\/2024\/03\/Axes.png\" alt=\"\" class=\"wp-image-141\" srcset=\"https:\/\/geoffzink.com\/wp-content\/uploads\/2024\/03\/Axes.png 411w, https:\/\/geoffzink.com\/wp-content\/uploads\/2024\/03\/Axes-300x254.png 300w\" sizes=\"(max-width: 411px) 100vw, 411px\" \/><\/figure>\n\n\n\n<p>The purpose of the Tracker is to allow a target physical object to become interactive in a mixed reality environment. The Tracker was designed to be mounted to the target object, and would send realtime 6DOF (X, Y Z position and yaw, pitch, roll rotation) data for anchoring virtual objects or interfaces.<\/p>\n\n\n\n<figure class=\"wp-block-image aligncenter size-medium is-resized\"><img decoding=\"async\"  src=\"https:\/\/geoffzink.com\/wp-content\/uploads\/2024\/03\/Tracker-Object-Splash.svg\" alt=\"\" class=\"wp-image-137\" style=\"width:378px;height:auto\"\/><\/figure>\n\n\n\n<p>For example, this could be used on a golf club, tennis racket or other sports equipment to allow you to practice your swing. With a mixed reality headset or display, you could hit a virtual ball, or analyse your swing and technique to improve your form. A great prototype example of this type of interaction is shown in this <a href=\"https:\/\/www.youtube.com\/watch?v=ziY5UVNKnas\">Table Tennis demo by Leap Motion<\/a> (now <a href=\"https:\/\/www.ultraleap.com\/\">UltraLeap<\/a>).<\/p>\n\n\n\n<p>Another application is music education. The popular video game <a href=\"https:\/\/en.wikipedia.org\/wiki\/Guitar_Hero\">Guitar Hero<\/a> uses a guitar style controller for players to play along to songs, although the simplification of the instrument (while accessible), means that these skills don&#8217;t translate to actually being able to play the instrument.<\/p>\n\n\n\n<p>The Tracker would allow a real instrument to become a virtual interface. This allows music data (e.g. a MIDI file), to be overlaid on the fretboard or finger positions, instead of sheet music or tablature which is not as easily digested by new players.<\/p>\n<\/details>\n\n\n\n<details class=\"wp-block-details is-layout-flow wp-block-details-is-layout-flow\"><summary>How it Works<\/summary>\n<p>The tracking system was designed to be compatible with &#8220;Inside-Out&#8221; tracking systems. This is where sensors on the headset are used to run <a href=\"https:\/\/www.interaction-design.org\/literature\/topics\/slam#:~:text=SLAM%20is%20the%20foundation%20of,appears%20in%20the%20real%20world.\">SLAM<\/a> algorithms, allowing for precise anchored placement of virtual objects, without the need for externally positioned base stations\/sensors. As a result, we chose to build a Infrared LED constellation system, combining the result of computer vision with an inertial measurement unit (IMU) feeding statistical data fusion in real-time. This method is similar to <a href=\"https:\/\/developer.oculus.com\/blog\/tracking-technology-explained-led-matching\/\">how VR controllers are tracked relative to stand alone headsets<\/a>.<\/p>\n\n\n\n<div class=\"wp-block-group has-global-padding is-layout-constrained wp-block-group-is-layout-constrained\">\n<div class=\"wp-block-columns are-vertically-aligned-center has-contrast-background-color has-background is-layout-flex wp-container-core-columns-layout-2 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:100%\">\n<figure class=\"wp-block-image aligncenter is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"732\" src=\"https:\/\/geoffzink.com\/wp-content\/uploads\/2024\/03\/System-Diagram-Wearable-1024x732.png\" alt=\"\" class=\"wp-image-138\" style=\"width:537px;height:auto\" srcset=\"https:\/\/geoffzink.com\/wp-content\/uploads\/2024\/03\/System-Diagram-Wearable-1024x732.png 1024w, https:\/\/geoffzink.com\/wp-content\/uploads\/2024\/03\/System-Diagram-Wearable-300x214.png 300w, https:\/\/geoffzink.com\/wp-content\/uploads\/2024\/03\/System-Diagram-Wearable-768x549.png 768w, https:\/\/geoffzink.com\/wp-content\/uploads\/2024\/03\/System-Diagram-Wearable-1536x1097.png 1536w, https:\/\/geoffzink.com\/wp-content\/uploads\/2024\/03\/System-Diagram-Wearable-2048x1463.png 2048w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n<\/div>\n<\/div>\n<\/div>\n<\/details>\n\n\n\n<details class=\"wp-block-details is-layout-flow wp-block-details-is-layout-flow\"><summary>Challenges<\/summary>\n<p>Commercially, we Invoke struggled to gain traction for two reasons:<\/p>\n\n\n\n<ul>\n<li>Development kits were difficult to sell at the time (2018), as there were very few potential customers with mixed reality devices. The emerging devices (Magic Leap One, Microsoft Hololens) were much more limited than we had anticipated, particularly when it came to field of View (FOV). Small FOV&#8217;s made the virtual elements of our tracked objects underwhelming, even if the tracking itself was performing well.<\/li>\n\n\n\n<li>These platforms were not open-source, and developers often had very restricted to access to the camera streams (essential to our tracking method). This makes it very difficult for third parties to create such peripherals. An exception is the <a href=\"https:\/\/www.roadtovr.com\/tundra-labs-steamvr-tracking-hdk-tl448k6d-gp-hdk\/\">Triad HDK system<\/a> compatible with Steam VR. While this delivers good tracking performance, it is an &#8220;outside-in&#8221; tracking system, so defeats the portable use case we had in mind for the tracker.<\/li>\n<\/ul>\n\n\n\n<p>Technically, a challenge we discovered was minimising the <a href=\"https:\/\/medium.com\/@DAQRI\/motion-to-photon-latency-in-mobile-ar-and-vr-99f82c480926\">motion-to-photon latency<\/a>. Minimising this latency is important, but not absolutely critical to VR-only systems. The brain can accomodate some delay without breaking immersion or making the user feel unbalanced\/nauseous. However, with a waveguide display on an AR headset (Such as the <a href=\"https:\/\/www.magicleap.com\/magic-leap-2\">Magic Leap<\/a> or <a href=\"https:\/\/www.microsoft.com\/en-au\/hololens\">Microsoft Hololens<\/a>), even latencies &lt;20ms are much more noticeable. If you rotate your head side to side while looking at a virtual object on a table for example, you can notice it oscillates in space trying to catch up to your realtime view. This is where pass-though displays such as that on the <a href=\"https:\/\/www.meta.com\/au\/quest\/quest-3\/\">Meta Quest 3<\/a> and <a href=\"https:\/\/www.apple.com\/apple-vision-pro\/\">Apple Vision Pro<\/a> are stronger solutions. The pass-though displays not only allow for wider FOV, crisper colour and shadows, they also sync the motion-to-photon latency of the feed of the real world with the movement of virtual objects, so they feel much more anchored in the space.<\/p>\n<\/details>\n\n\n\n<details class=\"wp-block-details is-layout-flow wp-block-details-is-layout-flow\"><summary>Are there Similar Products out there?<\/summary>\n<p>Now the <a href=\"https:\/\/www.vive.com\/au\/accessory\/vive-ultimate-tracker\/\">Vive Ultimate Tracker<\/a> is available, which executes its own inside-out SLAM using inertial and optical sensors. These are relatively expensive at the time of writing, and are primarily marketed for full body tracking in VR.<\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"672\" src=\"https:\/\/geoffzink.com\/wp-content\/uploads\/2024\/03\/vive-ultimate-tracker-v1-1366-1024x672.png\" alt=\"\" class=\"wp-image-144\" srcset=\"https:\/\/geoffzink.com\/wp-content\/uploads\/2024\/03\/vive-ultimate-tracker-v1-1366-1024x672.png 1024w, https:\/\/geoffzink.com\/wp-content\/uploads\/2024\/03\/vive-ultimate-tracker-v1-1366-300x197.png 300w, https:\/\/geoffzink.com\/wp-content\/uploads\/2024\/03\/vive-ultimate-tracker-v1-1366-768x504.png 768w, https:\/\/geoffzink.com\/wp-content\/uploads\/2024\/03\/vive-ultimate-tracker-v1-1366.png 1524w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n\n\n\n<p>As these trackers are supposedly platform agnostic (do not require Steam VR base stations, can send tracking data to any device), I&#8217;m excited for the potential of the applications we envisioned for the Invoke Tracker to be realised on devices like the Apple Vision Pro or Meta Quest 3. In my view, trackers like these could be a great way to provide modular tracking of useful objects for pass-through mixed reality \/ spatial computing.<\/p>\n<\/details>\n\n\n\n<h2 class=\"wp-block-heading is-style-asterisk\" id=\"Portal\">Portal<\/h2>\n\n\n\n<figure class=\"wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<iframe loading=\"lazy\" title=\"Invoke Portal | Mixed Reality | Invoke\" width=\"500\" height=\"281\" src=\"https:\/\/www.youtube.com\/embed\/PRP0mqX4oWE?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" allowfullscreen><\/iframe>\n<\/div><\/figure>\n\n\n\n<details class=\"wp-block-details is-layout-flow wp-block-details-is-layout-flow\"><summary>More about the Invoke Portal<\/summary>\n<p>As Invoke was not making traction with the Tracker development, we pivoted towards a headset-less VR system we built called the Portal. The pickup and play nature of the canvases lent itself well for eye-catching activations, were users of all abilities and ages can jump into a fun experience for 5 minutes or so.<\/p>\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"675\" height=\"751\" src=\"https:\/\/geoffzink.com\/wp-content\/uploads\/2024\/03\/Value-Proposition.png\" alt=\"\" class=\"wp-image-38\" srcset=\"https:\/\/geoffzink.com\/wp-content\/uploads\/2024\/03\/Value-Proposition.png 675w, https:\/\/geoffzink.com\/wp-content\/uploads\/2024\/03\/Value-Proposition-270x300.png 270w\" sizes=\"(max-width: 675px) 100vw, 675px\" \/><\/figure>\n\n\n\n<p>We made good progress on developing this system, and were set to run an initial product showcase as invitees to Augmented World Expo (AWE) in California in May 2020. Unfortunately, COVID-19 interrupted these plans and forced an end to Invoke&#8217;s operations. During the planned weekend of the expo, the Santa Clara convention centre was a converted to a temporary hospital for an overflow of COVID patients.<\/p>\n<\/details>\n\n\n\n<details class=\"wp-block-details is-layout-flow wp-block-details-is-layout-flow\"><summary>How it Works<\/summary>\n<div class=\"wp-block-columns alignfull is-layout-flex wp-container-core-columns-layout-3 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<figure class=\"wp-block-image aligncenter size-large is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"824\" src=\"https:\/\/geoffzink.com\/wp-content\/uploads\/2024\/03\/Iso-1024x824.png\" alt=\"\" class=\"wp-image-37\" style=\"width:611px;height:auto\" srcset=\"https:\/\/geoffzink.com\/wp-content\/uploads\/2024\/03\/Iso-1024x824.png 1024w, https:\/\/geoffzink.com\/wp-content\/uploads\/2024\/03\/Iso-300x241.png 300w, https:\/\/geoffzink.com\/wp-content\/uploads\/2024\/03\/Iso-768x618.png 768w, https:\/\/geoffzink.com\/wp-content\/uploads\/2024\/03\/Iso.png 1243w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n<\/div>\n<\/div>\n\n\n\n<p>The Portal canvases are tracked in realtime using Vive Trackers and the Steam VR tracking system. The canvases consist of a tensioned sheet of translucent film, held in a lightweight aluminium frame. Controllers on either side of the frame provided gripping points and allows the user to interact with the virtual environment in other ways than moving around spatially.<\/p>\n\n\n\n<p>We built a calibration system to quickly map the position of a set of projectors to the Steam VR coordinates. This requires a facilitator to align the portal with a set of projected rays, and confirm when they match. From this data, a set of vector operation algorithms could determine the projector&#8217;s 6DOF position with a high degree of accuracy in a matter of minutes.<\/p>\n\n\n\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-layout-4 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column has-contrast-background-color has-background is-layout-flow wp-block-column-is-layout-flow\">\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"773\" height=\"1024\" src=\"https:\/\/geoffzink.com\/wp-content\/uploads\/2024\/03\/Top-Metric-773x1024.png\" alt=\"\" class=\"wp-image-36\" srcset=\"https:\/\/geoffzink.com\/wp-content\/uploads\/2024\/03\/Top-Metric-773x1024.png 773w, https:\/\/geoffzink.com\/wp-content\/uploads\/2024\/03\/Top-Metric-227x300.png 227w, https:\/\/geoffzink.com\/wp-content\/uploads\/2024\/03\/Top-Metric-768x1017.png 768w, https:\/\/geoffzink.com\/wp-content\/uploads\/2024\/03\/Top-Metric.png 1000w\" sizes=\"(max-width: 773px) 100vw, 773px\" \/><\/figure>\n<\/div>\n<\/div>\n\n\n\n<p>Once calibrated, the projectors emits a selective window of the virtual scene, This window view is generated as if it is a camera from the user&#8217;s head position. The image lands on the back side of the moving canvas (back-projection). This means as a user, you can look straight into the canvas as if it is a literal &#8220;portal&#8221; into the virtual space you can carry around with you to explore.<\/p>\n\n\n\n<p>This form of VR is more social than headset based VR, as you can share the experience with others. Additionally, it is nicely scaleable with additional projectors to cover a wider space\/more angles, and multiple Portals able to be used in the same play space at the same time.<\/p>\n<\/details>\n\n\n\n<h2 class=\"wp-block-heading is-style-asterisk\" id=\"Projection-Mapping\">Projection Mapping<\/h2>\n\n\n\n<figure class=\"wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<iframe loading=\"lazy\" title=\"Realtime Projection Mapping | Mixed Reality | Invoke\" width=\"500\" height=\"281\" src=\"https:\/\/www.youtube.com\/embed\/JBpdWUz05dE?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" allowfullscreen><\/iframe>\n<\/div><\/figure>\n\n\n\n<details class=\"wp-block-details is-layout-flow wp-block-details-is-layout-flow\"><summary>More about Invoke&#8217;s Realtime Projection Mapping<\/summary>\n<p>This project was our first experimentation with projection mapping. It was inspired by other projection mapping showcases (e.g. commonly applied to a static set, or building), however we had the ambition to make the projections follow dynamic objects with the use of the Steam VR tracking technology.<\/p>\n\n\n\n<p>&#8220;<a href=\"https:\/\/www.youtube.com\/watch?v=lX6JcybgDFo\">Box<\/a>&#8221; is an outstanding production of a similar concept worth watching. Their system is also dynamic projection mapping, with the camera and projection surfaces running through an &#8220;on rails&#8221; routine. Our system aimed to allow for realtime interaction with users supplying varied input movements. <\/p>\n\n\n\n<p>We created some cool demos on this system, with physics models adding to the immersion. Through user testing, we found the single viewing angle (from a set of play glasses that were actually a tracked virtual &#8220;camera&#8221; was quite limiting. The users would also get between the target object and projectors and not understand why they couldn&#8217;t see anything (they were casting a shadow over the object). These findings lead to the development of the <a href=\"#Portal\">Invoke Portal<\/a>, which was a more successful concept in practice.<\/p>\n<\/details>\n","protected":false},"excerpt":{"rendered":"<p>First of all &#8211; I must sincerely thank the highly talented Invoke team I worked with to realise the exploratory work described below: I am fascinated by the future of spatial computing. Following my studies in mechatronics engineering and commerce, I went on to co-found a startup called Invoke with my final year engineering project [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"inline_featured_image":false,"footnotes":""},"_links":{"self":[{"href":"https:\/\/geoffzink.com\/index.php?rest_route=\/wp\/v2\/pages\/33"}],"collection":[{"href":"https:\/\/geoffzink.com\/index.php?rest_route=\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/geoffzink.com\/index.php?rest_route=\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/geoffzink.com\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/geoffzink.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=33"}],"version-history":[{"count":24,"href":"https:\/\/geoffzink.com\/index.php?rest_route=\/wp\/v2\/pages\/33\/revisions"}],"predecessor-version":[{"id":150,"href":"https:\/\/geoffzink.com\/index.php?rest_route=\/wp\/v2\/pages\/33\/revisions\/150"}],"wp:attachment":[{"href":"https:\/\/geoffzink.com\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=33"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}