Indoor AR Navigation System

Design Journal - 2021

Julian Whatley


Navigation systems have been around for a long time from paper to phone, and now with systems that utilise a whole bunch of sensors in a smartphone to make navigating more immersive. Augmented Reality (AR) takes all the sensor data from the smartphone and applies it to a camera overlay where the user can see computer generated imagery superimposed over the real-world. This Design Journal reflects the journey I took while researching the potential of an Indoor AR Navigation System. Including how many times I restarted the project, converting a current project from iOS to Android and then building my own system based of those ideas. With the primary focus being the code design and not the art or sound, I set off on an adventure which I thought I would never take and found it challenging yet rewarding. The primary use for this application is to provide accessibility to those who want independence but still need a small helping hand.

Useful Links

Week 1 - Starting 08/02/2021

Within the first week I created a mock up video demonstrating what I envisioned to help better understand the idea that I wanted to accomplish. Putting my video skills to good use I recorded a path through my house in 8K and imported the footage into Adobe After Effects. The reason I recorded at 8K was to make sure there was much as much detail in the video as possible so I can track the footage and add object into it.

Here is the result of the video prototype:

Week 2 - Starting 15/02/2021

In this week I researched what tools already existed and what third party tools could make my life easier when making an AR indoor navigation system. The reason for this is to get a better understanding of what has already been done and what can be done with current technology in smartphones. It allows me to potentially iterate on an idea and develop it further tying in Ocean3D’s 3D point cloud data from their scans.

Two ways I had been thinking about on how to approach it, was using the point cloud data provided by Ocean3D and the matterpak files. By using image recognition/computer vision, the AR application can pin point with great accuracy where the device is in the real world. The other way I could’ve approached it was by setting the point cloud data my self from the AR application, while the device/phone is scanning I could capture images and have them stored for future use to compare against to determine the location and position of the device. At this point I didn’t want to reinvent the wheel if I didn’t need to.

I built the example HelloAR scene provided by Google AR Core and following their instructions I tested it out on my smartphone.

I quickly moved on to finding a floorplan to my flat, scanned it in from my documentation provided by Lang Town and Country.

Once I got a floor plan, I imported it into Unity and into a project that Jan Hardy uploaded to Github and provided a walk through for on this website:

This walk through shows how to make a simple but effective indoor navigation system via scanning a QR code to localize the person indicator on the minimap and allow for navigating on one floor level. I used this project as a benchmarking tool for what to expect when I go to build my own or adapt and iterate.

With the floorplan I measured my hallway in real life and put a cube to a scale of 1 unit which equals 1 meter by default in unity and scaled the minimap to the measurements I took of the hallway, this gave me a rough guesstimate of the measurements for the rest of the floorplan in unity. With the floorplan set up, I placed a cylinder which acts as the person indicator on the minimap into the scene at a starting position of 0, 0, 0 which I set as the Hallway on the floorplan facing into the Sitting Room.

Here’s the video of the test:

Week 3 – Starting 22/02/2021

Continuing with the simple prototype, I struggled getting the location to work accurately and found that Jan Hardy’s project was causing frame rates to drop, although I don’t think I can blame this person as it’s more than likely to do with Google AR Core, however due to the limiting nature of the API I couldn’t build to iOS so looked at more alternatives.

I started a new project and looked at the Vuforia engine as this allows for 3D scans to be imported and a navmesh to be built on the 3D scanned model, which can provide as a base for the AR navigation.

After which I went into a 30-minute call with Joel Hodges my Tutor on the project to discuss what I’m doing. We discussed my Project Proposal, about an email that I sent to Chris Wood about getting access to a nearby local 3D scan, about starting a new project repeatedly, and was advised to look at cloud anchors.

Having an understanding that 3D scanned point cloud data takes up a lot of storage space, I quickly realized this is not an ideal solution to building an AR app. So, I started looking at Unity and the AR Foundation API. I proceeded to set up yet another project with AR Foundation package and the AR Core and AR Kit plugins.

My first idea consisted of scanning a barcode at fixed locations such as a hallway or reception area, then the AR App will know your exact location in World Space and what orientation you’re facing. Allowing the AR navigation to be accurate.

I spent a long while going over different ways to read QR codes through the camera on the device in Unity, and especially using the AR Foundation API. I looked over the Jan Hardy documentation on how to implement QR Code recognition but it was written for AR Core which doesn’t work with AR Foundation.

Week 4 – Starting 01/03/2021

Started up another project, and downloaded the AR Foundation Sample from Unity Technology on GitHub, hoping that I could access certain APIs and scripts, however I still found that I had issues being able to access the camera API to be able to read QR Codes.

This resulted in me starting the project yet once again but this time I followed a different guide on accessing the AR Foundation package to be able to read QRCodes and other symbols which can be found here: But I stumbled into yet another road block and found that I couldn’t downgrade the latest AR Foundation package to 4.1.5 which apparently is stable according to Unity.

This resulted in yet another project restart but I couldn’t get AR Foundation to work at all without loads of errors. Tried an older version of Unity but couldn’t download 4.1.5 which had the API I needed to be able to access the RAW camera API to be able to read QR Codes.

Week 5 – Starting 08/03/2021

Another day of figuring out how to get AR Foundation version 4.1.5 installed and get a QR code reader working. I found that starting on the 7th project resulted in Android build errors saying I had the incorrect Gradle, JDK and SDK installed, so I backed up the project and tried upgrading the project to a later version of Unity. Only to find once I built it the Android app, the camera didn’t work at all. So I reverted back to the previous version of Unity for Unity to say I didn’t even have AR Foundation installed. At this point I was pulling my hair out. I restarted the PC and reopened the 7th project and this time I had no errors but it was stuck on an older version of the AR Foundation and I couldn’t upgrade the package for unknown reasons.

I started an 8th project on Unity 2020.2 and this time I everything was working smoothly, downloading the AR Foundation version 4.1.5 package and AR Core and AR Kit plugins all downloaded fine.

Continuing for the next few days on project 8 I was able to get the AR Foundation package and QR Barcode reader to work and reading further through the guide on how to set up an AR Indoor Navigation system in unity using AR Core I had a better understanding of what I wanted to achieve.

I adapted the barcode reader that worked for AR core to work with AR Foundation as I was able to access the raw camera API in AR Foundation.

I continued following the walkthrough and added a few lines to the barcode relocator function in script which would allow me to scan a QR code and reposition the person indicator on the minimap to the location of the QR Code game object. With the minimap located in the bottom left of the screen and a person indicator situated on it, all that was left to do was testing it.

But before I could test it I had to get the person indicator to represent the rotation and position of the AR Camera, it was a simple case of getting the x and z axis of the AR camera and translating the position to the person indicator, as for rotation I got the rotation and applied it to the arrow on the person indicator.

Testing it resulted in it working as expected but only if you start at the exact position on the minimap that the person indicator is in. At this moment in time, I was unable to get the QR code relocating function to work. Scanning the QR code caused the person indicator to move in unpredictable ways.

After further investigation as to why the relocating function won’t work, it turns out I had gravity turned on, on the person indicator, causing it to behave unpredictably. So, I unticked use gravity and that stopped the unpredictable behavior, however it resulted in the person indicator not relocating the correct spot on the minimap.

I wrote a script that allows a user to control the minimap view by double tab to recenter the camera over the person indicator, tap drag and hold to move the minimap around and pinch to zoom. I used an orthographic camera for the minimap view to keep the perspective the same at any zoom level.

While making the script and testing I found that the screen kept dimming if it wasn’t being touched, so I added a line to the start function of the IndoorNavController script which tells the screen to never dim.

The minimap rotates with the person indicator after scanning a QR code and relocating. I thought maybe I could change it so only the arrow on the person indicator rotates, but didn’t work either, so I put the QR code relocating on hold.

I moved on to building a navmesh on the minimap and writing a few lines of code to spawn some arrows in front of the AR Camera at a lower height however quickly found that the arrow didn’t spawn. So, I took a few days off to get my head straight and come back with a fresh mind.

I thought spawning an arrow in front of the AR Camera one at a time was the best approach and nothing else came to mind on how else to approach navigation at the time.

Two test videos:

Week 6 – Starting 15/03/2021

As it turns out, the arrows didn’t spawn in, because I didn’t understand how to use anchors. So, I used a Unity AR Foundation sample and added a touch script meaning when the screen on the phone was touched an anchor with an arrow prefab attached would spawn in world space. However, the arrows were tiny and were instantiating inside the camera or device meaning I couldn’t see them until I moved my phone around. 

Moving on, I later figured out how to get destination pins appearing at the destination that was selected in a drop-down list, bearing in mind up to this point all I’ve done is followed a guide on how to set up AR indoor Navigation on AR Core, so I’ve had to convert it to work with AR Foundation.

I played around with different ways on getting the anchors and arrows to spawn in the world and one way that seemed to work was using screen to world point. Using what I did in the test on spawning anchors and arrows via touching the screen, I got the arrows working, however they are still coming out of the screen at this point meaning it’s not very clear to the user where the arrows are, and the arrows are also set to the orientation of the screen which makes it even less clear.

I went about stopping the anchors taking on the rotation of the screen by setting the just the pose instead of the rotation, but quickly found I couldn’t do that as I still needed to pass a rotation to the anchor. So I set the rotation to be Quaternion.identity. That didn’t work either. So, I set the rotation to Quaternion.Euler(0, 0, 0) but that resulted in chaos, the arrows seemed to face in any direction they wanted and to me it didn’t make sense.

Week 7 – Starting 22/03/2021

I watched numerous YouTube videos to try and get more of an understanding how AR and indoor navigation can work, I changed a few lines of code to have it so the anchors are placed from the arCamera instead of screen to world point. It worked however the angles were wrong.

Joel who is my tutor and I went into a call and discussed how I could fix the arrows, he added two very small lines of code which made sure the arrows always faced the direction the user needs to go, and suggested I have multiple arrows instead of just one.

To add multiple arrows, I would have to rewrite the anchor spawning system, so, instead I added 2 more arrows to the arrow prefab, and made some tweaks to the scripts so that the anchors and arrows get destroyed correctly, however, that didn’t work as intended and instead leaves the arrows and pin at the destination.

Two test videos:

Week 8 – Starting 29/03/2021

This is where the exciting part happens. I took on the suggestion of having a line drawn like on the minimap to the destination, but from the camera, so I moved the minimap over to the start position of the camera which is at 0, 0, 0. I used two cameras; one for the minimap layer and the arCamera for AR things, this means I was able to hide the minimap and navigation line from the arCamera but still have it visible on the render to texture minimap in the bottom left of the screen.

So with the minimap now at the start position I added walls using the default 3D asset cube in Unity and shaped them according to the floorplan walls, I then added a shader which occludes the walls from the camera, meaning that the line can’t been seen the other side.

I worked on getting a second line renderer in the scene, to be used as the AR navigation line, the line that is seen through the camera.

I created an arrow in photoshop which simply represents the direction the user needs to move towards.

Something I tried exploring was setting a pin at the destination and having it in an object pool, that way I only needed 1 pin in the scene and could move the transform to a new position. However, I found that I couldn’t move the transform when the object isn’t active and when I tried keep it active the transform still couldn’t be moved, so I back tracked and kept using instantiate for the destination pin.

I moved on quickly and made a new script called UpdateNav to check when the person indicator enters a destination trigger which sets the target to null, the line to 0 and the arLine to 0, meaning only the pin and AR UI is displayed.

I went back to the QRrecognition script to try and fix the relocate function, thinking I could just rotate the arrow on the person indicator and move its position, it still didn’t work.

One thing that I looked into was the documentation for the AR Foundation and saw there’s a reset function which I can access in the QRRecognition script unfortunately calling reset literally resets the AR Session to 0, 0, 0. I thought maybe it would be something easy to overcome by resetting the AR Session and then moving the transform of the TrackedPoseDriver on the arCamera and person indicator straight after the reset, however, the reset function just doesn’t work like that. AR Foundation seems to have some limits which can’t be controlled.

I changed up how the Camera follows the person indicator on the minimap, instead of being a child of the person indicator, I made it solo gameobject and added a few lines to the MapZoom script in LateUpdate which allows for the camera to follow the person indicator.

For the next few weeks, I took a break from it all as it was Half Term/Easter Holidays, this allowed me to reset my head and come back with a fresh way of thinking.

Two tests:

Test video with new line:

Test video with Arrows on it:

Week 11 – Starting 12/04/2021

After coming back with a fresh mind, I followed the AR Foundation documentation and created a new Base Pose Provider script which I thought would work in allowing the arCamera to be repositioned when scanning a QR Code, however the arCamera just stayed still once the QR code was scanned, it did move person indicator though.

With all the issues on trying to find a way to best recalibrate the arCamera and person indicator via scanning a QR code, I decided it was probably best to move on as I was wasting time. So instead, I started working on UI that would be in the AR world and UI that would be always visible, it made me feel as though I accomplished something.

With the UI that is in the AR world I added a script with a simple but effective line that Looks at a target, in this case the camera, meaning the in-world UI will always look at the camera when it’s active. At this stage I made a placeholder image to get a feel for what it may look like and to test the look at script.

With the new AR UI I needed a way to enable it at the correct destinations, so I created different AR UI for each destination, and wrote a switch statement which checks the index of the target destination and enables the correct AR UI.

I had some initial problems with the switch statement as I had forgot to but a default switch.

To make this modular and allow for more AR UI to be added in the future, it would require me to add more cases to the switch statement and maybe change the names of the AR UI to something numbered like ARUI_1, ARUI_2 etc and document it somewhere. This way all I would need to do is move the AR UI in the scene to the destinations and change the image and let the code do the rest. As long as I document what number belongs to what destination.

With the AR UI now in place at this stage, I decided to work on the overall UI to make it easier to navigate and control, thinking about user friendliness and a better user experience, I opted for the use of larger buttons, as this gives the user clear concise instructions throughout the entire AR experience.

I continued with working on the UI and added some tutorial prompts which will help guide the user when they first start the AR Experience. Explaining each button making sure that they know how to do each action. This greatly increases the User experience by telling them how to use it without taking them away from getting what they want done

So, this is where I experienced my first crash of this project, and luckily, I had been saving regularly locally to my computer, as well as to my Network Attached Storage (NAS) and to the Unity Cloud. So, I was able to quickly continue where I left off and only lost the text in the tutorial prompt boxes. I had taken a screenshot of what was in the text boxes and was able to make out what was in each box allowing me to simple rewrite the text.

The UI still needed a lot of work at this stage as it still looked clunky with inconsistences all over the place.

I continued with getting the tutorial to work when doing actions, I created a function in the NaviController called Tutorial which has a set of if statements and a double check method, one being a bool.

Once I wrapped up the Tutorial function, it worked as expected however I noticed some issues with the UI, and how clunky it was, as I said earlier it had a lot of inconsistences. I also noticed that the minimap controls were extremely sensitive to touch. So, I reworked the MapZoom script to accommodate the single touch operation allowing the user to drag and moved the camera around on the minimap. But this resulted in a new bug, being that the camera no longer follows the person indicator.

Week 12 – Starting 19/04/2021

I worked on fixing up the UI and wanted it to conform to a more modern feel, as that’s what most people are familiar with. By having consistency between my apps and other apps, a user will be able to pick it up and use the interface and know what most the buttons do with familiarity.

While looking into how to modernise the UI I worked on redesigning the drop-down list by making a menu and having a scrollrect with buttons on, the scroll rect allows the user to scroll up and down a list, and the text on the buttons changes to match the title of the destination allowing for a modular design approach.

Once I wrapped up the new placeholder UI I started work on a new start scene, where the user will be presented with instructions on how to start the AR Navigation experience, but because I didn’t have the QR working at that time I resulted in using a camera image overlay where the user needs to hold their phone and align the camera view as closely as they can to the overlay. I then added a start button which starts the AR scene.

Week 13 – Starting 26/04/2021

It was discussed in a class scrum that I add a font to really bring out the UI and give a better user experience. So, I went off to research what fonts are common across user interfaces, apps, websites, YouTube videos etc, and came across a font called Montserrat, to which I installed and used across all text fields in my AR app. To top off the user experience I also revamped the UI with much sharper corners instead of rounded ones. I also changed some of the buttons to common use symbol like the 3 horizontal lines which are common across multiple apps to access a menu.

The other symbols include: The Clear button which clears the current navigation on screen, it is a symbol of a refresh icon with a trash can. Cog wheel for settings and a magnifying glass as the search icon.

The reason for the UI overhaul is to provide consistency not only within my app but to offer familiarity with other apps, so to allow users to understand what the buttons and icons do with much ease.

After building, testing, and sharing a video of my AR app, I received positive comments, however, I noted a bug with the minimap in which I couldn’t drag and hold the minimap to move it around. The reason that bug existed is because the camera is lateupdate its position to the person indicator meaning I couldn’t move it freely.

I quickly combated the bug with a timer and a bool that gets set every time a user moves the minimap, after a set time, the camera will snap back over the person indicator, but if the user is dragging and holding the minimap to move it around there is a bool that gets set to say the camera can’t do the lateupdate until the user has removed their finger and the timer has run its course.

I decided against using arrows on the arLine renderer as the arrows were very distorted when going round corners, I investigated the line renderer documentation for more information on the matter and discovered that the line render isn’t really used that I intended, so I resulted in just using a colour gradient red being the target and green being the person indicator.

With the new arLine now in place, I figured I also needed a way to tell the user which direction they need to turn if they can’t see the line, and so, I went through a process of trying to figure out what would be the best way to approach it. I thought maybe having an indicator on the left and right side of the screen that activates when the collider which is situation just in front of the person indicator arrow isn’t touching the arLine I could activate the indicator. However, I quickly realised I was over complicating it, as I already have a similar system to determine the rotation of the camera and person indicator. I went with a method which is seen in many games where an arrow appears in the middle of the screen and rotates to tell you which direction to travel or which way to turn.

So, with that method I went about adding a few lines of code to find the first position on the line index cand calculate the rotation of the arrow when the line isn’t in line of sight. And as I already had the rotation code from the person indicator, all I had to do was copy that over into a new function and adapt it, I realise I could use just one function for getting the rotation for both game objects, but I want to make sure it works when it needs to and not run all the time.

I also revamped a lot of the UI including the floorplan, which I took into Photoshop and drew shapes over the old floorplan at a much higher resolution and applied the same Montserrat font as used across the app. With that change I also added a header to break up the menu and clear button from the camera view and a footer to make the status text clear, I matched the grey colour tone of the menu although at this point it was just a placeholder colour until I later got feedback.

I pulled a script from to help with detecting the safe area with the header and phones with notches. As I want the user to have use of as much of the screen as possible. I made sure that the header only moves down a little below the camera notch if needed to accommodate for the Menu button and Clear button.

Fixed a bug where an error would get thrown out saying I need to set a value for NaviController target, but it needs to be null when starting the AR app or when no destination is selected. It turned out it was a collider issue on the person indicator. I simply wrapped the if statement in the ontriggerenter with if( != null) and that caught the error before it happens.

After some feedback on the new UI it was made clear that I should start considering the colour of the UI, and suggestions were vocalised in class about having the colour theme and buttons match the theme of Ocean3D’s website. Which so happens to be rather simple but effective, purple, and white. Using the eyedropper tool in Unity I got the hex code and input the hex code into which gave me a few colour palettes I could work from.

With most the new UI in place, I created a second menu which houses the settings, with a sound toggle and a button that links directly to the Ocean3D website. I also decided to change up some of the icons again, to match the familiarity with the material design from google.

Some more tweaks to the UI involved putting panels behind the tutorial text to make the text clearer to read, I added a few lines of code to the directional arrow that points the user in the correct direction if the arLine is out of sight, the directional arrow shows like normal unless you are facing the wrong way then some text will appear instead saying to ‘Turn around.’. Once the main ar scene was almost done I finalised some of the art for the start screen and added the material buttons and panels to keep it consistent throughout the app. And with that I also added in a loading style 3 dot animation to show that the AR scene is starting when the user clicks start although it often loads too quickly for a user to even notice, but it is there. I then added a SoundManager script which varies the pitch ever so slightly with one sound file when clicking on buttons and other UI elements.

After testing I went back to fix a bug with the AR reset, the user will have an option within the settings menu to restart the app. The ARReset function didn’t reset because I missed out a line in the awake function of the NaviController script, all I needed to add was the ARReset function so that every time the AR app starts it resets the AR Session.

Finally, after fixing the bug, I polished up some more of the art work by creating simple white silhouettes of areas in a location, such as an animation for the kitchen and how to make a cup of tea, a sitting room, a toilet, a bedroom, and an exit. I then built the application and tested it one more time before wrapping up the rest of this document.

Four test videos:

Week 14 – Starting 03/05/2021

I had a studentship meeting and gave an update on my progress within this AR app and was advised that if I wanted, I could use some of the budget of my studentship to hire some artists. Being that I’m a stronger programmer than an artist I think I may take up that offer at a later date.

Last test video:

Week 15 – Starting 10/05/2021

I made a quick change on the fullscreen minimap where the close button was using the wrong texture.


To conclude my learning experience, I have to say, I learnt a whole new field and applied my knowledge of games design to my best ability to something I thought I would never do. It’s been a fun field of study, learning new technologies and design ideals, pushing boundaries with what I know to develop an AR experience.

To sum up what I’ve accomplished, lets start with what knowledge I went in with. Video editing skills allowed me to convey the idea of a visual AR navigation system with arrows guiding a user to a destination and a UI panel indicating what to do once at the location based off the idea provided by myself and Ocean3D. I found it easier to visually convey the idea with pictures and videos rather than video calling Ocean3D and using words.

Once the idea was conveyed, I swiftly moved on to figuring out the basics in developing an AR application using the Unity Engine, I found I was almost out of my depth with what I could do, but I didn’t stop there. I wanted to learn something new and pursue new knowledge. With the new information I gathered about developing AR applications in Unity, I investigated what other people have developer for indoor navigation systems using AR and downloaded an example project to adapt to my location. I had to adapt the code to work on Android as I didn’t have an iOS device to test on.

After adapting the example project, I moved on to developing my own navigation system which utilises lines instead of objects. I then later developed a reorientation system letting the user know when they are facing the wrong way and directing them to face towards the line.

With everything I did on this project I found my strongest skill is programming, as I did all the art, design and sound but found it rather difficult. I did bring this up in my studentship update meetings and it was said that after I submit this work, I can continue working on the project by hiring artists, sound engineers and other people of specialise in accessibilities in design using the budge of my studentship of £2000. Making me realise I don’t have to do projects solo all the time and can outsource parts of a project when needed.

So overall I set out to develop an AR application delivering the indoor navigation experience and I believe I have accomplished that goal even with negative and positive feedback throughout. I believe with this experience I can further research what it takes to develop an Indoor AR Navigation System.

The final result is this:


DanMillerDev (2019) Occlusion shader for Unity. Can be used for mobile AR Occlusion, Gist. Available at: (Accessed: 22 March 2021).

Akwakwak (2020) Mixing manual main camera move and Tracked pose driver ?, Unity Forum. Available at: (Accessed: 12 April 2021).

Andrew (2021) How Augmented Reality Indoor Navigation System Works, MobiDev. Available at: (Accessed: 23 February 2021).

Bhise, S. (2020) How to augment in real world using QRCode, Medium. Available at: (Accessed: 26 February 2021).

blanx, et al. (2018) Resolved – Reset AR Session, Unity Forum. Available at: (Accessed: 26 March 2021).

Chamuth (2017) Chamuth/unity-webcam, GitHub. Available at: (Accessed: 19 April 2021).

Cousins, C. (2019) ‘What Is Modern UI Design? 10 Tips & Examples’, 30 October. Available at: (Accessed: 19 April 2021).

Devaiah, V. (2020) Day 8: How to use occlusion to hide objects using AR Foundation in Unity. — Tutorials For AR, Medium. Available at: (Accessed: 22 March 2021).

Dilmer Valecillos (2020a) Unity AR Foundation – AR Draw Adding Multi-Touch Support (Part 2). Available at: (Accessed: 9 March 2021).

Dilmer Valecillos (2020b) Unity AR Foundation – AR Draw With AR Anchor Manager (Part 1). Available at: (Accessed: 9 March 2021).

DitzelGames (2019) Pinch and Scroll to Move and Zoom in Unity for Mobile Games. Available at: (Accessed: 16 April 2021).

Eyal Biran, et al. (2018) unity3d – Unity to ios Notch And Safe Are Problems, Stack Overflow. Available at: (Accessed: 28 April 2021).

fabian-mkv, et al. (2015) IsPointerOverGameObject not working with touch input – Unity Answers. Available at: (Accessed: 15 April 2021).

Free Font (2017a) ‘Free Montserrat Font Family Download’, Free Proxima Nova, 1 July. Available at: (Accessed: 26 April 2021).

Free Font (2017b) Free Proxima Nova Font – Download Fonts, Free Proxima Nova. Available at: (Accessed: 26 April 2021).

Geßwein, M. (2016) Smart UI Dimensions for any Screen Size, Medium. Available at: (Accessed: 19 April 2021).

Google (2021) Runtime considerations | ARCore | Google Developers. Available at: (Accessed: 24 February 2021).

Google (2021) Cloud Anchors overview for Unity | ARCore, Google Developers. Available at: (Accessed: 24 February 2021).

Google (2021) Quickstart for Android | ARCore, Google Developers. Available at: (Accessed: 16 February 2021).

Google Developers (2021) Enable ARCore, Google Developers. Available at: (Accessed: 24 February 2021).

Grygierczyk, M. (2020) AR Onboarding Application, Dribbble. Available at: (Accessed: 8 February 2021).

hallberg, matthew (2021) MatthewHallberg/IndoorNavPlaceNote. Available at: (Accessed: 16 February 2021).

Hardik (2019) unity3d – How to track the device’s position and orientation in physical space using ARFoundation?, Stack Overflow. Available at: (Accessed: 3 March 2021).

Hardy, J. (2020) Creating an ARCore powered indoor navigation application in Unity, Raccoons. Available at: (Accessed: 16 February 2021).

Hendrickx, S. (2019) Creating an ARCore powered indoor navigation application in Unity, Medium. Available at: (Accessed: 3 March 2021).

Jason Weimann (2018) Unity3D 101 – Creating Buttons Dynamically. Available at: (Accessed: 19 April 2021).

karmatha, et al. (2017) Re-orient TrackedPoseDriver, Unity Forum. Available at: (Accessed: 12 April 2021).

Karmi, Y. (2021) Do Users Really Care About a Modern UI? Available at: (Accessed: 19 April 2021).

Kitweadr, et al. (2018) unity3d – Getting positions of a line renderer on moving and rotating a line, Stack Overflow. Available at: (Accessed: 16 March 2021).

Lopez Mendez, R. (2018) Indoor Real Time Navigation with SLAM on your Mobile. Available at: (Accessed: 18 February 2021).

Makarov, A. (2019) Indoor Navigation App Development With ARKit – DZone IoT, Available at: (Accessed: 26 March 2021).

Mapbox (2018) Indoor navigation in AR with Unity, Medium. Available at: (Accessed: 16 February 2021).

Material Design (2021) Introduction, Material Design. Available at: (Accessed: 19 April 2021).

MatthewHallberg (2018) Coding INDOOR NAVIGATION with A* Pathfinding. Available at: (Accessed: 16 February 2021).

mgear (2016) Decode QRCode with + Unity, Unity Coding – Unity3D. Available at: (Accessed: 25 February 2021).

N, J. (2020) Arrow PNG, Medium. Available at: (Accessed: 8 February 2021).

Placenote (2021) Introduction. Available at: (Accessed: 16 February 2021).

QR Code Generator (2021) QR Code Generator | Create Your Free QR Codes. Available at: (Accessed: 25 February 2021).

RyanJVR, et al. (2020) Find the ARAnchor attached to specific ARPlane? · Issue #649 · Unity-Technologies/arfoundation-samples, GitHub. Available at: (Accessed: 5 March 2021).

Sapio, D. (2020) 10 Essential Skills for the Modern UI & UX Designer, Medium. Available at: (Accessed: 19 April 2021).

Satwant Singh (2018) ARFoundation Overview In Depth – Part 1. Available at: (Accessed: 15 March 2021).

Schoen, M. (2020) Unity MARS Companion Apps. Available at: (Accessed: 26 March 2021).

sigbuserror, et al. (2016) Nav generating INSIDE non-walkable objects., Unity Forum. Available at: (Accessed: 27 April 2021).

Tan C ̧ ankırı, Z. et al. (2016) ‘Guido: Augmented Reality for Indoor NavigationUsing Commodity Hardware’, in 2016 20th International Conference Information Visualisation (IV). 2016 20th International Conference Information Visualisation (IV), Lisbon: IEEE. doi: 10.1109/IV51561.2020.00123.

Technologies, U. (no date a) Unity – Scripting API: SceneManagement.SceneManager.LoadSceneAsync. Available at: (Accessed: 19 April 2021).

Technologies, U. (no date b) Unity – Scripting API: Screen.sleepTimeout. Available at: (Accessed: 5 March 2021).

Technologies, U. (no date c) Unity – Scripting API: Transform.LookAt. Available at: (Accessed: 16 March 2021).

TobiasF (2019) c# – Unity IsPointerOverGameObject Issue, Stack Overflow. Available at: (Accessed: 15 April 2021).

turadr (2021) Unity-Technologies/NavMeshComponents. Unity Technologies. Available at: (Accessed: 5 March 2021).

TutorialsForAR (2020) ‘Day 8: How to use occlusion to hide objects using AR Foundation in Unity.’, Tutorials For AR, 24 February. Available at: (Accessed: 22 March 2021).

Unity (2021a) About AR Foundation | AR Foundation | 4.1.5. Available at:[email protected]/manual/ (Accessed: 24 February 2021).

Unity (2021b) AR anchor manager | AR Foundation | 4.1.5. Available at:[email protected]/manual/anchor-manager.html (Accessed: 5 March 2021).

Unity (2021c) Placing an Object on a Plane in AR, Unity Learn. Available at: (Accessed: 26 February 2021).

Unity (no date) Rich Text | Unity UI | 1.0.0. Available at:[email protected]/manual/StyledText.html (Accessed: 15 April 2021).

ViewAR Augmented Reality (2020) GuideBOT QR Tutorial – AR Indoor Navigation. Available at: (Accessed: 26 March 2021).

‘Zhen_Shi_thesis_2020.pdf’ (2020). Available at: (Accessed: 26 March 2021).