06 December 2017

Getting your logos and splash screens right for Mixed Reality apps

Intro

Way too often I still see Mixed Reality apps with default (white) splash screens, app windows, or even worse – the default Unity Icon. When Mixed Reality was HoloLens-only and brand new, you kinda got away with that, as the WOW factor of the actual app eclipsed almost everything else, and the iconography did not matter much as there was only the Holographic Start menu. Also, although judging from my download numbers a fair number of HoloLenses are in use worldwide, the number is pretty limited compared to, say, PCs.

These days of leeway are over. With the advent of the Fall Creators Update, Mixed Reality apps can run on PCs with the right hardware – and in stead of a $3000+ device, a $400 headset will do. And as soon as I made an app available for immersive headsets, my download numbers exploded. Much more eyeballs so much more visibility, and in Windows, the flexible start menu and it’s different tile sizes makes omissions in the iconography glaringly visible. I also notice ‘ordinary’ users are a lot less forgiving. So, time to step up your game.

Note: I am currently using Unity 2017.2.0p2 MRTP5. Things may change. They tend to do that very much in this very new and exiting field :)

So what do you need?

At the very minimum, and icon of size 620x620 and a 1240x600 splash screen. 620x620 is the 310x310 icon at 200% scale. I tend to used 200% icons as they are the default sizes that are created by the Unity editor. I give the start 620x620 file the default name “Square310x310Logo.scale-200.png” and go from there.

In my Unity Assets folder I create a folder “UWPassets” and I drag the “Square310x310Logo.scale-200.png” in imagethere. I then proceed to make the following file from the first one

  • Wide310x150Logo.scale-200.png (620x310 pixels)
  • Square150x150Logo.scale-200.png (300x300 pixels)
  • Square71x71Logo.scale-200.png (142x142 pixels)
  • Square44x44Logo.scale-200.png (88x88 pixels)
  • StoreLogo.scale-100.png (50x50 pixels)

Do pay attention to the last one – that‘s 100% scale, so it’s actual size. To the right you see my actual icon.

These are just the standard sizes for UWP apps, if you are coming in from a Unity perspective you may not be familiar with them.

And then, as I said, you need the splash screen. This needs to be 1240x600. I have created this ‘awesome’ splash screen:

image

I suggest you take something that says something more about your app, but this will to for an example.

The result in the Unity editor:

image

Putting icons in the right place in Unity

This is not rocket science: Hit File/Build Settings and then the button “Player Settings”. This will open the “PlayerSettings” pane on the right side of your editor. First, you open the pane “Icon” and then the section “Store logo”. From the UWPAssets folder, drag the “StoreLogo.scale-100” into the top box.


image

You can leave the rest empty. Then skip all the other sections, scroll all the way down to the section “Universal 10 tiles and logos”. Open “Square 44x44 Logo”

image

Drag the “Square44x44Logo.scale-200.png” into the 4th box from the top, marked 200%. Then open section “Square 71x71 Logo” and drag  “Square71x71Logo.scale-200.png” in. Proceed similarly with the remaining three sizes till all the 200% boxes are filled.

Splash screen, splash screen and splash screen

Now dig this – there are actually three places to put splash screens in. Potentially you can have different splash images - I tend to use the same one everywhere. So scroll further down to the Splash Screen section. First up, on top, is the “Virtual Reality Splash Image”. You can drag your splash screen in there if you want.

image

Next up, a bit further down, is the Windows splash screen:

image

This is the most important one. This appears on your desktop when the app starts, it also is shown in a rectangle in Immersive Headsets during the app ‘transition scene’ (when your Cliff House is replaced by the app), and it’s displayed on your default app screen:

imageimage


And finally, there’s Windows Holographic Splash screen:

image

This is actually important in a HoloLens. As this does not have a ‘transition scene’, it shows a floating splash screen for a few seconds. You can see it in an immersive headset too, but usually for a very short time – most of the time you see it flash by in a split second before the actual app appears.

So where is the very first splash screen for? To be honest, I have no idea. Possibly it’s for other, non Windows Mixed Reality based apps.

Just tiny thing left

I tend to generate the app in a subfolder “App” in the root of the project. Typically, I place a .gitignore in there that, look like this:

*.*
*

true to it’s name, it ignores – everything. Because, well, it’s a generated app, right? But if you go into the generated app’s assets (App\SplashScreen\Assets) you will notice something odd:image

Although we nicely created an icon for every size, there’s still this pesky “Square44x44Logo.targetsize-24_altform-unplated.png” icon with the default logo that – interestingly, is 24x24 pixels big. So where is this used? Apparently in the task bar and – as I have noticed – in the developer portal. So although your customers won’t see it in the Store (I think), you will, it will clutter op your app list in the developer portal, and that’s annoying. I have found no way to actually create this from Unity, so I take the easy approach: I create yet another icon, now 24x24, overwrite that “Square44x44Logo.targetsize-24_altform-unplated.png” in  App\SplashScreen\Assets and add it to Git manually there. Unity won’t overwrite it, and if it get’s deleted or overwritten, I can always revert. You just need to remember to check the icon before actually submitting.

Conclusion

So now you know how to properly assign the minimum iconography and splash screens, and I let’s agree on no longer publishing stuff with default icons or empty splash screens to the store, right? ;)

Project code – although it does not actually do anything – can be found here. But it’s useful reference material.

12 November 2017

Finding the floor - and displaying holograms at floor level in HoloLens apps

Intro

Both of my HoloLens apps in the store ask the user to identify some kind of horizontal space to put holograms on, in one case Schiphol Airport, in the other case a map of some part of the world. Now you can of course use Spatial Understanding, which is awesome and offers very advanced capabilities, but requires some initial activity by the user. Sometimes, you just want to identify the floor or some other horizontal place - if only to find out how long the user is. This the actual code I wrote for Walk the World and extracted it for you. In the demo project, it displays a white plane at floor level.

Setting the stage.

We start with creating and empty project. The we proceed with importing the newest Mixed Reality Toolkit. If you have done that, you will notice the extra menu option "HoloToolkit" no longer appears, but it now says "Mixed Reality Toolkit". It still has three settings, but one of them has profoundly changed: the setting "Apply Mixed Reality Scene Settings" basically adds everything you need to get started:

imageimage

  • A "MixedRealityCameraParent" - the replacement for the HoloLensCamera that encapsulates a camera that will work both on HoloLens and immersive headsets.
  • A default cursor
  • An input manager that processes input from gestures, controllers or other devices depending on whether your app runs on a HoloLens or an immersive headset

Now I tend to organize stuff a little bit different, so after I have set up the scene I have like this:

image

I make an empty game object "HologramCollection" that will hold my holograms, and all the standard or none-graphic I tend to chuck in a game object "Managers". Notice I also added SpatialMapping. If you click the other two options in the Mixed Reality Toolkit/Configure menu our basic app setup is now ready to go.

Some external stuff to get

Then we proceed to import LeanTween and Mobile Power Ups Vol Free 1 from the Unity Store. Both are free. Note – the latter one is deprecated, but the one arrow asset we need from it still is usable. If you can't get it from the store anymore, just nick it from my code ;)

image

Recurring guest appearances

We need some more stuff I wrote about before:

  • The Messenger, although it’s internals have changed a little since my original article
  • The KeepInViewController class to keep an object in view – a improved version of the MoveByGaze class about my post about a floating info screen
  • The LookAtCamera class to keep an object oriented to the camera

Setting up the initial game objects

The end result should be this:

image

HologramCollection has three objects in it:image

  • A 3DTextPrefab "LookAtFloorText"
  • An empty game object "ArrowHolder"
  • A simple plane. This is the object we are going to project on the floor.

Inside the ArrowHolder we place two more objects:

  • A 3DTextPrefab "ConfirmText"
  • The "Arrows Green" prefab from the Mobile Power Ups Vol Free 1. That initially looks like on the right

So let's start at the top setting up our objects:

LookAtFloorText

imageThis is fairly easy. We scale it to 0.005 in all directions, enter the text "Please look towards the floor" in the text mesh, and set a couple of other parameters:

  • Z position is 1. So it will spawn 1 meter before you.
  • Character size to 0.1
  • Anchor to middle center
  • Font size to 480
  • Color to #00FF41FF (or any other color you like)

After that, we drag two components on it

  • KeepInViewController
  • LookAtCamera

imageWith the following settings:

  • Max Distance 2.0
  • Move time 0.8
  • Rotate Angle 180

This will keep the text at a maximum distance of 2 meters, or closer if you are looking at an object that is nearer, it will move in 0.8 seconds to a new position, and with an angle of 180 degrees it will always be readable to you.

image

ConfirmText

This is hardly worth it's own sub paragraph, but for the sake of completeness I show it's configuration.

Notice:

  • Y = 0.27, Z is 0 here.
  • This has only the LookAtCamera script attached to it, with the same settings. There is no LookAtCamera here.




imageArrows Green

This is the object nicked from Mobile Power Ups Vol Free 1.

  • Y = 0.1, I moved it a little upwards so it will always appear just above the floor
  • I rotated it over X so it will point to the floor.
  • I added a HorizontalAnimator to it with a spin time of 2.5 seconds over an absolute (vertical) axis so it will slowly spin around.

Plane

The actual object we are going to show on the floor. It's a bit big (default size = 10x10 meters), so we are going to scale it down a little:

image

And now for some code.

The general idea

Basically, there are two classes doing the work, the rest only is 'support crew.

  • FloorFinder actually looks for the floor, and controls a prompt object (LookAtFloorText) that prompts you - well, to look at the floor
  • FloorConfirmer displays an object that shows you where the floor is find, and then waits for a method to be called that either accepts or reject the floor. In sample the app, this is done by speech commands.
  • They both communicate by a simple messaging system

The message class is of course very simple:

using UnityEngine;

public class PositionFoundMessage
{
    public Vector3 Location { get; }

    public PositionFoundMessage(Vector3 location)
    {
        Location = location;
    }

    public PositionFoundStatus Status { get; set; }
}

With an enum to go with it indicating where in the process we are

public enum PositionFoundStatus
{
    Unprocessed,
    Accepted,
    Rejected
}

On startup, the FloorConfirmer hides it's confirmation object (the arrow with text). FloorFinder shows it's prompt object. When it detects a floor, it sends the position in a PositionFoundMessage message with status "Unprocessed". It also listens to PositionFoundMessages. If it receives one that has status "Unprocessed" (which is, in this sample, only sent by itself), it will disable itself and hide the prompt object (the text saying "Please look at the floor").

If the FloorConfirmer receives a PositionFoundMessage of status unprocessed "Unprocessed" it will show it's confirmation object on the location where the floor is detected. And then, as I wrote, it waits for it's Accept or Reject method being called. If Accept is called, it resends the PositionFoundMessage with status "Accepted" to anyone who might be interested - in this app, that's a simple class "ObjectDisplayer" that shows a game object that has been assigned to it on the correct height below the user's head. If the Reject method is called, FloorConfirmer resend the message as well - but with status Rejected. Which will wake up the FloorFinder again.

Finding the actual floor

using HoloToolkit.Unity.InputModule;
using HoloToolkitExtensions.Messaging;
using HoloToolkitExtensions.Utilities;
using UnityEngine;

public class FloorFinder : MonoBehaviour
{
    public float MaxDistance = 3.0f;

    public float MinHeight = 1.0f;

    private Vector3? _foundPosition = null;

    public GameObject LabelText;

    private float _delayMoment;

    void Start()
    {
        _delayMoment = Time.time + 2;
        Messenger.Instance.AddListener<PositionFoundMessage>(ProcessMessage);
#if !UNITY_EDITOR
        Reset();
#else
        LabelText.SetActive(false);
#endif
    }

    void Update()
    {
        if (_foundPosition == null && Time.time > _delayMoment)
        {
            _foundPosition = LookingDirectionHelpers.GetPositionOnSpatialMap(MaxDistance, 
GazeManager.Instance.Stabilizer); if (_foundPosition != null) { if (GazeManager.Instance.Stabilizer.StablePosition.y - _foundPosition.Value.y > MinHeight) { Messenger.Instance.Broadcast( new PositionFoundMessage(_foundPosition.Value)); PlayConfirmationSound(); } else { _foundPosition = null; } } } } public void Reset() { _delayMoment = Time.time + 2; _foundPosition = null; if(LabelText!= null) LabelText.SetActive(true); } private void ProcessMessage(PositionFoundMessage message) { if (message.Status == PositionFoundStatus.Rejected) { Reset(); } else { LabelText.SetActive(false); } } private void PlayConfirmationSound() { Messenger.Instance.Broadcast(new ConfirmSoundMessage()); } }

imageThis has three public properties - a prompt object (this becomes the text "please look towards the floor", a maximum distance to try to find the floor, and the minimum height the floor should be below the user's head. These properties are set as displayed on the right:

The Update method does all the work - if a position on the spatial map is found that's at least MinHeight below the user's head, then we might have found the floor, and we send out a message (with default status Unprocessed). The method below Update, ProcessMessage, actually gets that message too and hides the prompt text.

The helper method "GetPositionOnSpatialMap" in LookingDirectionHelpers simply tries to project a point on the spatial map at maximum distance along the viewing direction of the user. It's like drawing a line projecting from the users head ;)

public static Vector3? GetPositionOnSpatialMap(float maxDistance = 2,
                                               BaseRayStabilizer stabilizer = null)
{
    RaycastHit hitInfo;

    var headReady = stabilizer != null
        ? stabilizer.StableRay
        : new Ray(Camera.main.transform.position, Camera.main.transform.forward);

    if (SpatialMappingManager.Instance != null &&
        Physics.Raycast(headReady, out hitInfo, maxDistance, 
        SpatialMappingManager.Instance.LayerMask))
    {
        return hitInfo.point;
    }

    return null;
}

Is this the floor we want?

using HoloToolkitExtensions.Messaging;
using UnityEngine;

public class FloorConfirmer : MonoBehaviour
{
    private PositionFoundMessage _lastReceivedMessage;

    public GameObject ConfirmObject;

    // Use this for initialization
    void Start()
    {
        Messenger.Instance.AddListener<PositionFoundMessage>(ProcessMessage);
        Reset();
#if UNITY_EDITOR
        _lastReceivedMessage =  new PositionFoundMessage(new Vector3(0, -1.6f, 0));
        ResendMessage(true);
#endif
    }

    public void Reset()
    {
        if(ConfirmObject != null) ConfirmObject.SetActive(false);
        _lastReceivedMessage = null;
    }

    public void Accept()
    {
        ResendMessage(true);
    }

    public void Reject()
    {
        ResendMessage(false);
    }

    private void ResendMessage(bool accepted)
    {
        if (_lastReceivedMessage != null)
        {
            _lastReceivedMessage.Status = accepted ? 
                 PositionFoundStatus.Accepted : PositionFoundStatus.Rejected;
            Messenger.Instance.Broadcast(_lastReceivedMessage);
            Reset();
            if( !accepted) PlayConfirmationSound();
        }
    }

    private void ProcessMessage(PositionFoundMessage message)
    {
        _lastReceivedMessage = message;
        if (message.Status != PositionFoundStatus.Unprocessed)
        {
            Reset();
        }
        else
        {
            ConfirmObject.SetActive(true);
            ConfirmObject.transform.position = 
                message.Location + Vector3.up * 0.05f;
        }
    }


    private void PlayConfirmationSound()
    {
        Messenger.Instance.Broadcast(new ConfirmSoundMessage());
    }
}

A rather simple class - it disables it's confirm object at startup. If it gets a PositionFoundMessage, two things might happen:

  • If it's an Unprocessed message, it will activate it's confirm object (the arrow) and place it on the location provided inside the message (well, 5 cm above that).
  • For any other PositionFoundMessage, it will deactivate itself and hide the confirm object

If the Accept method is called from outside, it will resend the message with status Accepted for any interested listener and deactivate itself. If the reject method is called, it will send the message with status Rejected - effectively deactivating itself too, but waking up the floor finder again.

And thus these two objects, the FloorFinder and the FloorConfirmer can work seamlessly together while having no knowlegde of each other whatsover.

The final basket

For anything to happen after a PositionFoundMessage with status Accepted is sent, there need to also be something that actually receives it, and acts upon it. I places the game object it's attached to the same vertical position as the point it received - that is, 5 cm above it. It's not advisable to do it at the exact vertical floor position, as stuff might disappear under the floor.I have found horizontal planes are never smooth or, indeed - actually horizontal.

using HoloToolkitExtensions.Messaging;
using UnityEngine;

public class ObjectDisplayer : MonoBehaviour
{
    void Start()
    {
        Messenger.Instance.AddListener<PositionFoundMessage>(ShowObject);

#if !UNITY_EDITOR
        gameObject.SetActive(false);
#endif
    }

    private void ShowObject(PositionFoundMessage m)
    {
        if (m.Status == PositionFoundStatus.Accepted)
        {
            transform.position = new Vector3(transform.position.x, m.Location.y,
                transform.parent.transform.position.z) + Vector3.up * 0.05f;
            if (!gameObject.activeSelf)
            {
                gameObject.SetActive(true);
            }
        }
    }
}

imageThis script is dragged upon the plane that will appear on the floor

Wiring it all together

FloorFinder and FloorConfirmer sit together in the Managers object, but there's more stuff in the to tie all the knots:

  • The Messenger, for if there are messages to be sent, there should also be something to send it around
  • A Speech Input Source and a Speech Input Handler. Notice the last one calls FloorConfirmer's Accept method on "Yes", and the Reject method upon no.







Adding some sound


If yoimageu download the source code and run it, you might notice my trademark "pringggg" sound when the app does something. You might also have noticed various scripts sending ConfirmSoundMessage messages. In the Managers object there's another game object called "ConfirmSoundManager". It has an audio source and a ConfirmSoundRinger, that as you might expect is not too complicated:











using HoloToolkitExtensions.Messaging;
using UnityEngine;

public class ConfirmSoundRinger : MonoBehaviour
{
    void Start()
    {
        Messenger.Instance.AddListener<ConfirmSoundMessage>(ProcessMessage);
    }

    private void ProcessMessage(ConfirmSoundMessage arg1)
    {
        PlayConfirmationSound();
    }

    private AudioSource _audioSource;
private void PlayConfirmationSound() { if (_audioSource == null) { _audioSource = GetComponent<AudioSource>(); } if (_audioSource != null) { _audioSource.Play(); } } }

Conclusion

And that's it. Simply stare at a place below your head (default at least 1 meter), say "Yes" and the white plane will appear exactly on the ground. Or, as I explained, a little above it. Replace the plane by your object of choice and you are good to go, without using Spatial Understanding over complex code.

As usual, demo code can be found on GitHub.

04 November 2017

Build for both from one source–Hololens and Immersive Apps

Intro

The time has come. As I predicted in my previous blog post, it’s now possible to build a HoloLens app and an Immersive Apps (targeted toward the new immersive headsets) from once source, thanks to the awesome work done by the folks working on the Mixed Reality Toolkit. No need to keep two branches anymore. There is still some fiddling to do, but that is merely following a script when you create and submit the final apps. And script - that is exactly what I am going to describe.

Stuff you will need

Build the HoloLens app

  • Open the project in Unity 2017.1.x
  • You may get the following popup if the project was last opened in Unit 2017.2.x:

image

  • Just click “Continue”
  • You may get a popup like this:

image

  • Click “Cancel”
  • Open the Build Settings (File/Build settings)
  • Select SDK 15063

image

  • I always create Unity C# projects, but that checkbox is not mandatory
  • Click “Player Settings”
  • Expand Panel “Other settings”

image

  • Make sure Windows Holographic is available. I always remove the “WindowsMR (missing from build)” if it says so. You will only see this if you previously have built a Immersive app using Unity 2017.2.x
  • Build the Visual Studio solution by clicking the Build button on the Build settings Window
  • Open the resulting solution
  • Open the Package.appmanifest in an XML editor (not the default editor – you will have to make manual changes).
  • Find the text “TargetDeviceFamily Name="Windows.Universal"  and change that to TargetDeviceFamily Name="Windows.Holographic"
  • While you are at it, check if MinVersion in that line is set to 10240 and MaxVersionTested is set to 10586
  • Build the store packages as usual. Don’t forget to set the configuration to Master and build only x86 packages, as those are the only one required (and supported) by HoloLens.
  • Run the WACK to see if nothing odd has happened.
  • Upload the packages to the store. The store should automatically select Holographic as only target. Don’t submit it yet. There’s more to add to it.

Build the Immersive App

  • Open the Unity project with Unity 2017.2.0f3-MRTP3
  • You might once again get a popup complaining about the non-matching editor. Just click “Continue” again
  • Likewise, press “Cancel” again if the editor complains about not being able to open library files
  • Open the Build Settings (File/Build settings)
  • Select SDK 16299

image

  • Click “Player Settings”
  • Expand the bottom panel, XR Settings
  • Verify Windows Mixed Reality is selected.

image

  • Build the Visual Studio solution by clicking the Build button on the Build settings Window
  • Open the resulting solution
  • Open the Package.appmanifest in an XML editor again
  • Find the text “TargetDeviceFamily Name="Windows.Universal"  and change that to  “TargetDeviceFamily Name="Windows.Desktop"
  • While you are at it, check if MinVersion and MaxVersionTested in that line are both set to 16299
  • For all projects in the solution, change the Min version to 16299. We don’t want this app to land on anything older than the Fall Creators Update, since only the FCU supports Windows Mixed Reality

image 

  • Build the store packages (configuration Master again).
    • Don’t forget to set the configuration to Master but this time build x86 and x64 packages, as those are the platforms supported for desktop PCs (although in reality, I think most if not all Mixed Reality capable PCs will be x64)
    • Make sure you don’t overwrite the HoloLens app you created earlier – choose a different folder or copy the generated packages.
    • Make sure – and this is important – the version numbers of both app are different. The store won’t accept two packages with the same number. As you can see I have created the HoloLens app with version 3.2.8.0 and the Immerse app with 3.2.9.0

image

  • Upload the package to the same submission.

If you have done everything right, it should kind of look like this, as I showed in my previous post:

image

This is an actual screenshot from the version I submitted successfully to the store last week and has just been rolled out (I got it on my HoloLens and my MR PC yesterday,and just got the mail processing has been completed). Make sure to check out all the other checkmarks you need to check – see once again my previous post - to prevent from disappointing your users.

Some little tricks I used

The Mixed Reality Toolkit has now reached a maturity level that you don’t have to do much in your app to support both scenarios. The most dramatic change is using a Mixed Reality Camera in stead of the good old HoloLensCamera. The new camera has separate quality settings for HoloLens and Immersive apps. For HoloLens, this is default set to fastest, the “Opaque Display Settings” (those are for Immersive apps) is set to “Fantastic”. I tend to crank up the HoloLens setting notch one or two (in this case one)

image

This results in considerably better graphics, but be very careful with that – you might hose your HoloLens app’s performance. So test thoroughly before messing with this setting.

Some little piece of code I use, stolen from inspired by Mike Taulty, to check if I am on a HoloLens or not:

public static class OpaqueDetector
{
    public static bool IsOpaque
    {
        get
        {
#if !UNITY_EDITOR
            if (Windows.Foundation.Metadata.ApiInformation.IsTypePresent(
                "Windows.Graphics.Holographic.HolographicDisplay"))
            {
               return Windows.Graphics.Holographic.HolographicDisplay.GetDefault()?.IsOpaque == true;
            }
#endif
            return false;
        }
    }
}

Turns out I could call HolographicDisplay in the newest SDK, but not in the old one. Using Mike’s trick allowed it to work. Which allows me do to do simple things as

SomeObject.SetActive(OpaqueDetector.IsOpaque)

This, for instance, I use to disable the MotionControllers prefab inside the MixedRealityCameraParent because I don’t want all that stuff to be initialized. I

image

I also use it to skip the first phase of the HoloLens app – where the user has to detect the floor.

Conclusion

If you are writing a HoloLens app now, especially if you think general market, it’s almost a no-brainer to support Immersive headsets now as well. Although HoloLens seems to do pretty well worldwide judging from my download numbers, adding an Immersive app has added a serious spike to those download numbers, and opened my app to a considerable wider audience.

Disclaimer: as with everything in this super fast moving space, this is how it works now. I think the plan is for Unity to release a unified editor that can generate both apps in the near future. But it’s now already so easy that you would be crazy not to go forward already.

One final tip: be sure to regularly test on both types of devices if you are serious about expanding to Immersive. A bit of code that works fine on one type may not work so hot on another – or sometimes not even compile. You still have to watch your step, but it’s magnitudes easier now than it was in era running up to the Fall Creators Update release.

Enjoy! I hope you build something awesome!

14 October 2017

Lessons learned from adapting Walk the World from pure HoloLens to Windows Mixed Reality

imageHeads-up (pun intended)

This is not my typical code-with-sample story - this a war story from the front line of Windows Mixed Reality development. How did I get here, what did I learn, what mistakes did I make, what scars I have to show, and how did I win in the end.

The end

On the evening (CET) of Tuesday, October 10, 2017, Kevin Gallo - VP of Windows Developer Platform - announced in London the release of the SDK for the Windows 10 Fall Creators Update and the opening of the Windows Store for apps targeting that release - including Mixed Reality apps. 

Mere hours after that - Thursday had just arrived in the Netherlands - an updated version of Walk the World with added support for Windows Mixed Reality passed certification, and became available for download in the store on Friday the 13th around 8:30pm CET. I was able to download, install and verify it was working as I expected. Four days before the actual official rollout of the FCU including the Mixed Reality portal, I was in the Store. Against all odds, I not only managed to make my app available, but also get it available as a launch title.

Achievement unlocked ;)

What happened before

On June 28th, 2017, I was invited to Unity Unite Europe 2017 by Microsoftie Desiree Lockwood who I met numerous times in the cause of becoming an MVP. Not having to fly 9 hours to meet an old friend but only to have to take a short hop on a train, I gladly accepted. On a whim, I decided to bring my HoloLens with Walk the World for HoloLens loaded on it. We had lunch and I demoed the app, showing Mount Rainier about 4 meters high, in a side building. That apparently made quite an impression. Talk moved quickly to the FCU Mixed Reality, and how much work it would be to make my app available for MR as well. In a very uncharacteristically moment of hubris I said "you get me a head set, I will get you this app". I got guidance on how to pitch my app, I did follow the instructions, and July 27th the head set arrived.

Suddenly it was time to make sure I lived up to my big words.Before

Challenge 1: hardware

My venerable old development PC, dating back to 2011, had a video card that in no way in Hades would be able to drive a headset. I could get a 2nd hand video card and the PC, running the Creators Update AKA RS2, said it was ready to rock. So I happily enabled a dual boot config, added the Fall Creators Update Insiders preview, and then ran into my first snag.

On the Creators Update the Mixed Reality portal is nothing more than a preview, the preview ran nice on my old PC, but the upcoming production version apparently not. Maybe I should have read this part of the Mixed Reality development documentation better. Not only the GPU did not cut it by far, but the CPU was way too old. So with a head set on the way, I was looking at this.

After

Fortunately, one of my colleagues is an avid gamer. She and her husband took a look at the specs, and built an amazing PC for me in a like days. It’s specs are:

  • CPU: AMD Ryzen 7 1700, 3.0 GHz (3,7 GHz Turbo Boost) socket AM4 processor
  • Graphics card: Gigabyte GeForce GTX 1070 G1 Gaming 8GB
  • Motherboard: ASUS PRIME B350-PLUS, socket AM4
  • Memory: Corsair 16 GB DDR4-3000
  • Storage: Crucial MX300 1TB M.2
  • Power supply: Seasonic Focus Plus Gold 650W

This is built into a Fractal Design Core 2500 Tower with an extra Fractal Design Silent Series R3 120mm Case fan. My involvement in the actual creation of this monster was supplying maximum physical dimensions and entering payment details. Software is my shtick, not hardware. But I can tell you this device runs Windows Mixed Reality like a charm, and very stable, too. Thanks Alexandra and Miles!

Lesson 1: RTFM, and then wait till the FM is indeed final before making assumptions.

Lesson 2: don’t skimp on hardware especially when you are aiming for development.

Challenge 2: tools in flux

Developing for Windows Mixed Reality in early August 2017 was a bit of a challenge. Five factors where in play:

  • The Fall Creators Update Insiders preview
  • The Mixed Reality Portal
  • Visual Studio 2017.x (a few updates came out during the timeframe)
  • Unity 2017 (numerous versions)
  • The HoloToolkit (halfway rechristened the Mixed Reality Toolkit) – or actually, the development branch for Mixed Reality.

Only when all these five stars aligned, things would actually work together. Only three of the stars were in Microsoft’s control – Unity is of course made by Unity, and the Mixed Reality Toolkit is an open source project only partially driven by Microsoft. Four of them were very much in flux. A new version comes out for one of these stars, and the whole constellation starts to wobble. Fun things I had to deal with were, amongst others:

  • imageFor quite some time, Unity could not generate Unity C# projects, but only ‘player’ projects. Which meant debugging was nearly impossible. But it also made it a fun second-guessing-the-compiler game, kind of like in the very old day before live debugging (yes kids, that’s how long I have been developing software). Effectively, this meant I had to leave the "Unity C# Projects" checkbox unchecked in the build settings, because it created something the compiler did not want - let alone it being deployable.
  • An update in Visual Studio 2017 made it impossible to run Unity generated projects unless you manually edited project.lock.json or downgraded Visual Studio (which, in the end, I did).
  • Apps ran only once for a while, then you had to reset the MR portal. Or only showed a black screen. Next time they ran flawlessly. This was caused by a video card driver crash. This was, in a matter of speaking, a 6th star in the constellation that fortunately quickly disappeared.
  • For a while, I could not start apps from Visual Studio. I could only start them from the start menu. And only from the desktop start menu. Not from the MR start menu.
  • The Mixed Reality version and the HoloLens version of the app got quite far out of sync at one point.

Lesson 3: on the bleeding edge is where you suffer pain. But you also get the biggest gain. And this is were the community’s biggest strengths come to light.

A tale of two tool sets

I wanted to move forward with Mixed Reality, but at the same time I wanted to maintain the integrity of the HoloLens version. So although the sources I wrote myself remained virtually the same, at one point the versions of Unity, the Mixed Reality Toolkit and even the Visual Studio versions I needed were different. For now I used for HoloLens development:

  • Visual Studio 2017 15.3.5
  • Unity 2017.1.1f1
  • The Mixed Reality Toolkit master branch.

For Mixed Reality development I used:

  • Visual Studio 2017 15.4.0 preview 5
  • Unity 2017.2.0f1
  • The Mixed Reality Toolkit Dev_Unity_2017.2.0 branch

Why a Visual Studio Preview? Well that particular preview contained the 16299 SDK (as fellow MVP Oren Novotny pointed out in a tweet), and although I did not know for sure 16299 would be the FCU indeed, I decided to go for it. Late afternoon (CET) of Sunday, October 8, I built the package, pressed it trough the WACK, and submitted it. And as I wrote before, it sneaked through shortly after the Store was declared open, becoming an unplanned Mixed Reality release title. Unplanned by Microsoft, that is. It was definitely planned by me. ;)

In the mean time, things are till changing, see this screen shot from the HoloDeveloper Slack group,which I highly recommend joining, especially the immersive_hmd_info and mrtoolkit_holotoolkit channels as these give a lot of up-to-date information on the five-star-constellation changes and wobbles:

image

Lesson 4: Keep tightly track of your tool versions

Lesson 5: Join the HoloDeveloper slack channel (and this means something from a self-proclaimed NOT fan of Slack;) )

A tale of two source trees

As I already mentioned, I needed to use two versions of the Mixed Reality Toolkit. These are distributed in the form on Unity packages, which means they insert themselves into the source of your app, as source. It’s not like you reference an assembly or a NuGet package. This had a kind of nasty consequence – if I wanted to move forward and keep my HoloLens app intact for the moment, I had to make a a separate branch for Mixed Reality development, which was exactly what I did. So although the sources I wrote for my app are virtually the same, there was a different toolkit in my sources. Wow, did Microsoft mess up this one, right?

No. Not at all. Think with me.

  • I have a master branch that is based upon the master branch of the Mixed Reality Toolkit – this contains my HoloLens app
  • I have an MR branch based upon the Dev_Unity_2017.2.0 branch – this is the Mixed Reality variant. In this branch sits all the intelligence that makes the app work on a HoloLens and an immersive headset.
  • At one point the stars will align to a point where I can use one version of everything (most notably, Unity, which keeps on being a wild card in this constellation) to generate an app that will work on all devices. Presumably the Mixed Reality Dev_Unity_2017.2.0 branch will become the master branch. Then I will not merge to my master branch – that will be deleted. My MR branch will be based upon the latest stuff and will become the source of everything.

Changes in code and Unity objects

Preprocessing directives

In the phase that I could not create Unity C# projects - hence debuggable projects - it seemed to me that that UWP code within #if UNITY_UWP compiler directives did not get to be executed in the not-debuggable projects that were the only thing I could generate. Peeking in the HoloToolkit - I beg your pardon - Mixed Reality Toolkit I saw the all UNITY_UWP compiler directives were gone, and several others were used. I tried WINDOWS_UWP and lo and behold - it worked. Wanting to keep backwards compatibility I changed all the #if UNITY_UWP directives to #if UNITY_UWP || WINDOWS_UWP. I am not really sure it's still necessary - looking in the Visual Studio solution build configuration now, I see both conditionals defined. I decided to leave it there.

Camera

Next to the tried and trusted HoloLensCamera, there's now the MixedRealityCamera. This also includes support for controllers and stuff. What you need to do is to disable (or remove) the HoloLensCamera and add a MixedRealityCameraParent:

image

This includes the actual camera, the in-app-controller display (just like in the Cliff House, and it looks really cool) as well as default floor - a kind of bluish square that appears on ground level. I thinks it's apparent size is about 7x7 meters, but I did not check. As Walk the World has it's own 'floor' - a 6.5x6.5 meter map, I did not need that so I disabled that portion.

Runtime headset checking - for defining the floor

I am not sure about this one - but when running a HoloLens app, position (0,0,0) is the place where the HoloLens is at the start of the app. That is why my Walk the World for HoloLens starts up with a prompt for you to identify the floor. That way, I can determine your length and decide how much below your head I need to place the map to make it appear on the floor. Simply a matter of sending a Raycast, having it intersect with the Spatial Mapping at least 1 meter below the user's head, and go from there. I will blog about this soon. In fact, I have already started doing so, but then this came around.

First of all, we don't have Spatial Mapping in an immersive headset. But experimenting I found out that (0,0,0) is not the user's head position but apparently the floor directly beneath the headset on startup. This makes life a whole lot easier. I just check

if(Windows.Graphics.Holographic.HolographicDisplay.GetDefault().IsOpaque)

then skip the whole floor finding experience, make the initial map at (0,0,0) and I am done.

Stupid tight loops

In my HoloLens app I got away with calling this on startup.

private bool CheckAllowSomeUrl()
{
    var checkLoader = new WWW("http://someurl");
    while(!checkLoader.isDone);
    return checkLoader.text == "true";
}

This worked, as it was in the class that was used to build the map. In the HoloLens app was not used until the user had defined the floor so it had plenty of time to do it's thing. Now, this line was called almost immediately after app startup and the whole thing got stuck in a tight loop and I only got a black screen.

In the mean time, I have upgraded the Unity version that builds the HoloLens app from 5.6x to 2017.1.x and this problem occurs there now as well. Yeah, I know it's stupid. I wrote this quite some time, it worked, and I forgot about it. Thou shalt use yield. Try to pinpoint this while you cannot debug.

Skybox

A HoloLens app has a black Skybox, as it does not need to generate an virtual environment - it's environment is reality. An immersive head set does not have that, so in order to prevent the user having the feeling to float in and empty dark space, you have to provide for some context. Now Unity has a default Skybox, but according to a Microsoft friend who helped me out, using the default Skybox is Not A Good Thing and the hallmark of low quality apps. Since I only ever made HoloLens apps, this never occurred to me. With the aid of the HoloDeveloper slack channel I selected this package of Skyboxes and selected the HaloSky, which gives a nice half-overcast sky.

Coming from HoloLens, having never had to bother with Skyboxes before, you can spend quite some time looking for how the hell you are supposed to set one. I am assume it's all very logical for Unity buffs, but the fact is that you don't have to look in the Scene or the Camera - the most logical place to look after all - but you have to select Windows/Lighting/Settings from the main window. That will give a popup where you can drag the Skybox material in.

image

You can find this in the documentation, in this page that is titled "How do I Make a Skybox?" but since I did not want to make one, just use one it took me a while to find it. I find this confusing wording rather typical for Unity documentation. The irony is that the page itself is called "HOWTO-UseSkybox.html"

Upgrading can be fun - but not always

At one point I had to upgrade from Unity 5.6.x to 2017.1.x and later 2017.2.x. I have no idea what exactly happened and how, but at some point some settings I had changed from default in some of my components in the Unity editor got reverted to default. This was fortunately easy to track down with a diff using TortoiseGit. I also noticed my Store Icon got reverted to it's default value - no idea why or how, but still.

In the cause of upgrading, you will also notice some name spaces have changed in Unit. For instance, everything that used to be in UnityEngine.VR.WSA in now in UnityEngine.XR.WSA. Similar things happened in the Mixed Reality Toolkit. For reasons I don't quite understand, using a TextToSpeechManager can now only be used by calling from the main thread. For extra fun, in a later release it's name changed into TextToSpeech (sans "Manager") and the method name changed a little too.

Submitting for multiple device families

Having only submitted either all-device-type supported UWP apps or Hololens apps that, well, only ran on HoloLens, I was a bit puzzled how to go about making various packages for separate device families. I wanted to have an x86 based package for HoloLens, and an x86/x64 package for Windows Mixed Reality. I actually built those on different machine and I also gave them different version numbers.

image

But whatever I tried, I could not get this to work. If I checked both the checkbox for Holographic and Windows, the portal said it would offer both packages on both platforms depending on capabilities. I don't know if that would have caused any problems, but I got a tip from my awesome friend Matteo Pagani that I should dig into the Package.appxmanifest manually.

In my original Package.appmanifest it said:

<Dependencies>
<TargetDeviceFamily Name="Windows.Universal" MinVersion="10.0.10240.0" 
                    MaxVersionTested="10.0.15063.0" />
</Dependencies>

For my HoloLens app, I changed that into

<Dependencies>
<TargetDeviceFamily Name="Windows.Holographic" MinVersion="10.0.10240.0" 
                    MaxVersionTested="10.0.15063.0" />
</Dependencies>

For my Mixed Reality app, I changed that into

<Dependencies>
<TargetDeviceFamily Name="Windows.Desktop" MinVersion="10.0.16299.0" 
                    MaxVersionTested="10.0.16299.0" />
</Dependencies>

And then I got the results I wanted, and I was absolutely sure the right packages were offered to the right (and capable) devices only.

Setting some more submission options

From the same Microsoft friend who alerted me to my Skybox issues I also got some hints on how to submit a proper Mixed Reality head set app. There were a lot of options I was not ever aware of. Under "Properties", for instance, I set this

image

as well as this under "System requirements" (left is minimum hardware, right is recommended hardware)

image

Actually, you should set a lot more settings considering the minimal specs for the PC. Detailed instructions can be found here, including the ones I just discussed ;)

Conclusion

It was a rocky ride but a fun one too. I spent an insane amount of time wrestling with unfinished tools, but seeing my app work on the Mixed Reality headset for the very first time was an adrenaline high I will not forget easily. Even better was the fact I managed to sneak in the back door to get my app in the Store ready for the Fall Creators Update launch - that was a huge victory.

In the end, I did it all myself, but I could not have gotten there without the help of all the people I already mentioned, not to mention some heroes from the Slack Channel, particularly Lance McCarthy and Jesse McCulloch who were always there to get me unstuck.

In hindsight, Mixed Realty development is not harder than HoloLens development. In fact, I'd call it easier because you are not constrained by device limits, deployment and testing goes faster, and the Mixed Reality Toolkit evolved to a point where things get insanely easy. Nearly all my woes where caused by my stubborn determination to be there right out of the gate, so I had to use tools that still had severe issues. Now stuff is fleshed out, there's not nearly as much pain. The fun things is, when all is said and done, HoloLens apps and Mixed Reality apps are very much the same. Microsoft vision for a platform for 3D apps is really becoming true. You can re-use your HoloLens knowledge for Mixed Reality - and vice versa. Which brings us to:

Lesson 6: if you thought HoloLens development was too expensive for you, get yourself a headset and a PC to go with it. It's insanely fun and a completely new, exiting and nearly empty market is waiting for you!

Enjoy!

02 August 2017

Creating a 3D topographical map in your HoloLens / Windows MR app with the Bing Maps Elevation API

Intro

All right, my friends. It's time for the real deal, the blog post I have been anticipating to write since I first published the 3D version of Walk the World like two months ago. It took me a while to extract the minimal understandable code from my rather convoluted app. Everyone who has ever embarked on a trip of 'exploratory programming' (meaning you have an idea what you want to do, but only some vague clues how) knows how you end up with a repo (and code) full of failed experiments, side tracks, etc. So I had to clean that up a little first. Also, my app does a lot more than just show the map, and those features would obscure the general idea. As a bonus - after creating this blog post I finally actually understand myself how and most importantly why the app works :).

So, without further ado, I am going to show you how to display a 3D map in your HoloLens or Windows MR headset. Just like in Walk the World. I will build upon my previous post, in which I showed you how to make a flat slippy map. This time, we are going 3D.

The general idea

As you can read in the previous post, I 'paste' the actual map tiles - mere images - on a Unity3D Plane. A Plane is a so-called mesh that exists out of a grid of 11x11 points, that form the vertices. If I somehow would be able to ascertain actual elevation on those locations, I can move those points up and down and actually get a 3D map. The tile itself will be stretched up and down.The idea of manipulating the insides of a mesh, which turns out to be very simple, is explained by the awesome Rick Bazarra in the first episode of his must-see "Unity Strikes Back" explanatory video series on YouTube, a follow-up to his Creative Coding with Unity series on Channel 9, that I consider a standard starting point for everyone who wants to get off the ground with Unity3D.

So where do we get those elevations? Enter the awesome Microsoft service called the Bing Maps Elevation API. It seems to be built-to-order for this task. Your first order of business - get yourself a Basic Bing Maps key.

Adding some geo-intelligence to the tile

The Bing Maps Elevation API documentation describes an endpoint GetElevations that allows you to get altitudes in several ways. One of them is a grid of altitudes in a bounding box. That is what we want - our tiles are square. The documentation says the bounding box should be specified as follows:

"A bounding box defined as a set of WGS84 latitudes and longitudes in the following order:
south latitude, west longitude, north latitude, east longitude"

If you envision a tile positioned so that north is up, we are required to calculate the geographical location of the top-right and bottom-left of the tile. The Open Street Maps Wiki provides code for the north-west corner of the tile, i.e. top left. I translated the code to C#...

//http://wiki.openstreetmap.org/wiki/Slippy_map_tilenames#C.23
private WorldCoordinate GetNorthWestLocation(int tileX, int tileY, int zoomLevel)
{
    var p = new WorldCoordinate();
    var n = Math.Pow(2.0, zoomLevel);
    p.Lon = (float)(tileX / n * 360.0 - 180.0);
    var latRad = Math.Atan(Math.Sinh(Math.PI * (1 - 2 * tileY / n)));
    p.Lat = (float) (latRad * 180.0 / Math.PI);
    return p;
}

image... and it works fine, but we need the south west and the north east points. Well, if you consider how the tiles are stacked, you can easily see how the north east point is the north-west point of the tile right of our current tile, and the south-west point is the north west point of the tile below our tile. Therefore we can use the north-west points of those adjacent tiles to find the values we actually need - like this:

public WorldCoordinate GetNorthEast()
{
    return GetNorthWestLocation(X+1, Y, ZoomLevel);
}

public WorldCoordinate GetSouthWest()
{
    return GetNorthWestLocation(X, Y+1, ZoomLevel);
}

That was easy, right?

Size matters

Although Microsoft is a USA company, it fortunately has an international orientation so the Bing Maps Elevation API returns no yards, feet, inches, miles, furlongs, stadia, or any other deprecated distance unit – it returns plain old meters. Which is very fortunate, as the Windows Mixed Reality distance unit is – oh joy – meters too. But it returns elevation in real world values, and while it might be fun to show Kilimanjaro in real height, it will be a bit too big to fit in my room (or any room, for what matters). Open Street Map is shown at a definite scale per zoom level – and as a GIS guy, I like to be the height correctly scaled - to get for this real life feeling. Once again, referring to the Open Street Map Wiki – there is a nice table that shows how many meters a pixel is at any given zoom level. We will need the size per tile (which is 256 pixels, as I explained in the previous post), so we add the following code that will give you a scale factor for the available zoom levels for Open Street Map:

//http://wiki.openstreetmap.org/wiki/Zoom_levels
private static readonly float[] _zoomScales =
{
    156412f, 78206f, 39103f, 19551f, 9776f, 4888f, 2444f,
    1222f, 610.984f, 305.492f, 152.746f, 76.373f, 38.187f,
    19.093f, 9.547f, 4.773f, 2.387f, 1.193f, 0.596f, 0.298f
};

private const int MapPixelSize = 256;

public float ScaleFactor
{
    get { return _zoomScales[ZoomLevel] * MapPixelSize; }
}

Creating the request

Now we move to MapTile. We add the following code to download the Bing Maps Elevation API values

private string _mapToken = "your-map-token-here";

public bool IsDownloading { get; private set; }

private WWW _downloader;

private void StartLoadElevationDataFromWeb()
{
    if (_tileData == null)
    {
        return;
    }
    var northEast = _tileData.GetNorthEast();
    var southWest = _tileData.GetSouthWest();

    var urlData = string.Format(
    "http://dev.virtualearth.net/REST/v1/Elevation/Bounds?bounds={0},{1},{2},{3}&rows=11&cols=11&key={4}",
     southWest.Lat, southWest.Lon, northEast.Lat, northEast.Lon, _mapToken);
    _downloader = new WWW(urlData);
    IsDownloading = true;
}

This simply queries the TileInfo structure for the new methods we have just created. Notice it then builds the URL, containing the bounds, the hard coded 11x11 points that are in a Unity Plane, and the key. Then it calls a piece of Unity3D code called “WWW”  which is a sort of HttpClient named by someone with a lot of fantasy (NOT). And that’s it. We add a call to the existing SetTileData method like this:

public void SetTileData(TileInfo tiledata, bool forceReload = false)
{
    if (_tileData == null || !_tileData.Equals(tiledata) || forceReload)
    {
        TileData = tiledata;
        StartLoadElevationDataFromWeb();
    }
}

so that whenever tile data is supplied, it does not only initiate the downloading of the tile, but also the downloading of the 3D data.

Processing the 3D data

Next up is a method ProcessElevationDataFromWeb, that is called from Update (so about 60 times a second). In this method we check if a the MapTile is downloading – and if it’s ready, we process the data

protected override void OnUpdate()
{
    ProcessElevationDataFromWeb();
}

private void ProcessElevationDataFromWeb()
{
    if (TileData == null || _downloader == null)
    {
        return;
    }

    if (IsDownloading && _downloader.isDone)
    {
        IsDownloading = false;
        var elevationData = JsonUtility.FromJson<ElevationResult>(_downloader.text);
        if (elevationData == null)
        {
            return;
        }

        ApplyElevationData(elevationData);
    }
}

An ElevationResult is a class to deserialize a result from a call to the Bing Maps Elevation API in. I entered the result of a manual call in Json2CSharp and got a class structure back – only I changed all properties into public fields so the rather stupid limited Unity JsonUtility, that does not seem to understand the concept of properties, can handle it. I also initialized lists in the objects from the constructors. It’s not very interesting but if you want a look go here in the demo project.

Applying the 3D data.

So now it’s time to actually move the mesh points up an down. Mostly using code I stole from Rick Bazarra, with a few adaptions from me:

private void ApplyElevationData(ElevationResult elevationData)
{
    var threeDScale = TileData.ScaleFactor;

    var resource = elevationData.resourceSets[0].resources[0];

    var verts = new List<Vector3>();
    var mesh = GetComponent<MeshFilter>().mesh;
    for (var i = 0; i < mesh.vertexCount; i++)
    {
        var newPos = mesh.vertices[i];
        newPos.y = resource.elevations[i] / threeDScale;
        verts.Add(newPos);
    }
    RebuildMesh(mesh, verts);
}

private void RebuildMesh(Mesh mesh, List<Vector3> verts)
{
    mesh.SetVertices(verts);
    mesh.RecalculateNormals();
    mesh.RecalculateBounds();
    DestroyImmediate(gameObject.GetComponent<MeshCollider>());
    var meshCollider = gameObject.AddComponent<MeshCollider>();
    meshCollider.sharedMesh = mesh;
}

First we get the scale factor – that’s simply the value by which elevation data must be divided to make it match the current zoom level. Next, we get the elevation data itself, that is two levels down in de ElevationData. And then we go modify the elevation of the mesh points to match those of the elevation we got. For some reason - and that's why I said it looks like the Bing Maps Elevation API looks to be like built-to-order for this task - the points come in at exactly the right order for Unity to process in the mesh.

As I learned from Rick, you cannot modify the points of a mesh, you have to replace them. So we loop through the mesh points and fill a list with points that have their y – so the vertical direction – changed to a scaled value of the elevation. Then we call RebuildMesh, that simply replaces the entire mesh with new vertices, does some recalculation and rebuilds the collider, so your gaze cursor will actually play nice with the new mesh. I also noticed that it you don’t do the recalculate stuff, you will end up looking partly through tiles. I am sure people with a deeper understanding of Unity3D will understand why. I just found out that it needs to be done.

Don't press play yet! There a few tiny things left to do, to make the result look good.

Setting the right location and material

First of all, the map is kind of shiny, which was more or less okay-ish for the flat map, but if you turn the map into 3D you will get this over bright effect. So open up the project in Unity, create a new material “MapMaterial” and apply the properties as displayed here below left. The color of the material should be #BABABAFF. See left image. When it is done, drag it on top of the MapTile (see right image).

imageimage

Then, the app is still looking at Redmond. While that’s an awesome place, there isn’t much spectacular to see as far as geography is concerned. So we mosey over to the MapBuilder script. There we change the zoom level to 14, the Latitude to 46.78403 and the Longitude to -121.7543

image

It's a little east and quite a bit more south from Redmond. In fact, when you press play, you will see a landmark that is very familiar if you live anywhere near the Seattle area or visited it:

image

famous Mount Rainier, the volcano sitting about 100 km from Seattle, very prominently visible from aircraft - weather permitting. To get this view, I had to fiddle with the view control keys a little after pressing play - if you press play initially you will see Rainier from a lot more close up.

And that, my friends, is how you make a 3D map in your HoloLens. Of almost any place in the world. Want to see Kilimanjaro? Change Latitude to -3.21508 and Longitude to 37.37316. Press play.

image

Niagra falls? Latitude 43.07306, Longitude -79.07561 and change zoom level to 17. Rotate the view forward a little with your mouse and pull back. You have to look down. But then, here you go.

image

GIS is cool, 3D GIS is ultra-cool! All that is left is to generate the UWP app and deploy it into your HoloLens or view it in your new Windows Mixed Reality device.

Caveat emptor

Awesome right?  Now there are a few things to consider. In my previous post I said this app was a bandwidth hog, as it downloads 169 tiles per map. In addition, it now also fires 169 requests per map to the Bing Maps Elevation API. For. Every. Single. Map. Every time. Apart from the bandwidth and power consequences, there's another thing to consider. If you go to this page and click "Basic Key", you will see something like this:

image

What is boils down to is - if your app is anywhere near successful, it will eat your allotted request limit very fast, you will get a mail from the Bing Team kindly informing you of this (been there, got that) - and then suddenly you will have a 2D app again. You will have to buy and enterprise key and those are not cheap. So I advise you to do some caching - both in the app and if possible on an Azure service. I employed a Redis cache to that extent.

Furthermore, I explained I calculate the north east and south west points of a tile using the north west points of the tiles left of and below the current tile. If those tiles are not present, because you are at the edge of the map, I have no idea what will happen - but presumably it won't work as expected. You can run into this when you are zoomed out sufficiently in the very south or east of the map. But then you are either at the International Date Line (that runs from the North Pole to the South Pole exactly on the side of Earth that is exactly opposite of the Greenwich Meridian) or at Antarctica. On the first spot, there’s mostly ocean (why else do you think they’ve put it there) and thus no (visible) geography to speak of. As far as Antarctica goes, you’ll hit another limitation, for it clearly says in the Bing Maps Elevation API documentation:

"There is a limitation in that latitude coordinates outside of the range of -85 and 85 are not supported."

So beware. Stay away from the Earth's edges. Your app might fall off :).

Some assembly required, batteries not included

Indeed, it does not look exactly like in the videos I showed. Walk the World employs a different map tile sets (plural indeed), and there's also all kinds of other stuff my app does - like sticking the map to the floor so that even at high elevations you have a nice overview, reverse geocoding so you can click on the map to see what's there, tracking the user's moves so it can make a new map appear where the user is walking off it  - connecting to the old one, zoom in/out centered on the user's location in the map, showing a map of the physical surroundings... there's a lot of math in there. I only showed you the basics. If you need a HoloDeveloper who knows and understand GIS to the core, you know who to contact now :)

Conclusion

Once you know the basics, it's actually pretty easy to create a 3D scaled map of about anywhere in the world, that is - anywhere where the Bing Maps Elevation API is supported. The 3D stuff is actually the easy part - knowing how to calculate tiles and build a slippy map is harder. But in the end, it is always easy when you know what to do. Like I said, I think GIS is a premier field in which Mixed Reality will shine. I am ready for it: I hope you are too. Innovate or die - there are certainly companies I know that could take that advice and get moving.

Get the demo project and get inspired!

Credits

Thanks to RenĂ© Schulte, the wise man from 51.050409, 13.737262 ;)  for pointing me to the Bing Maps Elevation API. And of course to Rick Bazarra, who inspired me so often and actually provided some crucial code in his YouTube training video series.