The Zed Mini Camera is here! I first have to mount it to the VivePro, install the relevant Zed SDK and then I can start developing AR/MR. Wow, things do move fast.
Prior to diving into it, I will first do some research on various resources related to developing MR/AR using the Zed.
The first thing I come across is the page by StereoLabs: https://www.stereolabs.com/docs/getting-started/. This seems like a good place to start. It contains information about installation, usage instructions and even beginner tutorials about AR/MR development.
So that’s good. But now, let’s think a bit of how we are going to proceed with this and set some smart goals for this day.
- Mount the camera on the Vive, install the Zed SDK and successfully demo it or play a game like Ping Pong. This will verify that it works.
- Study the Zed manual page and go through their tutorials to make basic AR/MR apps. That will give me a handle on how developing with these tools looks like.
At that point, I will feel safe and sound using the Zed Mini for AR/MR and even have some ideas about what kind of app I will want to make. But, I do believe that this is where I should stop and look around me a bit. I am still very novice with developing with C# and Unity and AR/MR in the future of this project will be mainly needed by me in order to develop an interface on which the user interaction can be based. Thus, I will definitely need to dive more deeply into game programming. So the following goal is for the weekend:
- Follow a more advanced Unity Game Tutorial and get more acquainted with using Scripts, Prefabs etc.
- Read more of formal C# from your book, C# Percisely.
Now, looking even more ahead, next week’s tasks will focus on building an AR/MR UI for testing. That will include making clickable events, text and perhaps an android which can simulate the sign language motions. All those definitely seem within my grasp at the moment but they will even more so after this weekend’s tutorials.
Lastly, the issue of wearing the gloves and exchanging the data with the MR/AR application will be arriving soon. This is something I will have to discuss with the rest of the team, but for now I believe that as long as there exists a way to process the information, then the only thing that is left is piping it somehow to the Unity application. We’ll see.
Ok, so problem #0. The base where the camera is supposed to be mounted isn’t made for VivePro and so it’s not sticking really well. I need to be careful and get the replacement printed mount as soon as possible.
Problem #1: The camera needs to be connected to a USB port and the computer only has 2. That means no mouse. I need to bring a multi-USB adaptor from home.
Securing camera wire with Vive headset wire: OK!
Zed SDK GitHub: https://github.com/stereolabs/
Zed SDK required CUDA installation. Installed Zed SDK. Required computer restart. The USB cable is not reversible so one has to follow the arrows on it. That caused the camera to not be detected at first. Now it works. Downloading and installing Zed World app which will allow me to demo AR/MR.
Ran Zed World. First impressions. Not completely what I expected. The camera has a limited FOV and cannot really see things in detail. I also didn’t see any AR/MR things going on. Also the computer seemed to really be straining to handle it. Just now it crashed and I got a SYSTEM_THREAD_EXCEPTION_NOT_HANDLED, which isn’t good.
The second time I tried running Zed World, I had a more positive experience. I think that the FOV is just that way and actually it seems enough for our applications. It does seem like the camera has a limited resolution, which prevents the user from making out very fine details but that still can be ignored. My biggest issue was the AR. With the exception of some morphed, blurry images flying by in a way that is extremely difficult to detect or interact with, the experience overall with AR was disappointing. Zed World glitched and was in general really slow, was is definitely something we do not like. I’ll look into this topic first.
I downloaded and started testing some things in with the Zed Unity Plugin. It turns out (I think) that you also must import the steamVR plugin for it to work; otherwise you run the scenes and only the camera works (the Vive isn’t involved). So after that, I ran some demo scenes and got to see some AR/MR, which was cool. But it wasn’t that great and the quality was definitely lower than I had anticipated. I am suspecting that perhaps its my room setup and the fact that I am working on a very confined space. But at least it worked most of the time and now I can just dive into working developing a simple AR application.
The first thing we learn by studying the documentation of the ZED Plugin for Unity is the existence of a custom camera made by ZED specifically for AR/MR. They are the
ZED_Rig_Mono, ZED_Rig_Stereo
plugins and they appear in place of the camera we were used to before. So it is important to delete the main camera from a Unity project when planning to use the above Prefabs.
The Camera gameobjects have access to a script called ZED Manager. It handles a lot of of the parameters that go into forming the AR experience. There is a parameter called Input Type. Select ‘USB’ and then set the Resolution as you like. Let’s try to see if something changed by running the scene (input type was SVO before). The scene doesn’t play. Planetarium plays but not sampleMR. Let’s try another scene. The drones sorta work and so does disco, which is good. Moving on.