#VR #UX
May 07, 2018
4min read
VR Zone at MCE! Meet Our Teleoperating Robot
Don't have your ticket yet? Hurry! Only until May 10th, you can register at a regular price!
As a part of the VR/AR team's research & development activities, we are keen on exploring the potential applications of VR for Business. We believe that at some point in the future emerging technologies, like Machine Learning, loT, VR/AR and Robotics will converge, creating powerful tools for daily activities for decision makers. With a close collaboration with NoMagic.ai-that built a software platform controlling the robot - and Encon-Koester, exclusively for this years' MCE, we've developed the VR industrial train- ing inspired game, where the virtual actions of the human player will be followed by the real-life robot player.
Where VR-controlled robots come into play
VR is a new medium, that allows to completely immerse user senses into a virtual envi- ronment and create scenarios, which feel almost realistic. Such techniques are especially suited for virtual reality manufacturing trainings, to teach necessary high-skills for operating heavy machinery in Industry 4.0 without the risk of costly mistakes. In our game you will become a conveyor belt operator in a futuristic factory orbiting around Earth. You'll need some good reflexes and a quick-thinking ability to get a high score. But don't worry - our highly reliable robot Marvin will help you do the job properly;) While you'll be doing your part of the job in VR, Marvin will be doing some of it in real life. For those attendees who will take part in the competition and get top scores, we have prepared amazing prizes. They may even be some Easter Eggs;)
Jarek
Principal Software Engineer
I can't give you any more details about our VR experience, as I don't want to spoil the fun. Stay tuned as the mystery will be unravelled during MCE conference. If you want to learn first-hand about the implementation details of the VR game and the robotic software, Jarek and I will be giving the talk on the 2nd day of the conference.
Besides the already mentioned implementation of VR in staff training, we believe that in the future we will see more usage of VR in remotely teleoperated robots. In fact Toyota has already prototyped such a virtual reality robot system.
The idea behind MCE
5 years ago, when we first started organizing MCE, there was little to none confer- ences, which tried to combine the two, so-called "opposite" worlds of developers and designers. We've always believed that developers and designers should have a common place to network and exchange ideas and experiences. With MCE we managed to create a space for both groups to meet. By listening to inspiring talks of the world-class experts combined with various networking opportunities, the attendees can get out of their comfort zone and get inspired. Met with a great success, year by year we've been making MCE bigger and more ambitious. And now we are in 2018.
For the last two years, there has been more and more buzz about VR, AR, AI, Blockchain or Robotics. We are being fed with crazy stories like the first Al robot with a citizenship, robots doing backflips or training robots with human movements through VR. All those technologies are still emerging, but in a couple of years they will redefine the world we live in. Mobile smartphones will expand to mobile smart glasses or smart lenses, Skype or Spotify will have to run on an elderly-care robotics and so on. This is the main inspiration behind MCE 2018 theme-#redefiningtech.
What's to come
During 2 intensive days, the audience will discuss immersive experiences in VR, UX and Al, cross-platform development, digital inclusion and more.
The line up features some of the biggest names in the tech industry. Just to just name a few:
Pete Trainor - Co-Founder and Director of Human Focused Technology at Us Ai and author of "Hippo - The Human Focused Digital Book"
Tom Greever - author of the bestseller "Articulating Design Decisions"
Mike Alger - VR designer at Google
Joanna Chwastowska - Engineering Coach and Growth Manager at DeepMind Health
Chris Wróbel
VR/AR Expert
#VR #UX
July 02, 2018
6min read
How to Make a VR Game? Step-By-Step UX Review
MCE conference gathered developers, designers and tech influencers from all around the world. The 5th edition of the event was all about 2 tracks of talks with the well-known speakers representing various industries. One of the attractions waiting for the participants was VR zone-with Sort-It VR Game that I had a pleasure to work on. I would like to focus on the UX aspects of our VR experience.
How to design and develop a VR Game?
Our main goal was to design and develop a VR experience that would be fun for MCE 2018 attendees and also utilize a robotic arm (For real! We could use UR-3 robot thanks to NoMagic company). This experience had to be interesting for both a user currently wearing a VR headset, as well as for the people around watching it in the real world. So when we got the UR-3 robotic arm at our disposal, we were aware of the huge possibilities and also a big challenge that came with it. What is more, we used HTC Vive, so we had the opportunity to make 6 Degrees of Freedom fun for our players. Especially, we wanted to inspire industrial decision makers to consider VR in their businesses.
Why a VR Game
Virtual Reality is a fast-developing branch and grows dynamically. That's why the giants like Facebook or Google invest many resources in this technology. Our challenge was not to make our game...just a game. It was supposed to be a game that can give us fun and measurable value or has potential to be used in a business context as well. What is more, the robot (we named him Marvin, inspired by The Hitchhiker's Guide To The Galaxy) had to help us with making our app unique.
UX research first
You can't just make a great product without the UX research! We started with the fun-damental action which was to understand our users and context. We used the Personas tool and constructed 3 types of personalities we wanted to involve in our VR Space. We tried to empathize with them to understand the needs and perspectives of our potential users - their emotions, inspirations and expectations. All in order to make their conference experience fun! We've made 3 personas:
Identifying personas helped us to empathize with and understand our guests. This kind of research always helps with creating solutions tailored to user's needs. It also allows to predict some feelings that we want to avoid.
Next, we conducted a wide research to know as much as we can about VR medium, technical possibilities of integrating our VR game with the robot in real life and some previous achievements in this field. We already knew what tools would be helpful (or just necessary), like Unity3D, Sketch, diagramming and wireframing tools, paper, pencils, VR hardware and post-it notes.
Let's ideate!
I can specifically remember the excitement during the ideation phases. We've brainstormed to generate as many experience concepts as possible. To be sure, we conducted participatory process with all stakeholders (Developer, UX Designers, VR Designer, Project Manager and Product Owner), because every perspective was crucial to understand the core of the project. It's absolutely fantastic how much crazy ideas can be produced by only a few people with totally different skills and personalities. Finally, we chose 3 main concepts to refine, describe and make simple app visual flows.
After validating the concept according to the initial requirements we were left with one final game concept to develop. We knew by then that we wanted to create the game which imitates sorting things from the conveyor belt, just like in an industrial factory. We made a very simple prototype of this experience in Unity. During our first demo we let the first players sort simple cubes into containers by the color. Despite the lack of the scoring logic and other things that make games addictive, players reacted enthusiastically. The contact with our VR medium and stepping into the virtual world was a great experience for our first game testers and at the same time very encouraging to us.
Good flow is good. Test it well!
Next, we created the game rules to engage our players. The set up was simple - the player has limited time to get a high score by effectively sorting out items on the conveyor belt. Firstly, we had to adapt our interaction ideas to technical possibilities of interactions in VR hardware. We wanted to make sure that our users are comfortable with grabbing and throwing items, so we had to set up VR controllers as intuitively and simple as possible. Our tutorial allowed us to guide the players through the experience making it more fun. In order to test our work, we made few simple prototypes which helped us to choose the best solutions. For example, we were not sure what type of interaction is more intuitive for people - interestingly, most of the people preferred touching the button directly and sometimes they even tried to do it by pushing the trigger on the controller.
Remember! Test your app with users, ask them questions and analyze the data - it will give you a great feedback and opportunity to refine effects of your work. One more important thing: the flow was evolving during almost the entire project's duration.
Customer Journey Map in VR
We needed to look at the game's structure from a wider perspective and visualize it in a more graphic way. To do so, we used the Customer Journey Map, which was a clear visual sketch, helpful in visualizing the player and the environment around him/her, so we could focus on interactions and the game experience. We found a great tool to do screenplays, perfect for visualizing the VR player's perspective. What is more, in order to make the game comfortable to play, we built the gamespace: distances, sizes of items etc. as a prototype, because it gave us a possibility to check it with the users at the early stage of the development.
Moodboard mood
Every game needs a defined mood board. This is a very good way to start making the virtual environment. Everyone in the team shared their ideas during the brainstorming session. We had to finally establish assumptions about style and the overall feeling of our game. We've decided to place our player in the industrial space - bright, full of shadows and a bit dusty. Only some items were drawn with pastel colors. The objects to grab and sort out were animated: we opted for some funny little monsters in 4 colors and 4 different moods (sleepy, angry, giggling, shocked). We've also placed some blocks from the MCE Conference key visual, that the players have to put on the screen in front of them (for some extra points). By the end of our project we came up with the idea to place our factory in Space. Which is why you can see The Space behind your back, while you're playing the game...and a Tesla drifting with the cosmonaut inside. Sounds awesome? It is.
Wireframes time
Finally, it was time for creating first UI concepts, Lo-Fi UI Wireframes and Unity's as- sets research. The main interaction was the dragging and dropping items in VR and we wanted to make this experience the best we could. The tutorial and the intuitive interactions with interfaces and game objects were crucial.. Firstly we took care of clear text communication. Based on many tests with users we chose the best options. Once that was ready, we moved on to the Hi-Fi Wireframes.
Shiny experience. Animations, sounds, haptic feedback
Over the course of the whole project, we had tested various gamespace types with many people to be sure that we exhausted all the VR space possibilities and made it comfortable and fun for the users. Next, we worked simultaneously on the environment's design and UI design, as well as on animations, sounds and haptic feedback, which turned our VR game into an immersive experience.
But wait... Where's Marvin, the robot?
We knew what will happen with Marvin once we've established the idea of the game. Still, there were questions to be answered: Should the robot compete with our player? Or help? Finally, we decided that Marvin will reflect (in reality) what the player does in VR-when he or she matches the MCE block with the special board, the robot does the same thing in real life, with exactly the same block. We believe that Sort-it VR shows a potential of how robots or robotic arms specifically could help humans in real life.
Creating immersive experiences takes time, but it is worth it. The moment you see how players are getting involved and fascinated with your game and want to compete with each other is priceless.
If you have any questions about our game or VR in general, get in touch!
Paulina Ficner
Junior VR/AR Designer
more details about the project (article written by the team members):
Source of articles: www.polidea.com
Their website does not work, because Polidea company was acquired by Snowflake
Articles captured by https://archive.org/web/
Source of articles: www.polidea.com
Their website does not work, because Polidea company was acquired by Snowflake
Articles captured by https://archive.org/web/