// A project by Jay-Cee at PLANETART //

Bachelor Thesis - June 2021

Name: Jay-Cee Samar
Student number: 448837
Studying Creative Media & Game Technologies (CMGT) at Hogeschool Saxion University of Applied Sciences

Company Supervisor
Name: Kees De Groot

Graduation Coach
Name: Kasper Kamperman
Institution: Hogeschool Saxion University of Applied Sciences

The following chapter is about the specifications of the product, the hardware and software used.

Problem definition

Problem Analysis

By looking at the current goals for the project, the two problems given are as follows: More people need to visit WARP Technopolis and GOGBOT to experience and get inspired by virtual art. A virtual reality exhibition would be a step in the right direction to solve this problem if successful. The problem behind the problem could be caused because the WARP Technopolis has unused rooms meant to host art installations. Apart from festivals, these exhibition rooms are primarily unused. The given assignment could help give WARP Technopolis more content within the building, bringing in more visitors.

If successful, the product could help the client bring in more visitors. PLANETART is an artist collective; connected artists often visit the building. Local artists can try the product, and if it delivers sufficiently, this could have an inspiring influence on the artists about virtual artworks. 


The main question

The main question of this assignment is as follows:

"How to develop a VR experience to liven up a room at WARP Technopolis in the theme of the GOGBOT festival?"

Sub-questions are meant to help answer the main question:

  • "How can the GOGBOT theme be translated to a VR experience?"

  • "How to create an appealing style within Quill?"

  • "How to connect a Quill created VR experience to a physical room?"

Planned features of the product are:

– Visually appealing virtual space
– Sound effects
– Animated assets
– Room-scale VR
– Untethered VR experience
– Simple and foolproof experience
– Easy exhibition set up in a room

Potential additions to this project are:


– Set up manual
– Moodboard
– Concept Art
– Marketing campaign (Social media posts, branding and poster)
– Showcase website
– Product presentation

The clients' problem could originate from the empty spaces and the lack of incentive for people to visit WARP Technopolis. Apart from its decoration, an exhibition in the building could strengthen the focus on creative technology.


Tekengebied 10.png

This blog is written by Jay-Cee, a student at Hogeschool Saxion University of Applied Sciences, studying Creative Media & Game Technologies (CMGT) and an intern at PLANETART for this current project.
Jay-Cee is an artist that specialises himself in animation and motion graphics.
Virtual reality brings a unique way of experiencing and creating digital art to the table. On this website, said the topic would be explored for PLANETART to inspire artists about the possibilities of creating virtual art and summarise the design of a Virtual Reality product and any relevant information.


The project

The project consists of conducted research on creating and adding animated art to a physical space using Virtual Reality.

The final product is expected to be a functional exhibition that is visually interesting and experienced by physically walking around in a space and being able to look around and discover animated art pieces within virtual reality.

The Virtual Reality exhibition will be used to liven up a physical room at Warp Technopolis to attract visitors to experience virtual art. The product can also be shown off at festivals such as GOGBOT to attract more visitors. (1).gif

The company

PLANETART (founded in 1995) is an artists collective that organises projects, events and festivals in the field of art and technology, mass media, popular culture, DIY, music, experiment and activism.

Since 2004, PLANETART organises the annual GOGBOT festival in the centre of Enschede, and since 2014 TEC Art in Rotterdam. PLANETART is co-initiator of the development of Enschede's station square into an urban playground, from WARP Technopolis to SPACEBAR.


The project

The client is looking for an art installation that can attract more visitors to WARP Technopolis and festivals such as GOGBOT. Said installation is also used to inspire artist about the possibilities of virtual art.

Products and services

This art installation will be created in the form of a virtual exhibition.
The stated exhibition should be experienced by walking around a physical space and should include visually appealing animated visuals. Additionally, the product should be experienced by using a VR headset, preferably standalone or otherwise connected to a PC.
Next to the exhibition itself, this project will include concept art and animated 3D models. Sound design will be briefly researched for this project as well.
To attract more visitors, a potential promotional campaign for this experience should be looked into.

Product goal

This product aims to attract visitors to WARP Technopolis and events organised by PLANETART, such as GOGBOT festival. Next to that, it should inspire artists about virtual art and the possibilities of creating virtual art.

Project boundaries

The realisation of the project is limited by the time constraints given by Saxion, as a creative project could have only a limited amount of iterations and test phases. Furthermore, due to a pandemic happening, there needs to be looked into the measures and the safety of a virtual experience where visitors need to use a shared headset during an exhibition.
Saxion's XR-Lab provides the necessary tools to develop the stated project, such as powerful PC's and VR sets. In addition, PLANETART provides space for project development and the eventual showcase, and connections to artists who have experience in exhibitions for consulting.
Next to this, virtual reality artists and developers possibly need to be contacted for professional consulting and feedback.




PLANETART is one of the stakeholders in this assignment. The finished product will be displayed and showcased at WARP Technopolis and GOGBOT. They expect a creative and inspiring product that fits with the image they portray as a company and the theme of the GOGBOT festival.

Jay-Cee Samar

Jay-Cee, the writer and executor of this project, also has personal goals for this project. With this project, they want to grow as an artist and improve their skill in animation and VR painting within the Quill VR software. In addition, with this project, they want to profile themselves as a freelance Virtual reality artist and animator by creating and animating all assets for this project using Virtual Reality and Quill VR.


Saxion, the institution the student is graduating at, requires documentation of research and development of the product and gives a limited timeframe to work with. The project also needs to fulfil the twelve design competencies of Saxion to be sufficient. Saxion can also provide students with state-of-the-art virtual and augmented reality devices, computers and such through their XR-Lab.

Approach & Timeline

Double diamond process model

Double Diamond is a process model created by Design Council, a British organisation, in 2005. The model provides a graphic representation of a design process. [...]
The model presents four main stages across two adjacent diamonds.
Each of the four stages is characterised by either convergent or divergent thinking. These stages are:

  • Discover –identify, research and understand the initial problem.

  • Define – limit and define an apparent problem to be solved.

  • Develop – focus on and develop a solution.

  • Deliver – test and evaluate, ready the concept for production and launch. [2]


Double diamond process model [2]


Since this will be a one-person project under a time limit, the end product must be limited by some amount. The scope of this project has to be determined to be a single user one-room experience. The main focus of this project is going to be on the visual side of the experience. There will be one VR setup used for the entirety of this project to avoid differences in user experience that can occur across multiple hardware.
Due to the measures a pandemic brings with them, this product can only be tested by limited people to assure safety, which means that most testing must be done using employees. The current pandemic brings with it that the full potential of this product can't be fully realised. As of now, it's advised not to bring visitors to a building. This is why it needs to be considered that this experience should be simple to set up for later use when covid measures soften.

Planning & Approach

The planning for this project is based on planning similar to the process of creating an animation. Divided amongst six parts are different methods that are used in the creative industry. To assure the quality of research, methods from the CMD-method pack are applied. Since this project is an interactive experience, it requires testing. Because of this, the testing and iterating phase has been added to the planning. Furthermore, the double diamond process model has been implemented within this planning. Finally, the planning has been visualised as a timeline.
Each phase of the double diamond, discover, define, develop and deliver, has a place within this timeline.

360 Scan.png


A VR headset is needed to provide virtual reality to the user. A virtual reality headset provides a stereoscopic display, stereo sound, motion tracking sensors and motion controllers.
For this current project, the headset needs to comply with particular conditions:

  • Easy and safe to use

  • Allows for the user to walk around in a physical space using the headset (Roomscale VR)

  • Load custom VR experiences

Numerous headsets were available to use. Saxion provides several headsets and hardware to lend and experiment with from their XR-Lab. In addition to that, I purchased the Oculus Quest 2 when it came out, which makes this headset convenient to use since it was available to use at all times.

Oculus Quest 2

The Oculus Quest 2 is a wireless virtual reality headset that can load room-scale VR games without a cable. This is because of an integrated processor and battery. In addition, the Oculus tracking technology allows the user to have six degrees of freedom. Meaning the user can walk around and rotate in real space while the headset tracks the movement and adjust the virtual world to the position and orientation of the user. This is crucial for a virtual exhibition.
The wireless nature of the device makes it possible to create an easy to use and safe experience for the user since there is no wire for the user to get tangled in potentially or trip over, breaking the emersion.
A potential downside to this headset is the battery life of 2-3 hours. This would mean the headset needs to be charged in between uses, or the user would need to carry a battery pack if the battery of the headset happens to run low.
The resolution of the screen within the headset is ​1832×1920 per eye [3]. This high resolution increases image quality and immersion due to eliminating the 'screen-door effect, a mesh-like appearance that occurs due to gaps between pixels that can cause eyestrain [4].
The Oculus Quest 2 supports a Qualcomm® Snapdragon™ XR2 chip that can handle VR games to be loaded and played on the device itself. This would mean the experience could be loaded on the headset by itself without connecting it to a desktop. Due to the fact this chip being a mobile chip, it would mean the experience needs to be optimised since these chips aren't as powerful as a high-end desktop. It is also possible to play desktop VR games and experiences using the Quest 2 by plugging in a link cable or even wirelessly using a local network [5].


Oculus Quest 2 [3]

Guardian System

The Oculus Quest 2 also provides a Guardian System. 
The Guardian boundary is designed to display in-application wall and floor markers when users get near the play-area borders they defined. When the user gets too close to the edge of a boundary, a translucent mesh grid is displayed as a layer that is superimposed over the game or experience [6].

This guardian boundary should help the user walk around the physical space safely, preventing them from walking into a wall.
The boundary can be set up by pointing the controller towards the floor and drawing the shape of the border on the floor. The headset will remember the boundary shape when it recognises the room.

Setting up a boundary system is demonstrated in a published YouTube video [20].

Screenshot 2021-06-09 at 21.29.43.png
Screenshot 2021-06-09 at 21.30.41.png
Screenshot 2021-06-09 at 21.30.29.png

Setting up Guardian System [20]



Quill is the VR illustration and animation tool built to empower artists and creators, whether to create final art or as a production tool for concept creation aid.

Quill allows users to paint and animate in virtual reality on an infinitely scalable canvas - with rich colors and intuitive tools. Quill is designed to be expressive, precise and to let the artist's "hand" come through clearly - whether that's a watercolor style, pencil style, oil painting style or other.

Uploading and distributing your Quill creations to a VR audience has been made easier with the media management tool, Oculus Media Studio [7].

Quill is made to be intuitive to artists. It's expressive, efficient and comfortable to use over a long period of time.

Quill supports various animation approaches such as frame-by-frame, keyframe, anim brush and puppeteering techniques. This set of powerful tools and workflows allow artists to control the look and feel of the animation without requiring any traditional CG technical knowledge, such as rigging or curve manipulation.

WAV, MP3 files and Ambisionic sounds can be imported. These sounds can be used as stereo or spatial audio sources. Quill supports sphere, cone and frustum-based spacial audio emitters [8].

Personal goals

Mastering this software is something I strive for. I have previously used Quill, but I want to use this software within my professional arsenal as a freelancer, and I want to be known for my VR art. Creating all art and animation for this project in Quill allows me to gain experience in this field.

For these reasons, Quill has been chosen to use for creating and developing this experience.
Similar experiences can be loaded up on a Quest 2 using Quill Theater. An Oculus Media Studio profile is needed to publish a Quill project to Quill Theater.

Oculus Media Studio

Oculus Media Studio is a new media management tool for immersive creators to upload, publish, and analyze VR-first content. From this single hub, creators can easily distribute their stories directly to VR, where VR content is best enjoyed, while also enabling out-of-headset content discovery.

Visual fidelity is given a high priority in this pipeline. Accepted formats are high quality 4k+ 360 videos, 180 videos, and Quill creations that look good in VR [9].

Quill Theater

Regularly updated, the Quill Theater lets you visit high quality immersive animations from the comfort of Oculus TV, bringing you the best in immersive entertainment. These immersive animations are fully 6DOF, some including multiple viewpoints to let your move positionally within the art [10].

Vibe Aliens

An animation called Vibe Aliens has been created and optimised to work with Quill Theater to test if Quill Theater was applicable for hosting the VR experience.


Quill Theater requires spawn areas.

Spawn areas are used to determine the starting point from the user measured from the floor.

A YouTube tutorial has been followed to use spawn areas properly. [21] 

Because of this, the animation has been made a static experience, where the user can watch dancing aliens on a floating rock in space.
The number of polygons has to be reduced to save resources on the Oculus Quest, which can be done with the optimisation feature within Quill.
After publishing the experience to Quill Theater, it's sharable with an oculus media link and set public after approval, which takes a few days.

Vibe Aliens:

Screenshot 2021-06-12 at 17.13.28.png

Quill Theater [10]

Vibe Aliens on Oculus Media Studio


The following chapter is about coming up with content for within the product, the message the art should convey and determining the look and feel of the product.


The goal of this brainstorm is to come up with ideas for the content within the product.

PLANETART would like to display the VR experience at WARP-Technopolis and the GOGBOT 2021 festival.
Both these projects are heavily themed after neo-future, retro-future and technology, which inspired me to create a similar themed experience to strengthen the consistency of the building and the festival.
For GOGBOT 2021, a brainstorm has been held about the theme, 'INFOCALYPSE NOW – Recalibrate Reality'. Kees de Groot has sent an email containing information about this theme and brainstorm, including several keywords and descriptions:


In 2021, GOGBOT will research the idea of the truth/post-truth from within our digitalised society and look at the use of technology in document manipulation, propaganda, populism and conspiracy theories. Within this central topic, GOGBOT looks into issues such as what these technologies can mean for our privacy, the rise of alternative media, debates about gender and a new definition of reality.


Under the theme INFOCALYPSE NOW, Recalibrate reality, the program of GOGBOT 2021 dives into the transformation of our information ecosystem in a time of synthetic media, deep fakes, populism, physical distancing, trust crisis, the rise of conspiracy theories etc.

Wondering the impact of the democratisation of AI technologies on our society and its communication networks, GOGBOT invites artists, creators, and thinkers to present their latest work to inspire us and show us how to recalibrate our senses...

The clients' problem could originate from the empty spaces and the lack of incentive for people to visit WARP Technopolis. Apart from its decoration, an exhibition in the building could strengthen the focus on creative technology and interest for potential visitors.

Adjacent words:

  • Reality

  • Human

  • Unreal

  • Future

  • Loading

  • Synthetic

  • Paradise

  • Deepfake

  • SFX

  • Collision

  • GAN

  • Alternative Facts

Based on this document, I started brainstorming about the contents of the VR experience. Abstract concepts needed to be implemented and twisted to make the product more appealing.


"How can the GOGBOT theme be translated to a VR experience?"

The CMD method 'Storytelling' was applied to realise this. With storytelling, you can make abstract concepts concrete and strengthen the empathy for the user in your team [11]

Already a big part of the inspiration came from the neo-future theming going on at WARP.
The fundamental nature of this experience already complies with a lot of the theming of the GOGBOT festival itself.
The experience will be something that's only visual digitally, using a VR headset to see and experience a new reality within a physical room that is empty. Creating the collision between the digital world and the physical, where a visitor in the same room is not wearing the goggles, gets an entirely different experience than the visitor.

More ideas for the cybercity:

  • Freeways are computer circuits

  • Skyscrapers are servers

  • Vending machines are arcade machines

  • Social bubbles apartments

Screenshot 2021-06-04 at 19.01.55.png

Inspiration wall


The next goal is to establish an art direction and look and feel for the product and the promotional campaign.

One of several professional methods to save and organise creative ideas during a project is using an inspiration wall. An inspiration wall is a way to save and organize creative ideas during a project, or even permanently, in order to have access to them very quickly and let ideas ‘simmer’ for a while [12].

Commonly used service to collect inspiration is Pinterest. This service allows you to create a pinboard to collect images and algorithmically suggest similar ones.

Screenshot 2021-06-01 at 15.07.55.png

A couple of ideas were generated using this method.

1. Character design direction
Colourful characters created from simple shapes to fill the world. The simplicity of the designs would translate well inside a busy city.

2. Landscape direction
To illustrate the digital look and feel of the neo-city, the landscape this city is located in could be shown off in the style of retro wave artworks where the mountains are visualised using lines and grids.

3. Colours
This city is going to be a neo future style city. A few vibrant colour palettes that fit this aesthetic were found on art from 'Kurzgesagt'.

4. Marketing
Retrowave design fits quite well with the neo-city, WARP Technopolis, GOGBOT and the 'Infocalypse' theme. To apply this design into marketing, such as through poster design or social media posts, would only make sense and add consistency.


Style test


A small scene was created in Quill to test out if the look and feel from the inspiration wall would translate to VR,
A screenshot from the Pinterest board was taken and imported into Quill. This way, the colours could be colour picked from the image to apply within the scene.

The results left me pretty positive, and I believe this would work well in the VR experience. This simplistic style would allow me to animate and create environments entirely. They were also shared with stakeholder Kees de Groot, who was also delighted with the results.

World style tests in Quill

Character style tests in Quill


Mood board


"How to create an appealing style within Quill?"


The next goal is to establish an art direction further and look and feel for the product and in a more uniform and pitchable way.

Mood boards are an essential part of many aesthetic design projects, in particular for communication across stakeholders. This way, the client can be involved early in the process.
To create a mood board, you need to make a physical or digital collage of images, typography and a colour scheme to describe 'fluffy' stuff such as the ‘mood’, ‘feel’, or other core design concepts you want your product to have. The images are often gathered from magazines or visual discovery tools like Pinterest [13]

A mood board was constructed using images from the inspiration wall within the software Adobe Photoshop. This mood board has been pitched to stakeholder Kees de Groot.

Screenshot 2021-06-01 at 15.45.42.png

Mood board iterations


Final mood board created using images from the inspiration wall.




The product is mainly visual. The goal is to make the ideas that came out of the brainstorm concrete and visual to communicate to stakeholders.

Sketching is a standard method that is used to explore and communicate ideas [14].

Ideas that came from the brainstorm were put on paper. Then, more ideas were generated by sketching these initial ideas. Eventually, this leads to comprehensive concept art about the different items that the product is supposed to touch.
These were sent to client Kees de Groot, who was happy with the results. Communicating these ideas through sketches and concept art helped the client get involved early on within the process.

Screenshot 2021-06-04 at 19.00.09.png

A concept art piece was created based on colours inspired by the mood board and inspiration wall to make the experience cohesive in colour.

The sketches and the concept art are created using Clip Studio Paint, a digital art software used for illustration.

Concept art.png

Asset list


For managing this project, there needs to be a way to prioritise certain elements and assets. Otherwise, the ​developer could sink too much time into parts of the project that are not as important as others.

MoSCoW is a prioritisation method used to decide which requirements to complete first, which must come later and which to exclude.
Unlike a numbering system for setting priorities, the words mean something and make it easier to discuss what's important.

MoSCoW stands for must, should, could and would:

  • M - Must have this requirement to meet the business needs

  • S - Should have this requirement if possible, but project success does not rely on it

  • C - Could have this requirement if it does not affect anything else on the project

  • W - Would like to have this requirement later, but delivery won't be this time [15]

The following asset list has been created using this method. The primary purpose of this list is to motivate the developer to work on the project in a organised matter, making sure the prioritised items get worked on first.

The most important things are at the top. They have been chosen based upon what I thought makes a basic city, such as a ground, sky, streets, sound and buildings. Next to those, I chose the concepts that appeal the most to me and are directly influenced by the main theme of GOGBOT written in the e-mail from Kees. 

Asset List.png

The following chapter is about the creation of the product. It goes into development using Quill, sound design, optimisation and prototyping.


Creating assets in Quill

Quill is used for creating the assets for the VR experience. Quill is a VR illustration and animation tool. Quill was chosen because I wanted to master this tool and gain more experience using this. Quill includes a feature where the project could be exported to Quill Theater, a software that is used to experience Quill creations within 6 degrees of freedom.


Several assets being created in Quill.

The coloured concept art was imported into Quill to make sure the art was cohesive across the experience. Then, the eyedropper tool within Quill was used to pick colours directly from the concept art. This way, the colours kept consistent and colourful across the different assets.

Quill has animation features that allow the artist to animate their assets and work intuitively. Using keyframes, the artist can make clean interpolated movements. This technique was helpful for animating robots, giving the robot a robotic motion. Quill allows for a workflow similar to 2D traditional animation, where the artist can paint frame-by-frame with onion skin. The frame-by-frame technique was used to add smear frames to certain animated assets, to convey sudden movements such as Tunnelbear entering its tunnel or Surfshark doing a kickflip.

A smear frame 

Surfshark animated frame by frame

Robotic movement created with keyframe animations

Next to those techniques, Quill allows the artist to puppeteer using the grab tool, enabling the artist to grab parts of the model and move them while the timeline plays. These movements are then recorded instantly, which streamlines the animation process significantly. 

For the creation of any asset, optimisation needs to be kept in mind. If not, this will result in more work later in production.
Every Quill scene can only have a certain amount of draw calls and polygons. Draw calls can be taken up by the use of different VR brushes and layers. To optimise the asset and reduce the number of draw calls it takes up, the artists need to create the asset with as few brushes as possible and make sure it uses the least amount of layers.
The Building asset below has only been created using two different brushes, the cylinder brush and the cube brush.
All of the animated assets have been baked and combined into one layer.
Quill also includes a tool where the number of polygons can be reduced. This will make certain curves and bends shapes look polygonal and minimise detail in colour gradients. The artist needs to be selective in what amount and where this optimisation tool needs to be used. The number of draw calls and polygons can be seen at all times using the performance tab.

A YouTube tutorial that goes further deep into optimisation techniques has been followed. [22]

Optimisation tool to reduce polygons

Cube and cylinder brush only takes up 2 draw calls

The entire building after optimising also just takes up 2 draw calls

This slideshow features objects, characters and buildings that have been created and animated using Quill. They are based on their respective sketches, but other design decisions have been made during the creation process.

Every asset has been created within a separate Quill file, which is later used by importing them into one master file.




All the assets created during the production phase need to be combined into one Quill project so the user can see them all when walking around. In addition, the size of the digital world has to fit the exhibition space. The functionality of this project needs to be tested and adjusted to fit the room accordingly.

One method to test functionality would be prototyping. Building a prototype helps the developer test a specific aspect of the product [16].

The following approach has been used to build the prototype.
Firstly, a sketch of the overall world was made. The room that is housing the experience has been measured beforehand and has a square footprint, and the sketch that was created took this into account. This sketch has been used to create a rough map of the city, which is placed on the floor of the experience. This allows for placing the built assets accordingly.
Next to that, several environmental assets were added, such as clouds, mountains, a gradient sky, and the eye in the sky from the concept art. Some simplified buildings have also been created to fill space. 

Screenshot 2021-06-08 at 23.34.02.png
Rough map.png
Rough map2.png

The sketch, map and the assets beign placed on the map within Quill VR. 


Addition of mountains, buildings, clouds, eye in the sky and a gradient skybox


Sound Design

Sound design was added to add more depth to the product. Recordings of city soundscapes were downloaded from YouTube and manipulated using different filters to fit the cyber theme. The software that was used to create this sound design was GarageBand. So far, three tracks were create based on three locations: a main city ambient, the windy sounds for the mountains and a soundscape for the catfish market.
These sounds were made loop-able by using the sound looping technique of mixing in the tail-end of a track with the head end of a track. This way, the user should not notice when the sound starts and end. The next step would be implementing these sounds into the experience.


Mixing ambient audio using GarageBand

The sounds need to be imported into Quill to implement the sound design that has been created,
And within Quill, the sound can be adjusted and placed in 3D space for a spatial audio effect.
The audio for specific landmarks for SYNTHCITY has been placed accordingly.
Having spatial audio allows the user to associate different sounds to different spaces within the virtual world.
An internet tutorial has been followed to achieve this effect [23].


Importing audio files into Quill


The following chapter is about testing and mapping the prototype. Secondly, the creation of promotional content and information cards is also discussed. Finally, some user tests are conducted.


Mapping the experience to a room

"How to connect a Quill created VR experience to a physical room?"

360° room

One of the main goals of the experience is that it should enrich a physical space by adding digital art. The space provided by PLANETART within WARP Technopolis is the 360° expo room. A 7x7m room where the four walls of the room can be projected by eight projectors located on the ceiling, creating a full 360° image.
This space has been used to test the VR experience. Several other students were working on projects at the time. One bottleneck for testing was the desk in the middle of the room, which means that not every part of the room was accessible for the user.
Since the VR headset has an inside-out tracking system, setting up the headset was relatively simple. To set up the headset, the user needs to draw the boundary within the guardian system. There needed to be enough light in the room For optimal tracking of the headset. This could be achieved to turn on the eight projectors at the same time or turn on the lights. Since the lights were temporarily disabled, a light panel has been propped against the wall to light up the room. This seems sufficient enough for stable tracking.


The 360° expo room at WARP Technopolis

The goal of this test was to see if the experience would fit within this room accordingly. For example, if the experience is too big, the physical walls of the room prevent the user from taking a good look at particular highlights. On the other hand, if the experience is too small, the user would have too much space to walk around in empty space, and the content would be too condensed.

To test this out on standalone hardware, the experience needed to be uploaded to Oculus Media Studio.

As previously demonstrated with Vibe Aliens, publishing to Oculus Media Studio can take a few days since it needed to be approved by curators of Oculus TV. Waiting a few days between tests would be highly inconvenient, so the Virtual Animation discord group has been consulted. Nick Ladd, a VR artist and animator, pointed out that it is possible to load the experience on the headset logged into the same profile that uploaded the experience under a draft folder. This allows for direct testing after uploading iterations.

The size of the entire experience can be easily adjusted by scaling and repositioning the spawn area, which was practised until the edges of the city matched the edges of the room.


During these tests, it was discovered that the spawn area is not locked to the guardian system, which means that every time the user takes off the headset, the user's position would reset towards the spawn area, offsetting the entire experience. Therefore, the proximity sensor on the headset must be blocked off with tape to solve this issue. This way, the headset would not know when the user would take off the headset and would not reset the user's position.
On top of that, an arrow was drawn on the floor using tape, which allows for calibrating the spawn area position with a position in the physical room. Spawn areas are measured from the floor level set up by the guardian system. Having an arrow on the floor helps set up the experience by standing against the line in the direction of the arrow before loading up the experience. After loading up, the headset can be placed on the chair next to the arrow while being powered on. The visitor only has to put the headset on to experience SYNTHCITY.


A piece of tape covering the proximity sensor of the Oculus Quest 2


A piece of tape covering the proximity sensor of the Oculus Quest 2


Information cards

Several information cards were created to bring forth the storytelling aspect of the project. The experience is structured like an art exhibition, enabling the user to walk around and look at specific buildings and places. Just as art exhibitions often add signs with information about the art piece next to the displayed artwork, are these information cards created to describe particular highlights of SYNTHCITY.
These information cards have been designed using Adobe Illustrator and imported into Quill.


The created information cards


The cards implemented within SYNTHCITY


Promotional content

Name & branding

The name of a project, product or artwork is crucial in communication.
A name needs to be decided upon to communicate this project through social media and promotional content. During a brainstorm, a few different names were made up. Finally, together with stakeholder Kees de Groot the name 'SYNTHCITY' has been chosen. According to Kees, this name fits with the hype of synthetic media and covers digital VR animation nicely.

Different brainstormed names for the experience

Visual identity and poster in progress

Now with SYNTHCITY chosen as a name, a poster has been created using Adobe Illustrator. Taking inspiration from posters collected in the inspiration wall, the visual identity of SYNTHCITY is amongst the line of retro digital futurism similar to the movie Tron and other science fiction movie posters from the '80s. This visual identity is used for the poster and will be a leading theme through the promotional campaign and the presentation of the product.






SYNTHCITY social media post

A framework was created in this visual style to showcase the different assets made for this product using Adobe After Effects. In this software, the footage can be swapped out, and this way, a series of multiple assets can be made. This way, the assets can be showcased and sorted in a library separately from the VR experience.

Visual framework created in Adobe After Effects


User tests


The goal is to determine if this version of the experience is usable, if the storytelling comes through, if the experience is comfortable, and get general opinions. 

Several CMD methods were used to test this.

Usability testing is used to detect problems users have with your design and correct these before the product goes live [17].


The experience was set up, and the people working at the WARP Technopolis building were invited to try the experience. The participants were asked to find four highlights, the market, school pyramid and bank, and were free to wander around the experience. They were asked to think aloud to understand the reasons behind the users' behaviour [18].
After they finished, a short interview was conducted about their experience. This allows to understand users better by gathering their opinions and behaviours [19].

During testing, most participants found all four highlights, but not all information cards were read.
All of the participants reacted positively to the experience. They spend 8 minutes on average within SYNTHCITY. During this, new ideas were generated. The Guardian system also worked accordingly and prevented the participants from walking into the walls of the room.

Several inconveniences took place during testing. Initially, the headset was supposed to record the view of the participant during the test. The footage was, however, corrupted and could not be viewed. Next to that, the desk in the middle of the room obstructed the pathway towards some of the highlights. The participants have been warned when they almost collided with the desk. Another problem arose when the participant moved too close to the light pointed at the wall. The tracking would get lost, and the experience needed to be recalibrated.

Screenshot 2021-06-13 at 12.59.56.png
Screenshot 2021-06-13 at 17.42.43.png
Screenshot 2021-06-13 at 17.42.15.png

SYNTHCITY being tested by Aaike Lutkemulller


Product video

Finally, a product teaser has been created. This teaser is meant to promote SYNTHCITY.
It was filmed at WARP Technopolis and edited using Adobe After Effects. One of the goals of this teaser was to inform potential visitors that this world has been created by me using VR as a tool. Firstly, the video introduces me as an artist and the technology I use. Secondly, the company I graduate from. Finally, it shows the product and how it is being experienced, ending with a call to action.

Product video



Firstly, the sub-questions need to be answered before the main question can be answered.

"How can the GOGBOT theme be translated to a VR experience?"

Apply storytelling to make abstract concepts concrete. Collect ideas using an inspiration wall. Create sketches based upon the brainstorm and start working on how these concepts connect.

"How to create an appealing style within Quill?"
Import the concept art and sketches into your Quill scene and colour pick to keep your art consistent. Keep optimisation in mind when creating assets.

"How to connect a Quill created VR experience to a physical room?"

Set up the guardian system accordingly. Scale spawn areas, so the experience is the same size as the room. Cover the proximity sensor of the VR headset with tape so it doesn't lose track. Make sure the room is evenly lit for stable tracking. Pinpoint a location in the room that matches the spawn area in VR. Use this for recalibrating the headset.

The main question: 
"How to develop a VR experience to liven up a room at WARP Technopolis in the theme of the GOGBOT festival?"

First, gain inspiration to come up with ideas. Connect the abstract ideas from the theme of the GOGBOT festival to concrete concepts using storytelling. Create sketches to visualise the product and communicate these with the client. Build the assets with optimisation in mind and try to keep the visuals simple.



Results and expectation​
The final product is fully operational and visually appealing. In addition, the art has been implemented and animated, making the world of SYNTHCITY a colourful experience.
The experience is easily mapped to a room using the recalibration arrow and seems to practice stable tracking under the right lighting conditions.

New insights
Once mastered, Quill is a surprisingly intuitive tool and enables the artist to create multiple assets in a short amount of time. Keeping optimisation in mind from the beginning of the production process allowed for more room to play with when combining the assets into one file. 
Although, the creation process of these assets was still time-consuming to the point where some of the planning had to be pushed back a bit. Fully developing this whole experience singlehandedly, it came to mind that having a team to help work on such a project would vastly increase the content within and around the experience.
For now, this project gave insight into what it takes to create such an experience, and it shows that this would be marketable and, for me as a freelancer, perhaps something that I could sell as a product.

Due to other students working in the 360° room, it was not fully available to use. A desk took up space in the centre of the room, and this limited testing. Due to the pandemic, most of the work on the project was done while staying home. The number of days I could visit PLANETART and connect with the people working at WARP Technopolis was limited. This personally lowered my motivation to work on the project.

Follow-up research 
Although touched and looked into, I'd recommend looking more into features of the experience that connects with the real world for follow-up research. For example, could the 360° projectors add value to the product? Are there any ways for spectators to interact with the user? Does adding other stimuli to the experience, such as being able to touch the objects or feel wind by using fans, enhance the user experience?
These are all potential research questions I would recommend further explore. The product itself could also be enriched with more assets not created from the asset list in the future and more sound effects.



  • WARP Technopolis - Building in Enschede that houses freelancers and creative companies such as PLANETART. This building also includes exhibition spaces.

  • Room-scale VR - Roomscale VR allows users to move around freely through immersive experiences by moving around in real life.

  • 6DOF - Short for six degrees of freedom, allows the user to move within 6 degrees in virtual space. These degrees change in position as forward/backwards, up/down, left/right, and change in rotation along these axes.

  • Polygons - Straight-sided shapes a 3D model is built out of. The more polygons a 3D model has, the more data and detail it carries.

  • Retrowave - Aesthetic that draws inspiration from retro-tech and neon colours.

  • Adobe Photoshop - Software for illustration and image manipulation

  • Draw calls - The count of different brushes amongst different layers within a Quill project

  • Onion skin - A feature amongst animation software where the artist can view multiple frames of animation simultaneously.

  • Smear frames - Frames where the animated object is often stretched or duplicated to convey motion

  • Inside-out tracking - Instead of using external tracking beacons to determine the position and orientation of the VR headset and controllers, the tracking is done by several cameras within the headset itself.

  • Adobe Illustrator - Software for creating vector-based graphics.

  • Adobe After Effects - Video manipulation software, often used for VFX and motion graphics.