// A project by Jay-Cee at PLANETART //

Bachelor Thesis - June 2021

Name: Jay-Cee Samar
Student number: 448837
Studying Creative Media & Game Technologies (CMGT) at Hogeschool Saxion University of Applied Sciences

Company Supervisor
Name: Kees De Groot

Graduation Coach
Name: Kasper Kamperman
Institution: Hogeschool Saxion University of Applied Sciences


Tekengebied 10.png

This blog is written by Jay-Cee, a student at Hogeschool Saxion University of Applied Sciences, studying Creative Media & Game Technologies (CMGT) and an intern at PLANETART for this current project.
Jay-Cee is an artist that specialises himself in animation and motion graphics.
Virtual Reality brings a unique way of experiencing and creating digital art to the table. On this website, said the topic would be explored for PLANETART to inspire artists about the possibilities of creating virtual art and summarise the design of a Virtual Reality product and any relevant information.


The project
The project consists of conducted research on the process of the creation of adding animated art to a physical space using Virtual Reality.
The final product is expected to be a functional exhibition that is visually interesting and experienced by physically walking around in a space and being able to look around and discover animated art pieces within virtual reality.

The Virtual Reality exhibition is going to be used to liven up a physical space at Warp Technopolis to attract visitors to experience virtual art. The product can also be shown off at festivals such as GOGBOT to attract more visitors. (1).gif

The company
PLANETART (founded in 1995) is an artists collective that organizes projects, events and festivals in the field of art and technology, mass media, popular culture, DIY, music, experiment and activism.

Since 2004, PLANETART organizes the annual GOGBOT festival, in the center of Enschede, and since 2014 TEC Art in Rotterdam. PLANETART is co-initiator of the development of Enschede’s station square into an urban playground, from WARP Technopolis to SPACEBAR.


The project
The client is looking for an artistic installation that has the potential to attract more visitors to WARP Technopolis and festivals such as GOGBOT. Said installation is also used to inspire artist about the possibilities of virtual art.

Products and services

This artistic installation will be created in the form of a virtual exhibition.
Said exhibition should be experienced by walking around a physical space and should include visually appealing animated visuals. The product should be experienced by using a VR headset, preferably standalone or otherwise connected to a PC.
Next to the exhibition itself, this project will include concept art and animated 3D models. Sound design will be briefly researched for this project as well.
To attract more visitors, a potential promotional campaign for this experience should be looked into.

Product goal

The goal of this product is to attract visitors to WARP Technopolis and events organised by PLANETART such as GOGBOT festival. Next to that it should inspire artists about virtual art and the possibilities of creating virtual art.

Project boundaries

The realisation of the project is limited by the time constraints given by Saxion, as a creative project could have only a limited amount of iterations and test phases. Due to a pandemic happening, there also needs to be looked into the measures and the safety of a virtual experience where visitors need to use a shared headset during an exhibition.
Saxion’s XR-Lab provides the necessary tools to develop said project, such as powerful PC’s and VR sets. PLANETART provides space for project development and the eventual showcase, and connections to artists who have experience in exhibitions for consulting.
Next to this virtual reality artists and developers possibly need to be contacted for professional consulting and feedback.



PLANETART is one of the stakeholders in this assignment. The finished product will be displayed and showcased at WARP Technopolis and GOGBOT. They expect a creative and inspiring product that fits with the image they portray as a company and the theme of the GOGBOT festival.

Jay-Cee Samar

Jay-Cee, the writer and executer of this project, also has personal goals for during this project. With this project they want to grow as an artist and improve their skill in animation and VR painting within the Quill VR software. With this project they want to profile themselve as a freelance Virtual Reality artist and animator by creating and animating all assets for this project using Virtual Reality and Quill VR.

Saxion, the institution the graduate student studies at, requires documentation of research and development of the product and gives a limited timeframe to work with. The project also needs to fulfil the twelve design competencies of Saxion to be sufficient. Saxion is also able to provide students with state-of-the-art virtual en augmented reality-devices, computers and such through their XR-Lab.

Approach & Timeline

Double diamond process model
Double Diamond is a process model created by Design Council, a British organisation, in 2005.  The model provides a graphic representation of a design process. [...]
The model presents four main stages across two adjacent diamonds.
Each of the four stages is characterised by either convergent or divergent thinking. These stages are:

  • Discover –identify, research and understand the initial problem.

  • Define – limit and define a clear problem to be solved.

  • Develop – focus on and develop a solution.

  • Deliver – test and evaluate, ready the concept for production and launch.



Since this will be a one-man project under a time limit, the end product has to be limited by some amount. The scope of this project has to be limited to be a single user one-room experience. The main focus of this project is going to be on the visual side of the experience. There will be one VR setup used for the entirety of this project to avoid differences in experience that can occur across multiple hardware.
Due to the measures a pandemic brings with them, this product can only be tested by limited people to assure safety. This will mean that most of the testing has to be done using employees. The current pandemic brings with it that the full potential of this product can’t be fully realised. As of now, it’s advised not to bring visitors to a building. This is why needs to be taken into account that this experience should be simple to set up for later use for when the measures soften.

Planning & Approach

The planning for this project is based on planning similar to the process of creating an animation. Divided amongst six parts are different methods that are used in the creative industry. To assure the quality of research, methods from the CMD-method pack are applied in addition. Since this project is an interactive experience, it requires testing. Because of this, a testing and iterating phase has been added to the planning. The double diamond process model has been implemented within this planning. The planning has been visualised as a timeline.
Each phase of the double diamond, discover, define, develop and deliver, has a place within this timeline.


Problem definition

Problem Analysis

By looking at the current goals for the project, the two problems given are as follows: More people need to visit WARP Technopolis and GOGBOT to experience and get inspired by virtual art. If successful, a virtual reality exhibition would be a step in the right direction to solve this problem. The problem behind the problem could be caused because the WARP Technopolis has unused rooms which are meant to host art installations. Apart from festivals, these exhibition rooms are mostly unused. The given assignment could help give WARP Technopolis more content within the building which have the potential to bring in more visitors.

If successfully said product could help the client bring in more visitors. PLANETART is an artist collective, connected artists often visit the building. Local artists can come in contact with said product and if it delivers sufficiently, this could have an inspiring influence on the artists about virtual artworks.



Planned features of the product are:

– Visually appealing virtual space
– Sound effects
– Animated assets
– Room scale VR
– Untethered VR experience
– Simple and foolproof experience
– Easy exhibition set up in a room

Potential additions to this project are:


– Set up manual
– Moodboard
– Concept Art
– Marketing campaign (Videos, social media posts and posters)
– Showcase website
– Product presentation

The clients problem could originate from the empty spaces and the lack of incentive for people to visit WARP Technopolis. Apart from its decoration an exhibition that takes place in the building could strengthen the focus on creative technology and interest for potential visitors.

360 Scan.png


To provide virtual reality to the user, a virtual reality headset is needed. A virtual reality headset provides a stereoscopic display, stereo sound, motion tracking sensors and motion controllers.
For this current project the headset needs to comply with certain conditions:

  • Easy and safe to use

  • Allows for the user to walk around in a physical space using the headset (Roomscale VR)

  • Load custom VR experiences

Numerous headsets were available to use. Saxion provides several headsets and hardware to lend and experiment with from their XR-Lab. In addition to that, I purchased the Oculus Quest 2 when it came out, which makes this headset convenient to use since it was available to use at all times.

Oculus Quest 2

The Oculus Quest 2 is a wireless headset that is able to load room-scale VR games without a cable required. This is because of an integrated processor and battery. The Oculus tracking technology allows the user to have six-degrees of freedom. Meaning the user can walk around and rotate in real space while the headset tracks the movement and adjust the virtual world to the position and orientation of the user. This is crucial for a virtual exhibition.
The wireless nature of the device makes it possible to create an easy to use and safe experience for the user, since there is no wire for the user to potentially get tangled in or trip over, breaking the emersion.
A potential downside to this headset is the battery life of 2-3 hours. This would mean the headset needs to be charged in between uses or the user would need to carry a battery pack if the battery of the headset happens to run low.
The resolution of the screen within the headset is ​1832×1920 per eye. This high resolution increases image quality and immersion due to it eliminating the ‘screen-door’ effect, a mesh-like appearance that occurs due to gaps in between pixels that can cause eyestrain.

The Oculus Quest 2 supports a Qualcomm® Snapdragon™ XR2 chip, that can handle VR games to be loaded and played on the device itself. This would mean the experience could be loaded on the headset by itself without connecting it to a desktop. Due to the fact this chip being a mobile chip it would mean the experience needs to be optimised since these chip aren’t as powerful as a high end desktop. It is also possible to play desktop VR games and experiences using the Quest 2 by plugging in a link cable or even wirelessly using a local network.,when%20viewed%20at%20close%20proximity.&text=Because%20both%20of%20those%20factors,a%20hot%20issue%20in%20VR.


Oculus Quest 2

Guardian System

The Oculus Quest 2 also provides a Guardian System. 

The Guardian boundary is designed to display in-application wall and floor markers when users get near play-area borders they defined. When the user gets too close to the edge of a boundary, a translucent mesh grid is displayed as a layer that is superimposed over the game or experience.

This guardian boundary should help the user walk around the physical space safely, preventing them to walk into a wall.
The boundary can be set up by pointing the controller towards the floor and drawing the shape of the border on the floor. The headset will remember the boundary shape when it recognizes the room. 

Screenshot 2021-06-09 at 21.29.43.png
Screenshot 2021-06-09 at 21.30.41.png
Screenshot 2021-06-09 at 21.30.29.png

Setting up Guardian System

Source: Oculus on YouTube



Quill is the VR illustration and animation tool built to empower artists and creators, whether to create final art or as a production tool for concept creation aid.

Quill allows users to paint and animate in virtual reality on an infinitely scalable canvas - with rich colors and intuitive tools. Quill is designed to be expressive, precise and to let the artist's "hand" come through clearly - whether that's a watercolor style, pencil style, oil painting style or other.

Uploading and distributing your Quill creations to a VR audience is now made easier with the media management tool, Oculus Media Studio.

Quill is made to be intuitive to artists. It's expressive, efficient and comfortable to use over a long period of time.

Quill supports various animation approaches such as frame-by-frame, keyframe, anim brush and puppeteering techniques. This set of powerful tools and workflows allow artists to control the look and feel of the animation without requiring any traditional CG technical knowledge, such as rigging or curve manipulation.

WAV, MP3 files and Ambisionic sounds can be imported. These sounds can be used as stereo or spatial audio sources. Quill supports sphere, cone and frustum-based spacial audio emitters.

Personal goals

Mastering this software is something I strive for. Quill is something I have previously used before but I want to be able to use this software within my professional arsenal as a freelancer and I want to be known for my VR art. Creating all art and animation for this project in Quill allows me to gain experience in this field. 


For these reasons, Quill has been chosen to use for creating and developing this experience.
Similar experiences can be loaded up on a Quest 2 using Quill Theater. To publish a Quill project to Quill Theater, an Oculus Media Studio profile is needed.

Oculus Media Studio

Oculus Media Studio is a new media management tool for immersive creators to upload, publish, and analyze VR-first content. From this single hub, creators can easily distribute their stories directly to VR, where VR content is best enjoyed, while also enabling out-of-headset content discovery.

Visual fidelity is given a high priority in this pipeline. Accepted formats are high quality 4k+ 360 videos, 180 videos, and Quill creations that look good in VR.

Quill Theater

Regularly updated, the Quill Theater lets you visit high quality immersive animations from the comfort of Oculus TV, bringing you the best in immersive entertainment. These immersive animations are fully 6DOF, some including multiple viewpoints to let your move positionally within the art.

Vibe Aliens

To test if Quill Theater was applicable for hosting the VR experience, an animation has been created and optimised to work with Quill Theater, called Vibe Aliens.

Quill Theater requires spawn areas.

Spawn areas are used to determine the starting point from the user measured from the floor.
The following tutorial has been followed to properly use spawn areas. 

Because of this the animation has been made a static experience, in where the user can watch dancing aliens on a floating rock in space.
To save resources on the Oculus Quest, the amount of polygons have to be reduced, which can be done with the optimisation feature within Quill.
After publishing the experience to Quill theater, its sharable with an oculus media link and set public after approval, which takes a few days.

Vibe Aliens:

Screenshot 2021-06-12 at 17.13.28.png

Quill Theater

Vibe Aliens on Oculus Media Studio



The goal of this brainstorm is to come up with ideas for the content within the product.

PLANETART would like to display the VR-experience at WARP-Technopolis and the GOGBOT 2021 festival.
Both these projects are heavily themed after neo-future, retro future and technology. This inspired me to create a similar themed experience to strengthen the consistency of the building and the festival.
For GOGBOT 2021 a brainstorm has been held about the theme, ‘INFOCALYPSE NOW – Recalibrate Reality’. Kees de Groot has send an email containing information about this theme and brainstorm, including several keywords and descriptions:


In 2021, GOGBOT will research the idea of the truth/post truth from within our digitalised society and look at the use of technology in document manipulation, propaganda, populism and conspiracy theories. Within this main topic GOGBOT looks into topics such as what these technologies can mean for our privacy, the rise of alternative media, debates about gender and a new definition of reality.


Under the theme INFOCALYPSE NOW, Recalibrate Reality, the program of GOGBOT 2021 dives into the transformation of our information ecosystem in a time of synthetic media, deepfake, populism, physical distancing, trust crisis, rise of conspiracy theories etc…

Wondering what the impact of the democratisation of AI technologies is on our society and its communication networks, GOGBOT invites artists, creators, thinkers to present their latest work to inspire us and show us the way to recalibrate our senses..

The clients problem could originate from the empty spaces and the lack of incentive for people to visit WARP Technopolis. Apart from its decoration an exhibition that takes place in the building could strengthen the focus on creative technology and interest for potential visitors.

Adjacent words:

  • Reality

  • Human

  • Unreal

  • Future

  • Loading

  • Synthetic

  • Paradise

  • Deepfake

  • SFX

  • Collision

  • GAN

  • Alternative Facts

Based on this document I started brainstorming about the contents of the VR experience. Abstract concepts needed to be implemented and twisted to make the product more interesting.

To do this, the cmd method 'Storytelling' was applied.

Already a big part of the inspiration came from the neo-future theming going on at WARP.
The entire nature of this experience already complies with a lot of the theming of the GOGBOT festival itself.

The experience is going to be something thats only visual digitally, using a VR headset to see and experience a 'new reality' within a physical room that is empty. Creating the collision between the digital world and the physical, where a visitor in the same room not wearing the goggles getting a completely different experience than the visitor that does.

More ideas for the cybercity:

  • Freeways are computer circuits

  • Skyscrapers are servers

  • Vending machines are arcade machines

  • Social bubbles apartments

Screenshot 2021-06-04 at 19.01.55.png

Inspiration wall


The next goal is to establish an art direction, and look and feel for the product and the promotional campaign.

One of several professional methods to save and organise creative ideas during a project, is the use of an inspiration wall.

A common used service to collect inspiration is Pinterest. This service allows you to create a pin board where you can collect images, and algorithmically suggest similar ones.

Screenshot 2021-06-01 at 15.07.55.png

A couple of ideas were generated using this method.

1. Character design direction
Colourful characters created from simple shapes to fill the world. The simplicity of the designs would translate well inside a busy city.

2. Landscape direction
To illustrate the digital look and feel of the neo-city, the landscape this city is located in could be shown off in the style of retrowave artworks where the mountains are visualised using lines and grids.

3. Colours
This city is going to be a neo future style city. A few vibrant color palettes that would fit with this aesthetic were found on kurzgesagt art.

4. Marketing
Retrowave design fits quite well with the neo-city, WARP Technopolis, GOGBOT and the infocalypse theme. To apply this design into the marketing such as through poster-design or social media posts would only make sense and add consistency.


Style test


To test out if the look and feel from the inspiration wall would translate to VR, a small scene was created in Quill.

A screenshot from the Pinterest was taken and imported into Quill, this way the colours could be colorpicked from the image to apply within the scene.
The results left me pretty positive and I believe this would work well in the VR experience. This simplistic style would allow me to fully animate and create environments. They were also shared with stakeholder Kees de Groot, who was also very positive about the results.

World style tests in Quill

Character style tests in Quill


Mood board


The next goal is to further establish an art direction, and look and feel for the product and in a more uniform and pitchable way.


Mood boards are an essential part of many aesthetic design projects, in particular for communication across stakeholders. This way the client can be involved early in the process.

A mood board was constructed using images from the inspiration wall within the software Adobe Photoshop. This mood moard has been pitched to stakeholder Kees de Groot.

Screenshot 2021-06-01 at 15.45.42.png

Mood board iterations


Final mood board created using images from the inspiration wall.




The product is mainly visual. The goal is to make the ideas that came out of the brainstorm concrete and visual, so these can be communicated to the stakeholders.

Sketching is a common method that is used to explore and communicate ideas.

Ideas that came from the brainstorm were put on paper. More ideas were generated by sketching these initial ideas. Eventually, this leads to comprehensive concept art about the different items that the product is supposed to touch.
These were sent to client Kees de Groot, which was happy about the results. Communicating these ideas through sketches and concept art helped the client to be involved early on within the process.

Screenshot 2021-06-04 at 19.00.09.png

To make the experience cohesive in colour, a concept art was created based with colours inspired by the mood board and inspiration wall.

Both the sketches and the sketches are created using Clip Studio Paint, a digital art software used for illustration.

Concept art.png

Asset list


For managing this project, there needs to be a way to prioritize certain elements and assets, otherwise, the ​developer could sink too much time into parts of the project that are not as important as others.

MoSCoW as a prioritisation method is used to decide which requirements to complete first, which must come later and which to exclude.

Unlike a numbering system for setting priorities, the words mean something and make it easier to discuss what's important.

MoSCoW stands for must, should, could and would:

  • M - Must have this requirement to meet the business needs

  • S - Should have this requirement if possible, but project success does not rely on it

  • C - Could have this requirement if it does not affect anything else on the project

  • W - Would like to have this requirement later, but delivery won't be this time

Using this method, the following asset list has been created. The main purpose of this list is to motivate the developer to work on the project in a organised matter, making sure the prioritized items get worked on first.

Asset List.png

Creating assets in Quill

Quill was used to create the assets for the VR experience. Quill is a VR illustration and animation tool. Quill was chosen because I wanted to master this tool and gain more experience using this. Quill includes a feature where the project could be exported to Quill Theater, a software that is used to experience Quill creations within 6 degrees of freedom.


Several assets being created in Quill.

To make sure the art was cohesive across the experience, the coloured concept art was imported into Quill. The eyedropper tool within Quill was used to pick colours directly from the concept art. This way, the colours kept consistent and colourful across the different assets.

Quill has animation features that allow the artist to animate their assets and work intuitively. Using keyframes the artist can make clean interpolated movements. This technique was useful to animate robots, giving the robot a robotic motion. Quill allows for a workflow similar to 2D traditional animation, where the artist can paint frame-by-frame with onion skin. The frame-by-frame technique was used to add smear frames to certain animated assets, to convey sudden movements such as Tunnelbear entering its tunnel or Surfshark doing a kickflip.
Next to those techniques allows Quill for puppeteering using the grab tool. This allows the artist to grab parts of the model and move them while the timeline plays. These movements are then recorded instantly, which streamlines the animation process significantly. 


For the creation of any asset, optimisation needs to be kept in mind. If not, this will result in more work later in production.
Every Quill scene can only have a certain amount of draw calls and polygons. Draw calls can be taken up by the use of different VR brushes and layers. To optimise the asset and reduce the number of draw calls it takes up, the artists need to create the asset with as few brushes as possible, and also make sure the asset uses the least amount of layers.
The Building asset below has only been created by using two different brushes, the cylinder brush and the cube brush.
All of the animated assets have been baked and combined into one layer.
Quill also includes a tool where the number of polygons can be reduced. This will make certain curves and bends shapes look polygonal, as well as reduce detail in colour gradients. the artist needs to be selective in what amount and where this optimisation tool needs to be used.

The following youtube tutorial goes further deep into optimisation techniques:




All the assets created during the production phase need to be combined into one Quill project so the user can see them all when walking around. The size of the digital world has to fit the exhibition space. The functionality of this project needs to be tested and adjusted to fit the space accordingly.

One method to test functionality would be prototyping. Building a prototype helps the developer test a specific aspect of the product.

The following approach has bee used to build the prototype.
Firstly, as sketch of the overall world was made. The room that is housing the experience has been measured beforehand and has a square footprint, and the sketch that was created took this into account. This sketch has been used to create a rough map of the city, which is placed on the floor of the experience. This allows for placing the created assets accordingly.
Next to that, several environment assets were added such as clouds, mountains, a gradient sky as well as the eye in the sky from the concept art. Some simplified buildings have also been created to fill space. 

Screenshot 2021-06-08 at 23.34.02.png
Rough map.png
Rough map2.png

The sketch, map and the assets beign placed on the map within Quill VR. 


Addition of mountains, buildings, clouds, eye in the sky and a gradient skybox


Sound Design

To add more depth to the product, sound design was added. Recordings of city soundscapes were downloaded and manipulated using different filters to fit the cyber theme. The software that was used to create this sound design was Garage Band. So far, three tracks were create based on three locations: a main city ambient, the windy sounds for the mountains and a soundscape for the catfish market.

These sounds were made loop-able by using the sound looping technique of mixing in the tail-end of a track with the head end of a track. This way, the user should not notice when the sound starts and end. The next step would be implementing these sounds into the experience.


To implement the sound design that has been created, it needs to be imported into Quill.
And within Quill, the sound can be adjusted and placed in 3D space for a spatial audio effect.
The audio for specific landmarks for SYNTHCITY has been placed accordingly.
Having spatial audio allows the user to associate different sounds to different spaces within the virtual world.

An internet tutorial has been followed to achieve this effect.


Mapping the experience to a room

360° room

One of the main goals of the experience is that it should enrich a physical space by adding digital art. The space provided by PLANETART within WARP Technopolis is the 360° expo room. A 7 by 7 meter room in where the four walls of the room can be projected by 8 projectors located on the ceiling, creating a full 360° image.
This space has been used to test the VR experience. Several other students were working on projects at the time, one bottleneck for testing was the desk in the middle of the room. This means that not every part of the room was accessible for the user.

Since the VR headset has an in-house tracking system, setting up the headset was relatively simple. To set up the headset the user needs to draw the boundary within the guardian system. For optimal tracking of the headset there needed to be enough light in the room. This could be achieved to either turn on the eight projectors at the same time or turn on the lights. Since the lights were temporarily disabled, a lightpanel has been propped against the wall to light up the room. This seems sufficient enough for stable tracking.


The 360° expo room at WARP Technopolis

The goal of this test was to see if the experience would fit within this room accordingly. If the experience is too big, the physical walls of the room prevent the user from taking a proper look at certain highlights. If the experience is too small, the user would have too much space to walk around in empty space, and the content would be too condensed.
To test this out on standalone hardware, the experience needed to be uploaded to Oculus Media Studio.
As previously demonstrated with Vibe Aliens (read), publishing to Oculus Media Studio can take a few days since it needed to be approved by curators of Oculus TV. Waiting a few days between tests would be highly inconvenient, so the Virtual Animation discord group has been consulted. Nick Ladd, a VR artist and animator, pointed out that it is possible to load the experience on the headset logged into the same profile that uploaded the experience under a draft folder. This allows for immediate testing after uploading iterations.

The size of the entire experience can be easily adjusted by scaling and repositioning the spawn area, which was practised until the edges of the city matched the edges of the room.


During these tests, it was discovered that the spawn area is not locked to the guardian system, which means that every time the user takes off the headset, the position of the user would reset towards the spawn area, offsetting the entire experience. To solve this issue, the proximity sensor on the headset must be blocked off with a piece of tape. This way the headset would not know when the user would take off the headset and would not reset the users position.
On top of that, an arrow was drawn on the floor using tape. This allows for calibrating the spawn area position with a position in the physical room. Spawn areas are measured from the floor level set up by the guardian system. Having an arrow on the floor helps set up the experience by standing against the line in the direction of the arrow before loading up the experience. After loading up the headset can be placed on the chair next to the arrow while being powered on. The visitor only has to put the headset on to experience SYNTHCITY.


A piece of tape covering the proximity sensor of the Oculus Quest 2


A piece of tape covering the proximity sensor of the Oculus Quest 2


Information cards

To bring forth the storytelling aspect of the project, several information cards were created. The experience is structured like an art exhibition, enabling the user to walk around and look at certain buildings and places. Just as art exhibitions often add signs with information about the art piece next to the displayed artwork, are these information cards created to describe certain highlights of SYNTHCITY.
These information cards have been designed using Adobe Illustrator and imported into Quill.


The created information cards


The cards implemented within SYNTHCITY


Promotional content

Name & branding

The name of a project, product or artwork is crucial in communication.
To communicate this project through social media and promotional content, a name needs to be decided upon. During a brainstorm a few different names were made up. Together with stakeholder Kees de Groot the name ‘SYNTHCITY’ has been chosen. According to Kees, this name fits with the hype of synthetic media and covers the digital VR animation nicely.

Different brainstormed names for the experience

Visual identity and poster in progress

Now with SYNTHCITY chosen as a name, a poster has been created using Adobe Illustrator. Taking inspiration from posters collected in the inspiration wall, the visual identity of SYNTHCITY is amongst the line of retro digital futurism similar to the movie Tron and other science fiction movie posters from the 80’s. This visual identity is not only used for the poster but also will be a leading theme through the promotional campaign and the presentation of the product.






SYNTHCITY social media post

To showcase the different assets created for this product, a visual framework was created in this visual style within Adobe After Effects, with this software the footage can be swapped out and this way a series of multiple assets can be made. This way the assets can be showcased and sorted in a library separately from the VR experience.

Visual framework created in Adobe After Effects


User tests


The goal is to find out if this version of the experience is usable, if the storytelling comes through, if the experience is comfortable and to get general opinions from users. 

Several CMD methods were used to test this. The experience was set up and the people working at the WARP Technopolis building were invited to test the experience. The participants were asked to find four highlights, the market, school pyramid and bank, and were free to wander around the experience. They were asked to think aloud to understand the reasons behind the users behaviour.
After they finished, a short interview was conducted about their experience.

During testing, most participants found all four highlights, but not all information cards were read.
All of the participants reacted positively about the experience. They spend 8 minutes on average within SYNTHCITY.
 Some new ideas were generated during these tests. The Guardian system prevents the participants from walking into the walls of the room.

Several inconveniences took place during testing. Originally, the headset was supposed to record the view of the participant during the test, the footage was however corrupted and could not be viewed. Next to that, the desk in the middle of the room obstructed the pathway towards some of the highlights. The participants have been warned when they almost collided with the desk. Another problem that arised, is that when the participant moved too close to the light that was pointed at the wall, the tracking would get lost and the experience needed to be recalibrated. 

Screenshot 2021-06-13 at 12.59.56.png
Screenshot 2021-06-13 at 17.42.43.png
Screenshot 2021-06-13 at 17.42.15.png

SYNTHCITY being tested by Aaike Lutkemulller


Test conclusions

Based on the user tests a few points can be concluded.
For more tracking stability, the room needs to be brighter and more evenly lit.