With the rise in the use of VR headsets, there are huge opportunities for developers to explore the uncharted territory of VR game development. VR is an exciting new development field, requiring engagement with a number of new disciplines and ideas as the developer seeks to create seamless, enjoyable and effective VR experiences.

As VR attracts more and more developers, it is important to know the available development tools and what functionalities they offer. Some players in the market have been in business for years; some are new to the field. Some are building on experience in other areas that give them a head-start, and some are built on a strong community fostered through ease of use.

This is a brief overview of a few of the available programming tools in the field.

Game Maker Studio

Launched in 1999, Game Maker Studio has been around in the game development community for over 17 years now, since long before the advent of VR; but its background means it has a lot of experience built-in around user experience, which is vital when changing the interaction space in the way the VR does. Users need to feel a sense of delight, or at least of ease when encountering a VR environment; that means the developer of that environment can’t afford to skip thinking about what the end product will feel like in terms of user experience.

Game Maker Studio allows developers to produce games with high compatibility with iOS, Android, Windows and even HTML5 and is suitable for those developers who are learning the basics of VR development. However, as an environment its days may be numbers; while it has extensions like GMOculus for developing Apps for Oculus Rift; many believe that GameMaker has outlived its usefulness and will likely be cast aside as VR game development continues to mature as a sector.

If you want to play with it, try this. Given below is a code snippet which assists in map generation of a game by adding elements called “Sprites”.

public class Sprite
{
public eSprite SpriteType;


public int x; // -127 to +128 (offset into tile)
public int y; // ditto
public int z; // 0 to 65536 (0 to 1024*64)
public float scale; // x and y scales are the same
public float angle;
public eSpriteFlags flags;
}

public class MapColumn
{
public List<int> column = new List<int>();
public List<Sprite> sprites = new List<Sprite>();

public void Add(int _tile)
{
column.Add(_tile);
}

public void AddSprite(eSprite _spr, int _x, int _y, int _z, float _scale, float _angle, eSpriteFlags _flags)
{
Sprite spr = new Sprite();
spr.x = _x & 0xff;
spr.y = _y & 0xff;
spr.z = _z & 0x3ff;
spr.scale = _scale;
spr.angle = _angle;
spr.flags = _flags;
sprites.Add(spr);
}
}

}

Unity

Unity is undoubtedly one of the most popular game development engines today. Built with simplicity and a shallow learning-curve in mind, it allows developers to develop two as well three-dimensional applications. It is compatible with all desktop and mobile platforms in all versions of Unity, while the Unity Pro version allows development for Nintendo, Wii, and PlayStation as well. It’s a solid option for onboarding a VR development process in a team that’s not used to it, or a team which wants to sandbox for a while before choosing a triple-A platform to try and develops for. It’s superb for indie game development.

Developing apps for VR using Unity is considered relatively simple; Unity has built well on its existing simplicity of approach. Most developers regard Unity as easy to learn, and Unity offers greater performance with mobile VR kits, lowering the bar to use by both end users and developers themselves. Unity comes highly recommended for those developers that are new to the world of VR development.

Here’s a code snippet that provides the functionality of automatic re-center in Unity.

 using UnityEngine;
using System.Collections;

namespace OSVR
{
namespace Unity
{
public class AutoRecenter : MonoBehaviour
{
private ClientKit _clientKit;
private DisplayController _displayController;
private bool recentered = false;

void Awake()
{
recentered = false;
_clientKit = FindObjectOfType<ClientKit>();
_displayController = FindObjectOfType<DisplayController>();
}

void Update()
{
if(!recentered)
{
if (_displayController != null && _displayController.CheckDisplayStartup() && _displayController.UseRenderManager)
{
_displayController.RenderManager.SetRoomRotationUsingHead();
recentered = true;
}
else if (_displayController != null && _displayController.CheckDisplayStartup() && !_displayController.UseRenderManager)
{
_clientKit.context.SetRoomRotationUsingHead();
recentered = true;
}
}

}
}
}
}

Unreal Engine

Unreal Engine is another old warhorse of game development. Launched in 1998, Unreal Engine has since been one of the consistent big guns in the game development industry, known for its high-end products and for enabling exceptional graphics and display features. A leader in the 3D development space, VR was a natural move for Unreal, and its strengths as a graphical processor have allowed it to become a huge part of this new space.

Unreal Engine allows the developer to achieve exceptionally high frame rates, resolution and low latency rates; feats which are crucial for a good VR app, as they allow developers to sidestep common problems that arise from the processor-intensive nature of creating a consistent, smooth visual space for the user to inhabit. Smoothness is vital since there is a real danger of making users unwell if the environment that surrounds them is being rendered in a glitchy or slow way.

The downside here is Unreal’s reputation for being hard to learn and explore. This comes from a solid decision on Unreal’s part: to create a codebase that’s appropriate for use in triple-A games, they made the decision to build directly on C++. This is directly responsible for its better graphical output; by comparison with Unity’s C# basis, C++ enables Unreal to have performed faster by using C++’s inherent strengths in speed.

Here is a code snippet in C++ which allows the player to pick up a Cube in a game developed in Unreal Engine.

#include "VRCode.h"
#include "PickupCube.h"

// Sets default values
APickupCube::APickupCube()
{
// Set this actor to call Tick() every frame. You can turn this off to improve performance if you don't need it.
PrimaryActorTick.bCanEverTick = true;

StaticMeshComponent = CreateDefaultSubobject<UStaticMeshComponent>( TEXT( "StaticMeshComponent" ) );
StaticMeshComponent->SetSimulatePhysics( true );
StaticMeshComponent->bGenerateOverlapEvents = true;

StaticMeshComponent->SetCollisionProfileName( UCollisionProfile::PhysicsActor_ProfileName );
}

// Called when the game starts or when spawned
void APickupCube::BeginPlay()
{
Super::BeginPlay();

}

// Called every frame
void APickupCube::Tick( float DeltaTime )
{
Super::Tick( DeltaTime );
}

void APickupCube::Pickup_Implementation( class USceneComponent *AttachTo )
{
StaticMeshComponent->SetSimulatePhysics( false );
USceneComponent *Root = GetRootComponent();

FAttachmentTransformRules AttachmentTransformRules( EAttachmentRule::KeepWorld, false );
Root->AttachToComponent( AttachTo, AttachmentTransformRules );
}

void APickupCube::Drop_Implementation()
{
StaticMeshComponent->SetSimulatePhysics( true );

FDetachmentTransformRules DetatchmentTransformRules( EDetachmentRule::KeepWorld, true );
DetachFromActor( DetatchmentTransformRules );
} 

The development tool you choose essentially comes down to three things; the functional requirements of the VR app, and ease of use for your level of development. You can take it simple, or go for complexity, depending on your needs and the challenge you want to set yourself. Happy coding!