Unity Vs Unreal Engine: Which Is Better For XR Development?

Unity Vs Unreal Engine: Which Is Better For XR Development?

Many developers use Unity and Unreal Engine for creating XR (Extended Reality) applications. Developers can design immersive apps by using a unified code structure and accessing powerful features. We use both Unity and Unreal at Webo 360 Solutions to create impressive XR projects in an efficient and timely manner. Here, we aim to explain the pros and cons of different platforms by examining client experiences and real examples from projects.

Unity: Our Starting Point In XR

We launched our story with a VR interior app that lets users create virtual living spaces using Meta Quest. Thanks to the XR Toolkit in Unity, it was simple to try out this project. After a week, we finished the MVP, making it available for the client to evaluate and provide us with general input. By having this information, we moved more swiftly and built the experience with security. Here’s a basic script we used to let users grab, rotate, and scale objects in VR:

using UnityEngine;

using UnityEngine.XR.Interaction.Toolkit;

public class DecorObjectInteraction : MonoBehaviour

{

private XRGrabInteractable grabInteractable;

private Vector3 initialScale;

void Awake()

{

grabInteractable = GetComponent<XRGrabInteractable>();

initialScale = transform.localScale;

grabInteractable.selectEntered.AddListener(OnGrabbed);

grabInteractable.selectExited.AddListener(OnReleased);

}

void OnDestroy()

{

grabInteractable.selectEntered.RemoveListener(OnGrabbed);

grabInteractable.selectExited.RemoveListener(OnReleased);

}

private void OnGrabbed(SelectEnterEventArgs args)

{

Debug.Log($”{gameObject.name} grabbed.”);

// Optional: highlight or play soundGENERATED_BODY()

}

private void OnReleased(SelectExitEventArgs args)

{

Debug.Log($”{gameObject.name} released.”);

// Optional: Snap to grid or ground

}

// Optional: Add scale/rotate logic via UI or gestures

public void ScaleUp()

{

transform.localScale += Vector3.one * 0.1f;

}

public void ScaleDown()

{

transform.localScale = Vector3.Max(initialScale, transform.localScale – Vector3.one * 0.1f);

}

public void RotateRight()

{

transform.Rotate(Vector3.up, 15f);

}

public void RotateLeft()

{

transform.Rotate(Vector3.up, -15f);

}

}

This simple interaction gave life to our VR environment and laid the foundation for future XR apps at Webo 360 Solutions.

Switching To Unreal For Realistic Visuals

We were approached by a real estate client searching for a photorealistic VR walkthrough later on. Unity was not powerful enough to provide the high-quality visuals we required. At this point, we switched to Unreal Engine and began using the Collaborative Viewer Template.

Lumen and Nanite, strong tools in Unreal, let us produce stunning VR environments that users can explore together.

We also used a C++ component to let users change wall textures or floor finishes in real-time using a laser pointer. Here’s part of that code:

 

// MyMaterialChangerComponent.h

UCLASS( ClassGroup=(Custom), meta=(BlueprintSpawnableComponent) )

class YOURPROJECT_API UMyMaterialChangerComponent : public UActorComponent

{

GENERATED_BODY()

public:

UMyMaterialChangerComponent();

// Call this to change material at runtime

UFUNCTION(BlueprintCallable, Category=”Material”)

void ChangeMaterial(UMaterialInterface* NewMaterial, int32 ElementIndex = 0);

protected:

virtual void BeginPlay() override;

private:

UPROPERTY()

UMeshComponent* MeshComponent;

};

// MyMaterialChangerComponent.cpp

#include “MyMaterialChangerComponent.h”

#include “GameFramework/Actor.h”

#include “Components/StaticMeshComponent.h”

#include “Components/SkeletalMeshComponent.h”

UMyMaterialChangerComponent::UMyMaterialChangerComponent()

{

PrimaryComponentTick.bCanEverTick = false;

}

void UMyMaterialChangerComponent::BeginPlay()

{

Super::BeginPlay();

// Try to find a mesh component on the owner

AActor* Owner = GetOwner();

if (Owner)

{

MeshComponent = Owner->FindComponentByClass<UStaticMeshComponent>();

if (!MeshComponent)

{

MeshComponent = Owner->FindComponentByClass<USkeletalMeshComponent>();

}

}

}

void UMyMaterialChangerComponent::ChangeMaterial(UMaterialInterface* NewMaterial, int32 ElementIndex)

{

if (MeshComponent && NewMaterial)

{

MeshComponent->SetMaterial(ElementIndex, NewMaterial);

}

}

This quick feature turned out to be a game-changer. It lets users personalize the space during live sessions, improving interactivity and client satisfaction.

Performance Challenges And Solutions

While Unreal delivers impressive graphics, it can have an impact on performance in standalone VR. One of our VR safety training apps experienced frame rate issues until we made improvements with Hierarchical Instanced Static Meshes and baked lighting. Unity, on the other hand, makes it easier to fine-tune performance for mobile VR. As an example, we used a script to adjust the LOD (Level of Detail) based on the user’s distance from the object. This boosted performance by about 30% on the Meta Quest 2.

// MyMaterialChangerComponent.h

UCLASS( ClassGroup=(Custom), meta=(BlueprintSpawnableComponent) )

class YOURPROJECT_API UMyMaterialChangerComponent : public UActorComponent

{

GENERATED_BODY()

public:

UMyMaterialChangerComponent();

// Call this to change material at runtime

UFUNCTION(BlueprintCallable, Category=”Material”)

void ChangeMaterial(UMaterialInterface* NewMaterial, int32 ElementIndex = 0);

protected:

virtual void BeginPlay() override;

private:

UPROPERTY()

UMeshComponent* MeshComponent;

};

// MyMaterialChangerComponent.cpp

#include “MyMaterialChangerComponent.h”

#include “GameFramework/Actor.h”

#include “Components/StaticMeshComponent.h”

#include “Components/SkeletalMeshComponent.h”

UMyMaterialChangerComponent::UMyMaterialChangerComponent()

{

PrimaryComponentTick.bCanEverTick = false;

}

void UMyMaterialChangerComponent::BeginPlay()

{

Super::BeginPlay();

// Try to find a mesh component on the owner

AActor* Owner = GetOwner();

if (Owner)

{

MeshComponent = Owner->FindComponentByClass<UStaticMeshComponent>();

if (!MeshComponent)

{

MeshComponent = Owner->FindComponentByClass<USkeletalMeshComponent>();

}

}

}

void UMyMaterialChangerComponent::ChangeMaterial(UMaterialInterface* NewMaterial, int32 ElementIndex)

{

if (MeshComponent && NewMaterial)

{

MeshComponent->SetMaterial(ElementIndex, NewMaterial);

}

}

We always encourage our developers to use smart LODs and culling techniques when building for standalone VR devices. It saves both performance and development headaches.

MetaHuman Experiments

The MetaHuman Animator stands out as one of the coolest tools we’ve used in Unreal. We created a face-scanning app that produces realistic digital characters. We used Live Link to stream facial data from an iPhone and map it directly to MetaHuman characters. We then placed them in a VR training room where users could interact with these lifelike avatars in real-time. Here’s how our setup worked:

  • Face data is streamed from the iPhone.
  • Synced with MetaHuman face controls
  • Triggered expressions and animations

The results were so realistic that clients thought we hired a Hollywood studio. That’s how powerful this tech is when paired with good hardware.

Mixed Reality: High Potential, Limited Readiness

We gave MR app development a shot on Quest Pro and Vision Pro, but we weren’t happy with the outcome. Here’s what we noticed:

  • Visual distortion
  • Bad lighting
  • Lag in hand tracking

One app struggled to line up virtual workstations with the real world. Bright spaces made this problem worse. For the time being, stick to MR for small guided interactions. Stay away from anything that needs perfect pass-through visuals. The tech still has some catching up to do.

Learning And Playing: XR At Home

At my place, I saw XR’s potential to boost learning. I got a Tilt Brush ready for my kid, and he drew a whole 3D dragon out of thin air. This showed me how XR can spark creativity and help kids learn. We also created an AR dinosaur flashcard app with Unity. Kids can scan a card to make a roaring 3D dino pop-up. Here’s a simplified version of the code we used:

using UnityEngine;

using UnityEngine.XR.ARFoundation;

using UnityEngine.XR.ARSubsystems;

using System.Collections.Generic;

public class DinoImageTracker : MonoBehaviour

{

public ARTrackedImageManager trackedImageManager;

[System.Serializable]

public class DinoData

{

public string imageName; // Must match the reference image name

public GameObject dinoPrefab; // 3D prefab to spawn

}

public List<DinoData> dinoLibrary = new List<DinoData>();

private Dictionary<string, GameObject> spawnedDinos = new Dictionary<string, GameObject>();

void OnEnable()

{

trackedImageManager.trackedImagesChanged += OnTrackedImagesChanged;

}

void OnDisable()

{

trackedImageManager.trackedImagesChanged -= OnTrackedImagesChanged;

}

void OnTrackedImagesChanged(ARTrackedImagesChangedEventArgs eventArgs)

{

// Handle newly detected images

foreach (ARTrackedImage trackedImage in eventArgs.added)

{

SpawnOrUpdateDino(trackedImage);

}

// Handle updates (position, tracking state)

foreach (ARTrackedImage trackedImage in eventArgs.updated)

{

SpawnOrUpdateDino(trackedImage);

}

// Handle removed images

foreach (ARTrackedImage trackedImage in eventArgs.removed)

{

if (spawnedDinos.ContainsKey(trackedImage.referenceImage.name))

{

Destroy(spawnedDinos[trackedImage.referenceImage.name]);

spawnedDinos.Remove(trackedImage.referenceImage.name);

}

}

}

void SpawnOrUpdateDino(ARTrackedImage trackedImage)

{

string imageName = trackedImage.referenceImage.name;

if (!spawnedDinos.ContainsKey(imageName))

{

GameObject prefab = GetDinoPrefab(imageName);

if (prefab != null)

{

GameObject dino = Instantiate(prefab, trackedImage.transform.position, trackedImage.transform.rotation);

spawnedDinos[imageName] = dino;

// Optional: Play roar animation and sound

Animator animator = dino.GetComponent<Animator>();

if (animator != null) animator.SetTrigger(“Roar”);

AudioSource audio = dino.GetComponent<AudioSource>();

if (audio != null) audio.Play();

}

}

else

{

// Update position if already spawned

GameObject dino = spawnedDinos[imageName];

dino.transform.position = trackedImage.transform.position;

dino.transform.rotation = trackedImage.transform.rotation;

}

}

GameObject GetDinoPrefab(string imageName)

{

foreach (var data in dinoLibrary)

{

if (data.imageName == imageName)

return data.dinoPrefab;

}

return null;

}

}

Seeing kids interact with XR in this way made us build a full prototype for XR education. Spatial learning keeps them focused, creative, and excited.

Final Thoughts: XR Is A Blend Of Worlds

These days, AR, VR, and MR are all part of one big family: XR (Extended Reality). It’s not just about games anymore—XR is changing how we train, learn, and even talk to each other. XR is paving the way for fresh opportunities. It helps boost frame rates, design realistic characters, try out new gear, and teach children.

Try Our XR Demo Builds

At Webo 360 Solutions, we’ve tackled numerous XR projects through the years, and we’ve discovered one key thing—you never stop learning. We try out fresh concepts and tools. Right now, we have several demo builds ready to check out, such as:

  • A motion capture viewer for Meta Quest
  • An AR model viewer for product showcases

If you’re curious or want to try them out, feel free to reach out. We will be happy to help you.

Thinking About XR Development?

For those beginning their XR adventure, We suggest giving both Unity and Unreal Engine a shot. Each has its strong points—knowing when to use which one can make a big difference.

Need help choosing the right tool?

Looking for real project code to kick-start your development?

We are happy to share tips, code samples, and lessons we’ve learned along the way.

Let’s build and explore the future of immersive tech together.

Leave a Comment

Your email address will not be published. Required fields are marked *

*
*