Developer Documentation

Neighborhood Showcase

We have built a simple showcase based on Unity and Mirror Networking where players run around a simple foggy neighborhood and can experience 3D voice based on their position in 3D space and position independent walkie-talkie with multiple channels.

In addition to voice, this demo leverages UserData to sync the location within the map with ODIN servers. A simple web app is built to show player positions, if they are talking and which direction they are running. You can also talk to other players directly within the Commander App.

Screenshots

These shots give you an impression of the demo. Try to find your friends and colleagues in this foggy world.

Downloads

You can download the showcase demo with the URLs below. Try it with your friends and have fun :-).

PlatformSizeDownload
Windows x86_641,1 GBDownload
Linux and641,1 GBDownload
macOS Universal1,1 GBDownload

Commander

In this web app, you can see all players that are active within the demo showcase. Download the demo, run it and you’ll see yourself running around in the app below. This app is built with the Web SDK.

You can run the demo standalone on your mobile device using this link:

Open commander in new Tab

This is the web app built as a Web Component which can be added to any website with these few lines:

<script src="/path/to/widget/odin-showcase-commander.js" type="text/javascript" ></script>
<link href="/path/to/widget/odin-showcase-commander.css" rel="stylesheet">

<odin-showcase-commander></odin-showcase-commander>

As you can see, ODIN can be integrated with ease everywhere. Check out this live demo directly embedded into our developer documentation.

Whitepaper

Some technical details about how everything works.

Showcase Demo

This demo is built with Unity and Mirror Networking. A dedicated server is running in our 4Netplayers Server Hosting Service. The default IP address in the lobby is pointing to this server. But you can also host your own server-client combo by clicking on the Host button in the Lobby.

While building this app, we just followed our own guide on how to integrate ODIN into Unity and Mirror Networking. So, just follow that guide to learn how the basic network topology is designed and how 3D positional audio is integrated in the game. Don’t worry, it’s a short read, as the whole process is pretty simple and straightforward.

The demo features two ways to talk to each other: 3D positional audio, where volume and direction is adjusted (automatically) based on the position of the other players in the scene, and a walkie-talkie communication with audio effects and multiple channels. Our Event handling guides describes how to handle multiple voice rooms simultaneously.

Commander App

The commander app is a very basic web application built with Angular 13, integrating our ODIN Web SDK. It’s based on our Angular Demo that is available in our Github repository.

Sending Player Positions

To integrate ODIN into Mirror Networking we already leverage User Data to map Mirror Networking IDs to ODIN Peer IDs (so the voice of Player A is attached to the corresponding player game object in the scene).

For the Commander App, we extended the PlayerUserDataJsonFormat class with a few additional properties:

[Serializable]
public class PlayerUserDataJsonFormat : IUserData
{
    public string name;
    public string seed;
    
    // These are new
    public float xPosition;
    public float yPosition;
    public float heading;
    public bool spatialTalking;
    public bool walkieTalkieTalking;
    
    // ...
}

Then, 10 times a second, we call a UpdatePlayerPosition function in our GameManager singleton which creates a JSON representation of the PlayerUserDataJsonFormat and sends it to the ODIN server, which makes sure, that these user data are in sync with each client:

UpdatePlayerPosition example
public void UpdatePlayerPosition(PlayerController controller)
{
    // Update heading, position (top down) and talking indicators        
    playerUserData.xPosition = controller.transform.position.x;
    playerUserData.yPosition = controller.transform.position.z;
    playerUserData.heading = controller.transform.eulerAngles.y;
    playerUserData.spatialTalking = controller.spatialTalking;
    playerUserData.walkieTalkieTalking = controller.walkieTalkieTalking;

    // Make sure we only send data if they have changed and only every 100ms
    if (Time.time - _lastPositionUpdateTime > updatePositionIntervalInSeconds)
    {
        var userData = playerUserData.ToUserData();
        if (userData != _lastSentUserData)
        {
            // Send JSON representation of the user data to ODIN server
            OdinHandler.Instance.UpdateUserData(userData);
            _lastPositionUpdateTime = Time.time;
            _lastSentUserData = playerUserData.ToString();                
        }
    }
}

User Data is a very powerful feature. You just define a data structure that you want to attach to each peer in the ODIN room and ODIN makes sure, that this data is shared with each client connected. User Data can be anything, as internally this is just a byte array. JSON is a good format as its available cross-platform, can be easily processed and it’s also human-readable so you can easily check out via console logs if everything works as expected.

Leveraging player positions

Our web-based commander app connects to the same ODIN room. Once connected, our web-based commander app receives PeerUserDataChanged events that contains the JSON representation that we built within our Unity application. This user data looks like this:

User data as JSON
{
  "heading": 109.49986267089844,
  "name": "John Mclain",
  "seed": "14",
  "spatialTalking": true,
  "walkieTalkieTalking": false,
  "xPosition": -21.31228256225586,
  "yPosition": -0.24028445780277252
}

Using that data, we map the position in Unity units to the 2D-position on our top-down map shown in the commander. Then we just use the CSS properties left and top to position our location pin and use transform: rotate(..deg) to rotate the direction indicator.

A simple icon within the direction pin reflects if the user is talking right now.

Commanders voice

We also wanted to allow the commander to talk to players, to showcase the multiplatform support ODIN provides. Everyone can talk with everyone independantly of OS, framework or eco-system within the same ODIN room (and if you need to with permissions handled by our token system).

Warning

We did disable audio out in this demo, so the commander can say something to players in the game, but can not hear them. As players within the game might not be aware of the commander app we did not want to expose their voice to everyone with an internet connection. Our token system allows detailed permission handling to make sure only those people with permission can hear others, but that’s a simple tech demo so we kept it simple.

Keep the lower button pressed and start talking. All players in the game will hear your voice with a nice audio effect.

This is as simple as it gets. When connecting a room, we create an OdinMedia object but don’t start it yet, i.e. everything is prepared, but no data is sent to ODIN servers:

Adding Media to room
odinRoom.createMedia(mediaStream).then((media) => {
  this.ownMedia = media;
});

Then, when the button is pressed, the startTalking function is called, and once the button is released, the stopTalking function is called which stops the media. The mic stays active, but no data is sent to ODIN anymore, the user is muted again.

Mute and unmuting example
  startTalking() {
    if (this.ownMedia) {
      this.ownMedia.start();
    }
  }

  stopTalking() {
    if (this.ownMedia) {
      this.ownMedia.stop();
    }
  }

When the commander app is started, the user is joining the ODIN room as a “spectator”. We need to handle that within the game differently as if a “real” user is joining the room with a player objct in the scene. This is very easy and I’ll show you how we handled that. Whenever a peer joins a room, all clients receive a PeerJoined event.

While this might look a bit confusing it’s very simple:

  • If a peer joined in one of the walkie-talkie rooms we attach the users voice (it’s a dynamic AudioSource ) provided by the ODIN SDK to a walkie-talkie game object of our player “skin”.
  • If a user joins the “ShooterSample” room (we just gave it this name) we try to find a player that has a presence within the game and attach the voice to this player object. This way, sound is positioned in 3D space and Unity can process audio so that volume and direction reflects the position in 3D space
  • If there is no corresponding player in the scene, this is a spectator coming from everywhere (this can be a web app, iOS or Android app, or even someone who launches the game and joins as a spectator). In this case, we attach the voice to any object in the scene and set spatialBlend to 0.0f so that the volume is always the same (i.e. gods voice).
OnMediaAdded callback in Unity
public void OnMediaAdded(object sender, MediaAddedEventArgs eventArgs)
{
    Room room = sender as Room;
    Debug.Log($"ODIN MEDIA ADDED. Room: {room?.Config.Name}, PeerId: {eventArgs?.PeerId}, MediaId: {eventArgs?.Media.Id}");

    // Check if this is 3D sound or Walkie Talkie
    if (room.Config.Name.StartsWith("WalkieTalkie"))
    {
        // A player connected Walkie Talkie. Attach to the local players Walkie Talkie
        var localPlayerController = GameManager.Instance.GetLocalPlayerController();
        if (localPlayerController && localPlayerController.walkieTalkie)
        {
            PlayerUserDataJsonFormat userData = PlayerUserDataJsonFormat.FromUserData(eventArgs.Peer.UserData);
            PlayerController player = GetPlayerForOdinPeer(userData);
            if (player)
            {
                AttachWalkieTalkiePlayback(localPlayerController, player, room, eventArgs.PeerId, eventArgs.Media.Id);    
            }
            else
            {
                Debug.LogWarning("Attaching Walkie Talkie failed, could not find player");
            }
        }
    }
    else
    {
        // This is 3D sound, find the local player object for this stream and attach the Audio Source to this player
        if (!eventArgs.Peer.UserData.IsEmpty())
        {
            PlayerUserDataJsonFormat userData = PlayerUserDataJsonFormat.FromUserData(eventArgs.Peer.UserData);
            PlayerController player = GetPlayerForOdinPeer(userData);
            if (player)
            {
                AttachOdinPlaybackToPlayer(player, room, eventArgs.PeerId, eventArgs.Media.Id);
            }
            else
            {
                Debug.Log("Spectator with user data joined");
                AttachSpectator(room, eventArgs.PeerId, eventArgs.Media.Id);                    
            }
        }
        else
        {
            Debug.Log("Spectator joined");
            AttachSpectator(room, eventArgs.PeerId, eventArgs.Media.Id);
        }
    }
}