Docs
Plug 'n' Play
Unity

Unity Plug 'n' Play SDK Docs

Installing the Charisma Unity Plug ‘n’ Play SDK

Project setup

First, open up Unity, and create a blank 3D project (Preferably with High Definition Rendering Pipeline)

Creating a new Unity project in Unity Hub

Then, you'll need to install both the Charisma Unity SDK and the Charisma Unity Plug 'n' Play SDK from GitHub via the links below:

Charisma Unity SDK (opens in a new tab)

Charisma Unity Plug 'n' Play SDK (opens in a new tab)

Install both plugins using the Unity package manager (UPM), via GitHub URL.

Copying a GitHub repository URLPackage Manager field select in the Unity Editor toolbarSelecitng GitHub URL option in Unity Package Manager

Then, in UPM, import Plug 'n' Play samples into your project.

Importing samples into Unity Project in the Unity Package Manager

Now, you have both plugins installed, and example project is imported into Unity.

Connecting your Unity project to your Charisma story

Charisma story setup

First, create your Charisma story. You will need to login to your Charisma account and go to the Stories page.

Charisma Pro Story creation panel button

Then, create a "Pro Story", and choose "Plug 'n' Play" as the template for the story.

Charisma Pro Story creation menu, with Plug and Play template selected

The sample project contains a basic story with two characters and works out-of-the-box with our Plug 'n' Play SDKs, so you can get started in learning the Charisma ecosystem.

Charisma story overview page

Engine connection to Charisma

Now, let's go back to Unity. Search in the project files for the PlugNPlaySample scene, and double click it. This is a sample scene you have imported in the previous section of the tutorial.

Project files search, with Plug and Play scene found in the files

In the PlugNPlaySample scene, take a look at the Hierarchy. Click on CharismaPlaythroughInstance, and check out the inspector. CharismaPlaythroughInstance is the object that controls the connectivity between the client, and the Charisma server.

PlaythroughInstance object highlighted in the hierarchy and inspector

To connect, we will need 4 values: StoryID, StoryVersion, ApiKey, StartGraphReferenceId. Let's go back to the Charisma story page on your web browser to find these values.

StoryID & ApiKey

These are the values that link your Unity SDK to the correct story on the Charisma server.

StoryID and ApiKey can be found on the Story Overview page of your story, if you scroll down to the "Play API Access" section.

Click "Generate Key", to generate the API key.

Then, you can copy the StoryID, and API key, and paste them into the matching fields in the CharismaPlaythroughInstance component in Unity.

Panel in Charisma story page showing story ID and API key

StartGraphReferenceID

This is the value that informs Charisma from which graph you want your Charisma story to be started from.

In our case, click on the three dots on the "Start Subplot" on the left toolbar, and then click "Edit Details".

In the popup that appears, copy the "Reference ID" value, and paste it into the matching field in Unity.

Charisma popup showing details of a subplot

StoryVersion

StoryVersion is the value that informs Charisma as to which version of the story you want to play.

In Unity, set this value to -1. This means you want to play the unpublished, current draft version of your story.

PlaythroughInstance connection parameters filled with example values

Later on, when your story is published, you can set this to 0 to point Charisma at the current published version of your story, or a value greater than 0 to point at a specific version of the story instead.

Test the connection to Charisma

If everything was set up correctly, you can now play the test story in your Unity editor.

Give it a try - run the game in Play mode and see your template story come to life!

In the next tutorial sections, you will learn how to modify various aspects of the Plug 'n' Play SDK with custom content.

Linking your Charisma character to a 3D character model

Creating new characters in Charisma

Our Plug 'n' Play SDK comes with a story with 2 characters in the scene by default.

To add a new character, begin by adding a new character on the Charisma story. Go to "Characters", and "Add new Character"

Character panel on Charisma website showing a list of existing characters

Give the character a unique name, select a voice, and Save.

Once you have created a character, give them some dialogue on the graph (Start Subplot in the case of Plug 'n' Play template).

Character node in Charisma story graph, with new character linked to the node

Creating new characters in engine

Now, switch over to Unity. Our Plug 'n' Play sample comes with 6 character models ready to use. These can be found in:

Packages > Charisma.ai Plug-N-Play > Resources > Prefabs > NPCs

Project files search, with various NPC prefabs found in search

Drag one of the characters into the scene.

To link the 3D model to your Charisma character, click on your newly added prefab, and navigate over to the Inspector, where you'll find "Charisma Humanoid Actor" component with a CharacterID field. Write the name of your Charisma character into this field exactly, and you are done - your 3D model will now be linked to your Charisma character, and will speak whenever the graph hits a relevant character node.

HumanoidActorComponent highlighted in the hierarchy and inspector
Character animations and default behaviour

Default animation behaviour

The Plug 'n' Play SDK includes a full animation logic system for the provided character models, and a suite of base layer, full body, head, and upper body animations for each model. Blinking and facial expressions are also handled by the animation system.

Overview of the animator component on an NPC character

Animations are split into 6 key groups:

  • Idle: Animations played when character is idle (not talking, walking, or performing a custom animation/mannerism0

  • Mannerism: Animations that can play intermittently during the idle state

  • Walking: Animations that play when a character is walking

  • Turning: Animations that play when a character is turning

  • Talking: Animations that play when a character is talking

These animations are automatically governed by the animation system, and appropriate animation plays when a character is performing different actions.

The animation system uses HumanoidNPCAnimationConfig scriptable object as the basis for how the animation system is configured. You can search for this configuration in the Project files, or by finding it attached onto each character.

HumanoidNPCAnimationConfig in the project filesHumanoidNPCAnimationConfig higlighted in hierarchy and inspector

Inside of this config, you will find various settings that govern how the animations play, under the Animation Data list.

Overview of the HumanoidNPCAnimationConfig in the inspector

Each animation has an AnimationNodeName, Animation Tags and Associated Charisma Emotion.

  • AnimationNodeName is a read-only name of the animation. You can auto-populate all the animations in the animation config, by clicking Get Animation Data From Controller at the bottom of the config.

  • Animation Tags govern under which condition animations play (e.g. Talking animation flags: Talking, Standing)

  • Associated Charisma Emotion is a restriction for the animation to play only when a character is influenced by an emotion in the Charisma graph (e.g. JOY)

Playing an animation

You can force an animation to play when you hit a specific node in a graph, by attaching metadata to a Character node (opens in a new tab).

For example: you can attach metadata: [play-animation | ashby,IdleMannerismArmStretch] to play the IdleMannerismArmStretch animation from the animation config.

KeyValueFunction of Metadata
play-animation[character name],[animation name]Plays the specified animation on the target character
Meta data panel in Charisma story graph, with animation meta data values filled in
Character facial expressions

Default facial expression behaviour

Facial expressions in the Plug 'n' Play are controlled through the Charisma Emotions system. You can apply emotions in Character node (opens in a new tab) in the Charisma graph. These will then reflect in the character's face on the SDK side.

Emotions panel in Charisma story graph, showing Willow character with Joy values set

Each expression has an associated Scriptable Object with a list of target blendshape configurations that get applied onto the character's face when an emotion is received from Charisma. You can find a list of facial expression scriptable objects under

Assets > Samples > Charisma.ai Plug-N-Play > [VERSION] > Example > Animation > Data > Emotions

Facial expressions in project files

These emotions are attached to the HumanoidNPCAnimationConfig under Facial Expressions

Facial expression objects referenced inside the animation config

You can apply an Emotion to a Character, on the Character node (opens in a new tab) in Charisma. Apply an emotion to a specified character, with some intensity, and run the game to see how their facial expressions are affected.

Charisma graph node with the Emotions button highlighted

Adding a new facial expression

Go to the folder with all the facial expressions, right click, and select Create > Charisma > Config > NPC Facial Expression.

Rename the expression to what you see fit, and in the Associated Charisma Emotion, write the name of the charisma emotion you'd like to attach this facial expression to (in upper case spelling. E.g. JOY)

Creating a new emotions asset with the right-click menu

Now, you will need to configure the desired blendshape configuration of your new expression. To do this, we will modify the head blendshapes of a character in the scene, and save them onto the facial expression object.

First find the "Head" object of an NPC in the scene. E.g. NPC_Willow > Female_NPC_4_2 > Female_NPC_01 > Head. With the Head selected, you will find a list of blendshapes stored under the "Skinned Mesh Renderer" component in the Inspector.

Head blendshapes found in hierarchy, and highlighted in the inspector

You can play around with the blendshapes here, and watch as the selected Head in the scene changes its expression.

Once you are happy with the result, select the Facial Expression config that you have created in the previous steps. Drag the modified Head from the Hierarchy, into the Reference Mesh field in the Inspector. Then, press "Get Current Facial Expression From Selection", and save the project.

Next, you can drop in your new expression into the HumanoidNPCAnimationConfig, into the Facial Expressions list.

Lipsync

Oculus lipsync support

Charisma Plug 'n' Play includes support for Oculus VR Lipsync. To get started, download the official Oculus Lipsync OVR package from the official website:

Oculus Lipsync for Unity (opens in a new tab)

Once downloaded, unzip the the folder, and import the extracted package in Unity using the top toolbar Assets > Import Package > Custom Package

The imported package can be found in Project files under Assets > Oculus

Unity Asset menu highlight the custom package import button

Oculus lipsync setup in engine

Now, to setup the lipsync, select "AudioSource" object in the hierarchy under one of your NPC characters. Then, add a new component of type LipsyncHelperOVR

Adding LipsyncHelperOVR component in the inspector, on the head object in the hierarchy

Now, you need to populate the Head Mesh Renderer and Head Audio Source fields in the newly added LipsyncHelperOVR component

  • Head Audio Source - simply drag the Audio Source component on the same object, into the field in LipsyncHelperOVR
  • Head Mesh Renderer - go down in hierarchy of your selected NPC, and find the "Head" object. Drag it into the field in LipsyncHelperOVR
  • For Ashby sample character, this can be found under NPC_Ashby > Male_NPC 1 > Male_NPC_01 > Head
Referenced audio source higlighted in the LipsyncHelperOVR component in hierarchy and inspector

Next up, go back to the LipsyncHelperOVR component. Hit the Reset Blendshapes To Preset button. This will automatically populate the list of Viseme to Blend Targets, which is a list of data that determines the mouth shapes of the NPCs for particular sounds when talking.

Blendshape reset button on the LipsyncHelperOVR script

Finally, click on the Add Lipsync Components button, which will add all required Lipsync OVR components based on your LipsyncHelperOVR object, and will work out of the box.

Lipsync component addition button in the LipsyncHelperOVR script

That's it! Your NPCs will now analyze the Audio Source data they are receiving in real time, and lipsync accordingly - try it in-game!

NOTE: You need to go through this process for every NPC that you want to lipsync.

Setting ‘move-to’ behaviour

NPC Movement

You've learned how to play an animation using metadata in the previous section of the tutorial; now let's take a look at how you can make your characters move within the game world.

Firstly, you'll need to set up a "MoveTo" entity in your Unity project. Go into your Plug 'n' Play sample scene, and take a look at the example NPC move points under Environment > Move Points > Table1 / Table2.

These 'Move Points' are empty objects at floor level, with a CharismaMoveToEntity script that has an EntityID. The EntityID is what you'll be sending through the graph to force the NPCs to walk to different locations.

Move point objects highlighted in the scene hierarchy

Go ahead and duplicate one of these move points, and give them a new name in the hierarchy, and a new EntityID

Then, in your Charisma graph, go to a node where you want to force character movement, and go into Metadata Manager.

Add new metadata [move-to | ashby,X], where X is the EntityID you assigned to a move point in Unity.

KeyValueFunction of Metadata
move-to[character name], [move to target]Makes specified character walk to target waypoint
Charisma meta data panel with new move-to meta data filled in

And that's it - your character should now walk to your new move point when you hit the character node with the metadata!

Setting ‘look-at’ behaviour

NPC look-at targeting

The Charisma SDK also includes a look-at metadata function, which allows you to control who or what your character is looking at from within the Charisma graph.

To make use of this functionality, simply add [look-a | ashby,X] metadata to a node, where X is a character or other "Charisma object" you want Ashby to look at (you can also change Ashby to a different character - this is just an example).

KeyValueFunction of Metadata
set-look-at[character name],[look at target]Makes specified character look at a target for duration of node
Charisma meta data panel with the look-at meta data filled in

NOTE: Every object that is part of the Charisma system in the SDK can be targeted; Everything that inherits from CharismaPlaythroughActor and CharismaPlaythroughEntity, with a valid ID is part of the Charisma system.

Object interact controls

The Plug 'n' Play SDK allows you to create interactable objects that affect story graphs in Charisma. Out of the box, the Plug 'n' Play also supports allowing/blocking interactions with interactables in the scene.

The Plug 'n' Play comes with 3 interactables in the sample scene - a telescope, a book, and a coffee cup. Play through the story and see how your interactions affect the Charisma graph.

Creating new interactables

Let's create a new interactable, and test out the allow/block interact functionality. First, create a basic 3D cube in Unity with Right Click > 3D Object > Cube

Then, select your newly added cube, and attach an CharismaInteractableEntity component to your object. Give it a unique EntityId - it will be referenced by the graph. Also, you can modify the Range to affect from how far away the player can interact with this interactable.

New interactable created and selected in the hierarchy, with CharismaInteractableEntity showing in the inspector

Then, you will need to populate the On Use() field. Drag the CharismaPlaythroughInstance component from the hierarchy into the field. Then click on the No Function dropdown, and select PnpPlaythroughInstance > SetAction(string). After that, fill the text field with the EntityId name you have assigned to your interactable.

OnUse reference filled with PlaythroughInstance SetAction function, and 'new-interactable' string

The default behavior for interactable objects in our SDK is that objects are interactable when a playthrough begins, and interactions are blocked after the player has interacted with an object. You can use the reset-interact metadata to reset an object interaction, to re-enable the interaction with the object.

In the graph, you can then add new metadata on a Node (opens in a new tab) to allow/block interactivity with your cube.

KeyValueFunction of Metadata
block-interact[interactable name list (comma separated)]Prevents interactions with target object
allow-interact[interactable name list (comma separated)]Enables interactions with target object
reset-interact[interactable name list (comma separated)]Re-enables interactions with target object once interacted with

Charisma graph routing based on interactions

If you want an object interaction to trigger a particular pathway in your Charisma graph, you can use an Action Node (opens in a new tab).

Add an Action Node at a point in the graph where you would like to divert to a different branch if the interaction has been triggered. In the text field of the Action Node, add the EntityId name that you have set on your interactable object.

Charisma story graph with multiple character nodes being around an action node

Give it a playtest! Now, if you have triggered the particular action in a Unity playthrough, your graph will then divert through the Action node to your new branch.

The player

Default player behaviour, and static players

The Charisma SDK/PnP comes with 2 player controllers by default - movable, and static. The default player controller (CharismaPlayerController) is the movable one, and can be found on the sample scene in the hierarchy, on the PlayerController object.

If you would prefer to use the static player controller, you can switch it. You can find the static player controller object under:

Packages > Charisma.ai Plug-N-Play > Resources > Prefabs > StaticPlayer

You can delete the existing PlayerController from the scene, and drag the StaticPlayer prefab into the scene instead.

StaticPlayer object added to hierarhcy, with a reference to PlayerUI in the inspector in the player component

Now, to finish setting up the static player, select the StaticPlayer you added to the scene, and drag Canvas > PlayerUI in the hierarchy into the Charisma Reply UI field in your StaticPlayer object. That's it - now you have a static player controller!

Speech-To-Text

The primary way of interacting with Charisma, is through language. In our Plug 'n' Play SDK, we provide a way to communicate with Charisma through typing, as well as Speech-To-Text system, which work out of the box with our UI system (under Canvas > PlayerUI).

If you would like to make use of speech to text, make sure to enable speech to text in your Charisma Pro Story, under "Story Premium Features".

UI controls

UI prefabs and functionality

Plug 'n' Play SDK provides a template UI that handles text and speech input, character dialogue bubbles, and a story restart screen fader.

To find the UI elements in the PnP, you can check out the objects under the Canvas in the sample scene, or in the project files under Packages > Charisma.ai Plug-N-Play > Resources > Prefabs > UI.

UI prefabs higlighted in the scene hierarchyUI prefabs found in project files

You can modify any aspect of the PnP to fit your project needs. For example, if you would like to disable the speech bubbles, you can simply disable Canvas > TextboxController object, and speech bubbles will no longer appear.

Player input interaction points

To add interaction points and allow player to speak, you can use the set-player-speak metadata function.

KeyValueFunction of Metadata
set-player-speak[no value required]Brings up reply UI to allow player to speak or type a message

This metadata prepares and enables the input text/speech input UI, so that player can send a response to Charisma. Once player submits a response, the UI automatically disappears and no further actions are needed - the response gets sent to Charisma!

Charisma meta data panel with set-player-speak meta data filled in

Story end

To show the story end screen, you can toggle the End Story button on the final node in a Charisma graph. Now, once you reach that node in the graph, the Plug 'n' Play will show a story end screen which prompts the user to press "R" to restart the story from the beginning.

Charisma panel node with the 'end story' button highlightedStory ending screen prompting the player to press R to restart the story
Custom animations

Triggering animations manually

For animations, we leverage Unity’s in-built Animator system. This allows us to create animation state machines for our NPCs using exported animation clips. It also allows us to control animation through script. The main Female NPC animation controller can be found under:

Assets/Samples/Charisma.ai/Plug-n-Play/0.1.0/Example/Animation/Controllers/CharismaNPCFemale.controller

As the name implies, this controller is targeted to the female NPC skeleton. Male animations are done using the AnimatorOverrideController named CharismaNPCMale, in the same folder.

The female controller can be opened up in the Animator view by double-clicking it. Inside, the animation is structured into several layers:

  • Base Layer - primary layer, meant for movement, turning and idle animations. Has IK enabled for head tracking.
  • Full Body - layer specifically for full body animations, primarily used for talking and also custom animations. Overrides the Base Layer if an animation clip is playing.
  • Head - layer specifically for head-only animations. Used for talking animations.
  • Upperbody - layer specifically for upper-body-only animations. Used for talking animations.

This section will go over the process of adding new animations to the existing layers. If you wish to extend the system, we recommend looking into the code base and making more in-depth changes according to your project's requirements.

Adding a new animation clip

  • With the female animation controller open in the Animator view, Select the fullbody layer (or any non-base layer of your choice). Right click and add a new empty state.
Adding an empty animation state to the Animation controller
  • Select your newly added state and switch to the Inspector tab.

  • Assign a unique name for your animation node, and set your animation clip. In our example, we will be naming this animation node “Wave” and assigning an existing Wave_f animation.

Adding an animation clip to a newly added animation state
  • Once you’ve assigned your animation clip and set your name, make sure to connect your animation Node back to the main Idle node:

  • Right click your node > Make transition > Drag the connection into the Idle node

Adding a transition from TestWave state, to the center idle state

NOTE: You may need to add several transitions here with different conditions depending on the layer you’re working in and the context of the animation. For talking animations in Fullbody, for example, we exit when the animation finishes, when the NPC stops talking, or if the NPC starts walking. Please refer to other already existing transitions for examples.

The animation controller features several parameters that the user can verify their transitions against, namely:

  • Rotate - integer value, represents how much the NPC needs to rotate in degrees
  • Walking - boolean value, only set to true when NPC needs to move to a destination
  • Talking - boolean value, only true when the NPC is actively outputting audio

With the clip added, make sure to update the male animator override file mentioned previously, by adding a male variant of the animation (if available).

Updating the Animation Configuration

Once you’ve added your animation Node, you will have to update the Humanoid Animation Configuration. This configuration file can be found in:

Assets/Samples/Charisma.ai/Plug-n-Play/0.1.0/Example/Animation/Data/HumanoidNPCAnimationConfig.asset

When you open this asset in the Inspector view, you will see a detailed list of all the animation nodes present in the CharismaNPCFemale. As new nodes have been added, this file will need to be updated by pressing the “Get Animation Data From Controller” option. This will retain old data that has been set for existing layers and nodes, but add any new nodes and remove any deleted nodes respectively.

Get Animation Data button shown inside of HumanoidNPCAnimationConfig

With the Configuration file updated, you can now customize your newly added animation depending on the context of when you want the animation to be played by setting Animation Tags and the Associated Charisma Emotion.

HumanoidNPCAnimationConfig populated with new TestWave state from the anim controller

Depending on your animation needs, you may need to extend the AnimationTags and also the HumanoidNPCAnimationController script directly, as this parses any animation request based on tags.

Custom metadata

Adding new metadata functions

For adding new metadata functions, please refer to the Unity Plug-n-Play Sample folder, under:

Assets/Samples/Charisma.ai Plug-n-Play/0.1.0/Example/Scripts/Playthrough/Metadata

Inside, you can find several scripts for supporting various Metadata functions. These all inherit from the base class “MetadataFunction”, which inherits from the Unity ScriptableObject.

Meta data function scriptable objects found in project files

To add your own Metadata function, create a new C# Script in your preferred location and make it inherit from MetadataFunction. Also make sure the class/file name ends with the word Function. You will need to implement the base class functions for this script to compile:

  • MetadataId - this value is the Metadata Key present in the Charisma Story. This acts as a filter; only metadata with this key will be parsed by the Execute function.

  • Execute - this handles the Metadata Value, received whenever a message with a metadata entry is received. The body of this function depends on what you want the metadata to affect in the context of the Playthrough, be it the player, entities or Unity-side data.

TestFunction meta function class created in a code editor

Once you’ve created your new metadata class, you will have to create an instance of it as a ScriptableObject.


Referencing metadata functions

Make sure your newly added Metadata function contains the word Function in its naming. Examples: YourMetadataFunction.cs This applies to both the class name and the file name of the script.

Once the name has been changed, open your Charisma tab and hit the “Create Playthrough Metadata” operation

Charisma toolbar in Unity, highlighting 'Create Playthrough Metadata' button

This will automatically create a new ScriptableObject for your new MetadataFunction in the Sample folder under: Assets/Samples/Charisma.ai Plug-n-Play/X.X.X/Example/Data/Playthrough/Metadata

TestFunction scriptable object created in the project files

Your new metadata function also needs to be added to CharismaPlaythroughInstance meta data list:

TestFunction added to the PlaythroughInstance meta data list
Troubleshooting

Q: I cannot see any Packages in the Project files!

In the Project window, click on the Package Visibility button in the top-right corner (crossed-out eye symbol)

Package Visibility button highlighted in Unity project files

Q: I cannot find any of the provided anims in the package!

When selecting assets using the Unity picker tool, make sure to toggle the Package Visibility button.

Package Visibility button highlighted in Unity reference browser

Q: My NPCs have green skin!

Select either of your NPCs in the Hierarchy view and navigate to their meshes. Select one of the materials in the meshes, under the 'Skinned Mesh Renderer' component.

Static mesh renderer in the hierarchy, selected and showing materials in the inspectorSelected material ArmsM1 shown in the project files

Once opened, navigate to the bottom of the Material, to the Skin Diffusion Profile. Make sure this is assigned to the SkinDiffusionProfile provided in the example folder.

Material skin diffusion profile populated with SkinDiffusionProfile object

NOTE: You will need to do this for all “skin” materials present on the characters. Re-importing the Plug-N-Play package and its sample may also fix this.

Q: I have a different issue

Be sure to check out our repositories, they might have an answer to your query.

Charisma Unity SDK (opens in a new tab)

Charisma Unity Plug 'n' Play SDK (opens in a new tab)

If you have a query that above sections don't answer, send a message from our website (opens in a new tab), or get in touch directly through our Discord (opens in a new tab)

List of metadata included

Metadata function list

Charisma’s Plug ‘n’ Play SDK comes with a number of basic metadata instructions that are ready to work out of the box. As soon as your Charisma story and characters are connected to our Plug ‘n’ Play sample project and 3D avatars, the metadata instructions in the table below will trigger the specified events in your project. Simply add the required piece of metadata to the Metadata Manager on a Charisma Character node (opens in a new tab) in your story graph – the Key in the left field and the Value, where required, in the right field.

KeyValueFunction of Metadata
set-player-speak[no value required]Brings up reply UI to allow player to speak or type a message
set-look-at[character name], [look at target]Makes specified character look at a target for duration of node
play-animation[character name], [animation name]Plays the specified animation on the target character
move-to[character name], [move to target]Makes specified character walk to target waypoint
block-interact[interactable names] (comma separated)Prevents interactions with target object
allow-interact[interactable names] (comma separated)Enables interactions with target object
reset-interact[interactable names] (comma separated)>Re-enables interactions with target object once interacted with