Type Alias AI3DCharacterProps

AI3DCharacterProps: {
    apiKey?: AuthClientMessage["apiKey"];
    armsAnimationSmoothing?: number;
    bodyAnimationSmoothing?: number;
    cameraType?: "default" | "orthographic";
    className?: string;
    debugMode?: boolean;
    environment?: Environments;
    faceAnimationSmoothing?: number;
    handsAnimationSmoothing?: number;
    headAnimationSmoothing?: number;
    initialFocus?: FocusProps;
    legsAnimationSmoothing?: number;
    lowPerformanceMode?: boolean;
    onLoad: (client: SandaiClient) => void;
    raytrace?: boolean;
    showControls?: boolean;
    style?: React.CSSProperties;
    url: string;
    userId?: AuthClientMessage["userId"];
    voiceName?: VoiceNames;
    vrmUrl?: string;
}

Props for the AI3DCharacter component.

Type declaration

  • OptionalapiKey?: AuthClientMessage["apiKey"]

    Your API key. You can find it in the dashboard

  • OptionalarmsAnimationSmoothing?: number

    Dampens the arms movements.

    This is a number between 0 and 1, where 0 is the default, so the motions are applied as is, and 1 effectively disables all bone movement.

  • OptionalbodyAnimationSmoothing?: number

    Dampens the body movements.

    This is a number between 0 and 1, where 0 is the default, so the motions are applied as is, and 1 effectively disables all bone movement.

  • OptionalcameraType?: "default" | "orthographic"

    The type of camera the scene should be rendered with "default" is the standard perspective camera "orthographic" is an orhtographic camera

    Since the two camera types work differently, things like focus have different methods of calculation.

    For example, having a z-axis in an orthographic camera to control it's position and therefore the relative size of what's in view, the zoom is calculated from that value instead.

    It still works with the out of box experience, but if you set focus with custom values and switch the camera type, your carefully crafted values will likely look a bit off.

    If this annoys you, you can open an issue and send a pull- request in the r3f-vrm package in gitlab

  • OptionalclassName?: string
  • OptionaldebugMode?: boolean

    Skips auth and load checks.

    Load checks are used to determine if the user has interacted with the iframe yet (among other things), so this should be turned off in prod.

  • Optionalenvironment?: Environments
  • OptionalfaceAnimationSmoothing?: number

    Dampens the face movements.

    This is a number between 0 and 1, where 0 is the default, so the motions are applied as is, and 1 effectively disables all bone movement.

    Also handles face expression smoothing.

    0 is the default and only applies the default transitory smoothing, meaning for example when the mouth is maxed out at "aa" for 2 seconds and then goes to "ih", the transition is already smoothed out.

    1 basically holds the vowel indefinitely.

    If the mouth movements feel too choppy, try setting this value to somewhere between 0.1 and 0.9

  • OptionalhandsAnimationSmoothing?: number

    Dampens the hand movements.

    This is a number between 0 and 1, where 0 is the default, so the motions are applied as is, and 1 effectively disables all bone movement.

  • OptionalheadAnimationSmoothing?: number

    Dampens the head movements.

    This is a number between 0 and 1, where 0 is the default, so the motions are applied as is, and 1 effectively disables all bone movement.

  • OptionalinitialFocus?: FocusProps

    The initial camera properties

  • OptionallegsAnimationSmoothing?: number

    Dampens the legs movements.

    This is a number between 0 and 1, where 0 is the default, so the motions are applied as is, and 1 effectively disables all bone movement.

  • OptionallowPerformanceMode?: boolean

    Catch-all kind of a switch that will make things run on less powerful hardware. There are a few subtle effects like bloom and depth of field that are enabled by default, but they can be resource-intensive. On modern hardware, they run fine, but if you need to support, say, older android phones, then turning this on is a good idea.

    Some effects may also cause issues on outdated hardware or software, so turning this on by default may also be a good idea in general, but your milage may vary.

  • onLoad: (client: SandaiClient) => void
  • Optionalraytrace?: boolean

    Whether to use raytracing or not.

  • OptionalshowControls?: boolean

    Shows the Debug-Interface for the built-in engine, giving you access to the predefined motions, emotions and the built-in voices as a neat overlay.

    Since this technology works with vrm models, and since those models only have 5 emotions by default, it can be useful for looking at how the emotions combine to show the 27 emotions that sandai supports

  • Optionalstyle?: React.CSSProperties
  • url: string
  • OptionaluserId?: AuthClientMessage["userId"]

    Your user ID. You can find it in the dashboard

  • OptionalvoiceName?: VoiceNames
  • OptionalvrmUrl?: string