The class to create a new instance of the IrisSetup to interface with the IRIS Wed SDK.

const iris = new IrisSetup();

Constructors

  • Create a new instance of the class and set options.

    Parameters

    Returns IrisSetup

    const iris = new IrisSetup({
    analyticsReportingInterval: INTERVAL,
    audioWorkletPath: PATH_TO_IRIS_AUDIO_WORKLET,
    key: key,
    license: license,
    team: TEAM_ID,
    company: COMPANY_ID,
    user: USER_ID,
    loggingLevel: 1,
    numStreams: 1,
    });

Properties

analyticsData: IrisNoiseAnalyticsData = ...
audioCtx: null | AudioContext = null
contexts: null | AudioContextConfig[] = null
irisNode: null | AudioWorkletNode = null
options: IrisConfig = DEFAULT_OPTIONS
processedReceiveStreamElement: null | HTMLAudioElement = null
receiveNode: null | AudioWorkletNode = null
receiveStreamElement: null | HTMLAudioElement = null

Accessors

Methods

  • Connect a media or audio element to the IRIS node.

    Parameters

    • element: HTMLMediaElement | HTMLAudioElement

      The setup config object.

    • node: IrisNodeValues

      The node leg to connect to.

    Returns MediaStreamAudioDestinationNode

    streamDest - The media stream destiantion node.

    const audioCtx = new AudioContext();
    const audioElement = document.querySelector<HTMLAudioElement>('audio');
    const processedNode = iris.connectMediaElement(audioElement, 'send');
    processedNode.connect(audioCtx.destination);
  • Connect the receive stream (usually microphone) to the processing chain.

    Parameters

    • stream: MediaStream

      A media stream (usually/presumably webrtc endpoint).

    Returns MediaStream

    processedStream - The processed media stream.

    rtcPeerConnection.ontrack = (e: MediaStreamTrack) => {
    const remoteStream = e.streams[0];
    iris.connectReceiveStream(remoteStream);
    }
  • Connect the send stream (usually microphone) to the processing chain.

    Parameters

    • stream: MediaStream

      A media stream (usually/presumably microphone).

    Returns MediaStream

    streamDest - The processed media stream.

    navigator.mediaDevices
    .getUserMedia(constraints)
    .then((stream) => iris.connectStream(audioElement, 'send');
  • Cross fade between processed and unprocessed streams.

    Parameters

    • val: number

      mix value between 0 and 1.

    • node: IrisNodeValues

      send or receive?

    Returns void

    iris.crossFade(0.8, 'send');
    
  • Asynchronous method to initialise the SDK with an Audio Context.

    Parameters

    • audioCtx: AudioContext

      The setup config object

    Returns Promise<boolean>

    Promise boolean

    await iris.init();
    
  • Set parameter of a processor in the audio worklet.

    Parameters

    • contextId: string

      the unique ID of the created IRIS audio context.

    • paramId: string | number

      the unique ID of the processor parameter.

    • value: string | number | boolean

      the new value

    Returns void

    iris.setAudioWorkletParameter('123456', 12345, true);
    
  • Set the wet/dry mix for the processed 'send' (microphone) stream.

    Parameters

    • val: number

      mix value between 0 and 1.

    Returns void

    iris.setMixLevelMic(0.8);
    
  • Set the wet/dry mix for the processed 'receive' (speaker) stream.

    Parameters

    • val: number

      mix value between 0 and 1.

    Returns void

    iris.setMixLevelSpeaker(0.8);
    
  • Initialise IRIS for the start of a call/audio playback.

    Returns void

    iris.start();
    
  • Stop processing. Should be called at hangup or at the end of a call.

    Returns void

    iris.stop();
    
  • Toggle noise analytics on or off.

    Parameters

    • value: boolean

      on or off.

    • OptionallogInterval: number

      how often should we report new data?

    Returns void

    iris.crossFade(0.8, 'send');