Audio Scripting

  • Looking the Audio Scriptiong project to see how sounds are played.

    I want to play external sounds on runtime, and adding an URL it works.

    But i can't control that sound in events for play, add, FX or whatever, due there is no tag.

    How i add in the script a tag for an audio file to be referenced in events to manage like any other sound for play/stop/resume/fx/ or whatever by tag ?

    // Audio manager class for handling audio playback.
    let audioManager = null;
    
    // References to loaded audio files as global variables
    let audioSfx5 = null;
    let audioSfx7 = null;
    let audioEpicArpg = null;
    
    runOnStartup(async runtime =>
    {
    	// Initialise the audio manager. See AudioManager.js for details.
    	audioManager = new AudioManager(runtime);
    	
    	// During the loading screen, load both sound files as
    	// AudioBuffers and the music track all in parallel, so
    	// they are ready for immediate playback on startup.
    	[audioSfx5, audioSfx7, audioEpicArpg] = await Promise.all([
    		audioManager.loadSound("Sfx5.webm"),
     audioManager.loadSound("sfx7.webm"),
    		audioManager.loadMusic("epicArpg.webm")
    	]);
    });
    
    // These functions are called by the button click events.
    function PlaySfx5()
    {
    	audioManager.playSound(audioSfx5);
    }
    
    function PlaySfx7()
    {
    	audioManager.playSound(audioSfx7);
    }
    
    function PlayMusic()
    {
    	audioManager.playMusic(audioEpicArpg);
    }
    
  • The Audio Scripting example provides a thin layer on top of the browser Audio APIs in the form of the AudioManager.js script. The browser API has no concept of tags. Tags are an abstraction to make it easier to control multiple forms of audio in the event sheet, where audio objects cannot be placed in variables or collections.

    You need to store the references and called the required method with the reference to stop it. If you want to use tags then you can implement it yourself by adding the functionality to the AudioManager class. The AudioManager class in this example plays audio using 2 different methods: WebAudio and HTML Audio Elements. Both have different advantages, and different methods to manipulate them.

    WebAudio API

    HTML Audio element

  • Try Construct 3

    Develop games in your browser. Powerful, performant & highly capable.

    Try Now Construct 3 users don't see these ads
  • So, my question is:

    Is possible load an external sound in runtime and manage that audio like any others using tags to add FX or whatever?

    In that case, can you provide an example please?

  • There's quite a lot of the plugins which don't expose anything to the scripting API. But this doesn't mean you cannot work with them from scripts. In most cases you can create a function on the event sheet that say, plays an audio file, then call the function from your script using runtime.callFunction. You can do the opposite by declaring a JS function in your script file, then inserting an inline script block on the event sheet that calls the function. I think a lot of people at the moment are focusing on using only scripting or event sheets in their project but in reality they both have advantages and weaknesses vs the other. It's easy to integrate one with the other, so it makes sense to use both where they are strongest.

    Here is an adjusted version of the audio scripting demo to show how to use this mixture of JS functions and Event sheet functions to interface between the 2 systems.

Jump to:
Active Users
There are 1 visitors browsing this topic (0 users and 1 guests)