How to Create an AI Chat with Gemini in Construct 3

UpvoteUpvote 1 DownvoteDownvote

Index

Features on these Courses

Stats

40 visits, 58 views

Tools

Translations

This tutorial hasn't been translated.

License

This tutorial is licensed under CC BY 4.0. Please refer to the license text if you wish to reuse, share or remix the content contained within this tutorial.

Published on 26 Aug, 2025.

Step 6: From Forgetful Bot to Conversational AI: Adding Memory

Our chat works, but it has the memory of a goldfish. Each prompt is treated as a completely separate, standalone interaction, making it impossible to ask follow-up questions or build a coherent dialogue. To turn it into a real conversation, we need to give it a memory.

The solution, as outlined in the official documentation, is both elegant and simple: with each new request, we send the entire conversation history, specifying the role of each message (user for our prompts and model for Gemini's replies). Think of it like catching a friend up on a long conversation before asking your next question. Instead of treating each query as a fresh start, we provide Gemini with the full transcript, giving it the context needed to hold a meaningful, multi-turn dialogue.

Diving into the Code

First, let's modify main.ts to create and initialize a global variable that will hold the history. We'll take this opportunity to insert a "system prompt": an initial instruction that defines our AI's personality and behavior from the start, guiding the tone of all its future responses.

declare global {
    var runtime: IRuntime;
    var chatHistory: {role: "user"|"model", parts: [ q: {text: string}] }[];
}

runOnStartup(async  (runtime: IRuntime) =>  {  
    globalThis.runtime =  runtime;
    globalThis.chatHistory = [
                    {
                        role: "user",
                        parts: [{
                            text: "You're a helpful and precise virtual assistant. Your responses are clear and concise."
                        }]
                    },
                    {
                        role: "model",
                        parts:[{
                            text: "Instructions received. I'm ready."
                        }]
                        }
                    ];
    })

To implement memory, we'll evolve our ask function into a new, more powerful one called chat. This function won't just send a single question but will manage the entire conversation history.

The workflow expands with a few key steps:

  1. Before sending the request, add the user's question (user) to the history (chatHistory).
  2. The request payload will no longer contain a single question, but the entire chatHistory array.
  3. After receiving a valid response, add it to the history with the model role.
  4. If the API call fails, we must remove the user's last question from the history. This is crucial for maintaining state integrity. It ensures our chatHistory accurately reflects the conversation the AI is aware of, preventing it from getting out of sync on the next turn.

Let's see how these steps translate into the complete chat function:

export async function chat (obj :{key: string, question: string, chatHistory: {role: string, parts: [ q: {text: string}] }[], runtime: IRuntime}):Promise<string> {
    const {key, question, chatHistory, runtime} = {...obj};
    const url = `https://generativelanguage.googleapis.com/v1/models/${modelGemini}:generateContent?key=${key}`;
    // const url = `https://generativelanguage.googleapis.com/v1beta/models/${modelGemini}:generateContent?key=${key}`;

// Optimistically add the user's new message to the history
    chatHistory.push({
        role: "user",
        parts: [{ text: question }]
    });

    const payload = {
        contents: chatHistory
    };

    try {
        const response = await fetch(url, {
            method: 'POST',
            headers: { 'Content-Type': 'application/json' },
            body: JSON.stringify(payload)
        });

        if (!response.ok) 
// If there's an error, remove the unanswered question from history
            chatHistory.pop();
            throw new Error(`Errore dal server: ${response.status}`);
        }

        const data = await response.json();

        const answer = data.candidates[0].content.parts[0].text;

// Add the model's response to the history
        chatHistory.push({
            role: "model",
            parts: [{ text: answer }]
        });

        return answer;
    } catch (error) {
        console.error("Error details: ", error);
        return `An error occurred. (${(error as Error).message})`;
    }
}

The final step is to update the Construct 3 event sheet: simply replace the call to Gemini.ask with our new Gemini.chat function. Once that's done, our system is complete!

const answer = question.trim().length > 0 ? await Gemini.chat({key, question, chatHistory, runtime}) : "Missing Question!";
  • 0 Comments

Want to leave a comment? Login or Register an account!