Skip to main contentOverview
GladeCore is a comprehensive system for integrating advanced, AI-driven NPC interactions into your game engine project. It allows players to engage in dynamic, unscripted conversations with non-player characters or companions using either text or voice. NPCs use local inference to respond in real-time, with their personalities, knowledge, and even their voices defined by easy-to-use data assets that are set up to be designed and configured in the engine.
The plugin’s core features include:
- LLM-Powered Dialogue: Utilizes local LLM inference to generate dynamic NPC responses.
- Speech-to-Text (STT): Allows the player to use their microphone to speak to NPCs.
- Text-to-Speech (TTS): Enables NPCs to audibly speak their generated responses using either ElevenLabs’ cloud API or a local Piper TTS model
- Data-Driven Personalities: NPC personas, backstories, and voice settings are configured through Data Assets, allowing for easy customization without changing code.
- Retrieval Augmented Generation (Pro Subscribers Only): Store and retrieve knowledge to keep responses accurate, consistent, and grounded.
- Custom Model Fine-Tuning (Pro Subscribers Only): Finetune and integrate your own custom models of different sizes, emotions, and training data for even more personalization.
- Multiplayer Support (Enterprise Only): The architecture is built with networking in mind, using client-server RPCs to handle communication but is not implemented out of the box. We offer custom support Enterprise clients to figure out multiplayer for their specific use case.
Select your Game Engine
Choose your game engine to start the setup now!
Try a Free Demo Now
Visit here to try a free demo now!
What LLM do we use?
Our base plugin supports Meta’s Llama 3.2 (1B or 3B) and ships with a default fine-tune targeting that model. You can swap in other models easily within our framework.
Llama 3.2 is licensed under the Llama 3.2 Community License. Commercial use and redistribution are allowed, so long as you show “Built with Llama”, include the license and NOTICE when redistributing, and use “Llama” at the beginning of the model name if you distribute a model trained using Llama or its outputs.
Llama Community License
Llama Notice
Built with Llama
Need Help or Have Questions?