Migration Guide: From LeMUR to LLM Gateway
LeMUR will be deprecated on March 31st, 2026. Please migrate to the LLM Gateway before that date for continued access to language model capabilities with more models and better performance.
Overview
This guide helps you migrate from LeMUR to AssemblyAI’s LLM Gateway. While LeMUR was designed specifically for processing transcripts, LLM Gateway provides a more flexible, unified interface for working with multiple language models.
Key differences
Migration steps
Step 1: Replace transcript processing
Before (LeMUR): LeMUR automatically retrieved transcript text using transcript IDs.
After (LLM Gateway): You need to include the transcript text directly in your request.
Before: LeMUR
After: LLM Gateway
Step 2: Update model names
LLM Gateway uses different model identifiers:
Step 3: Modify response handling
Before: LeMUR returned a simple response object. After: LLM Gateway returns OpenAI-compatible response format.
Before: LeMUR
After: LLM Gateway
Complete migration example
Here’s a complete example showing how to migrate a LeMUR sentiment analysis task:
Python SDK (Before: LeMUR)
Python SDK (After: LLM Gateway)
Python (Before: LeMUR)
Python (After: LLM Gateway)
JavaScript SDK (Before: LeMUR)
JavaScript SDK (After: LLM Gateway)
Migration benefits
Moving to LLM Gateway provides several advantages:
More model choices
- 15+ models including Claude 4.5 Sonnet, GPT-5, and Gemini 2.5 Pro
- Better performance with newer, more capable models
Flexible input handling
- Multi-turn conversations for complex interactions
- Custom system prompts for better context control
Enhanced capabilities
- Tool calling for function execution
- Agentic workflows for multi-step reasoning
- OpenAI-compatible API for easier integration
Next steps
After migrating to LLM Gateway, explore additional capabilities:
- Multi-turn Conversations - Build conversational experiences
- Tool Calling - Enable function execution
- Agentic Workflows - Create multi-step reasoning
Need help?
If you encounter issues during migration:
- Check model availability - Ensure your chosen model is supported
- Verify API endpoints - LLM Gateway uses different URLs than LeMUR
- Update response parsing - Response format follows OpenAI standards
- Review token limits - Different models have different context windows
The LLM Gateway provides more flexibility and capabilities than LeMUR, but requires slightly more setup to include transcript content in requests.