Skip to main content

Customize parameters

Learn how you can customize LeMUR parameters to alter the outcome.

Change the model type

LeMUR features the following LLMs:

  • Default
  • Claude 2.1
  • Basic

You can switch the model by specifying the final_model parameter.

Defaultaai.LemurModel.defaultLeMUR Default is best at complex reasoning. It offers more nuanced responses and improved contextual comprehension.
Claude 2.1aai.LemurModel.claude2_1Claude 2.1 is similar to Default, with key improvements: it minimizes model hallucination and system prompts, has a larger context window, and performs better in citations.
Basicaai.LemurModel.basicLeMUR Basic is a simplified model optimized for speed and cost. LeMUR Basic can complete requests up to 20% faster than Default.

You can find more information on pricing for each model .

Change the maximum output size

You can change the maximum output size in tokens by specifying the max_output_size parameter. Up to 4000 tokens are allowed.

Change the temperature

You can change the temperature by specifying the temperature parameter, ranging from 0.0 to 1.0.

Higher values result in answers that are more creative, lower values are more conservative.

Send customized input

You can submit custom text inputs to LeMUR without transcript IDs. This allows you to customize the input, for example, you could include the speaker labels for the LLM.

To submit custom text input, use the input_text parameter on aai.Lemur().task().

Submit multiple transcripts

LeMUR can easily ingest multiple transcripts in a single API call.

You can feed in up to a maximum of 100 files or 100 hours, whichever is lower.

Delete data

You can delete the data for a previously submitted LeMUR request.

Response data from the LLM, as well as any context provided in the original request will be removed.

API reference

Request

curl https://api.assemblyai.com/lemur/v3/generate/task \
--header "Authorization: YOUR_API_KEY" \
--data '{
"transcript_ids": ["TRANSCRIPT_ID1", "TRANSCRIPT_ID2"],
"prompt": "YOUR_PROMPT"
}'
transcript_idsstring[]NoN/ANoneA list of completed transcripts with text. Up to a maximum of 100 files or 100 hours, whichever is lower. Use either transcript_ids or input_text as input into LeMUR.
input_textstringNoN/ANoneCustom formatted transcript data. Maximum size is the context limit of the selected model, which defaults to 100000. Use either transcript_ids or input_text as input into LeMUR.
promptstringYesN/ANoneYour text to prompt the model to produce a desired output, including any context you want to pass into the model.
final_modelstringNodefault, basic, anthropic/claude-2-1defaultThe model that is used for the final prompt after compression is performed.
max_output_sizeintNoN/A2000Max output size in tokens. Up to 4000 allowed.
temperaturefloatNoN/A0.0The temperature to use for the model. Higher values result in answers that are more creative, lower values are more conservative. Can be any value between 0.0 and 1.0 inclusive.

Response

KeyTypeDescription
responsestringThe response of the LeMUR request.
request_idstringThe ID of the LeMUR request.

You can find detailed information about all LeMUR API endpoints and parameters in the LeMUR API reference.