LeMUR

Run a task using LeMUR

POST
Use the LeMUR task endpoint to input your own LLM prompt.

Request

This endpoint expects an object.
prompt
stringRequired
Your text to prompt the model to produce a desired output, including any context you want to pass into the model.
context
unionOptional
Context to provide the model. This can be a string or a free-form JSON value.
final_model
enumOptional
The model that is used for the final prompt after compression is performed. Defaults to "default".
Allowed values: defaultbasicassemblyai/mistral-7banthropic/claude-2-1
input_text
stringOptional

Custom formatted transcript data. Maximum size is the context limit of the selected model, which defaults to 100000. Use either transcript_ids or input_text as input into LeMUR.

max_output_size
integerOptional
Max output size in tokens, up to 4000
temperature
doubleOptional
The temperature to use for the model. Higher values result in answers that are more creative, lower values are more conservative. Can be any value between 0.0 and 1.0 inclusive.
transcript_ids
list of stringsOptional

A list of completed transcripts with text. Up to a maximum of 100 files or 100 hours, whichever is lower. Use either transcript_ids or input_text as input into LeMUR.

Response

This endpoint returns an object
request_id
string
The ID of the LeMUR request
response
string
The response generated by LeMUR.
usage
object
The usage numbers for the LeMUR request
POST
1curl -X POST https://api.assemblyai.com/lemur/v3/generate/task \
2 -H "Authorization: <apiKey>" \
3 -H "Content-Type: application/json" \
4 -d '{
5 "prompt": "List all the locations affected by wildfires.",
6 "context": "This is an interview about wildfires.",
7 "final_model": "default",
8 "max_output_size": 3000,
9 "temperature": 0,
10 "transcript_ids": [
11 "64nygnr62k-405c-4ae8-8a6b-d90b40ff3cce"
12 ]
13}'
200Successful
1{
2 "request_id": "5e1b27c2-691f-4414-8bc5-f14678442f9e",
3 "response": "Based on the transcript, the following locations were mentioned as being affected by wildfire smoke from Canada:\n\n- Maine\n- Maryland\n- Minnesota\n- Mid Atlantic region\n- Northeast region\n- New York City\n- Baltimore\n",
4 "usage": {
5 "input_tokens": 27,
6 "output_tokens": 3
7 }
8}