Prompt A Structured Q&A Response Using LLM Gateway
This cookbook will demonstrate how to use AssemblyAI’s LLM Gateway framework to prompt a structured question and answer response.
Quickstart
Getting Started
Before we begin, make sure you have an AssemblyAI account and an API key. You can sign up for an AssemblyAI account and get your API key from your dashboard.
Find more details on the current LLM Gateway pricing in the AssemblyAI pricing page.
Step-by-Step Instructions
In this guide, we will prompt LLM Gateway with a structured Q&A format and generate an XML response.
First, let’s import the necessary libraries and set our API key.
Next, we’ll use AssemblyAI to transcribe a file and save our transcript.
Construct a formatted string to structure the questions. This includes the question text, context, an answer format, any answer options, then returns the formatted string.
Define a list of questions. For each question, you can define additional context and specify either an answer_format or a list of answer_options.
Construct the formatted question string for all the questions and build the LLM prompt.
Provide detailed instructions to prompt LLM Gateway to answer a series of questions. This also defines a structured XML template for the responses.
Getting Started
Before we begin, make sure you have an AssemblyAI account and an API key. You can sign up for an AssemblyAI account and get your API key from your dashboard.
Find more details on the current LLM Gateway pricing in the AssemblyAI pricing page.
Step-by-Step Instructions
In this guide, we will prompt LLM Gateway with a structured Q&A format and generate an XML response.
First, let’s import the necessary libraries and set our API key.
Next, we’ll use AssemblyAI to transcribe a file and save our transcript.
Construct a formatted string to structure the questions. This includes the question text, context, an answer format, and any answer options, then returns the formatted string.
Define a list of questions. For each question, you can define additional context and specify either an answer_format or a list of answer_options.
Construct the formatted question string for all the questions and build the LLM prompt.
Provide detailed instructions to prompt LLM Gateway to answer a series of questions. This also defines a structured XML template for the responses.
Prompt the LLM Gateway model and return the response.
Clean the XML output and print the question and answer pairs.