Deploy And Consume LLM App Using Azure ML Prompt Flow — Part 6

Shweta Lodha
4 min readFeb 5, 2024

This article is in continuation to my previous articles, where I explained about Azure Machine Learning Prompt Flow. Here are those:

Part 1 — Getting started with Azure Prompt Flow

Part 2 — Create, update and test Azure Prompt Flow locally

Part 3 — Chat With Custom Data — FAISS Index & Azure ML Prompt Flow

Part 4 — Integrating Azure AI Search with Azure Prompt Flow

Part 5 — Use multiple prompts to analyze LLM response — Prompt Variant

In this article, I’ll explain a bit on flow deployment as well as how we can consume it.

So, once we are done with building and testing our flow, we might want to deploy it as an endpoint so that we can invoke the endpoint for real-time inference.

Deploy The Flow

In order to deploy a flow, we need to click on Deploy button which is inside the Prompt Flow page and looks like the one shown below:

and it will open up a new dialog:

Furnish all the required details and it will create an endpoint for us, which you can see under Endpoints tab:

--

--