You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
# 3 or 5 days POC VBD powered by: Azure Search + Azure OpenAI + Bot Framework + Langchain + Azure SQL + CosmosDB + Bing Search API
3
+
# 3 or 5 days POC VBD powered by: Azure AI Search + Azure OpenAI + Bot Framework + Langchain + Azure SQL + CosmosDB + Bing Search API + Document Intelligence SDK
4
4
[](https://codespaces.new/MSUSAzureAccelerators/Azure-Cognitive-Search-Azure-OpenAI-Accelerator?quickstart=1)
5
5
[](https://vscode.dev/redirect?url=vscode://ms-vscode-remote.remote-containers/cloneInVolume?url=https://github.com/MSUSAzureAccelerators/Azure-Cognitive-Search-Azure-OpenAI-Accelerator)
6
6
@@ -33,7 +33,8 @@ The repo is made to teach you step-by-step on how to build a OpenAI-based Smart
33
33
* The customer team and the Microsoft team must have Contributor permissions to this resource group so they can set everything up 2 weeks prior to the workshop
34
34
* A storage account must be set in place in the RG.
35
35
* Customer Data/Documents must be uploaded to the blob storage account, at least two weeks prior to the workshop date
* A Multi-Tenant App Registration (Service Principal) must be created by the customer (save the Client Id and Secret Value).
37
+
* Customer must provide the Microsoft Team , 10-20 questions (easy to hard) that they want the bot to respond correctly.
37
38
* For IDE collaboration and standarization during workshop, AML compute instances with Jupyper Lab will be used, for this, Azure Machine Learning Workspace must be deployed in the RG
38
39
* Note: Please ensure you have enough core compute quota in your Azure Machine Learning workspace
39
40
@@ -48,9 +49,12 @@ The repo is made to teach you step-by-step on how to build a OpenAI-based Smart
48
49
* 3a. Azure SQL Database - contains COVID-related statistics in the US.
49
50
* 3b. API Endpoints - RESTful OpenAPI 3.0 API containing up-to-date statistics about Covid.
50
51
* 3c. Azure Bing Search API - provides access to the internet allowing scenerios like: QnA on public websites .
51
-
* 3d. Azure AI Text Search - contains AI-enriched documents from Blob Storage (10k PDFs and 90k articles).
52
-
* 3e. Azure AI Vector Search - contains 5 lenghty PDF books vectorized per page.
52
+
* 3d. Azure AI Search - contains AI-enriched documents from Blob Storage:
53
+
- 10,000 Arxiv Computer Science PDFs
54
+
- 90,000 Covid publication abstracts
55
+
- 5 lenghty PDF books
53
56
* 3f. CSV Tabular File - contains COVID-related statistics in the US.
57
+
* 3g. Kraken broker API for currencies
54
58
4. The app retrieves the result from the source and crafts the answer.
55
59
5. The tuple (Question and Answer) is saved to CosmosDB as persistent memory and for further analysis.
- Uses [Bot Framework](https://dev.botframework.com/) and [Bot Service](https://azure.microsoft.com/en-us/products/bot-services/) to Host the Bot API Backend and to expose it to multiple channels including MS Teams.
70
74
- 100% Python.
71
-
- Uses [Azure Cognitive Services](https://azure.microsoft.com/en-us/products/cognitive-services/) to index and enrich unstructured documents: Detect Language, OCR images, Key-phrases extraction, entity recognition (persons, emails, addresses, organizations, urls).
72
-
- Uses Vector Search Capabilities of Azure Cognitive Search to provide the best semantic answer.
73
-
- Creates vectors on-demand as users interact with the system. (versus vectorizing the whole datalake at the beginning)
75
+
- Uses [Azure Cognitive Services](https://azure.microsoft.com/en-us/products/cognitive-services/) to index and enrich unstructured documents: OCR over images, Chunking and automated vectorization.
76
+
- Uses Hybrid Search Capabilities of Azure AI Search to provide the best semantic answer (Text and Vector search combined).
74
77
- Uses [LangChain](https://langchain.readthedocs.io/en/latest/) as a wrapper for interacting with Azure OpenAI , vector stores, constructing prompts and creating agents.
75
78
- Multi-Lingual (ingests, indexes and understand any language)
Note: (Pre-requisite) You need to have an Azure OpenAI service already created
90
93
91
94
1. Fork this repo to your Github account.
92
-
2. In Azure OpenAI studio, deploy these models: **Make sure that the deployment name is the same as the model name.**
93
-
- "gpt-35-turbo"
94
-
- "gpt-35-turbo-16k"
95
-
- "gpt-4"
96
-
- "gpt-4-32k"
97
-
- "text-embedding-ada-002"
95
+
2. In Azure OpenAI studio, deploy these models (older models than the ones stated below won't work):
96
+
- "gpt-35-turbo-1106 (or newer)"
97
+
- "gpt-4-turbo-1106 (or newer)"
98
+
- "text-embedding-ada-002 (or newer)"
98
99
3. Create a Resource Group where all the assets of this accelerator are going to be. Azure OpenAI can be in different RG or a different Subscription.
99
-
4. ClICK BELOW to create all the Azure Infrastructure needed to run the Notebooks (Azure Cognitive Search, Cognitive Services, etc):
100
+
4. ClICK BELOW to create all the Azure Infrastructure needed to run the Notebooks (Azure AI Search, Cognitive Services, etc):
100
101
101
102
[](https://portal.azure.com/#create/Microsoft.Template/uri/https%3A%2F%2Fraw.githubusercontent.com%2Fpablomarin%2FGPT-Azure-Search-Engine%2Fmain%2Fazuredeploy.json)
102
103
103
104
**Note**: If you have never created a `Azure AI Services Multi-Service account` before, please create one manually in the azure portal to read and accept the Responsible AI terms. Once this is deployed, delete this and then use the above deployment button.
104
105
105
106
5. Clone your Forked repo to your AML Compute Instance. If your repo is private, see below in Troubleshooting section how to clone a private repo.
106
107
107
-
6. Make sure you run the notebooks on a **Python 3.10 conda enviroment**
108
+
6. Make sure you run the notebooks on a **Python 3.10 conda enviroment** or newer
108
109
7. Install the dependencies on your machine (make sure you do the below pip comand on the same conda environment that you are going to run the notebooks. For example, in AZML compute instance run:
109
110
```
110
111
conda activate azureml_py310_sdkv2
@@ -124,7 +125,7 @@ You might get some pip dependancies errors, but that is ok, the libraries were i
124
125
125
126
## **FAQs**
126
127
127
-
1.**Why use Azure Cognitive Search engine to provide the context for the LLM and not fine tune the LLM instead?**
128
+
1.**Why use Azure AI Search engine to provide the context for the LLM and not fine tune the LLM instead?**
128
129
129
130
A: Quoting the [OpenAI documentation](https://platform.openai.com/docs/guides/fine-tuning): "GPT-3 has been pre-trained on a vast amount of text from the open internet. When given a prompt with just a few examples, it can often intuit what task you are trying to perform and generate a plausible completion. This is often called "few-shot learning.
130
131
Fine-tuning improves on few-shot learning by training on many more examples than can fit in the prompt, letting you achieve better results on a wide number of tasks. Once a model has been fine-tuned, you won't need to provide examples in the prompt anymore. This **saves costs and enables lower-latency requests**"
# Please note below that running a non-async function like run_agent in a separate thread won't make it truly asynchronous. It allows the function to be called without blocking the event loop, but it may still have synchronous behavior internally.
0 commit comments