Skip to content

Changed module 1 instructions to use CLI to upload file to Storage instead of Portal #49

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 2 commits into
base: main
Choose a base branch
from
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
13 changes: 10 additions & 3 deletions docs/modules/Module1/Lab-1.md
Original file line number Diff line number Diff line change
Expand Up @@ -117,7 +117,7 @@ chmod +x ./tools/deploy/module0/aad-federated-cred.sh
* Click on the `Run workflow` button

* Configure following **Application Settings** for the Azure Function by going to your `function app > Configuration > Application Settings`:
1. OPENAI_API_BASE - Azure OpenAI API Endpoint URL (e.g. https://openai-demo-ahmedbham.openai.azure.com/)
1. OPENAI_API_BASE - Azure OpenAI API Endpoint URL (e.g. https://<youraccountname>.openai.azure.com/)
2. OPENAI_API_KEY - Azure OpenAI API Key
3. OPENAI_API_MODEL - "text-davinci-003" (set it equal to the `model name` you provided when deploying the `text-davinci-003` **model** in Azure OpenAI Studio)
**Remember to click Save after adding the above settings**
Expand All @@ -138,7 +138,14 @@ chmod +x ./tools/deploy/module0/aad-federated-cred.sh
## Testing Transaction Classification App

* Open the sample transaction file [25000_spend_dataset_current_25.csv](../../../tools/deploy/Module1/data/25000_spend_dataset_current_25.csv) and notice that the **classification** column is empty. This is the column that will be populated by the Azure Function by calling Azure OpenAI API.
* Upload this file to the **classification** blob container: `portal > storage account > containers > classification > upload`
* Upload this file to the **classification** blob container by entering the following CLI commands:

```bash
storage="$(az storage account list --resource-group $resourceGroupName --query [0].name -o tsv)"
key="$(az storage account keys list -g $resourceGroupName -n $storage --query [0].value -o tsv)"
az storage blob upload --account-name $storage --account-key $key --container-name classification --file tools/deploy/module1/data/25000_spend_dataset_current_25.csv --name 25000_spend_dataset_current_25.csv
```

* After few seconds, download the updated file from the **output** blob container `portal > storage account > containers > output > download`
* Open the file and notice the **classification** column is populated with the predicted category for each transaction.

Expand All @@ -147,5 +154,5 @@ chmod +x ./tools/deploy/module0/aad-federated-cred.sh
* Delete all resources created in this lab by deleting the resource group that was created in the first step of this lab.

```bash
az group delete --name <resource-group-name> --yes
az group delete --name $resourceGroupName --yes
```