[release/8.0] Support multiple deployments per cognitive service (#3448) #4022
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Customer Impact
When customers try to deploy an Azure OpenAI resource with more than 1 "deployment" model (i.e. chat gpt and a text embedding), it fails during deploy:
This fixes the issue so multiple models can be deployed in 1 AppHost Azure OpenAI resource.
Note that multiple deployment models are used in dotnet/eShop
Testing
Automated tests passing in the repo. I have manually run eshop with a local build and provisioning worked. Also verified
azd init/up
works correctly as well.Risk
Low to medium. The change tells Azure deployment that each deployment model is dependent on the previous, so they get deployed 1 at a time, which is how Azure OpenAI needs it to be.
Regression?
Yes - when we moved from manual bicep to Azure Provisioning.
Microsoft Reviewers: Open in CodeFlow