Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG]BlobClient.DownloadToAsync() does not work consistently in .Net 6. #25501

Closed
brianjscho opened this issue Nov 24, 2021 · 20 comments
Closed
Labels
Azure.Core Client This issue points to a problem in the data-plane of the library. customer-reported Issues that are reported by GitHub users external to the Azure organization. needs-team-attention Workflow: This issue needs attention from Azure service team or SDK team question The issue doesn't require a change to the product in order to be resolved. Most issues start as that Service Attention Workflow: This issue is responsible by Azure service team. Storage Storage Service (Queues, Blobs, Files)

Comments

@brianjscho
Copy link

brianjscho commented Nov 24, 2021

Describe the bug
I spent time today updating a small web app from .Net 5.0 to .Net 6.0. After the upgrade, downloading small blob files (just a few kb) will fail sporadically. Our app calls BlobClient.DownloadToAsync(Stream, CancellationToken).

This app has run on .Net 5.0 for over 6 months and Net Core 3.1 for at least a year before that. We've never had an issue downloading blobs.

Sometimes the download succeeds, and sometimes it fails. When failing, I'm experiencing a variety of failures...always a problem processing the http headers.

Actual behavior (include Exception or Stack Trace)
What is the actual behavior?

Here are a couple of different errors we've received. Both are related to processing the http headers.

Error 1: Index was outside the bounds of the array.

at System.Net.Http.Headers.HttpHeaders.ReadStoreValues[T](Span`1 values, Object storeValue, HttpHeaderParser parser, Int32& currentIndex)

   at System.Net.Http.Headers.HttpHeaders.GetStoreValuesAsStringOrStringArray(HeaderDescriptor descriptor, Object sourceValues, String& singleValue, String[]& multiValue)

   at System.Net.Http.Headers.HttpHeaders.GetEnumeratorCore()+MoveNext()

   at Azure.Core.Pipeline.HttpClientTransport.GetHeaders(HttpHeaders headers, HttpContent content)+MoveNext()

   at Azure.Core.Pipeline.LoggingPolicy.FormatHeaders(IEnumerable`1 headers)

   at Azure.Core.Pipeline.LoggingPolicy.ProcessAsync(HttpMessage message, ReadOnlyMemory`1 pipeline, Boolean async)

   at Azure.Core.Pipeline.HttpPipelineSynchronousPolicy.<ProcessAsync>g__ProcessAsyncInner|4_0(HttpMessage message, ReadOnlyMemory`1 pipeline)

   at Azure.Core.Pipeline.RedirectPolicy.ProcessAsync(HttpMessage message, ReadOnlyMemory`1 pipeline, Boolean async)

   at Azure.Core.Pipeline.RetryPolicy.ProcessAsync(HttpMessage message, ReadOnlyMemory`1 pipeline, Boolean async)

   at Azure.Core.Pipeline.RetryPolicy.ProcessAsync(HttpMessage message, ReadOnlyMemory`1 pipeline, Boolean async)

   at Azure.Storage.Blobs.BlobRestClient.DownloadAsync(String snapshot, String versionId, Nullable`1 timeout, String range, String leaseId, Nullable`1 rangeGetContentMD5, Nullable`1 rangeGetContentCRC64, String encryptionKey, String encryptionKeySha256, Nullable`1 encryptionAlgorithm, Nullable`1 ifModifiedSince, Nullable`1 ifUnmodifiedSince, String ifMatch, String ifNoneMatch, String ifTags, CancellationToken cancellationToken)

   at Azure.Storage.Blobs.Specialized.BlobBaseClient.StartDownloadAsync(HttpRange range, BlobRequestConditions conditions, Boolean rangeGetContentHash, Int64 startOffset, Boolean async, CancellationToken cancellationToken)

   at Azure.Storage.Blobs.Specialized.BlobBaseClient.DownloadStreamingInternal(HttpRange range, BlobRequestConditions conditions, Boolean rangeGetContentHash, String operationName, Boolean async, CancellationToken cancellationToken)

   at Azure.Storage.Blobs.Specialized.BlobBaseClient.DownloadStreamingAsync(HttpRange range, BlobRequestConditions conditions, Boolean rangeGetContentHash, CancellationToken cancellationToken)

   at Azure.Storage.Blobs.PartitionedDownloader.DownloadToAsync(Stream destination, BlobRequestConditions conditions, CancellationToken cancellationToken)

   at Azure.Storage.Blobs.Specialized.BlobBaseClient.StagedDownloadAsync(Stream destination, BlobRequestConditions conditions, StorageTransferOptions transferOptions, Boolean async, CancellationToken cancellationToken)

   at Azure.Storage.Blobs.Specialized.BlobBaseClient.DownloadToAsync(Stream destination, BlobRequestConditions conditions, StorageTransferOptions transferOptions, CancellationToken cancellationToken)

   at Azure.Storage.Blobs.Specialized.BlobBaseClient.DownloadToAsync(Stream destination, CancellationToken cancellationToken)

Error 2: Collection was modified; enumeration operation may not execute.

at System.Collections.Generic.List`1.Enumerator.MoveNext()

   at System.Net.Http.Headers.HttpHeaders.ReadStoreValues[T](Span`1 values, Object storeValue, HttpHeaderParser parser, Int32& currentIndex)

   at System.Net.Http.Headers.HttpHeaders.GetStoreValuesAsStringOrStringArray(HeaderDescriptor descriptor, Object sourceValues, String& singleValue, String[]& multiValue)

   at System.Net.Http.Headers.HttpHeaders.GetEnumeratorCore()+MoveNext()

   at Azure.Core.Pipeline.HttpClientTransport.GetHeaders(HttpHeaders headers, HttpContent content)+MoveNext()

   at Azure.Core.Pipeline.LoggingPolicy.FormatHeaders(IEnumerable`1 headers)

   at Azure.Core.Pipeline.LoggingPolicy.ProcessAsync(HttpMessage message, ReadOnlyMemory`1 pipeline, Boolean async)

   at Azure.Core.Pipeline.HttpPipelineSynchronousPolicy.<ProcessAsync>g__ProcessAsyncInner|4_0(HttpMessage message, ReadOnlyMemory`1 pipeline)

   at Azure.Core.Pipeline.RedirectPolicy.ProcessAsync(HttpMessage message, ReadOnlyMemory`1 pipeline, Boolean async)

   at Azure.Core.Pipeline.RetryPolicy.ProcessAsync(HttpMessage message, ReadOnlyMemory`1 pipeline, Boolean async)

   at Azure.Core.Pipeline.RetryPolicy.ProcessAsync(HttpMessage message, ReadOnlyMemory`1 pipeline, Boolean async)

   at Azure.Storage.Blobs.BlobRestClient.GetPropertiesAsync(String snapshot, String versionId, Nullable`1 timeout, String leaseId, String encryptionKey, String encryptionKeySha256, Nullable`1 encryptionAlgorithm, Nullable`1 ifModifiedSince, Nullable`1 ifUnmodifiedSince, String ifMatch, String ifNoneMatch, String ifTags, CancellationToken cancellationToken)

   at Azure.Storage.Blobs.Specialized.BlobBaseClient.GetPropertiesInternal(BlobRequestConditions conditions, Boolean async, CancellationToken cancellationToken, String operationName)

   at Azure.Storage.Blobs.Specialized.BlobBaseClient.ExistsInternal(Boolean async, CancellationToken cancellationToken)

   at Azure.Storage.Blobs.Specialized.BlobBaseClient.ExistsAsync(CancellationToken cancellationToken)

To Reproduce
Steps to reproduce the behavior (include a code snippet, screenshot, or any additional information that might help us reproduce the issue)

Here's the code snippet that is attempting to download the blob. I've also included the Startup code for injecting the BlobServiceClient.

            services
                .AddAzureClients(builder =>
                {
                    builder
                        .AddBlobServiceClient(Configuration.GetConnectionString("BlobStorageConnection"));
                });
            using var destination = new MemoryStream();

            var containerClient = _client.GetBlobContainerClient(containerName.ToLower());
            var blobClient = containerClient.GetBlobClient(fileName.ToLower());
            if (!(await blobClient.ExistsAsync(cancellationToken).ConfigureAwait(false)).Value)
                throw new FileNotFoundException($"The specified blob for this attachment does not exist.  Container: {attachment.ContainerName}, File Name: {attachment.FileName}.");

            await blobClient.DownloadToAsync(
                destination: destination,
                cancellationToken: cancellationToken).ConfigureAwait(false);

Environment:

  • Name and version of the Library package used: [e.g. Azure.Storage.Blobs 12.9.0]
  • Windows Server 2019. Running .Net 6.0.0. (6.0.0-rtm.21522.10)
  • Web App running in IIS in Inprocess mode.
@ghost ghost added needs-triage Workflow: This is a new issue that needs to be triaged to the appropriate team. customer-reported Issues that are reported by GitHub users external to the Azure organization. question The issue doesn't require a change to the product in order to be resolved. Most issues start as that labels Nov 24, 2021
@brianjscho
Copy link
Author

I did some additional testing. If I use the synchronous version of this download method...BlobClient.DownloadTo(Stream), I don't experience the issue.

I should add that I'm not able to reproduce this in a console app. I'm only getting these random errors in my web app running in IIS.

@jsquire jsquire added Client This issue points to a problem in the data-plane of the library. CXP Attention needs-team-attention Workflow: This issue needs attention from Azure service team or SDK team Storage Storage Service (Queues, Blobs, Files) labels Nov 24, 2021
@ghost ghost removed the needs-triage Workflow: This is a new issue that needs to be triaged to the appropriate team. label Nov 24, 2021
@ghost
Copy link

ghost commented Nov 24, 2021

Thank you for your feedback. This has been routed to the support team for assistance.

@SaurabhSharma-MSFT SaurabhSharma-MSFT added the Service Attention Workflow: This issue is responsible by Azure service team. label Nov 24, 2021
@ghost
Copy link

ghost commented Nov 24, 2021

Thanks for the feedback! We are routing this to the appropriate team for follow-up. cc @xgithubtriage.

Issue Details

Describe the bug
I spent time today updating a small web app from .Net 5.0 to .Net 6.0. After the upgrade, downloading small blob files (just a few kb) will fail sporadically. Our app calls BlobClient.DownloadToAsync(Stream, CancellationToken).

This app has run on .Net 5.0 for over 6 months and Net Core 3.1 for at least a year before that. We've never had an issue downloading blobs.

Sometimes the download succeeds, and sometimes it fails. When failing, I'm experiencing a variety of failures...always a problem processing the http headers.

Actual behavior (include Exception or Stack Trace)
What is the actual behavior?

Here are a couple of different errors we've received. Both are related to processing the http headers.

Error 1: Index was outside the bounds of the array.

at System.Net.Http.Headers.HttpHeaders.ReadStoreValues[T](Span`1 values, Object storeValue, HttpHeaderParser parser, Int32& currentIndex)

   at System.Net.Http.Headers.HttpHeaders.GetStoreValuesAsStringOrStringArray(HeaderDescriptor descriptor, Object sourceValues, String& singleValue, String[]& multiValue)

   at System.Net.Http.Headers.HttpHeaders.GetEnumeratorCore()+MoveNext()

   at Azure.Core.Pipeline.HttpClientTransport.GetHeaders(HttpHeaders headers, HttpContent content)+MoveNext()

   at Azure.Core.Pipeline.LoggingPolicy.FormatHeaders(IEnumerable`1 headers)

   at Azure.Core.Pipeline.LoggingPolicy.ProcessAsync(HttpMessage message, ReadOnlyMemory`1 pipeline, Boolean async)

   at Azure.Core.Pipeline.HttpPipelineSynchronousPolicy.<ProcessAsync>g__ProcessAsyncInner|4_0(HttpMessage message, ReadOnlyMemory`1 pipeline)

   at Azure.Core.Pipeline.RedirectPolicy.ProcessAsync(HttpMessage message, ReadOnlyMemory`1 pipeline, Boolean async)

   at Azure.Core.Pipeline.RetryPolicy.ProcessAsync(HttpMessage message, ReadOnlyMemory`1 pipeline, Boolean async)

   at Azure.Core.Pipeline.RetryPolicy.ProcessAsync(HttpMessage message, ReadOnlyMemory`1 pipeline, Boolean async)

   at Azure.Storage.Blobs.BlobRestClient.DownloadAsync(String snapshot, String versionId, Nullable`1 timeout, String range, String leaseId, Nullable`1 rangeGetContentMD5, Nullable`1 rangeGetContentCRC64, String encryptionKey, String encryptionKeySha256, Nullable`1 encryptionAlgorithm, Nullable`1 ifModifiedSince, Nullable`1 ifUnmodifiedSince, String ifMatch, String ifNoneMatch, String ifTags, CancellationToken cancellationToken)

   at Azure.Storage.Blobs.Specialized.BlobBaseClient.StartDownloadAsync(HttpRange range, BlobRequestConditions conditions, Boolean rangeGetContentHash, Int64 startOffset, Boolean async, CancellationToken cancellationToken)

   at Azure.Storage.Blobs.Specialized.BlobBaseClient.DownloadStreamingInternal(HttpRange range, BlobRequestConditions conditions, Boolean rangeGetContentHash, String operationName, Boolean async, CancellationToken cancellationToken)

   at Azure.Storage.Blobs.Specialized.BlobBaseClient.DownloadStreamingAsync(HttpRange range, BlobRequestConditions conditions, Boolean rangeGetContentHash, CancellationToken cancellationToken)

   at Azure.Storage.Blobs.PartitionedDownloader.DownloadToAsync(Stream destination, BlobRequestConditions conditions, CancellationToken cancellationToken)

   at Azure.Storage.Blobs.Specialized.BlobBaseClient.StagedDownloadAsync(Stream destination, BlobRequestConditions conditions, StorageTransferOptions transferOptions, Boolean async, CancellationToken cancellationToken)

   at Azure.Storage.Blobs.Specialized.BlobBaseClient.DownloadToAsync(Stream destination, BlobRequestConditions conditions, StorageTransferOptions transferOptions, CancellationToken cancellationToken)

   at Azure.Storage.Blobs.Specialized.BlobBaseClient.DownloadToAsync(Stream destination, CancellationToken cancellationToken)

Error 2: Collection was modified; enumeration operation may not execute.

at System.Collections.Generic.List`1.Enumerator.MoveNext()

   at System.Net.Http.Headers.HttpHeaders.ReadStoreValues[T](Span`1 values, Object storeValue, HttpHeaderParser parser, Int32& currentIndex)

   at System.Net.Http.Headers.HttpHeaders.GetStoreValuesAsStringOrStringArray(HeaderDescriptor descriptor, Object sourceValues, String& singleValue, String[]& multiValue)

   at System.Net.Http.Headers.HttpHeaders.GetEnumeratorCore()+MoveNext()

   at Azure.Core.Pipeline.HttpClientTransport.GetHeaders(HttpHeaders headers, HttpContent content)+MoveNext()

   at Azure.Core.Pipeline.LoggingPolicy.FormatHeaders(IEnumerable`1 headers)

   at Azure.Core.Pipeline.LoggingPolicy.ProcessAsync(HttpMessage message, ReadOnlyMemory`1 pipeline, Boolean async)

   at Azure.Core.Pipeline.HttpPipelineSynchronousPolicy.<ProcessAsync>g__ProcessAsyncInner|4_0(HttpMessage message, ReadOnlyMemory`1 pipeline)

   at Azure.Core.Pipeline.RedirectPolicy.ProcessAsync(HttpMessage message, ReadOnlyMemory`1 pipeline, Boolean async)

   at Azure.Core.Pipeline.RetryPolicy.ProcessAsync(HttpMessage message, ReadOnlyMemory`1 pipeline, Boolean async)

   at Azure.Core.Pipeline.RetryPolicy.ProcessAsync(HttpMessage message, ReadOnlyMemory`1 pipeline, Boolean async)

   at Azure.Storage.Blobs.BlobRestClient.GetPropertiesAsync(String snapshot, String versionId, Nullable`1 timeout, String leaseId, String encryptionKey, String encryptionKeySha256, Nullable`1 encryptionAlgorithm, Nullable`1 ifModifiedSince, Nullable`1 ifUnmodifiedSince, String ifMatch, String ifNoneMatch, String ifTags, CancellationToken cancellationToken)

   at Azure.Storage.Blobs.Specialized.BlobBaseClient.GetPropertiesInternal(BlobRequestConditions conditions, Boolean async, CancellationToken cancellationToken, String operationName)

   at Azure.Storage.Blobs.Specialized.BlobBaseClient.ExistsInternal(Boolean async, CancellationToken cancellationToken)

   at Azure.Storage.Blobs.Specialized.BlobBaseClient.ExistsAsync(CancellationToken cancellationToken)

To Reproduce
Steps to reproduce the behavior (include a code snippet, screenshot, or any additional information that might help us reproduce the issue)

Here's the code snippet that is attempting to download the blob. I've also included the Startup code for injecting the BlobServiceClient.

            services
                .AddAzureClients(builder =>
                {
                    builder
                        .AddBlobServiceClient(Configuration.GetConnectionString("BlobStorageConnection"));
                });
            using var destination = new MemoryStream();

            var containerClient = _client.GetBlobContainerClient(containerName.ToLower());
            var blobClient = containerClient.GetBlobClient(fileName.ToLower());
            if (!(await blobClient.ExistsAsync(cancellationToken).ConfigureAwait(false)).Value)
                throw new FileNotFoundException($"The specified blob for this attachment does not exist.  Container: {attachment.ContainerName}, File Name: {attachment.FileName}.");

            await blobClient.DownloadToAsync(
                destination: destination,
                cancellationToken: cancellationToken).ConfigureAwait(false);

Environment:

  • Name and version of the Library package used: [e.g. Azure.Storage.Blobs 12.9.0]
  • Windows Server 2019. Running .Net 6.0.0. (6.0.0-rtm.21522.10)
  • Web App running in IIS in Inprocess mode.
Author: brianjscho
Assignees: -
Labels:

Storage, Service Attention, Client, customer-reported, question, needs-team-attention, CXP Attention

Milestone: -

@amishra-dev
Copy link
Contributor

@pakrym Pavel can you help look into this?

@pakrym
Copy link
Contributor

pakrym commented Nov 29, 2021

@amishra-dev are you accessing a single response from multiple threads when doing the parallel download? This would explain the error and why it doesn't happen in the sync code path.

The Response object is not thread-safe.

@pakrym
Copy link
Contributor

pakrym commented Nov 29, 2021

Looking at stack traces it doesn't seem the case. I'll take a look.

@brianjscho
Copy link
Author

This process can run concurrently across multiple threads, however each process is independent any other. The BlobServiceClient is injected into each process. And each process creates their own BlobContainerClient and BlobClient and then awaits their own separate download request to blob storage.

@brianjscho
Copy link
Author

Any progress isolating the issue? I took a look at the source for Azure.Core and I saw some in-source framework checks such as:

#if NET5_0 and #if NETCOREAPP

If this library should support future versions of .Net beyond 5.0, shouldn't it use #if NET5_0_OR_GREATER (or maybe #if NET) instead of just #if NET5_0?

Locally, when I add net6.0 to the list of TargetFrameworks in the Azure.Core.csproj, Visual Studio shows Net 6.0 using the same logic/code-path as .Net Standard, not net5.0. In Visual Studio, net6.0 is not triggering for #if NETCOREAPP like net5.0? This seems incorrect. The global.json file at the root of the sdk has a value of 5.0.301. If I update this to 6.0.100, the app finally recognized net6.0 as part of NETCOREAPP.

If none of this input is helpful, please disregard. This is what stood out to me looking through the source for Azure.Core.

@pakrym
Copy link
Contributor

pakrym commented Dec 7, 2021

If this library should support future versions of .Net beyond 5.0, shouldn't it use #if NET5_0_OR_GREATER (or maybe #if NET) instead of just #if NET5_0?

You are correct, we haven't yet cross-compiled Azure.Core for .NET 6. But because of Nuget resolution rules the .NET 5.0 version of assembly will be used when running on .NET 6.

Any progress isolating the issue?

Not really. I can't seem to repro your issue. This is the code I use:

using Azure.Storage.Blobs;

var client = new BlobServiceClient("DefaultEndpointsProtocol=https;AccountName=...;EndpointSuffix=core.windows.net");
var containerName = "myblobcontainer";
var fileName = new [] {"AllModules.zip", "10k.bin"}; // 35mb and 10k file
var cancellationToken = CancellationToken.None;

for (int i = 0; i < 10; i++)
{
    NewFunction(i);
}

while (true)
{
    Thread.Sleep(1000);
}

void NewFunction(int i)
{
    Task.Run(async () =>
    {
        try
        {
            using var destination = new MemoryStream();

            var containerClient = client.GetBlobContainerClient(containerName);
            var blobClient = containerClient.GetBlobClient(fileName[i % 2]);
            await blobClient.ExistsAsync(cancellationToken);
            await blobClient.DownloadToAsync(
                destination: destination,
                cancellationToken: cancellationToken).ConfigureAwait(false);
            Console.Write("+");
            NewFunction(i);
        }
        catch (Exception e)
        {
            Console.WriteLine(e);
            throw;
        }
    });
}

Are you able to catch this exception in the debugger or share a memory dump for when it happens? Do you have some other code in the process that might access the response in parallel (custom policies, logging integrations etc)?

@brianjscho
Copy link
Author

brianjscho commented Dec 14, 2021

So I did some more digging to try to isolate what's happening on our end. The issues only occur as part of web apps running in IIS (where we host all our web apps) on Windows Server 2019 Data Center. We run this app in 4 different environments on separate VMs for each environment. All environments/VMs are experiencing the same issue.

Here are the details about the version of windows we're running:

Edition: Windows Server 2019 Datacenter
Version: 1809
OS Build: 17763.2366

I created a small api web app from scratch with an endpoint that allows you to specify the parameters to test the blob download. I've attached this small web app here. When running in IIS on the Windows version above, I encounter random download failures. If I attempt 100 downloads in a look, 3 to 10 of them typically fail. I've attached this small sample web app.

If I do a Synchronous download using DownloadTo() instead of DownloadToAsync() it works 100% of the time.

If I target .Net 5, it also works 100% of the time for both asynchronous and synchronous downloads.

If I host this app on my personal developer laptop in IIS, it's fine on .Net 6. It's just the Windows version specified above giving us issues.

I should also mention that we tested running this small web app from the command line on the same VMs experiencing the issue. When run from the command line, the async blob downloads work fine running .Net 6. It is just when hosted in IIS that we experience the issue.

If there are any more details I can provide that will help isolate the issue, don't hesitate to ask.

TestBlobDownload.zip

@brianjscho
Copy link
Author

brianjscho commented Dec 23, 2021

I know it's the holidays and I don't expect immediate action. However, was the more specific information and sample code I provided above helpful?

@pakrym
Copy link
Contributor

pakrym commented Dec 27, 2021

Hi @brianjscho, sorry for the long wait. Your analysis is extremely impressive! From the looks of it, this issue seems to be related to IIS/ASP.NET Core Module. I'll ping the team that owns that component when everyone's back in January.

@pakrym
Copy link
Contributor

pakrym commented Jan 3, 2022

Hey @HaoK! We are observing a strange async issues that reproes only when running in IIS/ANCM in a very specific version of windows and using .NET 6. Have you ever seen anything similar?

@brianjscho
Copy link
Author

Any update on this? We're about 3+ months away from .Net 5 no longer being supported and would like to migrate to .Net 6 without having to code around this issue.

@HaoK
Copy link

HaoK commented Jan 26, 2022

Sorry I missed this, we'll try to take a look at this in triage this week

@brianjscho
Copy link
Author

In case it matters, we've been downloading and installing the Windows Hosting Bundle from the official .Net Download site on our VMs. We've been doing this for all versions of .Net Core.

@adityamandaleeka
Copy link

adityamandaleeka commented Jan 26, 2022

Triage: we don't believe this is ANCM-related based on the info in this issue so far.

It might be related to this: dotnet/runtime#61798

Concurrent HttpHeader access isn't safe, and a change in .NET 6 revealed a bunch of these types of issues: dotnet/runtime#54130

Can someone from the Azure SDK team reassign this bug to fix the concurrent access?

@pakrym @amishra-dev

@christothes
Copy link
Member

Hi @brianjscho - Would you be willing to capture a minidump of this issue using DebugDiag? It can be downloaded here and the following config can be imported as a starting point: DebugDiagConfig.zip

@brianjscho
Copy link
Author

I worked on this weekend. Since I had to install this tool, I thought I'd spin up a new VM at azure to test with a completely clean slate. I was NOT able to reproduce the issue. That's when it clicked. The link above mentions that New Relic v 9.2.0 contained a fix for this issue. We use New Relic to monitor our applications and are only running v9.0 in all our environments. I've updated the VMs in 2 of our non-production environments to the latest version of their monitoring agent (v9.4) and the issue is resolved for these VMs.

My apologies for not mentioning New Relic in the first place. It never occurred to me that New Relic might be the issue. I think we can probably close this issue unless you feel there is still something to dig into further.

@christothes
Copy link
Member

My apologies for not mentioning New Relic in the first place. It never occurred to me that New Relic might be the issue. I think we can probably close this issue unless you feel there is still something to dig into further.

No worries. I don't think we connected the dots until @adityamandaleeka referenced the runtime issue either.

Thanks for confirming!

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Azure.Core Client This issue points to a problem in the data-plane of the library. customer-reported Issues that are reported by GitHub users external to the Azure organization. needs-team-attention Workflow: This issue needs attention from Azure service team or SDK team question The issue doesn't require a change to the product in order to be resolved. Most issues start as that Service Attention Workflow: This issue is responsible by Azure service team. Storage Storage Service (Queues, Blobs, Files)
Projects
None yet
Development

No branches or pull requests

9 participants