Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Read data from tables which are used by BulkInsert are causing the BulkInsert to fail #207

Closed
PassionateDeveloper86 opened this issue Jul 19, 2019 · 1 comment
Labels

Comments

@PassionateDeveloper86
Copy link

Hi,

I have a datawarehouse which is be filled automaticly every hour.
It will need around 30 seconds to insert 180K of data.
I wrap that into a transaction to be safe for the delete I do ealier, this is my code:

` try
{
using (var transaction = await this.Context.Database.BeginTransactionAsync())
{
var newList = items.Select(x => new DwhOrderBudgetOverviewMainItem()
{
Address = x.Address,
Amount = x.Amount,
Contract = x.Contract,
DamageTheme = x.DamageTheme,
District = x.District,
DocumentDate = x.DocumentDate,
DocumentDateMonth = x.DocumentDateMonth,
DocumentDateTitle = x.DocumentDateTitle,
DocumentDateYear = x.DocumentDateYear,
DocumentNumber = x.DocumentNumber,
Type = (int)x.Type,
VendorName = x.VendorName,
VendorNumber = x.VendorNumber,
ObjectAccountNumber = x.ObjectAccountNumber,
}).ToList();

                var query = this.Context.DwhOrderBudgetOverviewMainItems.AsQueryable();

                if (fillOnlyThisYear)
                {
                    query.Where(X => X.DocumentDateYear == DateTime.Now.Year);
                }

                await query.BatchDeleteAsync();

                var chunk = 10000;
                for (int i = 0; i < newList.Count; i += chunk)
                {
                    var chunkedList = newList.Skip(i).Take(chunk).ToList();
                    await this.Context.BulkInsertAsync(chunkedList);
                }

                transaction.Commit();

                return new DatabaseResultModel(newList.Count);
            }
        }
        catch (Exception ex)
        {
            return new DatabaseResultModel(ex);
        }`

So every chunk needs around 2,5seconds to enter.

When in this 2.5 seconds (I can reproduce it very clear) someone tried to read the data with another method (and of course another context), the bulkinsert will go to the catch and throws:

A transport-level error has occurred when receiving results from the server. (provider: TCP Provider, error: 0 - Eine vorhandene Verbindung wurde vom Remotehost geschlossen.) ---> System.ComponentModel.Win32Exception (10054): Eine vorhandene Verbindung wurde vom Remotehost geschlossen

@borisdj
Copy link
Owner

borisdj commented Jul 19, 2019

In BulkConfig you can try setting enum SqlBulkCopyOptions with flag TableLock.
Also if problem is some kind of Deadlock there is useful info in #46
And there is thread about this error, but not sure if it is connected with Bulk use case.
SO-a-transport-level-error

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants