Updating child records ado

Something to have in mind before rushing and breaking other parts of the application. It is the growing size of attached entities in the context what slows down the insertion step by step. Here are a few measurements for my 560.000 entities: The behaviour in the first test above is that the performance is very non-linear and decreases extremely over time.

updating child records ado-75

It is visible that there is speed increase when moving from 1 to 10, and from 10 to 100, but from 100 to 1000 inserting speed is falling down again.

So I've focused on what's happening when you reduce batch size to value somewhere in between 10 and 100, and here are my results (I'm using different row contents, so my times are of different value): Quantity | Batch size | Interval 1000 1 3 10000 1 34 100000 5 1 10000 5 12 100000 10 1 10000 10 11 100000 20 1 10000 20 9 100000 20 92 1000 27 0 10000 27 9 100000 27 92 1000 30 0 10000 30 9 100000 30 92 1000 35 1 10000 35 9 100000 35 94 1000 50 1 10000 50 10 100000 100 1 10000 100 14 100000 100 141 Based on my results, actual optimum is around value of 30 for batch size. Problem is, I have no idea why is 30 optimal, nor could have I found any logical explanation for it.

As other people have said Sql Bulk Copy is the way to do it if you want really good insert performance.

It's a bit cumbersome to implement but there are libraries that can help you with it.

I've inserted several million records with it before and it is extremely fast.

That said, unless you will need to re-run this insert, it might be easier to just use EF.

I tried many of the solutions provided in this post and Sql Bulk Copy was by far the fastest.

Pure EF took 15min, but with a mix of the solution and Sql Bulk Copy I was able to get down to 1.5 min! Without any DB index [email protected] Mustafa: yeah.

The one problem is ofcourse if you need to insert releated data.

Tags: , ,