-
Notifications
You must be signed in to change notification settings - Fork 3.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
New efcore provider: Implementing batched update support #35512
Comments
Since you mentioned
At least in our terminology - yes. Bulk Insert is a highly optimized way of inserting data that doesn't allow retrieving any store-generated values from the inserted rows. Tracked by #27333
Yes and this is a tricky part. For SQL Server we use See SqlServerUpdateSqlGenerator.cs for details.
Yes, and choose whether to derive from ReaderModificationCommandBatch.cs or AffectedCountModificationCommandBatch.cs depending on how the number of rows affected is returned by the database. |
Yes, and I think I'll make sure we do that. Is this meant to handle any mutation (insert/update/delete)?
I believe this could work the same way, though one thing I'm actually curious about is ensuring db assigned ID's are communicated back to efcore infrastructure correctly (if this is even a requirement). Eg, If I essentially have 10 inserts happening, its easy enough to batch those by delimiting with semicolon. For each insert I can easily capture the newly selected ID, or the id and columns being expected back (using affected rows and I understand a bit more now as to why this is automatically just creating its own statement per 'batch' because the |
@AndriySvyryd Do you know if there are test I can run that would show exactly what is happening with sql server as far as here are x inserts, here is the statement batch sent to sql server and what it returns back? I found DECLARE @inserted0 TABLE ([Id] int);
INSERT INTO [dbo].[Ducks] ([Id])
OUTPUT INSERTED.[Id]
INTO @inserted0
VALUES (DEFAULT),
(DEFAULT);
SELECT [t].[Id], [t].[Computed] FROM [dbo].[Ducks] t
INNER JOIN @inserted0 i ON ([t].[Id] = [i].[Id]); So essentially declaring a variable of table type, then insert into the table outputting inserted id into that table (in this case with default values), ultimately returning the ids in the order they were inserted? |
Yes, once we implement the support for it in EF (not this year).
Yes, SQL Server needs to do it because the generated values are returned as a table. If you use a separate statement for each row there should be no problem with the order.
It allows reading the results from a single resultset or single row per resultset or anything in between.
Take a look at https://github.com/dotnet/efcore/blob/44580352b37ed0df4fa02236c7d344b5bb003ef3/test/EFCore.SqlServer.FunctionalTests/BatchingTest.cs You could use SSMS to look at the sent SQL.
Yes, pretty much, this is for the most complex case where the keys are generated in addition to other columns |
Question
Greetings! I'm still plugging my way through an efcore implementation for a proprietary database engine. I've made quite a lot of progress, but noticed I'm not seeing batch support happening automatically. I'm not sure this is explicit bulk insert, but rather sending multiple updates delimited by semi colon or something. I've just got a few questions on implementation of this. Article about batched updates
IModificationCommandBatchFactory
to handle that?I've got a sample project where I'm simply trying to insert a number of records by
AddRangeAsync
and then callingSaveChangesAsync
and they seem to run as a bunch of different queries on the same connection. For this demo project I've also configured min of 1 and max of 50 for batched statement sizes. I just don't know if batched updates are only for updates, or if they should cover inserts as well.Thank you for your time, and any pointers you might be able to give
Your code
Stack traces
Verbose output
EF Core version
8.0.11
Database provider
Custom
Target framework
.net 8
Operating system
Windows 11
IDE
Visual Studio 2022
The text was updated successfully, but these errors were encountered: