After some goŠ¾gling, I could not find a proper replacement of SQLBulkCopy from SQLClient in Npgsql. Can any body suggest me anything like SQLBulkCopy for PostgreSQL? I need to insert around 10 millions of data.. I know about making single query insertion...
I am just doing test for comparison purpose.
Any suggestion is appreciated.
Very old now, but note that Npgsql 3 includes an optimized binary bulk copy functionality, see the docs (3.0).
Related
I'm porting my old project. It previously use Oracle.DataAccess.Client library to access with oracle database.
Now, I want to use Npgsql library to access with Postgresql database. So I need to porting the old c# code which use Oracle library method. For example, the old code use OracleCommand.ArrayBindCount property to insert multi rows at once.
But I do not see the similar method or property to do the same thing.
I want to know if there're similar ways to achieve it.
No, there's nothing like ArrayBindCount in Npgsql/PostgreSQL.
However, you can batch two (or more) INSERT statements in the same CommandText (delimited by semicolons) and have all the parameters for all statements on that command. This would execute efficiently as a single batch.
For really efficient bulk insert, use COPY which is by far the fastest.
You also can refer below post.
Does PostgreSQL have the equivalent of an Oracle ArrayBind?
I have a C# console app that I'm using Entity Framework to insert about a million records into an Oracle 11 DB. The first run took over 12 hours and I finally had to killed it. I'm logging the inserts for the moment so I can see what's going on and I'm sure that is slowing it down some. There were no errors, it was just taking for ever to insert that many files.
Someone suggested looking at SQL Loader for Oracle but I'm new to Oracle and I'm not sure I can run that from inside a console app and then I would have to make sure it completed successfully before moving on the next portion of the application which creates and export file.
Any suggestions on what I can look at to make the inserting happen quicker?
You can't use EF for this kind of massive job. I mean, you can, but as you saw is not efficient.
The best way here is to use ODP.NET (http://www.oracle.com/technetwork/topics/dotnet/index-085163.html) and make a plain old PL-SQL insert.
Take a look at this answer: Bulk Insert to Oracle using .NET
for more details or to this for a sample implementation http://www.oracle.com/technetwork/issue-archive/2009/09-sep/o59odpnet-085168.html
Obviously, you can still use EF for everything else, you just need to make a little arrangement to implement this step with plain old pl-sql. That's the way I use to work for example with SQL Server.
Hope it helps.
Seriously, look into SQL*Loader. Read the turorials. It's insanely fast. Anything else is just wasted time, both runtime and yours. You can probably learn how to use it and insert all your data in the time it takes to just run any alternative solution.
You can use the Process class to start external processes from your console application.
Use either SQL*Loader, OR .Net's Oracle Bulk Loader mechanism, which relies on the ODP.Net data provider. SQL*Loader is more "native" to Oracle, but ODP.Net is pure managed code without having to use an external process.
Here's a good post to help introduce these topics further: Bulk Insert to Oracle using .NET
I am working with Dapper .net for Bulk insert operation in SQL Tables. I am thinking to user SQKBulk copy with Dapper .Net but don't have any experience How to use SqlbulkCopy with Dapper .Net
your help is Highly appreciated
It is not good idea to use dapper for bulk insert, because there it will not fast. The better case for this is use of SqlBulkCopy class. But if you want use Dapper for bulk insert, you can find solution here.
How to do multiple insert (50000 record) as well update using dapper .net ?
Is it possible to use SqlBulkCopy to achieve this? If yes then how?
Is there any best way to implement multiple hierarchical insert or update using Dapper.net?
Technologies : C#, SQL Server 2012, Dapper.net
If you just want to insert: SqlBulkCopy should be fine; if you want an "upsert", I suggest table-valued-parameters (which dapper supports) and the merge t-sql operation
Dapper just simplifies ado.net; if you think of a way to do it in ado.net, dapper can probably make it easier for you; however, it sounds like multiple TVPs might suffice
If you are mean to OK and able to segregate insert and update entities separately then I would suggest to use Dapper.Contrib library provided by Dapper.Net guys themselves. It is available via nuget. It has worked very efficiently for my project.
Here is the link to their Github project page.
I'm providing MySql compatibility for my program that previously worked only with SQL Server. I used SqlBulkCopy and I would like to use it with MySql as well. I know there is MySqlBulkLoader that can be used to perform the same task. The difference however is that SqlBulkCopy worked with a DataTable so I prepared my DataTable and then performed the copy. MySqlBulkLoader, as far as I know, is used to copy an entire file into the database. But I am not dealing with a file here and I would prefer to skip extra steps of converting my DataTable into a temp file, performing the BulkCopy and then deleting the temp file.
Is there a way to make MySqlBulkLoader work with DataTables? Is there a trustworthy alternative to MySqlBulkLoader?
I assume that you're using the MySql Connector/NET, but which version of it?
Assuming that you're using the latest version (8.0 at current time), a look at the MySQL Connector/NET 8.0 API Reference shows that there is no other option than importing your data from an existing file.
Seems like your proposed method is the only workaround for that...