SQL Bulk Stored Procedure call C# - c#

How do I call stored procedures in bulk? I would like to do something like a bulk copy.
All that the stored procedure does is 8 selects for unique constraint and 8 inserts. With no returning value.

You cannot do that.
Bulk copy is a firehose dump of data into a table, you cannot call sprocs or anything else instead of just dumping it into an existing table.
What you can do, however, is dump the data using bulk copy into a temporary table with the right structure, and then afterwards call your sproc that moves that data into the real tables, possibly by modifying existing data instead of inserting it, or whatnot.

If you are using SQL Server 2008, then Table-Valued Parameters is a viable option.
First, you create a user-defined table type containing all of your expected columns and data types on the SQL Server side:
create type dbo.MyTableType as table
(
foo int,
bar varchar(100)
);
then use the above as the table type parameter for your stored procedure:
create procedure uspInsertMyBulkData
(
#myTable dbo.MyTableType readonly
)
as
/* now in here you can use the multi-row data from the passed-in table
parameter, #myTable, to do your selects and inserts*/
Then, on the C#/.NET client side, call this stored procedure via ADO.NET and pass in either a DataTable, an object that inherits from DbDataReader (such as DataTableReader), or an object of type IEnumerable<SqlDataRecord>:
// create my source DataTable
object [] row1 = {1, "a"};
object [] row2 = {2, "b"};
var myDataTable = new DataTable();
myDataTable.Columns.Add(new DataColumn("foo"));
myDataTable.Columns.Add(new DataColumn("bar"));
myDataTable.LoadDataRow(row1, true);
myDataTable.LoadDataRow(row2, true);
// bulk send data to database
var conn = new SqlConnection(connectionString);
var cmd = new SqlCommand("uspInsertMyBulkData", conn)
{
CommandType = CommandType.StoredProcedure
};
SqlParameter param = cmd.Parameters.AddWithValue("#myTable", myDataTable);
param.SqlDbType = SqlDbType.Structured;
cmd.ExecuteNonQuery();

If you want to bulk load data into a table (inserts), the SqlBulkCopy class is the way to go.
Alternatively, you can use the SqlDataAdapter. Set the InsertCommand to the stored procedure that will perform an insert, and map the datatable fields to the sproc parameters. If you have updated records in the datatable, you can also specify an UpdateCommand which will be fired for each updated row. Then call the Update method on the SqlDataAdapter passing it the datatable. You can set the UpdateBatchSize property to define how many records to send to the db in each roundtrip.

SqlServer stored procedures can accept xml, so you could prepare your bulk data as an xml file and pass it to a special-purpose stored procedure which would then call your original stored procedure for each row. You'd need the OPENXML function.
I hesitate to recommend the xml features of SqlServer, but this may be a case where they are appropriate.

I'm not saying that I recommend it, but you could put an insert trigger on the table you are bulk copying into that inserts into those 8 separate tables instead of the original one. You may need to have a tempdb big enough to store all the data though...
CREATE TRIGGER TRG_REPLACETRIGGER
ON BULK_TABLE
INSTEAD OF INSERT
AS BEGIN
INSERT TABLE1 (ID, VALUE) SELECT ID, VALUE1 FROM INSERTED
INSERT TABLE2 (ID, VALUE) SELECT ID, VALUE2 FROM INSERTED
-- ... TABLE3-7
INSERT TABLE8 (ID, VALUE) SELECT ID, VALUE8 FROM INSERTED
END

Related

How to insert a datatable into SQL Server database using Uipath database activitvities?

I have a datatable DT with 25 columns. I want to insert that data from DT into a SQL Server table SQDT with the same columns. I'm using an INSERT query for now.
Is there a better way to approach this problem instead of using INSERT on each record? Can I bulk insert into the database using a stored procedure or some other efficient way? It takes a lot of time to insert as DT has 130000 tuples.
Yes, there is a better way. You need to create Type in SQL server with the same definition as your Datatable.
Here you can User-Defined TableType
CREATE TYPE [dbo].[MyTable] AS TABLE(
[Id] int NOT NULL,
)
Define Parameter in SP
CREATE PROCEDURE [dbo].[InsertTable]
#myTable MyTable readonly
AS
BEGIN
insert into [dbo].Records select * from #myTable
END
and send your DataTable as a parameter
You can use SqlBulkCopy in-build for C#.
SqlBulk Copy can be used for bulk inserts with batches which worked efficiently for me
https://programmingwithmosh.com/net/using-sqlbulkcopy-for-fast-inserts/

Fastest way to bulk insert from a text file to sql server without integration services

I have the following requirement
The customer will put a file on a folder which can be worth 600MB size, I need with a Windows Service to poll the folder, once a new file is dropped there, then I need to process it and insert into SQL Server.
What would be the fastest way recommended? ADO.NET with an INSERT per row? what would you recommend me?
Use bulk insert feature of SQL server.
See https://blogs.msdn.microsoft.com/nikhilsi/2008/06/11/bulk-insert-into-sql-from-c-app/
I would recommend using ADO.NET, but don't INSERT per row. That can be incredibly taxing if you have 600MB of rows in a text file. Instead, create a User Defined Table Type
CREATE TYPE [dbo].[MyTable] AS TABLE
(
[Col1] INT NOT NULL,
[Col2] NVARCHAR(10) NOT NULL,
[Col3] NVARCHAR(250) NOT NULL
)
Then in C# create a Data Table.
var dt = new DataTable("MyTable");
dt.Columns.Add("Col1", typeof(int));
dt.Columns.Add("Col2", typeof(string));
dt.Columns.Add("Col3", typeof(string));
Load it using this syntax.
dt.Rows.Add(someVar1, someVar2, someVar3);
Load the data table in batches, and then submit the data table as a parameter to SQL like any other parameter.
command.Parameters.Add(new SqlParameter { ParameterName = "Values", Value = dt });
Your stored procedure should be expecting the parameter.
CREATE PROCEDURE [ive].[myProc]
#Values [dbo].[MyTable] READONLY
Then simply use it in your stored procedure like a table. You can wrap the process in a transaction in case a bulk insert fails at any step, and use some other logic to correct the issue.

How can I insert 10 million records in the shortest time possible?

I have a file (which has 10 million records) like below:
line1
line2
line3
line4
.......
......
10 million lines
So basically I want to insert 10 million records into the database.
so I read the file and upload it to SQL Server.
C# code
System.IO.StreamReader file =
new System.IO.StreamReader(#"c:\test.txt");
while((line = file.ReadLine()) != null)
{
// insertion code goes here
//DAL.ExecuteSql("insert into table1 values("+line+")");
}
file.Close();
but insertion will take a long time.
How can I insert 10 million records in the shortest time possible using C#?
Update 1:
Bulk INSERT:
BULK INSERT DBNAME.dbo.DATAs
FROM 'F:\dt10000000\dt10000000.txt'
WITH
(
ROWTERMINATOR =' \n'
);
My Table is like below:
DATAs
(
DatasField VARCHAR(MAX)
)
but I am getting following error:
Msg 4866, Level 16, State 1, Line 1
The bulk load failed. The column is too long in the data file for row 1, column 1. Verify that the field terminator and row terminator are specified correctly.
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 1
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
Below code worked:
BULK INSERT DBNAME.dbo.DATAs
FROM 'F:\dt10000000\dt10000000.txt'
WITH
(
FIELDTERMINATOR = '\t',
ROWTERMINATOR = '\n'
);
Please do not create a DataTable to load via BulkCopy. That is an ok solution for smaller sets of data, but there is absolutely no reason to load all 10 million rows into memory before calling the database.
Your best bet (outside of BCP / BULK INSERT / OPENROWSET(BULK...)) is to stream the contents from the file into the database via a Table-Valued Parameter (TVP). By using a TVP you can open the file, read a row & send a row until done, and then close the file. This method has a memory footprint of just a single row. I wrote an article, Streaming Data Into SQL Server 2008 From an Application, which has an example of this very scenario.
A simplistic overview of the structure is as follows. I am assuming the same import table and field name as shown in the question above.
Required database objects:
-- First: You need a User-Defined Table Type
CREATE TYPE ImportStructure AS TABLE (Field VARCHAR(MAX));
GO
-- Second: Use the UDTT as an input param to an import proc.
-- Hence "Tabled-Valued Parameter" (TVP)
CREATE PROCEDURE dbo.ImportData (
#ImportTable dbo.ImportStructure READONLY
)
AS
SET NOCOUNT ON;
-- maybe clear out the table first?
TRUNCATE TABLE dbo.DATAs;
INSERT INTO dbo.DATAs (DatasField)
SELECT Field
FROM #ImportTable;
GO
C# app code to make use of the above SQL objects is below. Notice how rather than filling up an object (e.g. DataTable) and then executing the Stored Procedure, in this method it is the executing of the Stored Procedure that initiates the reading of the file contents. The input parameter of the Stored Proc isn't a variable; it is the return value of a method, GetFileContents. That method is called when the SqlCommand calls ExecuteNonQuery, which opens the file, reads a row and sends the row to SQL Server via the IEnumerable<SqlDataRecord> and yield return constructs, and then closes the file. The Stored Procedure just sees a Table Variable, #ImportTable, that can be access as soon as the data starts coming over (note: the data does persist for a short time, even if not the full contents, in tempdb).
using System.Collections;
using System.Data;
using System.Data.SqlClient;
using System.IO;
using Microsoft.SqlServer.Server;
private static IEnumerable<SqlDataRecord> GetFileContents()
{
SqlMetaData[] _TvpSchema = new SqlMetaData[] {
new SqlMetaData("Field", SqlDbType.VarChar, SqlMetaData.Max)
};
SqlDataRecord _DataRecord = new SqlDataRecord(_TvpSchema);
StreamReader _FileReader = null;
try
{
_FileReader = new StreamReader("{filePath}");
// read a row, send a row
while (!_FileReader.EndOfStream)
{
// You shouldn't need to call "_DataRecord = new SqlDataRecord" as
// SQL Server already received the row when "yield return" was called.
// Unlike BCP and BULK INSERT, you have the option here to create a string
// call ReadLine() into the string, do manipulation(s) / validation(s) on
// the string, then pass that string into SetString() or discard if invalid.
_DataRecord.SetString(0, _FileReader.ReadLine());
yield return _DataRecord;
}
}
finally
{
_FileReader.Close();
}
}
The GetFileContents method above is used as the input parameter value for the Stored Procedure as shown below:
public static void test()
{
SqlConnection _Connection = new SqlConnection("{connection string}");
SqlCommand _Command = new SqlCommand("ImportData", _Connection);
_Command.CommandType = CommandType.StoredProcedure;
SqlParameter _TVParam = new SqlParameter();
_TVParam.ParameterName = "#ImportTable";
_TVParam.TypeName = "dbo.ImportStructure";
_TVParam.SqlDbType = SqlDbType.Structured;
_TVParam.Value = GetFileContents(); // return value of the method is streamed data
_Command.Parameters.Add(_TVParam);
try
{
_Connection.Open();
_Command.ExecuteNonQuery();
}
finally
{
_Connection.Close();
}
return;
}
Additional notes:
With some modification, the above C# code can be adapted to batch the data in.
With minor modification, the above C# code can be adapted to send in multiple fields (the example shown in the "Steaming Data..." article linked above passes in 2 fields).
You can also manipulate the value of each record in the SELECT statement in the proc.
You can also filter out rows by using a WHERE condition in the proc.
You can access the TVP Table Variable multiple times; it is READONLY but not "forward only".
Advantages over SqlBulkCopy:
SqlBulkCopy is INSERT-only whereas using a TVP allows the data to be used in any fashion: you can call MERGE; you can DELETE based on some condition; you can split the data into multiple tables; and so on.
Due to a TVP not being INSERT-only, you don't need a separate staging table to dump the data into.
You can get data back from the database by calling ExecuteReader instead of ExecuteNonQuery. For example, if there was an IDENTITY field on the DATAs import table, you could add an OUTPUT clause to the INSERT to pass back INSERTED.[ID] (assuming ID is the name of the IDENTITY field). Or you can pass back the results of a completely different query, or both since multiple results sets can be sent and accessed via Reader.NextResult(). Getting info back from the database is not possible when using SqlBulkCopy yet there are several questions here on S.O. of people wanting to do exactly that (at least with regards to the newly created IDENTITY values).
For more info on why it is sometimes faster for the overall process, even if slightly slower on getting the data from disk into SQL Server, please see this whitepaper from the SQL Server Customer Advisory Team: Maximizing Throughput with TVP
In C#, the best solution is to let the SqlBulkCopy reads the file. To do this you need to pass an IDataReader direct to SqlBulkCopy.WriteToServer method. Here is an example: http://www.codeproject.com/Articles/228332/IDataReader-implementation-plus-SqlBulkCopy
the best way is a mix between your 1st solution and 2nd,
create DataTable and in the loop add rows to it then use BulkCopy to upload
to DB in one connection use this for help in bulk copy
one other thing to pay attention that bulk copy is a very sensitive operation that almost
every mistake will void the copy, such if you declare the column name in the dataTable as "text" and in the DB its "Text" it will throw an exception, good luck.
If you want to insert 10 million records in the shortest time to direct using SQL query for testing purpose you should use this query
CREATE TABLE TestData(ID INT IDENTITY (1,1), CreatedDate DATETIME)
GO
INSERT INTO TestData(CreatedDate) SELECT GetDate()
GO 10000000

passing parameter of list of values and tables to stored procedure

I'm in the point to implement an C# application that needs to consume already existing stored procedure that receive IDs or values in params.
My task in charge is in two steps:
1- migrate stored procedure in order to receive list of IDs(int) and list of current params, means like a table
2- implement the layer that cal this procedures and will receive List and KeyValuePair or KeyValuePair
What should be the best approach to do this ?
EntityFramework to wrap SPs or not ORM at alla?
How to implement List and KeyValuePair params ob SP side ? with Table-Valued params ?
I'm with SQL 2012
thanks,
Try in sql side User defined table type functionality and pass table as parameter in stored procedure.
For example:
CREATE TABLE Test
(
Id int NOT NULL IDENTITY (1, 1),
TestName varchar(50) NOT NULL,
Value int NULL
) ON [PRIMARY]
-- Create a table data type
CREATE TYPE [dbo].[TestType] As Table
(
--This type has structure similar to the DB table
TestName varchar(50) NOT NULL,
Value int NULL
)
--This is the Stored Procedure
CREATE PROCEDURE [dbo].[TestProcedure]
(
#Test As [dbo].[TestType] Readonly
)
AS
Begin
Insert Into Test(TestName,Value)
Select TestName, Value From #Test
End
C# code passing the data as follows:
DataTable dataTable = new DataTable("SampleDataType");
// We create column names as per the type in DB
dataTable.Columns.Add("TestName", typeof(string));
dataTable.Columns.Add("Value", typeof(Int32));
// And fill in some values
dataTable.Rows.Add("Metal", 99);
dataTable.Rows.Add("HG", null);
...
SqlParameter parameter = new SqlParameter();
// The parameter for the SP must be of SqlDbType.Structured
parameter.ParameterName="#Test";
parameter.SqlDbType = System.Data.SqlDbType.Structured;
parameter.Value = dataTable;
command.Parameters.Add(parameter);
I dealt with this same issue just recently. The links in the comments above lay out how to do SPs with table valued parameters. I've used the TVP method and it was easy and clean.
When it comes to Entity Framework, you can make EF aware of the Stored Procedures and call them and get the results back into EF Objects. Here's a link:
http://msdn.microsoft.com/en-us/data/gg699321.aspx
It's quite a bit more work than just calling the SPs with ADO. A major consideration is whether the results returned by the SP map directly onto one of your objects. Suppose you're joining a couple of tables in a search. You'd have to make a new model for those results and map the SP to that model and for what? So everything will run slower.
If you're just reading data and the results don't map exactly to an existing model you should skip EF and use ADO directly. If, OTOH, you're doing reads and writes and you really want to keep everything in EF for consistency's sake, it is possible.

How to send a big array to a stored procedure

I need to send a several list (about 20000 id's) to a stored procedure, like this:
1391924, 6546510, 7419635, 6599910, 6546888, 1116510, 6546720, ...
I have this data on a List<int>
How can I do to send this list to an stored procedure?
And then I need to insert the id's list on a temporary table
You can use: Table-Valued Parameters
Table-valued parameters are a new parameter type in SQL Server 2008. Table-valued parameters are declared by using user-defined table types. You can use table-valued parameters to send multiple rows of data to a Transact-SQL statement or a routine, such as a stored procedure or function, without creating a temporary table or many parameters.
Table-valued parameters are like parameter arrays in OLE DB and ODBC, but offer more flexibility and closer integration with Transact-SQL. Table-valued parameters also have the benefit of being able to participate in set-based operations.
Eg.:
SQL Server:
Create Table-Valued Parameters:
CREATE TYPE IDsTableType AS TABLE
(
[Product] [varchar](10) NOT NULL
)
Pass it to Stored Procedure:
CREATE PROCEDURE GetIDs
(
#TableVariable IDsTableType READONLY
)
AS
BEGIN
//Do Something
END
GO
C# Code for passing table valued parameter to Stored Procedure:
DataTable dataTable = GetData();
// Configure the SqlCommand and SqlParameter.
SqlCommand cmd= new SqlCommand(
"GetIDs", connection);
cmd.CommandType = CommandType.StoredProcedure;
SqlParameter tvpParam = cmd.Parameters.AddWithValue(
"#TableVariable", dataTable);
tvpParam.SqlDbType = SqlDbType.Structured;
Refer: Passing table-valued parameter data to a stored procedure
If you write the array into a DataTable, you can bulk insert the DataTable into the database. Then the stored proc could just read from that table.

Categories