I need to send a several list (about 20000 id's) to a stored procedure, like this:
1391924, 6546510, 7419635, 6599910, 6546888, 1116510, 6546720, ...
I have this data on a List<int>
How can I do to send this list to an stored procedure?
And then I need to insert the id's list on a temporary table
You can use: Table-Valued Parameters
Table-valued parameters are a new parameter type in SQL Server 2008. Table-valued parameters are declared by using user-defined table types. You can use table-valued parameters to send multiple rows of data to a Transact-SQL statement or a routine, such as a stored procedure or function, without creating a temporary table or many parameters.
Table-valued parameters are like parameter arrays in OLE DB and ODBC, but offer more flexibility and closer integration with Transact-SQL. Table-valued parameters also have the benefit of being able to participate in set-based operations.
Eg.:
SQL Server:
Create Table-Valued Parameters:
CREATE TYPE IDsTableType AS TABLE
(
[Product] [varchar](10) NOT NULL
)
Pass it to Stored Procedure:
CREATE PROCEDURE GetIDs
(
#TableVariable IDsTableType READONLY
)
AS
BEGIN
//Do Something
END
GO
C# Code for passing table valued parameter to Stored Procedure:
DataTable dataTable = GetData();
// Configure the SqlCommand and SqlParameter.
SqlCommand cmd= new SqlCommand(
"GetIDs", connection);
cmd.CommandType = CommandType.StoredProcedure;
SqlParameter tvpParam = cmd.Parameters.AddWithValue(
"#TableVariable", dataTable);
tvpParam.SqlDbType = SqlDbType.Structured;
Refer: Passing table-valued parameter data to a stored procedure
If you write the array into a DataTable, you can bulk insert the DataTable into the database. Then the stored proc could just read from that table.
Related
I have a datatable DT with 25 columns. I want to insert that data from DT into a SQL Server table SQDT with the same columns. I'm using an INSERT query for now.
Is there a better way to approach this problem instead of using INSERT on each record? Can I bulk insert into the database using a stored procedure or some other efficient way? It takes a lot of time to insert as DT has 130000 tuples.
Yes, there is a better way. You need to create Type in SQL server with the same definition as your Datatable.
Here you can User-Defined TableType
CREATE TYPE [dbo].[MyTable] AS TABLE(
[Id] int NOT NULL,
)
Define Parameter in SP
CREATE PROCEDURE [dbo].[InsertTable]
#myTable MyTable readonly
AS
BEGIN
insert into [dbo].Records select * from #myTable
END
and send your DataTable as a parameter
You can use SqlBulkCopy in-build for C#.
SqlBulk Copy can be used for bulk inserts with batches which worked efficiently for me
https://programmingwithmosh.com/net/using-sqlbulkcopy-for-fast-inserts/
I am very new to ADO.NET and SQL. I was playing around with oracle and ADO.NET and have ran into a problem which I feel is not documented well.
I have a stored procedure which used a custom type called NUM_ARRAY which is defined as follows.
create or replace
TYPE NUM_ARRAY
AS TABLE OF NUMBER(38, 0);
and my stored procedure is as follows.....
PROCEDURE SAMPLE_PROCEDURE(
SAMPLE_ARRAY IN NUM_ARRAY)
AS
BEGIN
UPDATE RETURNLIST_PICKLIST_MAPPING
SET PICKLIST_ID = 5555555
WHERE RETURNLIST_ID
IN (SELECT * FROM TABLE(SAMPLE_ARRAY));
END SAMPLE_PROCEDURE;
I am trying to invoke this stored procedure using the following .NET code
idbInterface.Open();
idbInterface.AddParameters("RETURN_LIST_ID_IN", returnListId);
idbInterface.Parameters["RETURN_LIST_ID_IN"].DbType = System.Data.DbType.Int16;
idbInterface.Parameters["RETURN_LIST_ID_IN"].Direction = ParameterDirection.Input;
idbInterface.AddParameters("Cur_Picklist_report", null);
idbInterface.Parameters["Cur_Picklist_report"].DbType = System.Data.DbType.Object;
idbInterface.Parameters["Cur_Picklist_report"].Direction = ParameterDirection.Output;
DataSet dsPickListReportData= idbInterface.ExecuteDataSet(CommandType.StoredProcedure, procedureName);
Now the problem is that I am getting a wrong number or type of arguments error.
How can I send the a datatype of type NUM_ARRAY from ADO.NET code?
I'm in the point to implement an C# application that needs to consume already existing stored procedure that receive IDs or values in params.
My task in charge is in two steps:
1- migrate stored procedure in order to receive list of IDs(int) and list of current params, means like a table
2- implement the layer that cal this procedures and will receive List and KeyValuePair or KeyValuePair
What should be the best approach to do this ?
EntityFramework to wrap SPs or not ORM at alla?
How to implement List and KeyValuePair params ob SP side ? with Table-Valued params ?
I'm with SQL 2012
thanks,
Try in sql side User defined table type functionality and pass table as parameter in stored procedure.
For example:
CREATE TABLE Test
(
Id int NOT NULL IDENTITY (1, 1),
TestName varchar(50) NOT NULL,
Value int NULL
) ON [PRIMARY]
-- Create a table data type
CREATE TYPE [dbo].[TestType] As Table
(
--This type has structure similar to the DB table
TestName varchar(50) NOT NULL,
Value int NULL
)
--This is the Stored Procedure
CREATE PROCEDURE [dbo].[TestProcedure]
(
#Test As [dbo].[TestType] Readonly
)
AS
Begin
Insert Into Test(TestName,Value)
Select TestName, Value From #Test
End
C# code passing the data as follows:
DataTable dataTable = new DataTable("SampleDataType");
// We create column names as per the type in DB
dataTable.Columns.Add("TestName", typeof(string));
dataTable.Columns.Add("Value", typeof(Int32));
// And fill in some values
dataTable.Rows.Add("Metal", 99);
dataTable.Rows.Add("HG", null);
...
SqlParameter parameter = new SqlParameter();
// The parameter for the SP must be of SqlDbType.Structured
parameter.ParameterName="#Test";
parameter.SqlDbType = System.Data.SqlDbType.Structured;
parameter.Value = dataTable;
command.Parameters.Add(parameter);
I dealt with this same issue just recently. The links in the comments above lay out how to do SPs with table valued parameters. I've used the TVP method and it was easy and clean.
When it comes to Entity Framework, you can make EF aware of the Stored Procedures and call them and get the results back into EF Objects. Here's a link:
http://msdn.microsoft.com/en-us/data/gg699321.aspx
It's quite a bit more work than just calling the SPs with ADO. A major consideration is whether the results returned by the SP map directly onto one of your objects. Suppose you're joining a couple of tables in a search. You'd have to make a new model for those results and map the SP to that model and for what? So everything will run slower.
If you're just reading data and the results don't map exactly to an existing model you should skip EF and use ADO directly. If, OTOH, you're doing reads and writes and you really want to keep everything in EF for consistency's sake, it is possible.
I am trying to store rows in a SQL Server 2008 table for a junkyard about documentation in all vehicles; I will run this program once a month that I receive a list with information for all vehicles. I know I can write a text file and do a "bulk insert" from a file.
However, I just wondered if there is a way to insert information stored in string arrays directly into SQL Server.
Right now I am doing a for loop and running 500 query commands to insert them, and for 500 records it only takes about 4 seconds the whole process.
I would like to know if there is a better way to insert the information from arrays directly without using a for-loop 500 times?
The code below works perfectly fine for me, however I would not like to use that kind of spaghetti code if there exists a better way to do it. Thanks in advance!
for (int i = 0; i < 500; i++)
{
con.Open();
SqlCommand myCommand = new SqlCommand("INSERT INTO [TESTCATALOG].[dbo].[TITLES] VALUES ('" + TitleNum[i] + "','" + VIN[i] + "','" + DateIssued[i] + "')", con);
myCommand.ExecuteReader();
con.Close();
}
You can use table valued parameters (TVP) to insert lists directly through a stored procedure.
Table-valued parameters are a new parameter type in SQL Server 2008. Table-valued parameters are declared by using user-defined table types. You can use table-valued parameters to send multiple rows of data to a Transact-SQL statement or a routine, such as a stored procedure or function, without creating a temporary table or many parameters.
This will also avoid the SQL Injection vulnerability currently present in your code.
See Table-Valued Parameters in SQL Server 2008 (ADO.NET) on MSDN to see how to call a stored procedure with a TVP with C#:
SqlCommand insertCommand = new SqlCommand(sqlInsert, connection);
SqlParameter tvpParam = insertCommand.Parameters.AddWithValue(
"#tvpNewCategories",
addedCategories);
tvpParam.SqlDbType = SqlDbType.Structured;
tvpParam.TypeName = "dbo.CategoryTableType";
Yes, you can use linq to sql or linq to entities to achieve this in a single transaction.
You could use the SqlBulkCopy class. The SqlBulkCopy class requires your data be available as DataRow[], DataTable or `IDataReader.
From SQL Server 2008 onwards, you can use a table valued variable.
See "Passing a Table-Valued Parameter to a Parameterized SQL Statement" in MSDN.
you can use dataadapter for insert or update.
first add rows in a datatable in yor for loop then call dataadapter.update with datatable
How do I call stored procedures in bulk? I would like to do something like a bulk copy.
All that the stored procedure does is 8 selects for unique constraint and 8 inserts. With no returning value.
You cannot do that.
Bulk copy is a firehose dump of data into a table, you cannot call sprocs or anything else instead of just dumping it into an existing table.
What you can do, however, is dump the data using bulk copy into a temporary table with the right structure, and then afterwards call your sproc that moves that data into the real tables, possibly by modifying existing data instead of inserting it, or whatnot.
If you are using SQL Server 2008, then Table-Valued Parameters is a viable option.
First, you create a user-defined table type containing all of your expected columns and data types on the SQL Server side:
create type dbo.MyTableType as table
(
foo int,
bar varchar(100)
);
then use the above as the table type parameter for your stored procedure:
create procedure uspInsertMyBulkData
(
#myTable dbo.MyTableType readonly
)
as
/* now in here you can use the multi-row data from the passed-in table
parameter, #myTable, to do your selects and inserts*/
Then, on the C#/.NET client side, call this stored procedure via ADO.NET and pass in either a DataTable, an object that inherits from DbDataReader (such as DataTableReader), or an object of type IEnumerable<SqlDataRecord>:
// create my source DataTable
object [] row1 = {1, "a"};
object [] row2 = {2, "b"};
var myDataTable = new DataTable();
myDataTable.Columns.Add(new DataColumn("foo"));
myDataTable.Columns.Add(new DataColumn("bar"));
myDataTable.LoadDataRow(row1, true);
myDataTable.LoadDataRow(row2, true);
// bulk send data to database
var conn = new SqlConnection(connectionString);
var cmd = new SqlCommand("uspInsertMyBulkData", conn)
{
CommandType = CommandType.StoredProcedure
};
SqlParameter param = cmd.Parameters.AddWithValue("#myTable", myDataTable);
param.SqlDbType = SqlDbType.Structured;
cmd.ExecuteNonQuery();
If you want to bulk load data into a table (inserts), the SqlBulkCopy class is the way to go.
Alternatively, you can use the SqlDataAdapter. Set the InsertCommand to the stored procedure that will perform an insert, and map the datatable fields to the sproc parameters. If you have updated records in the datatable, you can also specify an UpdateCommand which will be fired for each updated row. Then call the Update method on the SqlDataAdapter passing it the datatable. You can set the UpdateBatchSize property to define how many records to send to the db in each roundtrip.
SqlServer stored procedures can accept xml, so you could prepare your bulk data as an xml file and pass it to a special-purpose stored procedure which would then call your original stored procedure for each row. You'd need the OPENXML function.
I hesitate to recommend the xml features of SqlServer, but this may be a case where they are appropriate.
I'm not saying that I recommend it, but you could put an insert trigger on the table you are bulk copying into that inserts into those 8 separate tables instead of the original one. You may need to have a tempdb big enough to store all the data though...
CREATE TRIGGER TRG_REPLACETRIGGER
ON BULK_TABLE
INSTEAD OF INSERT
AS BEGIN
INSERT TABLE1 (ID, VALUE) SELECT ID, VALUE1 FROM INSERTED
INSERT TABLE2 (ID, VALUE) SELECT ID, VALUE2 FROM INSERTED
-- ... TABLE3-7
INSERT TABLE8 (ID, VALUE) SELECT ID, VALUE8 FROM INSERTED
END