Sending a DataTable as a parameter to stored procedure - c#

I'm trying to send a DataTable to a stored procedure using c#, .net 2.0 and SQLServer 2012 Express.
This is roughly what I'm doing:
//define the DataTable
var accountIdTable = new DataTable("[dbo].[TypeAccountIdTable]");
//define the column
var dataColumn = new DataColumn {ColumnName = "[ID]", DataType = typeof (Guid)};
//add column to dataTable
accountIdTable.Columns.Add(dataColumn);
//feed it with the unique contact ids
foreach (var uniqueId in uniqueIds)
{
accountIdTable.Rows.Add(uniqueId);
}
using (var sqlCmd = new SqlCommand())
{
//define command details
sqlCmd.CommandType = CommandType.StoredProcedure;
sqlCmd.CommandText = "[dbo].[msp_Get_Many_Profiles]";
sqlCmd.Connection = dbConn; //an open database connection
//define parameter
var sqlParam = new SqlParameter();
sqlParam.ParameterName = "#tvp_account_id_list";
sqlParam.SqlDbType = SqlDbType.Structured;
sqlParam.Value = accountIdTable;
//add parameter to command
sqlCmd.Parameters.Add(sqlParam);
//execute procedure
rResult = sqlCmd.ExecuteReader();
//print results
while (rResult.Read())
{
PrintRowData(rResult);
}
}
But then I get the following error:
ArgumentOutOfRangeException: No mapping exists from SqlDbType Structured to a known DbType.
Parameter name: SqlDbType
Upon investigating further (in MSDN, SO and other places) it appears as if .net 2.0 does not support sending a DataTable to the database (missing things such as SqlParameter.TypeName), but I'm still not sure since I haven't seen anyone explicitly claiming that this feature is not available in .net 2.0
Is this true?
If so, is there another way to send a collection of data to the database?
Thanks in advance!

Out of the box, ADO.NET does not suport this with good reason. A DataTable could take just about any number of columns, which may or may not map up to a real table in your database.
If I'm understanding what you want to do - upload the contents of a DataTable quickly to a pre-defined, real table with the same structure, I'd suggest you investigate SQLBulkCopy.
From the documentation:
Microsoft SQL Server includes a popular command-prompt utility named
bcp for moving data from one table to another, whether on a single
server or between servers. The SqlBulkCopy class lets you write
managed code solutions that provide similar functionality. There are
other ways to load data into a SQL Server table (INSERT statements,
for example), but SqlBulkCopy offers a significant performance
advantage over them.
The SqlBulkCopy class can be used to write data only to SQL Server
tables. However, the data source is not limited to SQL Server; any
data source can be used, as long as the data can be loaded to a
DataTable instance or read with a IDataReader instance.
SqlBulkCopy will fail when bulk loading a DataTable column of type
SqlDateTime into a SQL Server column whose type is one of the
date/time types added in SQL Server 2008.
However, you can define Table Value Parameters in SQL Server in later versions, and use that to send a Table (DateTable) in the method you're asking. There's an example at http://sqlwithmanoj.wordpress.com/2012/09/10/passing-multipledynamic-values-to-stored-procedures-functions-part4-by-using-tvp/

Per my experience, if you're able to compile the code in C# that means the ADO.Net support that type. But if it fails when you execute the code then the target database might not support it. In your case you mention the [Sql Server 2012 Express], so it might not support it. The Table Type was supported from [Sql Server 2005] per my understanding but you had to keep the database compatibility mode to greater than 99 or something. I am 100% positive it will work in 2008 because I have used it and using it extensively to do bulk updates through the stored procedures using [User Defined Table Types] (a.k.a UDTT) as the in-parameter for the stored procedure. Again you must keep the database compatibility greater than 99 to use MERGE command for bulk updates.
And of course you can use SQLBulkCopy but not sure how reliable it is, is depending on the

Related

Compare the data in two tables, one is Oracle, the other is SQL Server

I have two databases, one is Oracle 11g and the other is in Sql Server 2012. There is a table in the Oracle database that contains a column called BASE_ID. This is comprised of work order numbers.
In the Sql Server database, there is also a table that contains a column called WorkOrders. This also contains work order numbers. What I need to do is compare what is in the Oracle column to what is in the SQL Server column, and display what is in Oracle but not in Sql Server. I am a little stumped here, even for a starting point. I am kind of just taking pot shots. This mess is what I have so far:
private void compare()
{
try
{
if (con.State != ConnectionState.Open)
con.Open();
DataTable t = new DataTable();
OracleCommand oraCmd = new OracleCommand("SELECT BASE_ID FROM WORK_ORDER x WHERE NOT EXISTS(SELECT NULL FROM [SQLServer].[BACRTest].[WorkOrders] y WHERE y.WorkOrder = x.BASE_ID)", oraconn);
OracleDataAdapter dt = new OracleDataAdapter(oraCmd);
dt.Fill(t);
dataGridView3.DataSource = t;
}
catch (Exception e)
{
MessageBox.Show(e.Message);
}
}
This is just throwing me an ORA-00903: invalid table name error which is fair enough because I wasn't really expecting it to work.
Any suggestions would be very much appreciated.
As noted above, if you have the correct products installed (e.g. Oracle's Database Gateway for SQL Server, or Database Gateway for ODBC) you can make a direct connection between Oracle and SQL Server. If you don't have such products you can try exporting the SQL Server table to a comma-separated values (.csv) file, then set up an external table in Oracle to read the .csv. Not perhaps the most seamless method in the world, but it does represent one way to handle this.
Best of luck.
Because the entries are on different DBMSs, you cannot directly make a query for this, so most likely you will need to store the entries in memory and compare them one by one. Maybe you could partition the tables into small sets and compare them

using LINQ on SQL Server with a linked server to oracle db

I was wondering whether I can execute LINQ on an SQL server to speed up my query.
I will make a simple example. Currently I use this to fill my datatable:
using (var connection = new SqlConnection())
using (var da = new SqlDataAdapter())
using (da.SelectCommand = connection.CreateCommand())
{
da.SelectCommand.CommandText = newcmd;
da.SelectCommand.Connection.ConnectionString = connstring;
da.SelectCommand.CommandTimeout = 0;
DataTable ds = new DataTable(); //conn is opened by dataadapter
da.Fill(ds);
}
with this command:
newcmd = "select * from openquery("LinkedServer", 'select * FROM tbl_oracle p ')";
And then once I have the data in the DataTable I use LINQ to manipulate the data as I see fit. Howerever this means I have to transfer the entire table!
Since this returns a lot of data in the real query, the below (simple sum example) turns out to be much faster (mainly because of interface /transfer rates).
newcmd = "select * from openquery("LinkedServer", 'select p.timestep, SUM (p.position)
FROM tbl_oracle p GROUP BY p.timestep ')";
Obviously in reality the data manipulation is more complex. So my question:
Can I somehow use LINQ on oracle db or on Linked Server on SQL Server and execute it on the server so that data manipulation is done before data transfer to desktop? I would really like the power of LINQ without tranferring all the raw data.
UPDATE
I set up a view in sql server management studio on the linked oracle server as suggested in the answer below. I then ran a very simple query:
select * from view where ID=1
with an execution plan and this shows that the entire oracle table is scanned first (remote scan 100% cost) an query is not executed on oracle server. The same query executes in split seconds via openquery. This makes this approach unusable due to the size of the data involved. Any other suggestions would be appreciated.
You can create views on your tables of interest in the SQL Server, and use EF or LINQ to SQL on that tables. In this way, the query will be transfered to the Oracle Server.
EF or LINQ to SQL don't support the specification of the server part on the fully qulified name of a table. But, if you create a view like this:
create view MyView as SELECT * FROM LinkedServer.Database.Schema.Table
you can work on MyView as if it was a table in your local server, and the resulting SQL query will be executed directly on the linked Oracle server.

Handle multiple db updates from c# in SQL Server 2008

I like to find a way to handle multiple updates to a sql db (with one singe db roundtrip). I read about table-valued parameters in SQL Server 2008 http://www.codeproject.com/KB/database/TableValueParameters.aspx which seems really useful. But it seems I need to create both a stored procedure and a table type to use it. Is that true? Perhaps due to security? I would like to run a text query simply like this:
var sql = "INSERT INTO Note (UserId, note) SELECT * FROM #myDataTable";
var myDataTable = ... some System.Data.DataTable ...
var cmd = new System.Data.SqlClient.SqlCommand(sql, conn);
var param = cmd.Parameters.Add("#myDataTable", System.Data.SqlDbType.Structured);
param.Value=myDataTable;
cmd.ExecuteNonQuery();
So
A) do I have to create both a stored procedure and a table type to use TVP's? and
B) what alternative method is recommended to send multiple updates (and inserts) to SQL Server?
Yes, you need to create the types.
Alternatives are sending a big string sql batch or passing XML to sprocs.
The downside to big sql string batches is it can blow the sql proc cache and might cause sql to recompile - especially if the batch is unique because of input data being part of that large string. By definition each batch would be unique.
XML was the main alternative before TVPs. The one downside to XML, for at least awhile, sql azure didn't support it (that might change?) so it limits your options.
TVPs seem to be the way to do this. Our project just converted to using TVPs.
Hope that helps.

SqlBulkCopy calculated field

I am working on moving a database from MS Access to sql server. To move the data into the new tables I have decided to write a sync routine as the schema has changed quite significantly and it lets me run testing on programs that run off it and resync whenever I need new test data. Then eventually I will do one last sync and start live on the new sql server version.
Unfortunately I have hit a snag, my method is below for copying from Access to SQLServer
public static void BulkCopyAccessToSQLServer
(string sql, CommandType commandType, DBConnection sqlServerConnection,
string destinationTable, DBConnection accessConnection, int timeout)
{
using (DataTable dt = new DataTable())
using (OleDbConnection conn = new OleDbConnection(GetConnection(accessConnection)))
using (OleDbCommand cmd = new OleDbCommand(sql, conn))
using (OleDbDataAdapter adapter = new OleDbDataAdapter(cmd))
{
cmd.CommandType = commandType;
cmd.Connection.Open();
adapter.SelectCommand.CommandTimeout = timeout;
adapter.Fill(dt);
using (SqlConnection conn2 = new SqlConnection(GetConnection(sqlServerConnection)))
using (SqlBulkCopy copy = new SqlBulkCopy(conn2))
{
conn2.Open();
copy.DestinationTableName = destinationTable;
copy.BatchSize = 1000;
copy.BulkCopyTimeout = timeout;
copy.WriteToServer(dt);
copy.NotifyAfter = 1000;
}
}
}
Basically this queries access for the data using the input sql string this has all the correct field names so I don't need to set columnmappings.
This was working until I reached a table with a calculated field. SQLBulkCopy doesn't seem to know to skip the field and tries to update the column which fails with error "The column 'columnName' cannot be modified because it is either a computed column or is the result of a union operator."
Is there an easy way to make it skip the calculated field?
I am hoping not to have to specify a full column mapping.
There are two ways to dodge this:
use the ColumnMappings to formally define the column relationship (you note you don't want this)
push the data into a staging table - a basic table, not part of your core transactional tables, whose entire purpose is to look exactly like this data import; then use a TSQL command to transfer the data from the staging table to the real table
I always favor the second option, for various reasons:
I never have to mess with mappings - this is actually important to me ;p
the insert to the real table will be fully logged (SqlBulkCopy is not necessarily logged)
I have the fastest possible insert - no constraint checking, no indexing, etc
I don't tie up a transactional table during the import, and there is no risk of non-repeatable queries running against a partially imported table
I have a safe abort option if the import fails half way through, without having to use transactions (nothing has touched the transactional system at this point)
it allows some level of data-processing when pushing it into the real tables, without the need to either buffer everything in a DataTable at the app tier, or implement a custom IDataReader

Problem using SQLDataReader with Sybase ASE

We're developing a reporting application that uses asp.net-mvc (.net 4). We connect through DDTEK.Sybase middleware to a Sybase ASE 12.5 database.
We're having a problem pulling data into a datareader (from a stored procedure). The stored procedure computes values (approximately 50 columns) by doing sums, counts, and calling other stored procedures.
The problem we're experiencing is... certain (maybe 5% of the columns) come back with NULL or 0. If we debug and copy the SQL statement being used for the datareader and run it inside another SQL tool we get all valid values for all columns.
conn = new SybaseConnection
{
ConnectionString = ConfigurationManager.ConnectionStrings[ConnectStringName].ToString()
};
conn.Open();
cmd = new SybaseCommand
{
CommandTimeout = cmdTimeout,
Connection = conn,
CommandText = mainSql
};
reader = cmd.ExecuteReader();
// AT THIS POINT IMMEDIATELY AFTER THE EXECUTEREADER COMMAND
// THE READER CONTAINS THE BAD (NULL OR 0) DATA FOR THESE COLUMNS.
DataTable schemaTable = reader.GetSchemaTable();
// AT THIS POINT WE CAN VIEW THE DATATABLE FOR THE SCHEMA AND IT APPEARS CORRECT
// THE COLUMNS THAT DON'T WORK HAVE SPECIFICATIONS IDENTICAL TO THE COLUMNS THAT DO WORK
Has anyone had problems like this using Sybase and ADO?
Thanks,
John K.
Problem Solved! ... The problem turned out to be a diffence in the way nulls were handled in the SQL. ... We had several instances in the stored procedure that used non ansi null tests. (x = null rather than x is null) The SQL tools I had used to test this problem were defaulting the "SET ANSINULL" to OFF while our ADO code was not so the "SET ANSINULL" value was ON. Because of this setting, SQL code that tested for null would never test "TRUE" allowing the null value to be returned.

Categories