Which one would be better in executing an insert statement for ms-sql database:
Sql DataAdapter or SQL Command
Object?
Which of them would be better, while inserting only one row and while inserting multiple rows?
A simple example of code usage:
SQL Command
string query = "insert into Table1(col1,col2,col3) values (#value1,#value2,#value3)";
int i;
SqlCommand cmd = new SqlCommand(query, connection);
// add parameters...
cmd.Parameters.Add("#value1",SqlDbType.VarChar).Value=txtBox1.Text;
cmd.Parameters.Add("#value2",SqlDbType.VarChar).Value=txtBox2.Text;
cmd.Parameters.Add("#value3",SqlDbType.VarChar).Value=txtBox3.Text;
cmd.con.open();
i = cmd.ExecuteNonQuery();
cmd.con.close();
SQL Data Adapter
DataRow dr = dsTab.Tables["Table1"].NewRow();
DataSet dsTab = new DataSet("Table1");
SqlDataAdapter adp = new SqlDataAdapter("Select * from Table1", connection);
adp.Fill(dsTab, "Table1");
dr["col1"] = txtBox1.Text;
dr["col2"] = txtBox5.Text;
dr["col3"] = "text";
dsTab.Tables["Table1"].Rows.Add(dr);
SqlCommandBuilder projectBuilder = new SqlCommandBuilder(adp);
DataSet newSet = dsTab.GetChanges(DataRowState.Added);
adp.Update(newSet, "Table1");
Updating a data source is much easier using DataAdapters. It's easier to make changes since you just have to modify the DataSet and call Update.
There is probably no (or very little) difference in the performance between using DataAdapters vs Commands. DataAdapters internally use Connection and Command objects and execute the Commands to perform the actions (such as Fill and Update) that you tell them to do, so it's pretty much the same as using only Command objects.
I would use LinqToSql with a DataSet for single insert and most Database CRUD requests. It is type safe, relatively fast for non compilcated queries such as the one above.
If you have many rows to insert (1000+) and you are using SQL Server 2008 I would use SqlBulkCopy. You can use your DataSet and input into a stored procedure and merge into your destination
For complicated queries I recommend using dapper in conjunction with stored procedures.
I suggest you would have some kind of control on your communication with the database. That means abstracting some code, and for that the CommandBuilder automatically generates CUD statements for you.
What would be even better is if you use that technique together with a typed Dataset. then you have intellisense and compile time check on all your columns
Related
I was wondering whether I can execute LINQ on an SQL server to speed up my query.
I will make a simple example. Currently I use this to fill my datatable:
using (var connection = new SqlConnection())
using (var da = new SqlDataAdapter())
using (da.SelectCommand = connection.CreateCommand())
{
da.SelectCommand.CommandText = newcmd;
da.SelectCommand.Connection.ConnectionString = connstring;
da.SelectCommand.CommandTimeout = 0;
DataTable ds = new DataTable(); //conn is opened by dataadapter
da.Fill(ds);
}
with this command:
newcmd = "select * from openquery("LinkedServer", 'select * FROM tbl_oracle p ')";
And then once I have the data in the DataTable I use LINQ to manipulate the data as I see fit. Howerever this means I have to transfer the entire table!
Since this returns a lot of data in the real query, the below (simple sum example) turns out to be much faster (mainly because of interface /transfer rates).
newcmd = "select * from openquery("LinkedServer", 'select p.timestep, SUM (p.position)
FROM tbl_oracle p GROUP BY p.timestep ')";
Obviously in reality the data manipulation is more complex. So my question:
Can I somehow use LINQ on oracle db or on Linked Server on SQL Server and execute it on the server so that data manipulation is done before data transfer to desktop? I would really like the power of LINQ without tranferring all the raw data.
UPDATE
I set up a view in sql server management studio on the linked oracle server as suggested in the answer below. I then ran a very simple query:
select * from view where ID=1
with an execution plan and this shows that the entire oracle table is scanned first (remote scan 100% cost) an query is not executed on oracle server. The same query executes in split seconds via openquery. This makes this approach unusable due to the size of the data involved. Any other suggestions would be appreciated.
You can create views on your tables of interest in the SQL Server, and use EF or LINQ to SQL on that tables. In this way, the query will be transfered to the Oracle Server.
EF or LINQ to SQL don't support the specification of the server part on the fully qulified name of a table. But, if you create a view like this:
create view MyView as SELECT * FROM LinkedServer.Database.Schema.Table
you can work on MyView as if it was a table in your local server, and the resulting SQL query will be executed directly on the linked Oracle server.
I am trying to copy a large datatable (columns with more than 1000 rows) created dynamically in the applicaiton to a MySQL table using c#, WPF.
I have searched for various ways to do this but was unsuccessful to implement. I think the MySqlDataAdapter class is something I should use but I cant make it work. This is what I tried to do...
MySqlConnection con = new MySqlConnection(MyConString);
MySqlCommand comm = new MySqlCommand("Select * From kinectdata", con);
MySqlDataAdapter test1 = new MySqlDataAdapter(comm);
test1.Update(skelData);
Speed of this transfer is also important so I prefered not to call an insert or update statement 1000 times.
Many thanks for your feedback!
M
You can build a single INSERT statement which inserts all 1000 rows.
INSERT INTO table VALUES (1,2,3), (4,5,6), (7,8,9);
1000 rows is not that much, in database terms its nothing, using insert should be very fast. No more then 2 seconds.
In your example you do have to declare command type and set your query and command text.
I am trying to store rows in a SQL Server 2008 table for a junkyard about documentation in all vehicles; I will run this program once a month that I receive a list with information for all vehicles. I know I can write a text file and do a "bulk insert" from a file.
However, I just wondered if there is a way to insert information stored in string arrays directly into SQL Server.
Right now I am doing a for loop and running 500 query commands to insert them, and for 500 records it only takes about 4 seconds the whole process.
I would like to know if there is a better way to insert the information from arrays directly without using a for-loop 500 times?
The code below works perfectly fine for me, however I would not like to use that kind of spaghetti code if there exists a better way to do it. Thanks in advance!
for (int i = 0; i < 500; i++)
{
con.Open();
SqlCommand myCommand = new SqlCommand("INSERT INTO [TESTCATALOG].[dbo].[TITLES] VALUES ('" + TitleNum[i] + "','" + VIN[i] + "','" + DateIssued[i] + "')", con);
myCommand.ExecuteReader();
con.Close();
}
You can use table valued parameters (TVP) to insert lists directly through a stored procedure.
Table-valued parameters are a new parameter type in SQL Server 2008. Table-valued parameters are declared by using user-defined table types. You can use table-valued parameters to send multiple rows of data to a Transact-SQL statement or a routine, such as a stored procedure or function, without creating a temporary table or many parameters.
This will also avoid the SQL Injection vulnerability currently present in your code.
See Table-Valued Parameters in SQL Server 2008 (ADO.NET) on MSDN to see how to call a stored procedure with a TVP with C#:
SqlCommand insertCommand = new SqlCommand(sqlInsert, connection);
SqlParameter tvpParam = insertCommand.Parameters.AddWithValue(
"#tvpNewCategories",
addedCategories);
tvpParam.SqlDbType = SqlDbType.Structured;
tvpParam.TypeName = "dbo.CategoryTableType";
Yes, you can use linq to sql or linq to entities to achieve this in a single transaction.
You could use the SqlBulkCopy class. The SqlBulkCopy class requires your data be available as DataRow[], DataTable or `IDataReader.
From SQL Server 2008 onwards, you can use a table valued variable.
See "Passing a Table-Valued Parameter to a Parameterized SQL Statement" in MSDN.
you can use dataadapter for insert or update.
first add rows in a datatable in yor for loop then call dataadapter.update with datatable
Being new to working with Data, I hope I'm asking this properly. How can I select what columns come in from a DataSet into a DataTable? I know I can fill a DataTable by using...
DataTable table = dataSet1.Tables[0];
but this brings in all the columns. How can I fill a DataTable with only certain columns?
I'm using .NET 3.5, C#, and a SQL CE 3.5 single table database.
Thanks.
The DataTable is actually filled via a DataAdapter when the DataSet is created. Once you run your query, the columns in the DataTable are set. But, you can use a DataView to apply an additional filter and a column reduction to a DataTable, but the cost of querying the database and pulling data has already occurred, so you should consider making sure your query doesn't pull back more than you need. MSDN is a great resource.
Of course if you're just now learning this, it bears mentioning that while ADO.NET is important to know foundationally, you should be aware that there's a lot of momentum away from raw ADO.NET lately towards things like Entity Framework. While SQL will never die, nor should it, you're going to have to write a whole lot more plumbing code when using ADO.NET then you would with a nice ORM. Check out these posts for more info.
// Assumes that connection is a valid SqlConnection object.
string queryString = "SELECT CustomerID, CompanyName FROM dbo.Customers";
SqlDataAdapter adapter = new SqlDataAdapter(queryString, connection);
DataSet customers = new DataSet();
adapter.Fill(customers, "Customers");
DataTable table = customers.Tables[0];
Instead of "CustomerID, CompanyName" you can put the columns you want to select.
For further learning check this MSDN link.
I am working on moving a database from MS Access to sql server. To move the data into the new tables I have decided to write a sync routine as the schema has changed quite significantly and it lets me run testing on programs that run off it and resync whenever I need new test data. Then eventually I will do one last sync and start live on the new sql server version.
Unfortunately I have hit a snag, my method is below for copying from Access to SQLServer
public static void BulkCopyAccessToSQLServer
(string sql, CommandType commandType, DBConnection sqlServerConnection,
string destinationTable, DBConnection accessConnection, int timeout)
{
using (DataTable dt = new DataTable())
using (OleDbConnection conn = new OleDbConnection(GetConnection(accessConnection)))
using (OleDbCommand cmd = new OleDbCommand(sql, conn))
using (OleDbDataAdapter adapter = new OleDbDataAdapter(cmd))
{
cmd.CommandType = commandType;
cmd.Connection.Open();
adapter.SelectCommand.CommandTimeout = timeout;
adapter.Fill(dt);
using (SqlConnection conn2 = new SqlConnection(GetConnection(sqlServerConnection)))
using (SqlBulkCopy copy = new SqlBulkCopy(conn2))
{
conn2.Open();
copy.DestinationTableName = destinationTable;
copy.BatchSize = 1000;
copy.BulkCopyTimeout = timeout;
copy.WriteToServer(dt);
copy.NotifyAfter = 1000;
}
}
}
Basically this queries access for the data using the input sql string this has all the correct field names so I don't need to set columnmappings.
This was working until I reached a table with a calculated field. SQLBulkCopy doesn't seem to know to skip the field and tries to update the column which fails with error "The column 'columnName' cannot be modified because it is either a computed column or is the result of a union operator."
Is there an easy way to make it skip the calculated field?
I am hoping not to have to specify a full column mapping.
There are two ways to dodge this:
use the ColumnMappings to formally define the column relationship (you note you don't want this)
push the data into a staging table - a basic table, not part of your core transactional tables, whose entire purpose is to look exactly like this data import; then use a TSQL command to transfer the data from the staging table to the real table
I always favor the second option, for various reasons:
I never have to mess with mappings - this is actually important to me ;p
the insert to the real table will be fully logged (SqlBulkCopy is not necessarily logged)
I have the fastest possible insert - no constraint checking, no indexing, etc
I don't tie up a transactional table during the import, and there is no risk of non-repeatable queries running against a partially imported table
I have a safe abort option if the import fails half way through, without having to use transactions (nothing has touched the transactional system at this point)
it allows some level of data-processing when pushing it into the real tables, without the need to either buffer everything in a DataTable at the app tier, or implement a custom IDataReader