There is a windows form application. I am using the MS Access database for some data manipulation. I want to copy data from one database to another. The table name, schema and the data types are same in both the tables.
I am using the below query to bulk insert data in destination database by selecting data from the source database.
INSERT INTO [Table1] IN 'C:\Data\Users.mdf' SELECT * FROM [Table1]
After data is inserted, I am querying to the target table to fetch the inserted data. I am using OleDbConnection for performing the database operations.
The issue I am facing here is that, after the above mentioned INSERT query is executed when I am executing the SELECT statement to fetch the data, I am not getting the data. However, when I am checking in debugging mode then I am getting the data.
I noticed that if I am waiting for some time after the INSERT statement is executed then the data is coming correctly. So I assume that it needs some time(delay?) to complete the bulk insert operation.
I tried providing Task.Delay(20000) after the INSERT query execution but no luck. Could someone help me here, how I can resolve this issue? Any help is highly appreciated.
I didn't find a good way to handle this but did a work around for the same. After data is inserted into the table, I am firing another query to check whether there is any data in the inserted table or not. This happens in a do..while loop like follows. The table is dropped every time the operation is completed.
var insertQuery = "INSERT INTO [Table1] IN 'C:\Data\Users.mdf' SELECT * FROM [Table1]";
ExecuteQuery(insertQuery, connProd);
var count = 10;
do
{
var selectQuery = "SELECT TOP 1 * FROM " + tableProdCopy;
var dtTopRowData = GetQueryData(selectQuery, connOther);
if (dtTopRowData != null && dtTopRowData.Rows.Count > 0)
{
count = 0;
break;
}
System.Threading.Thread.Sleep(2000);
count = count - 1;
} while (count > 0);
private DataTable GetQueryData(string query, OleDbConnection conn)
{
using (OleDbCommand cmdOutput = new OleDbCommand(query, conn))
{
using (OleDbDataAdapter adapterOutput = new OleDbDataAdapter(cmdOutput))
{
var dtOutput = new DataTable();
adapterOutput.Fill(dtOutput);
return dtOutput;
}
}
}
private void ExecuteQuery(string query, OleDbConnection conn)
{
using (OleDbCommand cmdInput = new OleDbCommand(query, conn))
{
cmdInput.ExecuteNonQuery();
}
}
Related
I have data coming from an Excel spreadsheet, I cannot change how this comes in. Is there a solution for adding IList<IList<Object>> values to a SQL Server database instead of looping as I am reaching limit with currently 5k rows.
I was also informed that I shouldn't use injection, so any alternatives are welcomed.
public static void Load_Table()
{
// This function on_click populates the datagrid named JumpTable
// this.JumpTable.ItemsSource = null; // Clears the current datagrid before getting new data
// Pulls in the 2d table from the client sheet
IList<IList<Object>> client_sheet = Get(SetCredentials(), "$placeholder", "Client!A2:AY");
DBFunctions db = new DBFunctions();
db.postDBTable("DELETE FROM Client");
foreach (var row in client_sheet)
{
string exe = "INSERT INTO Client ([Tracker ID],[Request ID]) VALUES('" + row[0].ToString() + "','" + row[1].ToString() + "')";
db.postDBTable(exe);
}
}
Database functions
public SqlConnection getDBConnection()
{
// --------------< Function: Opens the connection to our database >-------------- \\
string connectionString = Properties.Settings.Default.connection_string; // Gets the connection source from properties
SqlConnection dbConnection = new SqlConnection(connectionString); // Creates the connection based off our source
if (dbConnection.State != ConnectionState.Open) dbConnection.Open(); // If it's not already open then open the connection
return dbConnection;
}
public DataTable getDBTable(string sqlText)
{
// --------------< Function: Gets the table from our database >-------------- \\
SqlConnection dbConnection = getDBConnection();
DataTable table = new DataTable();
SqlDataAdapter adapter = new SqlDataAdapter(sqlText, dbConnection);adapter.Fill(table);
return table;
}
public void postDBTable(string sqlText)
{
// --------------< Function: Post data to our database >-------------- \\
SqlConnection dbConnection = getDBConnection();
SqlCommand cmd = new SqlCommand(sqlText, dbConnection);cmd.ExecuteNonQuery();
}
I have worked with lots of bulk data loads in the past. There are two main ways to avoid individual inserts with SQL Server, a list of values inserted at one time or using bulk insert.
the first option to use a list of values is like this:
INSERT INTO Foo (Bar,Baz)
VALUES ('bar','baz'),
('anotherbar','anotherbaz')
In c# you would loop through your list and build the values content, however doing this with out a sql injection vulnerability is difficult.
The second option is to use bulk insert with SQL Bulk copy and a datatable. Before getting to the code below you would build a DataTable that holds all your data then use SqlBulkCopy to insert rows using Sql functionality that is optimized for inserting large amounts of data.
using (var bulk = new SqlBulkCopy(con)
{
//bulk mapping is a SqlBulkCopyColumnMapping[] and is not necessaraly needed if the DataTable matches your destination table exactly
foreach (var i in bulkMapping)
{
bulk.ColumnMappings.Add(i);
}
bulk.DestinationTableName = "MyTable";
bulk.BulkCopyTimeout = 600;
bulk.BatchSize = 5000;
bulk.WriteToServer(someDataTable);
}
These are the two framework included methods. There are other libraries that can help. Dapper is one but I am not sure how it handles inserts on the back end. Entity framework is another but it does single inserts so it is just moving the problem from your code to some one else's.
I am trying to insert into table from array of Json as well as select records from SQL Server DB table.
When executing the below method it is almost taking more than 10 minutes.
public async Task CreateTableAsync(string formsJson, string connectionString)
{
SqlConnection con = new SqlConnection(connectionString);
List<FormsJson> listOfformsJson = JsonConvert.DeserializeObject<List<FormsJson>>(formsJson);
foreach (var form in listOfformsJson)
{
string formId = Guid.NewGuid().ToString();
//insert into forms Table
string formQuery = "insert into Forms([FormId]) values(#FormId)";
using (var cmd = new SqlCommand(formQuery, con))
{
cmd.CommandTimeout = 120;
//Pass values to Parameters
cmd.Parameters.AddWithValue("#FormId", formId);
if (con.State == System.Data.ConnectionState.Closed)
{
con.Open();
}
cmd.ExecuteNonQuery();
}
//relationship between forms and ETypes,get all the eTypes and fill
foreach (var typeOf in form.TypeOf)
{
//get all the eTypeIds for this typeof field
string query = "select Id from ETypes Where TypeOf = #typeOf";
List<string> eTypeIdList = new List<string>();
using (var sqlcmd = new SqlCommand(query, con))
{
sqlcmd.CommandTimeout = 120;
//Pass values to Parameters
sqlcmd.Parameters.AddWithValue("#typeOf", typeOf);
if (con.State == System.Data.ConnectionState.Closed)
{
con.Open();
}
SqlDataReader sqlDataReader = sqlcmd.ExecuteReader();
while (sqlDataReader.Read())
{
string eTypeId = sqlDataReader[0].ToString();
eTypeIdList.Add(eTypeId);
}
sqlDataReader.Close();
}
//insert into Forms ETypes Relationship
string fe_query = "";
foreach (var eTypeId in eTypeIdList)
{
fe_query = "insert into Forms_ETypes([Form_Id],[EType_Id]) values (#Form_Id,#EType_Id)";
if (con.State == System.Data.ConnectionState.Closed)
{
con.Open();
}
using (var fesqlcmd = new SqlCommand(fe_query, con))
{
fesqlcmd.CommandTimeout = 120;
//Pass values to Parameters
fesqlcmd.Parameters.AddWithValue("#Form_Id", formId);
fesqlcmd.Parameters.AddWithValue("#EType_Id", eTypeId);
fesqlcmd.ExecuteNonQuery();
}
}
}
}
}
Outer foreach(...listofformsJson) loop more than hundreds of records.
And same for the inner loop around hundreds of rows.
In between commandTimeout, keeping the open connection with server statements.
Any help to optimize the time and remove/add ADO statements.
The primary issue here is that you are pulling all of the data out of the database and then, row by row, inserting it back in. This is not optimal from the database's point of view. It is great at dealing with sets - but you are treating the set as lots of individual rows. Thus it becomes slow.
From a set-based standpoint, you have only two statements that you need to run:
Insert the Forms row
Insert the Forms_ETypes rows (as a set, not one at a time)
1) should be what you have now:
insert into Forms([FormId]) values(#FormId)
2) should be something like:
insert Forms_ETypes([Form_Id],[EType_Id]) SELECT #FormId, Id from ETypes Where TypeOf IN ({0});
using this technique to pass in your form.TypeOf values. Note this assumes you have fewer than 500 entries in form.TypeOf. If you have many (e.g. greater than 500) then using a UDT is a better approach (note some info on UDTs suggest that you need to use a stored proc, but that isn't the case).
This will enable you to run just two SQL statements - the first, then the second (vs possibly thousands with your current solution).
This will save time for two reasons:
The database engine didn't need to pass the data over the wire twice (from your DB server to your application, and back again).
You enabled the database engine to do a large set based operation, rather than lots of smaller operations (with latency due to the request-response nature of the loop).
So I am creating a messaging application for a college project and I have a database of Users in Access, I have linked the database correctly and can execute statements but I am struggling with one problem, how to count the number of rows in a data table.
In fact, all I want to do is to count the total number of users and my teacher told me to get the data into a DataTable and count the number of rows. However, no matter how many users I have in the database, it always returns as 2.
int UserCount = 0;
using (OleDbConnection cuConn = new OleDbConnection())
{
cuConn.ConnectionString = #"DATASOURCE";
string statement = "SELECT COUNT(*) FROM Users";
OleDbDataAdapter da = new OleDbDataAdapter(statement, cuConn);
DataTable Results = new DataTable();
da.Fill(Results);
if (Results.Rows.Count > 0)
{
UserCount = int.Parse(Results.Rows[0][0].ToString());
}
}
The above code is a copy of what I was sent by my teacher who said it would work. Any help would be appreciated.
Also, sorry if this is a waste of time, still getting used to this StackOverflow thing...
Try replace Users with [Users]?
Because Users may be a key word of database.
Also the simpler way to get aggregate numbers is by ExecuteScalar method.
using (OleDbConnection cuConn = new OleDbConnection())
{
cuConn.ConnectionString = #"DATASOURCE";
string statement = "SELECT COUNT(*) FROM [Users]";
OleDbCommand cmd = new OleDbCommand (statement, cuConn);
cuConn.Open();
int count = (int)cmd.ExecuteScalar();
if (count > 0)
{
//
}
}
I successfully used your exact code (except the connection string) with sql server so maybe there is a problem with your #"DATASOURCE" or MS Access.
I have a SQL Server database which has a lot of information inside.
I want to select top 50 rows in a single query (which I did, with no problem) but then I want to update a column from false to true, so next time I select I wont select the same, my code looks like this:
string Command = "UPDATE HubCommands SET [Alreadytaken] = 'true' FROM (SELECT TOP 50 [CommandId],[DeviceId],[Commandtext], [HashCommand],[UserId] FROM HubCommands) I WHERE [HubId] = '18353fe9-82fd-4ac2-a078-51c199d9072b'";
using (SqlConnection myConnection = new SqlConnection(SqlConnection))
{
using (SqlDataAdapter myDataAdapter = new SqlDataAdapter(Command, myConnection))
{
DataTable dtResult = new DataTable();
myDataAdapter.Fill(dtResult);
foreach (DataRow row in dtResult.Rows)
{
Guid CommandId, DeviceId, UserId;
Guid.TryParse(row["CommandId"].ToString(), out CommandId);
Guid.TryParse(row["DeviceId"].ToString(), out DeviceId);
Guid.TryParse(row["UserId"].ToString(), out UserId);
Console.WriteLine("CommandId" + CommandId);
}
}
}
This code does work, and it updates what I ask it to update, but I don't get nothing in the data table, its like it is always updating but not selecting.
If I do a normal select it does work and give information.
Does anyone have any idea how to update and get some data back, in a single query?
So your question is:
How can I update a table in SQL Server using C# and return the truly updated
rows as a DataTable ?
First You have multiple issues in your query.
You should use 1 and 0, not true or false. SQL-Server has a bit datatype and not a Boolean.
Second, this is how you should've constructed your query:
DECLARE #IDs TABLE
(
[CommandId] uniqueidentifier
);
INSERT INTO #IDs
SELECT [CommandId] FROM HubCommands
WHERE [HubId] = '18353fe9-82fd-4ac2-a078-51c199d9072b' AND [Alreadytaken] = 0;
UPDATE HubCommands
SET [Alreadytaken] = 1
WHERE CommandId IN
(
SELECT [CommandId] FROM #IDs
);
SELECT * FROM HubCommands
WHERE CommandId IN
(
SELECT [CommandId] FROM #IDs
);
Wrap all the above in a single string and use SqlDataReader. No need for an Adapter in you case (Since we're mixing commands unlike what the adapter usually does):
var sqlCommand = new SqlCommand(Command, myConnection);
SqlDataReader dataReader = sqlCommand.ExecuteReader();
DataTable dtResult = new DataTable();
dtResult.Load(dataReader);
I highly advise you to create a stored procedure accepting HubId as a parameter that does all the above work. It is neater and better for maintenance.
The company that I work for has large databases, millions of records in a single table. I have written a C# program that migrates tables between remote servers.
I first create all the tables using SMO without copying data and then the data insertion is done after all the tables have been created.
During the record insertion since there are so many records the console window remains blank until all the rows have been inserted. Due to the sheer volumes of data this takes a long time.
What I want now is a way to print n rows updated like in MSSQL import export data wizard.
The insert part is just a simple insert into select * query.
It sounds like you might be using SqlCommands, if so here is a sample
using (SqlConnection connection = new SqlConnection(Connection.ConnectionString) )
{
using(SqlCommand command = new SqlCommand("insert into OldCustomers select * from customers",connection))
{
connection.Open();
var numRows = command.ExecuteNonQuery();
Console.WriteLine("Affected Rows: {0}",numRows);
}
}
You definitely need to look on OUTPUT clause. There are useful examples on MSDN.
using (SqlConnection conn = new SqlConnection(connectionStr) )
{
var sqlCmd = "
CREATE TABLE #tmp (
InsertedId BIGINT
);
INSERT INTO TestTable
OUTPUT Inserted.Id INTO #tmp
VALUES ....
SELECT COUNT(*) FROM #tmp";
using(SqlCommand cmd = new SqlCommand(sqlCmd,conn))
{
conn .Open();
var numRows = command.ExecuteNonQuery();
Console.WriteLine("Affected Rows: {0}",numRows);
}
}
Also I suggest to use stored procedure for such purposes.