My scenario is that I have an Excel Spreadsheet that I want to upload to an SQL Database and UPDATE the information based upon the primary key value (which is locked and hidden within the Excel Spreadsheet). I have the below code that can be used to insert new entries into the database, but im not sure how to adapt it to UPDATE:
string path = string.Concat((Server.MapPath("~/temp/" + FileUpload1.FileName)));
FileUpload1.PostedFile.SaveAs(path);
OleDbConnection OleDbcon =
new OleDbConnection(#"Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" + path +
";Extended Properties=\"Excel 8.0;HDR=Yes;IMEX=1\";");
OleDbCommand cmd = new OleDbCommand("select * from [Sheet1$]",
OleDbcon);
OleDbDataAdapter objAdapter1 = new OleDbDataAdapter(cmd);
OleDbcon.Open();
DbDataReader dr = cmd.ExecuteReader();
string con_str =
#"Data Source=****************mydatasource""""""""""";
SqlBulkCopy bulkInsert = new SqlBulkCopy(con_str);
bulkInsert.DestinationTableName = "TableName";
bulkInsert.WriteToServer(dr);
OleDbcon.Close();
Example Data:
ID StartDate EndDate OrderNumber
1 01/02/2015 NULL 100
2 02/02/2015 NULL 100
3 03/02/2015 NULL 101
4 04/02/2015 NULL 102
5 05/02/2015 NULL 103
When the data in the database is inserted, the EndDate is Null.
The End Date is added into the excel sheet, and then I want the C# ASP.Net Application to UPDATE this information and update the SQL Table, not insert it as a new row.
End Data
ID StartDate EndDate OrderNumber
1 01/02/2015 02/02/2015 100
2 02/02/2015 03/02/2015 100
3 03/02/2015 04/02/2015 101
4 04/02/2015 05/02/2015 102
5 05/02/2015 06/02/2015 103
thanks for any help you can give me
The code you already have provides an efficient way of loading a file into SQL Server but doesn't provide any flexibility. Your options are to re-write your C# and do the INSERT/UPDATE one row at a time, or to change your approach slightly and bulkinsert the new data into a working table then call a stored procedure to do the actual INSERT/UPDATE from the working table to the real table - the latter would be my recommendation.
I believe you will need to iterate over the data reader and test for the date value and when found issue a single update statement, e.g.:
while (reader.Read())
{
//test if row exists
//then Update
//else Insert
}
Related
I have a database with 8 tables, i fill my datagridview and then save changes using the same block of code for each table, but i can only save 4 of 8 tables, the other 4 tables gives me an error
Concurrency violation: the UpdateCommand affected 0 of the expected 1 records whe i try to save the changes in them
This is how i fill my dataGridView:
{
string script = "SELECT * FROM hatr.rbt;";
mycon = new MySqlConnection(connect);
mycon.Open();
MySqlDataAdapter ms_data = new MySqlDataAdapter(script, connect);
SD.DataTable table = new SD.DataTable();
ms_data.Fill(table);
DataSet ds = new DataSet();
dataGridView1.DataSource = table;
mycon.Close();
ds.AcceptChanges();
}
And this is how i save the changes:
{
string script = "SELECT id, name, Model_preparation_R_Hr, Time_for_preparation_hr, Time_for_post_processing_hr, YZV_work FROM rbt;";
mycon = new MySqlConnection(connect);
mycon.Open();
MySqlDataAdapter ms_data = new MySqlDataAdapter(script, connect);
var cb = new MySqlCommandBuilder(ms_data);
cb.GetInsertCommand();
DataSet ds = new DataSet();
ms_data.Update(dataGridView1.DataSource as DataTable);
ms_data.InsertCommand = cb.GetInsertCommand();
ms_data.InsertCommand.CommandText += "; select * from rbt where id = last_insert_id();";
ms_data.InsertCommand.UpdatedRowSource = UpdateRowSource.FirstReturnedRecord;
}
I also know that the error occurs in the line:
ms_data.Update(dataGridView1.DataSource as DataTable);
Both blocks of code are the same for every table, but the second block which is need to save changes in dataGrid for some reason gives me an updateCommand error i mentioned before, i couldn't figure out why i get this error for some tables and why i dont get it for others.
So maybe there is some other way to save changes in dataGridView? Because this is the only one i could come up with for now
I finally have found a reason of the Concurrency violation: the UpdateCommand affected 0 of the expected 1 records error. Considering that i have created my database in MySql Workbench and i was trying to work with it through dataGridView in my App i have found out that there is a conflict between FLOAT fields in MySql Workbench and dataGridView. When you create a float number in MySql Workbench it should be written with a dot like "0.1" and it appears that in dataGridView it should be written with comma like "0,1" so the one float field but in to different situations: in mysql Workbench and dataGridView has two different ways to be written and it occurs an Error, as i see this situation, i can be wrong in some ways of course, but i just recreated all FLOAT fields as DOUBLE and now it works.
I'm writing an MVC application and currently working on being able to import employee data from an Excel spreadsheet. I have 2 non-null columns in the database that are not present in the excel file (employeestatus - bit, and datecreated - datetime).
Is there a way to set values for these columns when importing the others? If I create new columns in the spreadsheet for these fields and add them to the mapping it works perfect, but I want to avoid the client having to do that.
//where data gets stored in db
sqlBulk.DestinationTableName = "Employee";
//mappings
sqlBulk.ColumnMappings.Add("ID", "id");
sqlBulk.ColumnMappings.Add("LastName", "employeelastname");
sqlBulk.ColumnMappings.Add("FirstName","employeefirstname");
sqlBulk.ColumnMappings.Add("MI", "employeemi");
sqlBulk.ColumnMappings.Add("StreetAddress1", "streetaddress");
sqlBulk.ColumnMappings.Add("City", "employeecity");
sqlBulk.ColumnMappings.Add("State", "employeestate");
sqlBulk.ColumnMappings.Add("Zip", "employeezip");
sqlBulk.ColumnMappings.Add("Payroll", "payroll");
sqlBulk.ColumnMappings.Add("Salary", "salary");
sqlBulk.ColumnMappings.Add("POBox", "pobox");
sqlBulk.ColumnMappings.Add("POBoxCity", "poboxcity");
sqlBulk.ColumnMappings.Add("POBoxState", "poboxstate");
sqlBulk.ColumnMappings.Add("Org", "orgcode");
//write and close connection
sqlBulk.WriteToServer(dReader);
excelConnection.Close();
Ideally, it'd be like:
sqlBulk.ColumnMapping.Set("status", "1");
sqlBulk.ColumnMapping.Set("empdatecreated") = DateTime.Now();
My searches are taking me in circles of examples where all the data needed is already in the spreadsheet.
If I put this into a temp table first, could I set the column values for all the rows then?
Thank you
If your dReader is perhaps a datareader connected via the jet/ace driver to a spreadsheet:
var cmd = new OleDbCommand("SELECT * FROM [Sheet1$]", excelConnStr);
dReader = cmd.ExecuteReader();
You could add the columns you want to that sql so them are available on the reader:
var cmd = new OleDbCommand("SELECT *, 1 as status, NOW() as empdatecreated FROM [Sheet1$]", excelConnStr);
dReader = cmd.ExecuteReader();
Simply I have an Excel file and I loaded it into SQL Server 2008.
I want to insert current date into the same cells added from Excel while date transferring... then date inserted automatically every time I added data and never lose old date. How can I do this?
string ssqltable = comboBox1.GetItemText(comboBox1.SelectedItem);
string myexceldataquery = "select * from [" + ssqltable + "$]";
try
{
OleDbConnection oconn = new OleDbConnection(#"Provider=Microsoft.ACE.OLEDB.12.0;Data Source=" + imagepath + ";Extended Properties='Excel 12.0 Xml; HDR=YES;IMEX=1;';");
string ssqlconnectionstring = "Data Source=.;Initial Catalog=Bioxcell;Integrated Security=true";
OleDbCommand oledbcmd = new OleDbCommand(myexceldataquery, oconn);
oconn.Open();
SqlBulkCopy bulkcopy = new SqlBulkCopy(ssqlconnectionstring);
DataTable dt = new DataTable();
dt.Load(oledbcmd.ExecuteReader());
bulkcopy.DestinationTableName = ssqltable;
for (int i = 0; i < dt.Columns.Count; i++)
{
bulkcopy.ColumnMappings.Add(i, i);
}
bulkcopy.WriteToServer(dt);
oconn.Close();
}
I used this but I know insert for only last record
I want to insert current date in newly created rows and doesn't lose previous columns date .. for example when i insert data each time insert new date saving old data with old date
SqlCommand Update6 = new SqlCommand("insert into Overseas (Date) Values('" + DateTime.Now.ToShortDateString() + "')", conn);
Update6.ExecuteScalar();
while using
SqlCommand Update6 = new SqlCommand("insert into Overseas (Date) Values (GETDATE())", conn);
Update6.ExecuteNonQuery();
the result was
enter image description here
So what's the solution ?
Set GETDATE() as the default constraint for your Date column of Overseas table.
SQL Command to add constraint:
ALTER TABLE Overseas
ALTER COLUMN Date DATETIME NOT NULL DEFAULT GETDATE()
Post this, you need not pass any value to this column while inserting. Just insert the remaining columns, this will get inserted automatically.
OR
Try this:
SqlCommand Update6 = new SqlCommand("insert into Overseas (Date) Values (GETDATE())", conn);
Update6.ExecuteScalar();
Based on your DB screenshot hope this query helps:
INSERT INTO Overseas (EnglishName,ProductCode,ProductName,TerritoryCode,TerritoryName,Salesvalue,CreditValue,NetSalesValue,Sales,Bonus,Bioxellbricks,BioxellTerritories,Date,ID)
SELECT EnglishName,ProductCode,ProductName,TerritoryCode,TerritoryName,Salesvalue,CreditValue,NetSalesValue,Sales,Bonus,Bioxellbricks,BioxellTerritories,GETDATE(),ID from [exceltablename]
Here Overseas is your DB table and exceltablename is your source table of excel.
You can simply use Datetime.Now(); function to get current date.
Also Use the zzz format specifier to get the timezone offset as hours and minutes. You also want to use the HH format specifier to get the hours in 24 hour format.
DateTime.Now.ToString("yyyy-MM-ddTHH:mm:sszzz")
Result:
2011-08-09T23:49:58+02:00
Some culture settings uses periods instead of colons for time, so you might want to use literal colons instead of time separators:
DateTime.Now.ToString("yyyy-MM-ddTHH':'mm':'sszzz")
Custom Date and Time Format Strings
Hope this will works for you .. Thank you
There is a function GETDATE() for this purpose.
Returns the current database system timestamp as a datetime value
without the database time zone offset. This value is derived from the
operating system of the computer on which the instance of SQL Server
is running.
https://learn.microsoft.com/en-us/sql/t-sql/functions/getdate-transact-sql
I have a task where i want to copy all data from one database to another database & skipping 2 tables. There are more than 200 tables.
I have table structure ready for my 2nd databas.
So as a solution i created a page & on a button click i have below code :-
DataSet ds = new DataSet();
string connectionString = "Data Source=COMP112\\MSSQLSERVER2014;Initial Catalog=HCMBL;Integrated Security=True;Persist Security Info=True";
SqlConnection con = new SqlConnection(connectionString);
//render table name from database
string sqlTable = "SELECT * FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_TYPE='BASE TABLE' and TABLE_Schema='" + Session["SchemaName"].ToString() + "' and TABLE_NAME!='ENTRY' and TABLE_NAME!='OT' and TABLE_NAME!='BL_ENTRY' and TABLE_NAME!='BL_OT'";
con.Open();
SqlDataAdapter da = new SqlDataAdapter();
SqlCommand cmd = new SqlCommand(sqlTable, con);
cmd.CommandType = CommandType.Text;
da.SelectCommand = cmd;
da.Fill(ds);
con.Close();
//render connection string from WebConfig file
string strcon = ConfigurationManager.ConnectionStrings["SPSchema"].ConnectionString;
for (int i = 0; i < ds.Tables[0].Rows.Count; i++)
{
if (!(ds.Tables[0].Rows[i]["TABLE_NAME"].ToString().Contains("Asp")))
{
string deleteQuery = "Truncate table " + Session["SchemaName"].ToString() + "." + ds.Tables[0].Rows[i]["TABLE_NAME"];
con.Open();
SqlCommand cmdDelete = new SqlCommand(deleteQuery, con);
cmdDelete.ExecuteNonQuery();
con.Close();
DataSet dataSet = new DataSet();
SqlConnection conn = new SqlConnection(strcon);
conn.Open();
string selectData = "select * from " + Session["SchemaName"].ToString() + "." + ds.Tables[0].Rows[i]["TABLE_NAME"];
SqlCommand command = new SqlCommand(selectData, conn);
DataTable dataTable = new DataTable();
SqlDataAdapter dataAdapter = new SqlDataAdapter(selectData, conn);
dataAdapter.FillSchema(dataSet, SchemaType.Mapped);
dataAdapter.Fill(dataSet);
dataTable = dataSet.Tables[0];
conn.Close();
if (dataSet.Tables[0].Rows.Count > 0)
{
//Connect to second Database and Insert row/rows.
SqlConnection conn2 = new SqlConnection(connectionString);
conn2.Open();
SqlBulkCopy bulkCopy = new SqlBulkCopy(conn2);
bulkCopy.DestinationTableName = Session["SchemaName"].ToString() + "." + ds.Tables[0].Rows[i]["TABLE_NAME"].ToString();
bulkCopy.WriteToServer(dataTable);
conn2.Close();
}
}
}
As i run the above code after inserting data in less than 10 tables, it gives out of memory exception & program crashes.
How to handle this? I tried increasing the memory capacity of SQL Server but still same error.
Is there any other way to achieve the task?
What you are doing is very far from the best solution. You are using an ASP.NET MVC process to get all data of your entire database into memory, and then outputting it to another database. If your database is anything more than small and trivial, that will most definitely fill your process's alotted memory.
This type of task should never be done through the memory of a process, but rather using some form of Backup/Restore pattern.
You should look into SSIS projects and create an extract, transfer, and load (ETL) solution, which can be triggered from your ASP.NET MVC solution asynchronously.
An SSIS solution can be triggered from C# code in this way:
var app = new Application();
var package = app.LoadPackage("compiled-package.dtsx", null);
var results = package.Execute();
See this question for a little more information (not specifically about duplicating databases, but has information about triggering SSIS packages from code): How to execute an SSIS package from .NET?
Alternatively
You also have the option of running a query against both databases at once, however this requires some additional plumbing to be done. The user account of your ASP.NET MVC solution needs to have access to both databases. If your databases are hosted on different servers, you also need to link one server to the other: Create linked servers
To perform an insert directly from the output of a select, consider this:
string source = "NAME_OF_SOURCE_DATABASE";
string target = "NAME_OF_TARGET_DATABASE";
string schema = Session["SchemaName"].ToString();
string table = ds.Tables[0].Rows[i]["TABLE_NAME"];
// Uncomment this if you need to deal with autoincrement columns
/*string idInsQuery = $"SET IDENTITY_INSERT {target}.{schema}.{table} ON";
var idInsCommand = new SqlCommand(idInsQuery, conn);
idInsCommand.ExecuteNonQuery();*/
string insQuery = $"INSERT INTO {target}.{schema}.{table} SELECT * FROM {source}.{schema}.{table}";
var insCommand = new SqlCommand(insQuery, conn);
insCommand.ExecuteNonQuery();
// Uncomment this if you need to deal with autoincrement columns
/*string idInsQuery2 = $"SET IDENTITY_INSERT {target}.{schema}.{table} OFF";
var idInsCommand2 = new SqlCommand(idInsQuery2, conn);
idInsCommand2.ExecuteNonQuery();*/
This will only work if the table structures are identical. There might be problems with autoincrement ids or columns with default values, too.
This will copy data from a table in database 1 to a table in database 2
Insert into db2.dbo.table2 (col1,col2)
Select col1,col2 from db1.dbo.table1
Run this sql statement and the data will be copied without a round trip to your app.
Let me know if you find my approach is useful.
First of all, why you want to write down one whole application to do this job while SQL Server have inherited property to do it.
My approach would be configure an Linked Server and configure it which tables you want to copy and which one not.
https://learn.microsoft.com/en-us/sql/relational-databases/linked-servers/create-linked-servers-sql-server-database-engine
Secondly, You can just write down simple stored procedure and schedule that in your sql server to push into another server database as per your schedule. In this way you can control it in N number of ways. I mean about controlling any dependencies(Table level or Business level).
To do this in t-sql, you can use the following system stored procedures to schedule a daily job. This example schedules daily at 1:00 AM. See Microsoft help for details on syntax of the individual stored procedures and valid range of parameters.
DECLARE #job_name NVARCHAR(128), #description NVARCHAR(512), #owner_login_name NVARCHAR(128), #database_name NVARCHAR(128);
SET #job_name = N'Some Title';
SET #description = N'Periodically do something';
SET #owner_login_name = N'login';
SET #database_name = N'Database_Name';
-- Delete job if it already exists:
IF EXISTS(SELECT job_id FROM msdb.dbo.sysjobs WHERE (name = #job_name))
BEGIN
EXEC msdb.dbo.sp_delete_job
#job_name = #job_name;
END
-- Create the job:
EXEC msdb.dbo.sp_add_job
#job_name=#job_name,
#enabled=1,
#notify_level_eventlog=0,
#notify_level_email=2,
#notify_level_netsend=2,
#notify_level_page=2,
#delete_level=0,
#description=#description,
#category_name=N'[Uncategorized (Local)]',
#owner_login_name=#owner_login_name;
-- Add server:
EXEC msdb.dbo.sp_add_jobserver #job_name=#job_name;
-- Add step to execute SQL:
EXEC msdb.dbo.sp_add_jobstep
#job_name=#job_name,
#step_name=N'Execute SQL',
#step_id=1,
#cmdexec_success_code=0,
#on_success_action=1,
#on_fail_action=2,
#retry_attempts=0,
#retry_interval=0,
#os_run_priority=0,
#subsystem=N'TSQL',
#command=N'EXEC my_stored_procedure; -- OR ANY SQL STATEMENT',
#database_name=#database_name,
#flags=0;
-- Update job to set start step:
EXEC msdb.dbo.sp_update_job
#job_name=#job_name,
#enabled=1,
#start_step_id=1,
#notify_level_eventlog=0,
#notify_level_email=2,
#notify_level_netsend=2,
#notify_level_page=2,
#delete_level=0,
#description=#description,
#category_name=N'[Uncategorized (Local)]',
#owner_login_name=#owner_login_name,
#notify_email_operator_name=N'',
#notify_netsend_operator_name=N'',
#notify_page_operator_name=N'';
-- Schedule job:
EXEC msdb.dbo.sp_add_jobschedule
#job_name=#job_name,
#name=N'Daily',
#enabled=1,
#freq_type=4,
#freq_interval=1,
#freq_subday_type=1,
#freq_subday_interval=0,
#freq_relative_interval=0,
#freq_recurrence_factor=1,
#active_start_date=20170101, --YYYYMMDD
#active_end_date=99991231, --YYYYMMDD (this represents no end date)
#active_start_time=010000, --HHMMSS
#active_end_time=235959; --HHMMSS
Let me know in case you need more details on this.
Thanks,
Ayan
I am working on an application which gets data from two different databases (i.e Database1.Table1 and Database2.Table2) then it compares these two tables ( comparision done only with the primary key i-e ID ) and insert rows from Database1.Table1 to Database2.Table2 if it does not exists in Database2.Table2
The problem is that there is a huge amount of data (about 0.8 Million in both tables ) and it takes a lot of time in comparision. Is there any way to do this Fast
NOTE: I am using Datatable in C# to compare there tables Code is given below
DataTable Database1_Table1;// = method to get all data from Database1.Table1
DataTable Database2_Table2;// = method to get all data from Database2.Table2
foreach (DataRow row in Database1_Table1.Rows) //(var GoodClass in Staging_distinct2)
{
if (Database2_Table2.Select("ID=" + row["ID"]).Count() < 1)
{
sqlComm = new SqlCommand("Delete from Database1.Table1 where Id=" + row["ID"], conn);
sqlComm.ExecuteNonQuery();
sqlComm = new SqlCommand("INSERT INTO Database2.Table2 Values (#ID,#EmpName,#Email,#UserName)", conn);
sqlComm.Parameters.Add("#ID", SqlDbType.Int).Value = row["ID"];
sqlComm.Parameters.Add("#EmpName", SqlDbType.VarChar).Value = row["EmpName"];
sqlComm.Parameters.Add("#Email", SqlDbType.VarChar).Value = row["Email"];
sqlComm.Parameters.Add("#UserName", SqlDbType.VarChar).Value = row["UserName"];
sqlComm.ExecuteNonQuery();
totalCount++;
added++;
}
else
{
deleted++;
totalCount++;
}
}
Submit this SQL from your application to the database:
INSERT INTO Database1..Table1 (Key, Column1,Column2)
SELECT Key, Column1,Column2
FROM Database2..Table2
WHERE NOT EXISTS (
SELECT * FROM Database1..Table1
WHERE Database1..Table1.Key = Database1..Table2.Key
)
It will copy all rows that don't match on column Key from Database..Table2 to Database..Table1
It will do it on the database server. No needless round trip of data. No RBAR (Row By Agonising Row). The only downside is you can't get a progress bar - do it asynchronously.
Bulk update/insert is the fastest way. (sqlbulk copy)
http://www.jarloo.com/c-bulk-upsert-to-sql-server-tutorial/
Best way to handle this is to bulk insert into a temp table then issue a merge statement from that temp table into your production table. I do this with millions of rows a day without issue. I have an example of the technique on my blog C# Sql Server Bulk Upsert