How to run SQL File having large number of lines? - c#

I am having a SQL file(filenameScript) having more than 10k lines of code. Each block of SQL starts with GO and ends with GO. While executing the file from C#, I am getting exception near GO statements. But when I am running the same file in SQL server it is working fine.
con.ConnectionString = sqlconn;
FileInfo file = new FileInfo(filenameScript);
string script = file.OpenText().ReadToEnd();
SqlCommand command = new SqlCommand(script, con);
con.Open();
command.ExecuteNonQuery();
Close();
I think ExecuteNonQuerry is not able to handle so many \n,\r,and \t as the file read is stored in single line with many \n and \r.
Is there any other method to do the same?
Thanks in advance.

No, the issue is not the length of the file, nor is it the existence of \r and/or \n characters. It is because executing SQL using that method can only run a single batch, and the script having GO statements causes multiple batches.
One possibility is to split the text on the keyword GO and execute each individual part:
con.ConnectionString = sqlconn;
var commands = File.ReadAllText(filenameScript).Split(new []{"GO"},StringSplitOption.RemoveEmptyEntries);
con.Open();
foreach(var batch in commands)
{
SqlCommand command = new SqlCommand(batch, con);
command.ExecuteNonQuery();
}
con.Close()
Additionally, you could wrap that in a Transaction to ensure all batches are executed atomically.
An alternative is also provided in this SO Question: How do I execute a large SQL script (with GO commands) from c#

Related

script task in SSIS under for loop each container takes lot of time to complete

enter image description hereI need to execute a c# code for each record which i get from a table
I pass these records using for each loop container in SSIS . Which calls the script task for each record.
In the script task i have a c# code which executes some logic and loads the results into another table.
When i do it, i can see all the data loaded into table faster.
But the whole process is not completed . Package is takes 2 hrs of time to show successful.
What could be the reason for this ?
howwe can solve this
i tried closing the db connection for each record when it inserts into the table
SqlConnection conn = new SqlConnection(connectionString);
string query = "insert into [TEMP_PRE_STG](id,addressType,country,STATUS) values(id,#addressType,#country,#STATUS)";
SqlCommand cmd = new SqlCommand(query, conn);
cmd.Parameters.AddWithValue("#ZGPFTGID", Dts.Variables["User::hid"].Value);
cmd.Parameters.AddWithValue("#addressline", address);
cmd.Parameters.AddWithValue("#group1", Dts.Variables["User::group1"].Value);
cmd.Parameters.AddWithValue("#group2",
conn.Open();
cmd.ExecuteNonQuery();
conn.Close();
package should complete faster after loading the data into table.
The issue is resolved. I have created a dataflowtask and inside that script component is called . this has improved the performance

Is there a method other than using ssis to get data from sql server to oracle?

The problem is, using ssis, I do an ado source to ado destination. This method only writes over 88 rows per second and is very slow.
using Oracle.ManagedDataAccess.Client;
using System;
using System.Data;
using System.Data.SqlClient;
namespace SQLconnection
{
internal static class Program
{
private static void Main(string[] args)
{
SqlConnection conn = new SqlConnection("Data Source=;Database=;Integrated Security=yes");
conn.Open();
SqlCommand cmd = new SqlCommand("SELECT * FROM TABLE", conn);
SqlDataReader reader = cmd.ExecuteReader();
while (reader.Read())
{
Console.WriteLine(reader.GetString(0) + ", " + reader.GetString(19));
}
conn.Close();
conn.Dispose();
Console.ReadLine();
OracleConnection con = new OracleConnection("User Id=;Password=;Data Source=;");
con.Open();
OracleCommand cmd2 = con.CreateCommand();
cmd2
.CommandText = "SELECT \'Hello World!\' FROM dual";
OracleDataReader reader2 = cmd2.ExecuteReader();
reader2.Read();
Console.WriteLine(reader2.GetString(0));
Console.WriteLine(con.ServiceName);
Console.WriteLine(con.ServerVersion);
Console.WriteLine(con.HostName);
con.Close();
Console.ReadLine();
}
}
}
Is there any way I can do a connection and pass the data via a console application? I feel that would be faster than 88 rows per sec.
Yes, you can write a console app that uses a native Oracle provider to pass data to Oracle.
https://www.oracle.com/webfolder/technetwork/tutorials/obe/db/dotnet/GettingStartedNETVersion/GettingStartedNETVersion.htm
I have found that file based operations are much quicker when doing bulk data transfers.
I would investigate using the BCP out utility to generate delimited text files from SQL server. Have a read: https://learn.microsoft.com/en-us/sql/tools/bcp-utility?view=sql-server-2017
Getting it into Oracle could possibly be a little harder (I have very limited Oracle experience). As per the following question, you can investigate using SQL Loader scripts:
Oracle: Import CSV file
There are a couple gotchas when using BCP, though:
Depending on the makeup of the data (do you have commas and carriage returns in text fields in your data?), consider using custom delimiters for both fields and records. This can be easily specified in your BCP command by using the -t and -r options
Obviously, make sure the fields and the data matches (or is at least comparable) between formats. You can use the QUERYOUT option in BCP to create custom queries, which should give you the ability to cast if you need to and order the columns the way you want.
It might not be the sexiest solution, but it can work, can be very repeatable and can see a high throughput of data. We have been doing this for a Sybase ASE to SQL Server ETL process and saw processing times drop to 10% of what they were using other database to database methods.
Obviously, though, YMMV, so test first.

Is it better to run 1000-10000 of command in single SQLCLR Context Connection?

I am using the JOB to continuously watch the table data. Inside the JOB i am calling the SQLCLR SP. SQLCLR SP will run. Before the while loop i will open the SQL connection. Inside for loop i will access the database 1000-10000 times in only one connection. I wont close the DB connection untill my work is done.
SqlConnection connection = null;
try
{
using (connection = new SqlConnection("context connection=true"))
{
connection.Open();
DataTable dt;
SqlDataAdapter adp=new SqlDataAdapter("select * from tableName",CnStr);
DataSet ds=new DataSet();
adp.Fill(ds,"TableName");
dt= ds[0];
//dt.Rows.count may be range from 1000-10000
for(i=0;i<dt.Rows.count;i++)
{
int id = int.Parse(dt.Rows[i][0].ToString());
SqlCommand command = new SqlCommand("select * from table1 where IsParsed=0 and Id=" + id, connection);
SqlDataReader r1 = command.ExecuteReader();
SqlCommand command = new SqlCommand("Insert into table2 (values)", connection);
int r2 = command.ExecuteNonQuery();
//Always get table1 data which has IsParsed=0. Get those rows manipulate those rows data and
// insert into datatable table2 and update those rows to IsParsed=1
SqlCommand command = new SqlCommand("Update table1 set IsParsed=1 where id=#id", connection);
int r3 = command.ExecuteNonQuery();
// Run the Billing Logic here
// Insert into Billing Table
SqlCommand command = new SqlCommand("Insert into Billing(values)", connection);
int r2 = command.ExecuteNonQuery();
}
}
}
catch (Exception ex)
{
}
finally
{
connection.close();
}
Is there any problem with this approach let me know? Is there any issue with using the connection like this? Provide proper suggestion..
I gone through the article
better way to Execute multiple commands in single connection
Here I am using the Context Connection and executing the thousands of command in single connection. Is there any consideration of Connection pool in context connection..? How about the performance of single command execution for each connection vs multiple command execution with single connection?
Also I want to know that in both cases like context connection and regular connection yields to same result? because the SP is deployed in DB itself. If I wrong please correct me.
There is no problem in executing large number of queries over a single connection. Any how you are using a SQL CLR Procedure with context connection. As mentioned in MSDN it states that:
using the context connection typically results in better performance and less resource usage. The context connection is an in-process–only connection, so it can contact the server "directly" by bypassing the network protocol and transport layers to send Transact-SQL statements and receive results. The authentication process is bypassed, as well.
Please refer this link for more information on context and regular connection.
No. It is fine to execute a large number of queries over a single connection.
Your code would likely perform worse if you were to open/close a connection to run those three SQL queries for each of those 1000+ rows.

SQL 'Execute As' Login Command and Linq to SQL

I am trying to execute a sql query as another login using the 'Execute As' command. I am using Linq to SQL, so I've generated a Data Context class and I am using the ExecuteQuery method to run the 'Execute As' SQL command. I then call a Linq to SQL command that is successful. However, every subsequent query fails with the following error:
A severe error occurred on the current command. The results, if any, should be discarded.
Here is the code snippet that I have tried:
SummaryDataContext summary = new SummaryDataContext();
summary.ExecuteQuery<CustomPostResult>(#"Execute as Login='Titan\Administrator'");
var test = summary.Customers.First();
var test2 = summary.Products.ToList();
No matter what query I run on the second query I receive the error message from above. Any help would be appreciated.
I managed to get around this issue in my application by executing the query using ADO.NET classes.
SqlCommand cmd = new SqlCommand("EXECUTE AS USER = 'operator'");
cmd.Connection = dc.Connection as SqlConnection;
cmd.Connection.Open();
cmd.ExecuteNonQuery();
// do the rest of the queries using linq to sql
You may have already ruled this out, but one possible work around would be to simply create the data context with a different connection string.
To edit the connection string, you can set the DataContext.Connection.ConnectionString property. I've done it before in the partial method OnCreated(), which gets called when the data context gets created. I haven't tested but I think you could also do:
YourDataContext dc = new YourDataContext();
dc.Connection.ConnectionString = "connection string here";
Here's an article that describes this as well - http://www.mha.dk/post/Setting-DataContext-Connection-String-at-runtime.aspx
I was having a similar issue and by looking at ruskey's answer I was able to Execute as User but noticed that I was getting errors when running other queries after that. It was due to the missing Revert. So for anyone having a similar issue this is how the code looks like.
SqlCommand cmd = new SqlCommand("EXECUTE AS USER = 'domain\\user';");
OSSDBDataContext dc = new OSSDBDataContext();
cmd.Connection = dc.Connection as SqlConnection;
cmd.Connection.Open();
cmd.ExecuteNonQuery();
//Execute stored procedure code goes here
SqlCommand cmd2 = new SqlCommand("REVERT;");
cmd2.Connection = dc.Connection as SqlConnection;
cmd2.ExecuteNonQuery();

How do I pull data from my old server to my SQL server?

I want to write a code that transfers data from on server to my SQL Server. Before I put the data in, I want to delete the current data. Then put the data from one to the other. How do I do that. This is snippets from the code I have so far.
string SQL = ConfigurationManager.ConnectionStrings["SQLServer"].ToString();
string OLD = ConfigurationManager.ConnectionStrings["Server"].ToString();
SqlConnection SQLconn = new SqlConnection(SQL);
string SQLstatement = "DELETE * FROM Data";
SqlCommand SQLcomm = new SqlCommand(SQLstatement, SQLconn);
SQLconn.Open();
OdbcConnection conn = new OdbcConnection(OLD);
string statement = "SELECT * FROM BILL.TRANSACTIONS ";
statement += "WHERE (TRANSACTION='NEW') ";
OdbcCommand comm = new OdbcCommand(statement, conn);
comm.CommandTimeout = 0;
conn.Open();
SqlDataReader myDataReader = SQLcomm.ExecuteReader();
while (myDataReader.Read())
{
//...
}
SQLconn.Close();
SQLconn.Dispose();
Depending on which version of SQL Server you are using, the standard solution here is to use either DTS (2000 and before) or SSIS (2005 and on). You can turn it into an executable if you need to, schedule it straight from SQL Server, or run it manually. Both tools are fairly robust (although SSIS much more so) with methods to clear existing data, rollback in case of errors, transform data if necessary, write out exceptions, etc.
If at all possible I'd try and do it all in SQL Server. You can create a linked server to your other database server. Then simply use T-SQL to copy the data across - it would look something like...
INSERT INTO new_server_table (field1, field2)
SELECT x, y
FROM mylinkedserver.myolddatabase.myoldtable
If you need to do this on a regular basis or clear out the data first you can do this as part of a scheduled task using the SQL Agent.
If you only need to import the data once, and you have a lot of data, why not use the "BULK INSERT" command? Link
T-SQl allows you to insert data from a select query. It would look something like this:
insert into Foo
select * from Bar;
As long as the field types align this will work - otherwise you will have to massage the data from Bar to fit the fields from Foo.
When you need to do this once, take a look at the database publishing wizard (just google) and generate a script which does everything.

Categories