bulkCopy.WriteToServer does not copy - c#

I'm trying to use bulkCopy.WriteToServer() to copy data from my remote SQL Server Express to my local SQL Server Express, but no data is ever written to my local SQL Server Express database.
The code jumps over the WriteToServer() method instantly... I have no idea if it's failing on the inside and not showing an error message though
I have read How to duplicate a SQL Server 2000 table programatically using .NET 2.0? and I am using very similar code. Although I am using SQL Server 2008 Express on remote and a local to SQL Server 2014 Express:
using (SqlConnection remoteConnection = new SqlConnection(remoteConnectionString))
{
var query = "SELECT * FROM information_schema.tables WHERE table_type = 'base table'";
SqlCommand commandGetTables = new SqlCommand(query, remoteConnection);
try
{
remoteConnection.Open();
SqlDataReader results = commandGetTables.ExecuteReader();
while (results.Read())
{
tables.Add(results.GetString(2));
}
results.Close();
}
catch (Exception ex)
{
//stuff
}
finally
{
remoteConnection.Close();
}
remoteConnection.Open();
foreach (var table in tables)
{
// Get data from the source table as a SqlDataReader.
var commandSourceData = new SqlCommand("SELECT * FROM " + table + ";", remoteConnection);
var reader = commandSourceData.ExecuteReader();
using (SqlConnection destinationConnection = new SqlConnection(destinationConnectionString))
{
destinationConnection.Open();
using (var bulkCopy = new SqlBulkCopy(destinationConnection))
{
bulkCopy.DestinationTableName = table;
try
{
// Write from the source to the destination.
bulkCopy.WriteToServer(reader);
}
catch (Exception ex)
{
//stuff
}
finally
{
//stuff removed for this post
}
}
}
}
remoteConnection.Close();
}
return true;
I know this could be subject to SQL injection etc, but this app is only used by me and not the issue here.
What am I doing wrong?
Edit
I checked the value of reader (var reader = commandSourceData.ExecuteReader();) and it has entries as I would expect, meaning he reading from the remote is fine.

bulkCopy.DestinationTableName = table;
bulkCopy.WriteToServer(reader);
these lines are wrong it's supposed to look like this..
bulkCopy.DestinationTableName = "dbo." + DataTable.TableName;
bulkCopy.WriteToServer(DataTable);

Related

Copying data from Table on MySQL Server to the local MySql Database table in C#

I'm trying to copy some columns from one table located on the MySQL Server, to a local MySQL database using C#. How can I do it ?
I'm not sure if i should use MySQLDataAdapter or Bulk Copy (which i don't understand much).
Bulk copy would be an easier method. It takes a previously created datatable and uploads it to the database.
You can use the following code to help. The below code allows you to return a datatable:
public DataTable FillTable(string sql)
{
MySqlCommand sqlQuery = new MySqlCommand();
cmd.Connection = ;// insert the connection details of the DB to transfer FROM
cmd.CommandText = sql;
DataTable dt = new DataTable();
try
{
conn.Open();
dt.Load(cmd.ExecuteReader());
conn.Close();
}
catch (SqlException ex)
{
Console.WriteLine("fillTable: "+ ex.Message);
}
return dt;
}
Then using the following you can send the data over to your new database:
Datatable newTable = FillTable("SELECT * FROM MyOldTable");
using (MySqlBulkCopy destination = new MySqlBulkCopy("MyNewDatabaseConnection String"))
{
destination.DestinationTableName = "myNewTable";
try
{
destination.WriteToServer(newTable);
}
catch (Exception Ex)
{
Console.WriteLine(Ex.Message);
}
}
Reference MSDN

How to query a Foxpro .DBF file with .NDX index file using the OLEDB driver in C#

I have a Foxpro .DBF file. I am using OLEDB driver to read the .DBF file. I can query the DBF and utilize its .CDX index file(cause it is automatically opened). My problem is that I want to query it with the .NDX index file (which is not automatically opened when the .DBF is opened). How can I open the .NDX file in C# using OLEDB driver cause the DBF is really big to search for a record without the index? Thanks all! Here is the code I am using to read the DBF.
OleDbConnection oleDbConnection = null;
try
{
DataTable resultTable = new DataTable();
using (oleDbConnection = new OleDbConnection("Provider=VFPOLEDB.1;Data Source=P:\\Test\\DSPC-1.DBF;Exclusive=No"))
{
oleDbConnection.Open();
if (oleDbConnection.State == ConnectionState.Open)
{
OleDbDataAdapter dataApdapter = new OleDbDataAdapter();
OleDbCommand command = oleDbConnection.CreateCommand();
string selectCmd = #"select * from P:\Test\DSPC-1 where dp_file = '860003'";
command.CommandType = CommandType.Text;
command.CommandText = selectCmd;
dataApdapter.SelectCommand = command;
dataApdapter.Fill(resultTable);
foreach(DataRow row in resultTable.Rows)
{
//Write the data of each record
}
}
}
}
catch (Exception e)
{
Console.WriteLine(e.Message);
}
finally
{
try
{
oleDbConnection.Close();
}
catch (Exception e)
{
Console.WriteLine("Failed to close Oledb connection: " + e.Message);
}
}
ndx files wouldn't be opened by default and those are a thing of the past really, why wouldn't you simply add your index to your CDX. If it is not an option, then ExecScript suggestion by DRapp is what you can do. He was very close. Here is how you could do that:
string myCommand = #"Use ('P:\Test\DSPC-1') alias myData
Set Index To ('P:\Test\DSPC-1_Custom.NDX')
select * from myData ;
where dp_file = '860003' ;
into cursor crsResult ;
nofilter
SetResultset('crsResult')";
DataTable resultTable = new DataTable();
using (oleDbConnection = new OleDbConnection(#"Provider=VFPOLEDB;Data Source=P:\Test"))
{
oleDbConnection.Open();
OleDbCommand command = new OleDbCommand("ExecScript", oleDbConnection);
command.CommandType = CommandType.StoredProcedure;
command.Parameters.AddWithValue("code", myCommand);
resultTable.Load(cmd.ExecuteReader());
oleDbConnection.Close();
}
Your connection string should only reference the PATH to where the .dbf files are located.
Then, your query is just by the table name.
new OleDbConnection("Provider=VFPOLEDB.1;Data Source=P:\\Test\\;Exclusive=No"))
selectCmd = #"select * from DSPC-1 where dp_file = '860003'";
As for using the .NDX, how / where was that created... Is that an old dBASE file you are using the Visual Foxpro driver for?
If it is a separate as described, you might need to do via an ExecScript() to explicitly open the file first WITH the index, THEN run your query. This is just a SAMPLE WITH YOUR FIXED value. You would probably have to PARAMETERIZE it otherwise you would be open to sql-injection.
cmd.CommandText = string.Format(
#"EXECSCRIPT('
USE DSPC-1 INDEX YourDSPC-1.NDX
SELECT * from DSPC-1 where dp_file = '860003'" );
Also, you might have issue with your table names being hyphenated, you may need to wrap it in [square-brackets], but not positive if it is an issue.

Single insert vs. muliple inserts

I'm iterating a (binary) file and insert some values in a table.
After receiving the values I insert them row-by-row in the database.
Part of my code (example values):
string sColumn1 = Getcolumn1Value();
string sColumn2 = Getcolumn2Value();
string sqlIns = "INSERT INTO table (column1, column2) " +
"VALUES (#column_1, #column_2)";
SqlCommand cmdInsPacket = new SqlCommand(sqlIns, connection);
cmdInsPacket.Parameters.AddWithValue("#column_1", sColumn1);
cmdInsPacket.Parameters.AddWithValue("#column_2", sColumn2);
cmdInsPacket.ExecuteNonQuery();
cmdInsPacket.Parameters.Clear();
Because there're millions of records I was wondering if it's beter to insert a group of, for example, 500 instead of one by one.
Any good suggestions to fix this (and is my suggestion (group insert) useful)
(MS SQL Server 2008 R2 and C#)
You can use SqlBulkCopy to insert multiple records. You can put records in datatable and save them to db in single call.
using (SqlBulkCopy bulkCopy = new SqlBulkCopy(connection))
{
bulkCopy.DestinationTableName = "dbo.BulkCopyDemoMatchingColumns";
try
{
bulkCopy.WriteToServer(newProducts);
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
}
}
If you have the data in DataTable format use this simple function.
function static void InsertDataIntoSQLServerUsingSQLBulkCopy(DataTable bulkData)
{
using(SqlConnection dbConnection = new SqlConnection("Data Source=ProductHost;Initial Catalog=yourDB;Integrated Security=SSPI;"))
{
dbConnection.Open();
using (SqlBulkCopy s = new SqlBulkCopy(dbConnection))
{
s.DestinationTableName = "Your table name";
foreach (var column in bulkData.Columns)
s.ColumnMappings.Add(column.ToString(), column.ToString());
s.WriteToServer(bulkData);
}
}
}

My aim is to import data from SQL and export it to Oracle Database. there might be thousands of records, so I am trying it using bulk import

I have a SP at SQL side returning the required data. I need to export it to Oracle DB. I am using bulkcopy. here is the code.
string connectionString = System.Configuration.ConfigurationManager.AppSettings.Get("ConnectionString");
string ConnectionStringOracle = System.Configuration.ConfigurationManager.AppSettings.Get("ConnectionStringOracle");
using (SqlConnection connection = new SqlConnection(connectionString))
{
using (SqlCommand command = new SqlCommand("[SPNAME]", connection))
{
connection.Open();
SqlDataReader rdr = command.ExecuteReader();
using (OracleConnection destinationConnection = new OracleConnection(ConnectionStringOracle))
{
destinationConnection.Open();
try
{
using (Oracle.DataAccess.Client.OracleBulkCopy bulkCopy = new Oracle.DataAccess.Client.OracleBulkCopy(ConnectionStringOracle))
{
bulkCopy.DestinationTableName = "DESTTABLLENAME"; / //bulkCopy.ColumnMappings.Add(1,1);
bulkCopy.WriteToServer(rdr);
bulkCopy.Close();
bulkCopy.Dispose();
destinationConnection.Dispose();
connection.Close();
}
}
catch (OracleException ex)
{
}
}
// ...
}
// ...
}
I am getting the following error :
oracle exception external component has thrown an exception

How to speed up odbc insert

So what I'm doing is reading a lot of data from remote Nettezza database and inserting them into another remote Oracle database. For that I'm using ODBC driver. The problem is that there is a lot of data and it takes too much time. How can I speed up?
Here is what I do:
First I create connection and command for inserting:
String connect = "Driver={Microsoft ODBC for Oracle};CONNECTSTRING=(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=myhost)(PORT=myprt))(CONNECT_DATA=(SERVICE_NAME=myname)));Uid=uid;Pwd=pass";
connection = new OdbcConnection(connect);
connection.Open();
String q = #"INSERT INTO TEST (id)
VALUES (?)";
myCommand = new OdbcCommand(q,connection);
Then I read data from netteza:
String connect = "Driver={NetezzaSQL};servername=server;port=5480;database=db; username=user;password=pass;
string query = #"SELECT T2.LETO_MESEC, T1.*
FROM data T1
JOIN datga2 T2 ON T2.ID = T1.EFT_ID
WHERE T2.LETO_MESEC = '" + mesec + #"'";
using (OdbcConnection connection = new OdbcConnection(connect))
{
try
{
OdbcCommand command = new OdbcCommand(query, connection);
connection.Open();
OdbcDataReader reader = command.ExecuteReader();
int counter=0;
while (reader.Read())
{
int id_first = reader.GetInt32(5);
insertOracle(id_first);
}
}
catch (Exception e)
{
Console.WriteLine("ne dela" + e.ToString());
}
}
And finally my insert :
public void insertOracle(int id_first)
{
try
{
myCommand.Parameters.Clear();
myCommand.Parameters.Add(new OdbcParameter("id", id_first));
myCommand.ExecuteNonQuery();
}
catch (Exception e)
{
Console.WriteLine("ne dela" + e.ToString());
}
}
I noticed that these commit in every line, so how to remove that and speed it up. Right now it takes about 10 minutes for 20000 rows.
Single inserts are always going to be slow -- start processing the data in arrays, selecting a batch of ID's from the source system and loading an array to the target.
Here is an article which might be helpful. http://www.oracle.com/technetwork/issue-archive/2009/09-sep/o59odpnet-085168.html

Categories