import sql database table into datatable - c#

i am using this code below to allow me to go into my sql sever and get the data out and input the data into a datatable... but there is no data being added to the table and the table in the database has information in the table.
// SQL Server Connection Strings
sqlConnectionString = "Data Source=.;Initial Catalog= " + databaseName + " ;Integrated Security=true";
queryString = "select * from " + sqlTableName;
using (var connection = new SqlConnection(sqlConnectionString))
using (var adapter = new SqlDataAdapter(queryString, connection))
{
sqlDataTable.Clear();
adapter.Fill(sqlDataTable);
}
return sqlDataTable;

There could be four possible errors in your code
1) Try adding schema name (for example "dbo")
queryString = string.Format("select * from [dbo].['{0}']",sqlTableName);
2) Unhandled Exception
From MSDN DataTable.Clear()
All rows in all tables are removed. An exception is generated if the table has any enforced child relations that would cause child rows to be orphaned.
I think at the point of sqlDataTable.Clear() your code is getting error (or somewhere else) which you are not handling by try catch block. so the datatable is empty.
3) database Table is empty
4) Doesn't matter but try adding connection.Open().
I cant say anything more until you show the complete code related to this function and how you are assigning this function(or method) to your original DataTable

Try to begin a transaction and commit it after you have added the data. I don't know much about C#, but I see a BeginTransaction() method on the SqlConnection object.

Related

How to populate SQL Server database with list of list of objects instead of looping

I have data coming from an Excel spreadsheet, I cannot change how this comes in. Is there a solution for adding IList<IList<Object>> values to a SQL Server database instead of looping as I am reaching limit with currently 5k rows.
I was also informed that I shouldn't use injection, so any alternatives are welcomed.
public static void Load_Table()
{
// This function on_click populates the datagrid named JumpTable
// this.JumpTable.ItemsSource = null; // Clears the current datagrid before getting new data
// Pulls in the 2d table from the client sheet
IList<IList<Object>> client_sheet = Get(SetCredentials(), "$placeholder", "Client!A2:AY");
DBFunctions db = new DBFunctions();
db.postDBTable("DELETE FROM Client");
foreach (var row in client_sheet)
{
string exe = "INSERT INTO Client ([Tracker ID],[Request ID]) VALUES('" + row[0].ToString() + "','" + row[1].ToString() + "')";
db.postDBTable(exe);
}
}
Database functions
public SqlConnection getDBConnection()
{
// --------------< Function: Opens the connection to our database >-------------- \\
string connectionString = Properties.Settings.Default.connection_string; // Gets the connection source from properties
SqlConnection dbConnection = new SqlConnection(connectionString); // Creates the connection based off our source
if (dbConnection.State != ConnectionState.Open) dbConnection.Open(); // If it's not already open then open the connection
return dbConnection;
}
public DataTable getDBTable(string sqlText)
{
// --------------< Function: Gets the table from our database >-------------- \\
SqlConnection dbConnection = getDBConnection();
DataTable table = new DataTable();
SqlDataAdapter adapter = new SqlDataAdapter(sqlText, dbConnection);adapter.Fill(table);
return table;
}
public void postDBTable(string sqlText)
{
// --------------< Function: Post data to our database >-------------- \\
SqlConnection dbConnection = getDBConnection();
SqlCommand cmd = new SqlCommand(sqlText, dbConnection);cmd.ExecuteNonQuery();
}
I have worked with lots of bulk data loads in the past. There are two main ways to avoid individual inserts with SQL Server, a list of values inserted at one time or using bulk insert.
the first option to use a list of values is like this:
INSERT INTO Foo (Bar,Baz)
VALUES ('bar','baz'),
('anotherbar','anotherbaz')
In c# you would loop through your list and build the values content, however doing this with out a sql injection vulnerability is difficult.
The second option is to use bulk insert with SQL Bulk copy and a datatable. Before getting to the code below you would build a DataTable that holds all your data then use SqlBulkCopy to insert rows using Sql functionality that is optimized for inserting large amounts of data.
using (var bulk = new SqlBulkCopy(con)
{
//bulk mapping is a SqlBulkCopyColumnMapping[] and is not necessaraly needed if the DataTable matches your destination table exactly
foreach (var i in bulkMapping)
{
bulk.ColumnMappings.Add(i);
}
bulk.DestinationTableName = "MyTable";
bulk.BulkCopyTimeout = 600;
bulk.BatchSize = 5000;
bulk.WriteToServer(someDataTable);
}
These are the two framework included methods. There are other libraries that can help. Dapper is one but I am not sure how it handles inserts on the back end. Entity framework is another but it does single inserts so it is just moving the problem from your code to some one else's.

SQL Bulk Insert in C# not inserting values

I'm completely new to C#, so I'm sure I'm going to get a lot of comments about how my code is formatted - I welcome them. Please feel free to throw any advice or constructive criticisms you might have along the way.
I'm building a very simple Windows Form App that is eventually supposed to take data from an Excel file of varying size, potentially several times per day, and insert it into a table in SQL Server 2005. Thereafter, a stored procedure within the database takes over to perform various update and insert tasks depending on the values inserted into this table.
For this reason, I've decided to use the SQL Bulk Insert method, since I can't know if the user will only insert 10 rows - or 10,000 - at any given execution.
The function I'm using looks like this:
public void BulkImportFromExcel(string excelFilePath)
{
excelApp = new Excel.Application();
excelBook = excelApp.Workbooks.Open(excelFilePath);
excelSheet = excelBook.Worksheets.get_Item(sheetName);
excelRange = excelSheet.UsedRange;
excelBook.Close(0);
try
{
using (SqlConnection sqlConn = new SqlConnection())
{
sqlConn.ConnectionString =
"Data Source=" + serverName + ";" +
"Initial Catalog=" + dbName + ";" +
"User id=" + dbUserName + ";" +
"Password=" + dbPassword + ";";
using (OleDbConnection excelConn = new OleDbConnection())
{
excelQuery = "SELECT InvLakNo FROM [" + sheetName + "$]";
excelConn.ConnectionString = "Provider=Microsoft.ACE.OLEDB.12.0;Data Source=" + excelFilePath + ";Extended Properties='Excel 8.0;HDR=Yes'";
excelConn.Open();
using (OleDbCommand oleDBCmd = new OleDbCommand(excelQuery, excelConn))
{
OleDbDataReader dataReader = oleDBCmd.ExecuteReader();
using (SqlBulkCopy bulkImport = new SqlBulkCopy(sqlConn.ConnectionString))
{
bulkImport.DestinationTableName = sqlTable;
SqlBulkCopyColumnMapping InvLakNo = new SqlBulkCopyColumnMapping("InvLakNo", "InvLakNo");
bulkImport.ColumnMappings.Add(InvLakNo);
sqlQuery = "IF OBJECT_ID('ImportFromExcel') IS NOT NULL BEGIN SELECT * INTO [" + DateTime.Now.ToString().Replace(" ", "_") + "_ImportFromExcel] FROM ImportFromExcel; DROP TABLE ImportFromExcel; END CREATE TABLE ImportFromExcel (InvLakNo INT);";
using (SqlCommand sqlCmd = new SqlCommand(sqlQuery, sqlConn))
{
sqlConn.Open();
sqlCmd.ExecuteNonQuery();
while (dataReader.Read())
{
bulkImport.WriteToServer(dataReader);
}
}
}
}
}
}
}
catch(Exception ex)
{
MessageBox.Show(ex.ToString());
}
finally
{
excelApp.Quit();
}
}
The function runs without errors or warnings, and if I replace the WriteToServer with manual SQL commands, the rows are inserted; but the bulkImport isn't inserting anything.
NOTE: There is only one field in this example, and in the actual function I'm currently running to test; but in the end there will be dozens and dozens of fields being inserted, and I'll be doing a ColumnMapping for all of them.
Also, as stated, I am aware that my code is probably horrible - please feel free to give me any pointers you deem helpful. I'm ready and willing to learn.
Thanks!
I think it would be a very long and messy answer if I commented on your code and also gave pointer sample codes in the same message, so I decided to divide then into two messages. Comments first:
You are using automation to get what? You already have the sheet name as I see it and worse you are doing app.Quit() at the end. Completely remove that automation code.
If you needed some information from excel (like sheet names, column names) then you could use OleDbConnecton's GetOleDbSchemaTable method.
You might do the mapping basically in 2 ways:
Excel column ordinal to SQL table column name
Excel column name to SQL table column name
both would do. In a generic code, assuming you have column names same in both sources, but their ordinal and count may differ, you could get the column names from OleDbConnection schema table and do the mapping in a loop.
You are dropping and creating a table named "ImportFromExcel" for the purpose of temp data insertion, then why not simply create a temp SQL server table by using a # prefix in table name? OTOH that code piece is a little weird, it would do an import from "ImportFromExcel" if it is there, then drop and create a new one and attempt to do bulk import into that new one. In first run, SqlBulkCopy (SBC) would fill ImportFromExcel and on next run it would be copied to a table named (DateTime.Now ...) and then emptied via drop and create again. BTW, naming:
DateTime.Now.ToString().Replace(" ", "_") + "_ImportFromExcel"
doesn't feel right. While it looks tempting, it is not sortable, probably you would want something like this instead:
DateTime.Now.ToString("yyyyMMddHHmmss") + "_ImportFromExcel"
Or better yet:
"ImportFromExcel_" +DateTime.Now.ToString("yyyyMMddHHmmss")
so you would have something that is sorted and selectable for all the imports as a wildcard or looping for some reason.
Then you are writing to server inside a reader.Read() loop. That is not the way WriteToServer works. You wouldn't do reader.Read() but simply:
sbc.WriteToServer(reader);
In my next message e I will give simple schema reading and a simple SBC sample from excel into a temp table, as well as a suggestion how you should do that instead.
Here is the sample for reading schema information from Excel (here we read the tablenames - sheet names with tables in them):
private IEnumerable<string> GetTablesFromExcel(string dataSource)
{
IEnumerable<string> tables;
using (OleDbConnection con = new OleDbConnection("Provider=Microsoft.ACE.OLEDB.12.0;" +
string.Format("Data Source={0};", dataSource) +
"Extended Properties=\"Excel 12.0;HDR=Yes\""))
{
con.Open();
var schemaTable = con.GetOleDbSchemaTable(OleDbSchemaGuid.Tables, null);
tables = schemaTable.AsEnumerable().Select(t => t.Field<string>("TABLE_NAME"));
con.Close();
}
return tables;
}
And here is a sample that does SBC from excel into a temp table:
void Main()
{
string sqlConnectionString = #"server=.\SQLExpress;Trusted_Connection=yes;Database=Test";
string path = #"C:\Users\Cetin\Documents\ExcelFill.xlsx"; // sample excel sheet
string sheetName = "Sheet1$";
using (OleDbConnection cn = new OleDbConnection(
"Provider=Microsoft.ACE.OLEDB.12.0;Data Source="+path+
";Extended Properties=\"Excel 8.0;HDR=Yes\""))
using (SqlConnection scn = new SqlConnection( sqlConnectionString ))
{
scn.Open();
// create temp SQL server table
new SqlCommand(#"create table #ExcelData
(
[Id] int,
[Barkod] varchar(20)
)", scn).ExecuteNonQuery();
// get data from Excel and write to server via SBC
OleDbCommand cmd = new OleDbCommand(String.Format("select * from [{0}]",sheetName), cn);
SqlBulkCopy sbc = new SqlBulkCopy(scn);
// Mapping sample using column ordinals
sbc.ColumnMappings.Add(0,"[Id]");
sbc.ColumnMappings.Add(1,"[Barkod]");
cn.Open();
OleDbDataReader rdr = cmd.ExecuteReader();
// SqlBulkCopy properties
sbc.DestinationTableName = "#ExcelData";
// write to server via reader
sbc.WriteToServer(rdr);
if (!rdr.IsClosed) { rdr.Close(); }
cn.Close();
// Excel data is now in SQL server temp table
// It might be used to do any internal insert/update
// i.e.: Select into myTable+DateTime.Now
new SqlCommand(string.Format(#"select * into [{0}]
from [#ExcelData]",
"ImportFromExcel_" +DateTime.Now.ToString("yyyyMMddHHmmss")),scn)
.ExecuteNonQuery();
scn.Close();
}
}
While this would work, thinking in the long run, you need column names, and maybe their types differ, it might be an overkill to do this stuff using SBC and you might instead directly do it from MS SQL server's OpenQuery:
SELECT * into ... from OpenQuery(...)
the WriteToServer(IDataReader) is intended to do internally the IDataReader.Read()operation.
using (SqlCommand sqlCmd = new SqlCommand(sqlQuery, sqlConn))
{
sqlConn.Open();
sqlCmd.ExecuteNonQuery();
bulkImport.WriteToServer(dataReader);
}
You can check the MSDN doc on that function, has a working example: https://msdn.microsoft.com/en-us/library/434atets(v=vs.110).aspx

Copying data from one database to another - Out of Memory Exception

I have a task where i want to copy all data from one database to another database & skipping 2 tables. There are more than 200 tables.
I have table structure ready for my 2nd databas.
So as a solution i created a page & on a button click i have below code :-
DataSet ds = new DataSet();
string connectionString = "Data Source=COMP112\\MSSQLSERVER2014;Initial Catalog=HCMBL;Integrated Security=True;Persist Security Info=True";
SqlConnection con = new SqlConnection(connectionString);
//render table name from database
string sqlTable = "SELECT * FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_TYPE='BASE TABLE' and TABLE_Schema='" + Session["SchemaName"].ToString() + "' and TABLE_NAME!='ENTRY' and TABLE_NAME!='OT' and TABLE_NAME!='BL_ENTRY' and TABLE_NAME!='BL_OT'";
con.Open();
SqlDataAdapter da = new SqlDataAdapter();
SqlCommand cmd = new SqlCommand(sqlTable, con);
cmd.CommandType = CommandType.Text;
da.SelectCommand = cmd;
da.Fill(ds);
con.Close();
//render connection string from WebConfig file
string strcon = ConfigurationManager.ConnectionStrings["SPSchema"].ConnectionString;
for (int i = 0; i < ds.Tables[0].Rows.Count; i++)
{
if (!(ds.Tables[0].Rows[i]["TABLE_NAME"].ToString().Contains("Asp")))
{
string deleteQuery = "Truncate table " + Session["SchemaName"].ToString() + "." + ds.Tables[0].Rows[i]["TABLE_NAME"];
con.Open();
SqlCommand cmdDelete = new SqlCommand(deleteQuery, con);
cmdDelete.ExecuteNonQuery();
con.Close();
DataSet dataSet = new DataSet();
SqlConnection conn = new SqlConnection(strcon);
conn.Open();
string selectData = "select * from " + Session["SchemaName"].ToString() + "." + ds.Tables[0].Rows[i]["TABLE_NAME"];
SqlCommand command = new SqlCommand(selectData, conn);
DataTable dataTable = new DataTable();
SqlDataAdapter dataAdapter = new SqlDataAdapter(selectData, conn);
dataAdapter.FillSchema(dataSet, SchemaType.Mapped);
dataAdapter.Fill(dataSet);
dataTable = dataSet.Tables[0];
conn.Close();
if (dataSet.Tables[0].Rows.Count > 0)
{
//Connect to second Database and Insert row/rows.
SqlConnection conn2 = new SqlConnection(connectionString);
conn2.Open();
SqlBulkCopy bulkCopy = new SqlBulkCopy(conn2);
bulkCopy.DestinationTableName = Session["SchemaName"].ToString() + "." + ds.Tables[0].Rows[i]["TABLE_NAME"].ToString();
bulkCopy.WriteToServer(dataTable);
conn2.Close();
}
}
}
As i run the above code after inserting data in less than 10 tables, it gives out of memory exception & program crashes.
How to handle this? I tried increasing the memory capacity of SQL Server but still same error.
Is there any other way to achieve the task?
What you are doing is very far from the best solution. You are using an ASP.NET MVC process to get all data of your entire database into memory, and then outputting it to another database. If your database is anything more than small and trivial, that will most definitely fill your process's alotted memory.
This type of task should never be done through the memory of a process, but rather using some form of Backup/Restore pattern.
You should look into SSIS projects and create an extract, transfer, and load (ETL) solution, which can be triggered from your ASP.NET MVC solution asynchronously.
An SSIS solution can be triggered from C# code in this way:
var app = new Application();
var package = app.LoadPackage("compiled-package.dtsx", null);
var results = package.Execute();
See this question for a little more information (not specifically about duplicating databases, but has information about triggering SSIS packages from code): How to execute an SSIS package from .NET?
Alternatively
You also have the option of running a query against both databases at once, however this requires some additional plumbing to be done. The user account of your ASP.NET MVC solution needs to have access to both databases. If your databases are hosted on different servers, you also need to link one server to the other: Create linked servers
To perform an insert directly from the output of a select, consider this:
string source = "NAME_OF_SOURCE_DATABASE";
string target = "NAME_OF_TARGET_DATABASE";
string schema = Session["SchemaName"].ToString();
string table = ds.Tables[0].Rows[i]["TABLE_NAME"];
// Uncomment this if you need to deal with autoincrement columns
/*string idInsQuery = $"SET IDENTITY_INSERT {target}.{schema}.{table} ON";
var idInsCommand = new SqlCommand(idInsQuery, conn);
idInsCommand.ExecuteNonQuery();*/
string insQuery = $"INSERT INTO {target}.{schema}.{table} SELECT * FROM {source}.{schema}.{table}";
var insCommand = new SqlCommand(insQuery, conn);
insCommand.ExecuteNonQuery();
// Uncomment this if you need to deal with autoincrement columns
/*string idInsQuery2 = $"SET IDENTITY_INSERT {target}.{schema}.{table} OFF";
var idInsCommand2 = new SqlCommand(idInsQuery2, conn);
idInsCommand2.ExecuteNonQuery();*/
This will only work if the table structures are identical. There might be problems with autoincrement ids or columns with default values, too.
This will copy data from a table in database 1 to a table in database 2
Insert into db2.dbo.table2 (col1,col2)
Select col1,col2 from db1.dbo.table1
Run this sql statement and the data will be copied without a round trip to your app.
Let me know if you find my approach is useful.
First of all, why you want to write down one whole application to do this job while SQL Server have inherited property to do it.
My approach would be configure an Linked Server and configure it which tables you want to copy and which one not.
https://learn.microsoft.com/en-us/sql/relational-databases/linked-servers/create-linked-servers-sql-server-database-engine
Secondly, You can just write down simple stored procedure and schedule that in your sql server to push into another server database as per your schedule. In this way you can control it in N number of ways. I mean about controlling any dependencies(Table level or Business level).
To do this in t-sql, you can use the following system stored procedures to schedule a daily job. This example schedules daily at 1:00 AM. See Microsoft help for details on syntax of the individual stored procedures and valid range of parameters.
DECLARE #job_name NVARCHAR(128), #description NVARCHAR(512), #owner_login_name NVARCHAR(128), #database_name NVARCHAR(128);
SET #job_name = N'Some Title';
SET #description = N'Periodically do something';
SET #owner_login_name = N'login';
SET #database_name = N'Database_Name';
-- Delete job if it already exists:
IF EXISTS(SELECT job_id FROM msdb.dbo.sysjobs WHERE (name = #job_name))
BEGIN
EXEC msdb.dbo.sp_delete_job
#job_name = #job_name;
END
-- Create the job:
EXEC msdb.dbo.sp_add_job
#job_name=#job_name,
#enabled=1,
#notify_level_eventlog=0,
#notify_level_email=2,
#notify_level_netsend=2,
#notify_level_page=2,
#delete_level=0,
#description=#description,
#category_name=N'[Uncategorized (Local)]',
#owner_login_name=#owner_login_name;
-- Add server:
EXEC msdb.dbo.sp_add_jobserver #job_name=#job_name;
-- Add step to execute SQL:
EXEC msdb.dbo.sp_add_jobstep
#job_name=#job_name,
#step_name=N'Execute SQL',
#step_id=1,
#cmdexec_success_code=0,
#on_success_action=1,
#on_fail_action=2,
#retry_attempts=0,
#retry_interval=0,
#os_run_priority=0,
#subsystem=N'TSQL',
#command=N'EXEC my_stored_procedure; -- OR ANY SQL STATEMENT',
#database_name=#database_name,
#flags=0;
-- Update job to set start step:
EXEC msdb.dbo.sp_update_job
#job_name=#job_name,
#enabled=1,
#start_step_id=1,
#notify_level_eventlog=0,
#notify_level_email=2,
#notify_level_netsend=2,
#notify_level_page=2,
#delete_level=0,
#description=#description,
#category_name=N'[Uncategorized (Local)]',
#owner_login_name=#owner_login_name,
#notify_email_operator_name=N'',
#notify_netsend_operator_name=N'',
#notify_page_operator_name=N'';
-- Schedule job:
EXEC msdb.dbo.sp_add_jobschedule
#job_name=#job_name,
#name=N'Daily',
#enabled=1,
#freq_type=4,
#freq_interval=1,
#freq_subday_type=1,
#freq_subday_interval=0,
#freq_relative_interval=0,
#freq_recurrence_factor=1,
#active_start_date=20170101, --YYYYMMDD
#active_end_date=99991231, --YYYYMMDD (this represents no end date)
#active_start_time=010000, --HHMMSS
#active_end_time=235959; --HHMMSS
Let me know in case you need more details on this.
Thanks,
Ayan

How to insert data in single column(cell) of new row in c#?

I have an .accdb file with four tables in it
Product_Particulars
Cust_Details
Variable_fields
Permanant_fields
The number of column in the 'Variable_fields' table is not fixed (changed using 'ALTER TABLE' OleDb nonQuery). But it has two fixed columns 'Tranx_ID', 'Tranx_time'.
I want to accomplish something that will enable me to add data in the 'Tranx_ID' Column in a new row from a textBox without caring about other columns in the table (i.e. other cells in that row, in which the 'textBox.Text' is attempted to insert) and save the row with data in only one cell.
N.B.: I am actually using OleDb & I will use the 'Tranx_ID' for Updating that particular row using an OleDbCommand like,
"UPDATE Variable_fields " +
"SET [This column]='" +thistxtBx.Text +
"',[That column]='" +thattxtBx.Text +
"'WHERE ([Tranx_ID]='" +textBox.Text+ "')";
The exception is caused by the fact that one or more of the columns that you don't insert cannot have NULL as value. If you can remove this flag and allow a null value then your INSERT could work or not for other reasons.
Indeed you use a string concatenation to build your query and this is a well known source of bugs or a vector for an hacking tecnique called Sql Injection (I really suggest you to document yourself about this nasty problem)
So your code could be the following
string query = #"UPDATE Variable_fields
SET [This column]= #this,
[That column]=#that
WHERE ([Tranx_ID]=#trans";
using(OleDbConnection con = new OleDbConnection(....constringhere....))
using(OleDbCommand cmd = new OleDbCommand(query, con))
{
con.Open();
cmd.Parameters.Add("#this", OleDbType.VarWChar).Value = thisTextBox.Text;
cmd.Parameters.Add("#that", OleDbType.VarWChar).Value = thatTextBox.Text;
cmd.Parameters.Add("#trans", OleDbType.VarWChar).Value = transTextBox.Text;
int rowsInserted = cmd.ExecuteNonQuery();
if(rowsInserted > 0)
MessageBox.Show("Record added");
else
MessageBox.Show("Record NOT added");
}
Helpful links:
Sql Injection explained
Give me parameterized query or give me death
Using statement

Passing data from one database to another database table

I want to take a backup of my Access database Pragmatically.
And After taking all data in backup i want to delete data from source database.
( So that it will not take much time while querying and filtering through application.)
The source database name is Data.mdb
The destination database name is Backup.mdb
Both are protected by same password.
For these purpose i am writing a query in C# like this.
string conString = "Provider=Microsoft.Jet.OLEDB.4.0 ;Data Source=Backup.mdb;Jet
OLEDB:Database Password=12345";
OleDbConnection dbconn = new OleDbConnection();
OleDbDataAdapter dAdapter = new OleDbDataAdapter();
OleDbCommand dbcommand = new OleDbCommand();
try
{
if (dbconn.State == ConnectionState.Closed)
dbconn.Open();
string selQuery = "INSERT INTO [Bill_Master] SELECT * FROM [MS Access;DATABASE="+
"\\Data.mdb" + "; Jet OLEDB:Database Password=12345;].[Bill_Master]";
dbcommand.CommandText = selQuery;
dbcommand.CommandType = CommandType.Text;
dbcommand.Connection = dbconn;
int result = dbcommand.ExecuteNonQuery();
}
catch(Exception ex) {}
Everything goes fine if i try with without password database file.
I think error in passing password on query statement.
I am trying to execute through access query but it is saying "Invalid argument".
Please is there any other programing logic for doing that.
Thanks
prashant
YuvaDeveloper
Are Data.mdb and Backup.mdb identically in strcuture? If so, I wouldn't bother copying data via SQL but just copy the whole file.
Try remove the space between the ; and Jet …
So the format would be:
INSERT INTO [Bill_Master] SELECT * FROM [MS Access;DATABASE="+
"\\Data.mdb" + ";Jet OLEDB:Database Password=12345;].[Bill_Master]
You can copy and rename Data.mdb, and then truncate all the tables in Data.mdb. Far easier than trying to copy a table at a time..
Don't delete data. This becomes a lot mroe difficult in the future to do analysis or inquiries. If it's taking a long time then review indexing or upszing to SQL Server. The Express edition is free and can handle databases up to 4 Gb.

Categories