I have a U2/UniVerse database, and need to copy the data in one table into a SQL Server table. The table in question has approx 600,000 rows and just under 200 columns. I didn't create the table, and can't change it.
For other tables, I'm looping thru a UniDataSet one record at a time and adding it to a DataTable, and then using SqlBulkCopy to copy the records to the SQL Server. This works fine, but with the large table I seem to be running out of memory when creating the DataTable.
DataTable dt = new DataTable("myTempTable");
dt.Columns.Add("FirstColumn", typeof(string));
dt.Columns.Add("SecondColumn", typeof(string));
... //adding a bunch more columns here
dt.Columns.Add("LastColumn", typeof(string));
U2Connection con = GetU2Con();
UniSession us1 = con.UniSession;
UniSelectList s1 = us1.CreateUniSelectList(0);
UniFile f1 = us1.CreateUniFile("MyU2TableName")
s1.Select(f1);
UniDataSet uSet = f1.ReadRecords(s1.ReadListAsStringArray());
foreach (UniRecord uItem in uSet)
{
List<String> record = new List<String>(uItem.Record.ToString().Split(new string[] { "รพ" }, StringSplitOptions.None));
DataRow row = dt.NewRow();
row[0] = uItem.RecordID;
row[1] = record[0];
row[2] = record[1];
... //add the rest of the record
row[50] = record[49]
dt.Rows.Add(row);
}
con.Close();
So that copies the records from the UniDataSet into a DataTable. Then, I SqlBulkCopy the DataTable into a SQL table:
string SQLcon = GetSQLCon();
using (SqlBulkCopy sbc = new SqlBulkCopy(SQLcon))
{
sbc.DestinationTableName = "dbo.MySQLTableName";
sbc.BulkCopyTimeout = 0;
sbc.BatchSize = 1000; //I've tried anywhere from 50 to 50000
try
{
sbc.WriteToServer(dt);
}
catch
{
Console.WriteLine(ex.Message);
}
}
This works just fine for my U2 tables that have 50,000 or so rows, but it basically crashes the debugger (VS Express 2012) when the table has 500,000 rows. The PC I'm doing this on is Windows 7 x64 with 4GB ram. The VS process looks like it uses up to 3.5GB RAM before it crashes.
I'm hoping there's a way to write the UniDataSet right to SQL using SqlBulkCopy, but I'm not too familiar with the U2 .Net toolkit.
The problem I face is the UniDataSet records are multivalue, and I need to pick them apart before I can write them to SQL.
Thanks!
DataTable it gets too much bigger in memory, before inserting to Database.
Why don't you split the bulk insert operation? For example read the first 50.000 results and the insert to Sql server database, clear the DataTable Memory and start again with the next 50.000 rows.
if (dt.Rows.Count > 50000)
{
//do SqlbulkCopy
dt.Rows.Clear();
}
In U2 Toolkit for .NET v2.1.0 , we have implemented Native Access. Now you can create DataSet/DataTable from UniData/UniVerse File directly. You can specify WHERE and SORT Clause too. You will see performance improvement as it will not make too much Server Trip to get IDs. For example, if you have 1000 record IDs, it will make 1000 times Server Trip. Whereas if you use Native Access, it will make one Server Trip.
Please download U2 Toolkit for .NET v2.2.0 Hot Fix 1 and try the following code. For more information , please contact u2askus#rocketsoftware.com.
U2Connection con = GetU2Con();
U2Command cmd = lConn.CreateCommand();
cmd.CommandText = string.Format("Action=Select;File=MyU2TableName;Attributes=MyID,FirstColumn,SecondColumn,LastColumn;Where=MyID>0;Sort=MyID");
U2DataAdapter da = new U2DataAdapter(cmd);
DataSet ds = new DataSet();
da.Fill(ds);
DataTable dt = ds.Tables[0];
Related
I wrote the following code in order to copy a DataTable content into a MS Access table.
The problem is that the data set is very huge, it takes a long time (more than 10mns), and stops when the file reaches 2GB. I know entire set of data is about 785Mo in RAM for about 820000 rows.
public static bool InsertmyDataTableDAO(string filePathName, DataTable myDataTable)
{
string connectionString = string.Format(ConnectionParameters.MsAccessConnectionStringOledb, filePathName);
DBEngine dbEngine = new DBEngine();
Database db = dbEngine.OpenDatabase(filePathName);
db.Execute("DELETE FROM " + myDataTable.TableName);
Recordset rs = db.OpenRecordset(myDataTable.TableName);
Field[] tableFields = new Field[myDataTable.Columns.Count];
foreach(DataColumn column in myDataTable.Columns)
{
tableFields[column.Ordinal] = rs.Fields[column.ColumnName];
}
foreach(DataRow row in myDataTable.Rows)
{
rs.AddNew();
foreach(DataColumn col in row.Table.Columns)
{
tableFields[col.Ordinal].Value = row[col.Ordinal];
}
rs.Update();
}
rs.Close();
db.Close();
return true;
}
Is there a faster way to copy data set from datatable to MS Access DB?
The max db size for access is 2GB, you can't bypass this limit :
https://support.office.com/en-us/article/access-specifications-0cf3c66f-9cf2-4e32-9568-98c1025bb47c?ui=en-US&rs=en-US&ad=US
I see you're using a DELETE statement to remove the rows beforehand. DELETE doesn't necessarily recover free space. Here's what I'd do...
Use your existing code to delete the data in the table.
Next, use Microsoft.Interop.Access to compact/repair the database
Finally, run your above code to insert the DataTable.
I'd also add that you could probably use Microsoft.Interop.Access to import the datatable too... Perhaps save it to a CSV file first... then import it that way rather than using INSERT statements.
I searched the web and Stack Overflow and found lots of descriptions on how to fill a DataGridView with the content of a DataTable. But still it does not work for me. My DataGridView shows the correct number of columns and rows, but they appear empty.
I use following method:
public void ShowDataInGrid(ref DataTable table)
{
BindingSource sBind = new BindingSource();
dbView.Columns.Clear();
dbView.AutoGenerateColumns = false;
sBind.DataSource = table;
dbView.DataSource = sBind; //Add table to DataGridView
dbView.Columns.Add("Date", "Date");
}
Before this I created a DataGridView of name "dbView" via the designer. I am not even sure, whether I need sBind. Without it I can bind the table directly to dbView, with the same bad result.
I suspect my table is the problem. It origins from a database (SQLite) and has several columns and rows (one of the columns has the name "Date"). It is definately filled with readable data.
I mainly read the table in using following commands (after this I manipulate the data in several different steps, like changing strings and adding numbers...):
string sql = "select * from Bank";
SQLiteCommand command = new SQLiteCommand(sql, m_dbConnection);
SQLiteDataReader reader = command.ExecuteReader();
table.Load(reader);
reader.Close();
table.AcceptChanges();
I think the problem might be, that the table entries are stored as objects and not as string, and hence can't be shown. That's why I tried to force the content to be strings with the following change to my table:
DataTable dbTableClone = new DataTable();
dbTableClone.Load(reader);
SQLiteDataReader reader.Close();
dbTableClone.AcceptChanges();
string[] dBHeader = new string[dbTableClone.Columns.Count];
dBHeader = ReadHeaderFromDataTable(dbTableClone); //own funktion, which reads the header
DataTable table;
table.Clear();
//will first create dbTable as empty clone, so I can set DataTyp of each Column
table = dbTableClone.Clone();
for (int col = 0; col > dBHeader.Length; col++) //first set all columns as string
{
dbTable.Columns[col].DataType = typeof(string);
}
foreach (DataRow Row in dbTableClone.Rows)
{
dbTable.ImportRow(Row);
}
This did not help me neither.
Another idea: I found some comments on similar problems, where it got apparently solved with quote: "I designed columns in the VS datagridview designer. Not the column name, but the column DataPropertyName must match with fields in database." Unfortunately I don't seem to be able to do/understand this.
Following you see one row of my input table.
Try fetching and setting to GridView this way
SqlLiteConnection con = new SqlLiteConnection(#"Data Source=(LocalDB)\v11.0;AttachDbFilename=DB.mdf;Integrated Security=True");
con.Open();
SqlLiteDataAdapter adap = new SqlLiteDataAdapter("select * from Bank", con);
DataSet ds = new System.Data.DataSet();
adap.Fill(ds);
dataGridView1.DataSource = ds.Tables[0];
Comment everything you've done so far, try this and let me know if this works for you or not. Change connection according to your DB.
I solved the problem.
The DataTable was fine. The problem was the setup of my DataGridView dbView. I set up dbView in the designer and somehow gave it a datasource. Now I set the datasource to "none" (In "DataGridView Tasks") and my data appears as intended.
Thanks to M Adeel Khalid for looking at my stuff. Him assuring to me that my code for the link was right, made me find the solution eventually.
At the end I really only needed to use a single line:
dbView.DataSource = table;
I basically have a listbox that has postcode areas i.e : AE,CW,GU etc etc.
The user selects this and then a postback occurs - an sql statement is builts and a database query operation is performed and the results are returned to a datatable called tempdata.
So far so good. I then need to loop through this datatable and copy the records to my main viewstate datatable which is the datasource for google maps api.
DataTable tempstore = GetData(querystring, "");
//check tempstore has rows otherwise add defaultcust as default otherwise map will be blank
if (tempstore.Rows.Count == 0)
{
tempstore = GetData("WHERE CUSTCODE=='CD344'", "");
infoalert.Visible = true;
infoalert.InnerHtml = "No Results Returned For Selection";
}
foreach (DataRow row in tempstore.Rows)
{
dtpc.ImportRow(row);
dtpc.AcceptChanges();
}
//database command
using (OleDbConnection con = new OleDbConnection(conString))
{
using (OleDbCommand cmd = new OleDbCommand(query))
{
using (OleDbDataAdapter sda = new OleDbDataAdapter())
{
cmd.Connection = con;
sda.SelectCommand = cmd;
sda.Fill(dt5);
}
}
}
So my main datatable can grow and grow as users add more postcodes. However when it gets to around 500 rows or so I get a huge memory spike only on postback and then it settles back down.My ram usage goes from 2gb to 3gb and if even more postcodes is selected it maxes the memory and crashes my pc.
If I remove the:
dtpc.Importrow(row);
the memory spike goes completely, obviously because the main datatable has no rows. I thought you only run into memory issues when you have thousands of rows?
Any help would be much appreciated.
thank you
Do you really need all the rows at once
A DataReader will access a single row at a time and keep you memory to a minimum
DataReader class
If you need all you data at once create a class of strut for the data and hold it in a collection like a List. DataTable is a heavy object.
And if you are measuring memory via Task Manager be aware it is not very accurate.
First off, make sure you're wrapping any SQL execution in the appropriate "using" clauses. This is most likely the cause of your problem.
using (var command = new SqlCommand())
{
// Some code here
}
Like Blam said, DataTable is too heavy for your purposes.
You can convert your data rows into class objects quite easily:
var datasourceList = new List<YourCustomObject>();
foreach (DataRow row in tempstore.Rows)
{
var newMapsObject = new YourCustomObject
{
Value1 = row.Field<String>("Value1ColumnName"),
Value2 = row.Field<String>("Value2ColumnName")
};
datasourceList.Add(newMapsObject);
}
viewStateList.AddRange(datasourceList);
To bind a custom collection to a data display (such as a repeater) you assign the list to the .DataSource property of said display, then call .DataBind(). This will work for most all ASP.NET data display objects.
repeater1.DataSource = viewStateList;
repeater1.DataBind();
I'm (new to and) working with a SQL CE database that will later be connected to a project where we send and receive information to a device over a SerialPort. This database will store information about every part of the communication. I'm a little stuck when it comes to updating a Dataset and committing this updated data to the database.
data = new DataSet();
adapter = new SqlCeDataAdapter("SELECT * FROM " + [table_name])
builder = new SqlCeCommandBuilder(adapter);
adapter.Fill(data, [table_name]);
data.Tables[table_name].Rows[identity_value][a_column] = a_column_value;
adapter.Update(data, [table_name]);
Before I run this method I'm ensuring that I have a record in the table at identity value 1. However I'm getting a There is no row at position 1 IndexOutOfRangeException before I call the adapter.Update(). I'm assuming that I've misunderstood how to use and update a Dataset. Any advice?
I've tried looking into the Dataset prior to trying to update the row however the debugger doesn't seem to let me peer into the Dataset, is there a way to do this?
try this:
var rows = data.Tables[table_name].Select(String.Format("[Id] = {0}",identity_value));
if (rows.Length == 1)
{
rows[0][a_column] = a_column_value;
}
you are interpreting wrong the property Rows
data.Tables[table_name].Rows[ROWINDEX][COLUMNINDEX]
ROWINDEX is the array index not the identity
A suggestion:
if you have to use DataSet then load table schema info too with SqlCeDataAdapter.FillSchema:
adapter.FillSchema(data.Tables[table_name], SchemaType.Mapped);
adapter.Fill(data, [table_name]); this loads only the data:
Then if you have at least one column designated as a primary key column in the DataTable you can get the row with DataRowCollection.Find Method:
DataRow foundRow = data.Tables[table_name].Rows.Find(identity_value);
I have a DataTable in C# and would like to send it to my SQL CE 4 server. One thing that makes it a bit more complicated is that when it encounters an duplicate, it should either ignore it and move on to the next row in the DataTable. I've looked around but a lot of information I find doesn't seem to work with the CE version of SQL Server. What's an efficient way of doing this?
Filter your DataTable to exclude the duplicate rows before uploading, using the DataTable.Select Method
e.g.
DataTable table = DataSet1.Tables["Orders"];
// Presuming the DataTable has a column named Date.
string expression;
expression = "Date > #1/1/00#"; // you will need logic to remove your duplicates
DataRow[] foundRows;
// Use the Select method to find all rows excluding duplicates
foundRows = table.Select(expression);
// .NET 3.5 onwards
DataTable filteredDataTable = foundRows.copyToDataTable();
Try this Logic.
var dt = new DataTable(); //Supposed that this is your DataTable
foreach(DataRow row in dt.Rows)
{
var find = MyFindMethod("Id"); 1. select statement that find if the id is on database
if(find.Rows > 0)
{
//Id exist do nothing
}
else
{
//Id not exist then 2. Do Insert to sql ce id I not exist
MyInsertMethod("Id");
}
}
Regards