I basically have a listbox that has postcode areas i.e : AE,CW,GU etc etc.
The user selects this and then a postback occurs - an sql statement is builts and a database query operation is performed and the results are returned to a datatable called tempdata.
So far so good. I then need to loop through this datatable and copy the records to my main viewstate datatable which is the datasource for google maps api.
DataTable tempstore = GetData(querystring, "");
//check tempstore has rows otherwise add defaultcust as default otherwise map will be blank
if (tempstore.Rows.Count == 0)
{
tempstore = GetData("WHERE CUSTCODE=='CD344'", "");
infoalert.Visible = true;
infoalert.InnerHtml = "No Results Returned For Selection";
}
foreach (DataRow row in tempstore.Rows)
{
dtpc.ImportRow(row);
dtpc.AcceptChanges();
}
//database command
using (OleDbConnection con = new OleDbConnection(conString))
{
using (OleDbCommand cmd = new OleDbCommand(query))
{
using (OleDbDataAdapter sda = new OleDbDataAdapter())
{
cmd.Connection = con;
sda.SelectCommand = cmd;
sda.Fill(dt5);
}
}
}
So my main datatable can grow and grow as users add more postcodes. However when it gets to around 500 rows or so I get a huge memory spike only on postback and then it settles back down.My ram usage goes from 2gb to 3gb and if even more postcodes is selected it maxes the memory and crashes my pc.
If I remove the:
dtpc.Importrow(row);
the memory spike goes completely, obviously because the main datatable has no rows. I thought you only run into memory issues when you have thousands of rows?
Any help would be much appreciated.
thank you
Do you really need all the rows at once
A DataReader will access a single row at a time and keep you memory to a minimum
DataReader class
If you need all you data at once create a class of strut for the data and hold it in a collection like a List. DataTable is a heavy object.
And if you are measuring memory via Task Manager be aware it is not very accurate.
First off, make sure you're wrapping any SQL execution in the appropriate "using" clauses. This is most likely the cause of your problem.
using (var command = new SqlCommand())
{
// Some code here
}
Like Blam said, DataTable is too heavy for your purposes.
You can convert your data rows into class objects quite easily:
var datasourceList = new List<YourCustomObject>();
foreach (DataRow row in tempstore.Rows)
{
var newMapsObject = new YourCustomObject
{
Value1 = row.Field<String>("Value1ColumnName"),
Value2 = row.Field<String>("Value2ColumnName")
};
datasourceList.Add(newMapsObject);
}
viewStateList.AddRange(datasourceList);
To bind a custom collection to a data display (such as a repeater) you assign the list to the .DataSource property of said display, then call .DataBind(). This will work for most all ASP.NET data display objects.
repeater1.DataSource = viewStateList;
repeater1.DataBind();
Related
I have a U2/UniVerse database, and need to copy the data in one table into a SQL Server table. The table in question has approx 600,000 rows and just under 200 columns. I didn't create the table, and can't change it.
For other tables, I'm looping thru a UniDataSet one record at a time and adding it to a DataTable, and then using SqlBulkCopy to copy the records to the SQL Server. This works fine, but with the large table I seem to be running out of memory when creating the DataTable.
DataTable dt = new DataTable("myTempTable");
dt.Columns.Add("FirstColumn", typeof(string));
dt.Columns.Add("SecondColumn", typeof(string));
... //adding a bunch more columns here
dt.Columns.Add("LastColumn", typeof(string));
U2Connection con = GetU2Con();
UniSession us1 = con.UniSession;
UniSelectList s1 = us1.CreateUniSelectList(0);
UniFile f1 = us1.CreateUniFile("MyU2TableName")
s1.Select(f1);
UniDataSet uSet = f1.ReadRecords(s1.ReadListAsStringArray());
foreach (UniRecord uItem in uSet)
{
List<String> record = new List<String>(uItem.Record.ToString().Split(new string[] { "þ" }, StringSplitOptions.None));
DataRow row = dt.NewRow();
row[0] = uItem.RecordID;
row[1] = record[0];
row[2] = record[1];
... //add the rest of the record
row[50] = record[49]
dt.Rows.Add(row);
}
con.Close();
So that copies the records from the UniDataSet into a DataTable. Then, I SqlBulkCopy the DataTable into a SQL table:
string SQLcon = GetSQLCon();
using (SqlBulkCopy sbc = new SqlBulkCopy(SQLcon))
{
sbc.DestinationTableName = "dbo.MySQLTableName";
sbc.BulkCopyTimeout = 0;
sbc.BatchSize = 1000; //I've tried anywhere from 50 to 50000
try
{
sbc.WriteToServer(dt);
}
catch
{
Console.WriteLine(ex.Message);
}
}
This works just fine for my U2 tables that have 50,000 or so rows, but it basically crashes the debugger (VS Express 2012) when the table has 500,000 rows. The PC I'm doing this on is Windows 7 x64 with 4GB ram. The VS process looks like it uses up to 3.5GB RAM before it crashes.
I'm hoping there's a way to write the UniDataSet right to SQL using SqlBulkCopy, but I'm not too familiar with the U2 .Net toolkit.
The problem I face is the UniDataSet records are multivalue, and I need to pick them apart before I can write them to SQL.
Thanks!
DataTable it gets too much bigger in memory, before inserting to Database.
Why don't you split the bulk insert operation? For example read the first 50.000 results and the insert to Sql server database, clear the DataTable Memory and start again with the next 50.000 rows.
if (dt.Rows.Count > 50000)
{
//do SqlbulkCopy
dt.Rows.Clear();
}
In U2 Toolkit for .NET v2.1.0 , we have implemented Native Access. Now you can create DataSet/DataTable from UniData/UniVerse File directly. You can specify WHERE and SORT Clause too. You will see performance improvement as it will not make too much Server Trip to get IDs. For example, if you have 1000 record IDs, it will make 1000 times Server Trip. Whereas if you use Native Access, it will make one Server Trip.
Please download U2 Toolkit for .NET v2.2.0 Hot Fix 1 and try the following code. For more information , please contact u2askus#rocketsoftware.com.
U2Connection con = GetU2Con();
U2Command cmd = lConn.CreateCommand();
cmd.CommandText = string.Format("Action=Select;File=MyU2TableName;Attributes=MyID,FirstColumn,SecondColumn,LastColumn;Where=MyID>0;Sort=MyID");
U2DataAdapter da = new U2DataAdapter(cmd);
DataSet ds = new DataSet();
da.Fill(ds);
DataTable dt = ds.Tables[0];
I'm trying to create an object variable that will hold a collection from an Execute SQL Task. This collection will be used in multiple Script Task throughout the ETL package.
The problem is, after the first Fill of the first Script Task, the object variable becomes empty. Here's a code on how I used the variable to a DataTable:
try
{
DataTable dt = new DataTable();
OleDbDataAdapter da = new OleDbDataAdapter();
da.Fill(dt, Dts.Variables["reportMetrics"].Value);
Dts.TaskResult = (int)ScriptResults.Success;
}
catch (Exception Ex)
{
MessageBox.Show(Ex.Message);
Dts.TaskResult = (int)ScriptResults.Failure;
}
Throughout the ETL package, Script Task components will have this piece of code. Since the variable becomes empty after the first Fill, I can't reuse the object variable.
I'm guessing that the Fill method has something to do with this.
Thanks!
It looks like your Dts.Variables["reportMetrics"].Value object holds DataReader object. This object allows forward-only read-only access to the data. You cannot fill DataTable twice using DataReader. To accomplish your task you need to create another script task that performs exactly what you described here: it reads the Reader to DataTable object and stores this DataTable object in another Dts.Variable with type Object.
Dts.Variables["reportMetricsTable"].Value = dt
After that all your subsequequent script tasks shall either create a copy of this table if they modify the data, or use it directly if they do not modify it.
DataTable dtCopy = (Dts.Variables["reportMetricsTable"].Value as DataTable).Copy()
I had a similar situation. While I think you can do a SQL Task with a SELECT COUNT(*) query and assign the result to an SSIS variable, what I did was create an int SSIS variable called totalCount with an original value of 0. I expect the total count to be > 0 (otherwise, I won't have anything to iterate on) so I created an if statement within my Script Task. If the value is zero, I assume totalCount has not been initialized, so I use the same code you are using (with the Fill method). Otherwise (i.e, in further iterations), I skip that part and continue to use totalCount variable. Here's the block of code. Hope it helps:
if ((int)Dts.Variables["User::totalCount"].Value == 0) // if the total count variable has not been initialized...
{
System.Data.OleDb.OleDbDataAdapter da = new System.Data.OleDb.OleDbDataAdapter();
DataTable stagingTablesQryResult = new DataTable();
da.Fill(stagingTablesQryResult, Dts.Variables["User::stagingTablesQryResultSet"].Value); // to be used for logging how many files are we iterating. It may be more efficient to do a count(*) outside this script and save the total number of rows for the query but I made this as proof of concept for future developments.
Dts.Variables["User::totalCount"].Value = stagingTablesQryResult.Rows.Count;
}
Console.WriteLine("{0}. Looking for data file {0} of {1} using search string '{2}'.", counter, Dts.Variables["User::totalCount"].Value, fileNameSearchString);
Excellent
This has helped me around an issue in building myt ETL platform.
Essentially I execute a SQL task to build a dataset of tasks, there is some in line transformations and rules which pull the relevant tasks to the fore, which for obvious reasons I only want to execute the once per execution.
I then need to get the unique ProcessIDs from the data set (to use in a For Each Loop)
Within the FEL, I want to then fetch the relevant records from the original dataset to then push through a further FEL process.
I was facing the same "empty data set" for the 2nd execution against the dataset.
I thought I'd try to share my solution to assist others
You'll need to add the Namespaces
using System.Data.OleDb;
into the Scripts
Screen shot of solution
Get dataset
Execute SQL - Get your data and pass into a Variable Object
Pull Ds
Declare the Variable Objects
public void Main()
{
DataTable dt = new DataTable();
OleDbDataAdapter da = new OleDbDataAdapter();
//Read the original table
da.Fill(dt, Dts.Variables["Tbl"].Value);
//Push to a replica
Dts.Variables["TblClone"].Value = dt;
Dts.TaskResult = (int)ScriptResults.Success;
}
Build Proc List
This gets a list of ProcessIDs (and Names) by filtering on a Rank field in the dataset
Declare the Variable Objects
public void Main()
{ //Take a copy of the Cloned Dataset
DataTable dtRead = (Dts.Variables["TblClone"].Value as DataTable).Copy();
//Lock the output object variable
Dts.VariableDispenser.LockForWrite("User::ProcTbl");
//Create a data table to place the results into which we can write to the output object once finished
DataTable dtWrite = new DataTable();
//Create elements to the Datatable programtically
//dtWrite.Clear();
dtWrite.Columns.Add("ID", typeof(Int64));
dtWrite.Columns.Add("Nm");
//Start reading input rows
foreach (DataRow dr in dtRead.Rows)
{
//If 1st col from Read object = ID var
if (Int64.Parse(dr[9].ToString()) == 1) //P_Rnk = 1
{
DataRow newDR = dtWrite.NewRow();
newDR[0] = Int64.Parse(dr[0].ToString());
newDR[1] = dr[4].ToString();
//Write the row
dtWrite.Rows.Add(newDR);
}
}
//Write the dataset back to the object variable
Dts.Variables["User::ProcTbl"].Value = dtWrite;
Dts.Variables.Unlock();
Dts.TaskResult = (int)ScriptResults.Success;
}
Build TaskList from ProcList
Cycle round ProcessID in a For Each Loop
Build TL Collection
..and map Vars
Build TL Var Mappings
Build TL Script
This will dynamically build the output for you (NB this works for me although havent extensively tested it, so if it doesnt work....have a fiddle with it).
You'll see I've commented out some Debug stuff
public void Main()
{
//Clone the copied table
DataTable dtRead = (Dts.Variables["TblClone"].Value as DataTable).Copy();
//Read the var to filter the records by
var ID = Int64.Parse(Dts.Variables["User::ProcID"].Value.ToString());
//Lock the output object variable
Dts.VariableDispenser.LockForWrite("User::SubTbl");
//Debug Test the ProcID being passed
//MessageBox.Show(#"Start ProcID = " + ID.ToString());
//MessageBox.Show(#"TblCols = " + dtRead.Columns.Count);
//Create a data table to place the results into which we can write to the output object once finished
DataTable dtWrite = new DataTable();
//Create elements to the Datatable programtically
//dtWrite.Clear();
foreach (DataColumn dc in dtRead.Columns)
{
dtWrite.Columns.Add(dc.ColumnName, dc.DataType);
}
MessageBox.Show(#"TblRows = " + dtRead.Rows.Count);
//Start reading input rows
foreach (DataRow dr in dtRead.Rows)
{
//If 1st col from Read object = ID var
if (ID == Int64.Parse(dr[0].ToString()))
{
DataRow newDR = dtWrite.NewRow();
//Dynamically create data for each column
foreach (DataColumn dc in dtRead.Columns)
{
newDR[dc.ColumnName] = dr[dc.ColumnName];
}
//Write the row
dtWrite.Rows.Add(newDR);
//Debug
//MessageBox.Show(#"ProcID = " + newDR[0].ToString() + #"TaskID = " + newDR[1].ToString() + #"Name = " + newDR[4].ToString());
}
}
//Write the dataset back to the object variable
Dts.Variables["User::SubTbl"].Value = dtWrite;
Dts.Variables.Unlock();
Dts.TaskResult = (int)ScriptResults.Success;
}
For Each Loop Container
FEL Cont Collection
N.B. Dont forget to map the items in the Variable Mappings
Now you can consume the records and do stuff with that data
I included the Msg Loop script as an easy data check...in reality this will go off and trigger other processes but just to aid you in data checks I though Id include it
Msg Loop
Msg Loop Script
public void Main()
{
// TODO: Add your code here
MessageBox.Show("ID = " + Dts.Variables["User::ProcID"].Value + ", and val = " + Dts.Variables["User::TaskID"].Value, "Name = Result");
Dts.TaskResult = (int)ScriptResults.Success;
}
Hope that helps somebody solve their issue (Ive been tring to resolve this for a working day or so :/
I have table in my database stored in SQL Server 2012 and through this table I am iterating and adding new object in my binding list. This list is then set as datasource for my DataGridView.
As I understand, the DataGridView should create columns and fill the rows with data, but when I run the build, I only see blank rows. Their count is matching the count of rows in table and I also debugged with breakpoints so I have determined that I really have my datasource filled with data, but I cannot figure those blank rows out.
This is method I use for creating dataset and filling the binding list
public void selectCars()
{
string connString = #"Data Source=POHJOLA\SQLEXPRESS;Initial Catalog=BlueCars;Integrated Security=True";
using (SqlConnection connection = new SqlConnection(connString))
{
connection.Open();
string query = "SELECT * FROM Car ORDER BY CarID ASC";
SqlCommand command = new SqlCommand(query, connection);
using (SqlDataAdapter adapter = new SqlDataAdapter(command))
using (DataSet result = new DataSet())
{
adapter.Fill(result);
foreach (DataRow row in result.Tables[0].Rows)
{
carsList.Add(new Car(Convert.ToInt32(row[0]), row[1].ToString(), row[2].ToString(), row[3].ToString(), Convert.ToDecimal(row[4]),Convert.ToInt32(row[5]),row[6].ToString(),row[7].ToString() ));
}
}
}
}
This is my initialization
public managerCarForm()
{
InitializeComponent();
selectCars();
carsGrid.DataSource = carsList;
}
Also I should probably add, that I created columns manually in designer and set datanameproperty to parameters of the car class
I am not getting any exception or error here..
Thanks very much in advance!
I came across the exact same problem in VB.
I Found out that the solution was this:
(I´ll just write my code in VB you can translate it).
Before setting the DataSource of the grid, you should Clear the grid out.
carsGrid.DataSource = Nothing
carsGrid.Rows.Clear()
carsGrid.Columns.Clear()
Then set your grid DataSource as usual. In My case:
carsGrid.DataSource = GetEmptyObject._Get()
Hope it Helps.
foreach (DataRow row in result.Tables[0].Rows)
{
carsList.Add(new Car(Convert.ToInt32(row[0]), row[1].ToString(), row[2].ToString(), row[3].ToString(), Convert.ToDecimal(row[4]),Convert.ToInt32(row[5]),row[6].ToString(),row[7].ToString() ));
}
Please check your carList by applying a breakpoint after foreach loop to verify it contains at least a single data row. And also check your query.
If your application is an ASP.NET
try to modify your code as below..
public managerCarForm()
{
InitializeComponent();
selectCars();
carsGrid.DataSource = carsList;
carsGrid.Databind();
}
Normally this happens when you have manually added the columns in design time.
Or you have AutoGenerateColumns = false;
If you use AutoGenerateColumns = true; the columns will be/should be auto generated.
To solve this:
Right click on the grid -> Edit Columns.
Go to property: DataPropertyName
Set that to the variable name that you bind to (the table column name in your case).
(You say you have done that, but the value here should exactly match what you have in your list. I have made a DTO class and via a loop I have populated a List of my own and set the names to match the properties of that DTO. This should solve it for you.)
I have a problem with a datagridview in C#. I get the data via query from a mysql-database. But for some reason, only the first row from the result is displayed in the gridview. I use the following code to do this stuff:
MySqlCommand command = new MySqlCommand(query, Globals.Connection);
reader = command.ExecuteReader();
while (reader.Read())
{
object[] dataset = new object[7];
dataset[0] = reader["sto_name"];
dataset[1] = reader["esss"];
dataset[2] = reader["onl_name"];
dataset[3] = reader["rpc_id"];
if (reader["datum_aufstellung"].ToString() != "0")
{
dataset[4] = getDate(reader["datum_aufstellung"]);
}
else
{
dataset[4] = "Kein Datum gesetzt";
}
if (reader["datum_abbau"].ToString() != "0")
{
dataset[5] = getDate(reader["datum_abbau"]);
}
else
{
dataset[5] = "Kein Datum gesetzt";
}
dataset[6] = reader["id"];
dataGridView1.Rows.Add(dataset);
}
It worked, a few lines of code earlier. ^^
Do you have an idea, what the problem is?
UPDATE:
The content of the while loop is executed only one time. I was also wondering about that fact, before I asked my question here. But if I execute the Query in an MySQL-Client, it returns more rows than one.
UPDATE 2:
I've noticed, that while the whole content of the while-loop is commented, the loop is executed exactly the same times as there are rows in the query-result. Any ideas?
UPDATE 3:
It seems that everything after
dataGridView1.Rows.Add(dataset);
is not executed. Not only in the loop, but in the whole function.
But why?
PROBLEM SOLVED:
There was nothing wrong with the code posted here. I had an event in the rest of the code, which executed something, when a row in the dgv is entered. I suppose the loop breaked, when that happened. After removing that event, the dgv was properly filled.
You should use a DataAdapter and use it to populate a DataTable. Create a method for grabbing the data like this :-
public DataTable GetDataTable()
{
DataTable dt = new DataTable();
using (SqlConnection con = new SqlConnection (#"YourCOnnectionString"))
{
using (SqlCommand cmd = new SqlCommand (query, con))
{
var adaptor = new SqlDataAdapter ();
adaptor.SelectCommand = cmd;
con.Open();
adaptor.Fill(dt);
return dt;
}
}
}
Then You can reference it with your DataGrid :-
DataTable Result = GetDataTable();
DatagridView1.DataSource = Result;
Check the RowCount property on the Grid. It maybe set to 1 if so increase it to the desired amount of rows.
There is another easy way.
You can use DataTable instead of dataset and set the dataGridView's DataSource to that DataTable.
It will solve your problem and also using that you can set the column Headers easily.
I am using Winforms for my application & SQL Server as database.
I want that as soon as any text is typed in a textbox , immediate results are fetched/searched from the SQL SERVER database tables for that supplied text.
For this , i have given the following query:
public partial class Form1 : Form
{
SqlConnection conn = new SqlConnection();
public Form1()
{
conn.ConnectionString = "Trusted_Connection=true";
conn.Open();
InitializeComponent();
}
private void textBox1_TextChanged(object sender, EventArgs e)
{
DataTable dt = null;
SqlCommand cmd = new SqlCommand ("SELECT * FROM items WHERE item_name LIKE'" + textBox1.Text + "%'", conn);
SqlDataReader reader = cmd.ExecuteReader();
dt = new DataTable();
dt.Load(reader);
dataGridView1.DataSource = dt;
}
}
But , as this fetches data every time from the database, so it takes more time, but i want a faster way. so shall i use DATASETS for this purpose, as datasets are used for disconnected environment.
OR
I shall first fetch the whole ITEM table from the database on to a GridView , & display it when the Form is opened.
now, when text is entered in the textbox , then it would not fetch data from the sql database, but would search in the GridView, so would this be faster?
which way would be efficient?
The item table has 3.4 million records.
How big is your items table?
If it's not big, it'll do to just store it in a dataset. Use the same textbox but search in the dataset.
If it's big, I would suggest using a timer. On each textchange, restart the timer of maybe 0.5 seconds. when the timer has elapsed, then only query the database. This prevents multiple queries while the user is typing.
Alternatively, if you could read the whole table and assign it to the AutoCompleteCustomSource:
textBox1.AutoCompleteMode = AutoCompleteMode.SuggestAppend;
foreach(DataRow row in dt.Rows)
textBox1.AutoCompleteCustomSource.Add(row["item_name"] as string);
Yes. Using a dataset and searching on it would be much faster. Since you are using WinForms, memory footprint is probably also not an issue unless you you are fetching a huge number of rows from the database.
Also, you should probably not search on every text change, but wait for a small amount of time say 2 seconds during which there are no changes to the textbox and then fetch. Otherwise you would be fetching for any new character entered in the textbox (i think).
Better approach will be using DataSet / DataTable. Read all the data from the Table on the form load and store it in the Form.