I have developed a WinForm c# application , now adding a recovery options so if it closes unexpectedly etc Everything can be recovered on a new run.
I have managed to recover almost everything (list,Int,Strings etc...)
only issue i am facing is restoring a DataTable. During the run on my application records are added to this DataTable and at the end user can export this to csv.
I tried to add the DataTable to Properties.Settings.Default... But it does not work on the new run i always see it as Null .
Any suggestion on best way to save and restore a DataTable keeping in mind they records can go over 10-15 k during a run .
Thank you
Properties.Settings can store string data, so you can serialize your DataTable and store it. Later you can deserialize the string to get DataTable. You can use JSON.Net like:
var serializedDt = JsonConvert.SerializeObject(dt);
//store the string
to retrieve back:
DataTable yourDataTable = JsonConvert.DeserializeObject<DataTable>(serializedDt);
One more thing to add, if you are expecting large data, then you may look at options to store data in a database at client side, like Sqlite.
Serialze the object, but place it in a recovery file. During recovery start just read the file, and you won't have to worry about space.
Related
I'm having trouble coming up with a way to generate reports (either via xlsx or MS report viewer). The biggest issue is my dataset being dynamically created. Here's what I have:
One or more terminals periodically fill a cloud-hosted database with information about where the info is coming from, the global key and the value corresponding to the pair (origin, key).
So I end up with a table like:
TERMINAL_NO|GLOBAL_KEY|DECIMAL_VALUE
===========|==========|=============
123 |9876 |1.00
123 |9875 |0.50
123 |9872 |-4.00
234 |9876 |3.00
234 |9875 |5.45
234 |9872 |2.50
And I have made an app that transforms that into a DataSet with one column per TERMINAL_NO, and each row is every distinct GLOBAL_KEY, storing the corresponding DECIMAL_VALUES for each pair. However, that's the issue - I can't find a way to generate reports or xlsx files with dynamically created DataSets, as I understood both of them need typed DataSets to work with.
Is there an easier way to gather that data, or am I doing this wrong?
08-15-20 EDIT
As per #jdweng, I did try the method depicted there, but I can't make the reportViewer display the actual data. Currently I have the following:
ReportDataSource rds = new ReportDataSource("DataSetStock", LoadData());
//LoadData() returns a filled datatable.
RView.ProcessingMode = ProcessingMode.Local;
RView.LocalReport.DataSources.Clear();
RView.LocalReport.DataSources.Add(rds);
RView.LocalReport.ReportPath = "StockReport.rdlc";
RView.RefreshReport();
Yet, all I get is an empty reportviewer inside the winformshost control.
As per #nbk's suggestion, I went with a report generator code made by Nadir (https://www.codeproject.com/Tips/888174/Dynamically-Creating-an-RDLC-Report-Just-Using-a-D), by using the provided .cs files.
Just had to make a few changes on the xaml because I'm not using a logo image. I already had my dataset (just had to change the column names to be cls-compliant) and it worked flawlessly. I didn't have to use the userControl folder, as I'm working on WPF and already had a winforms host control with a reportviewer control on it.
What is the best approach to store information gathered locally in .csv-files with a C#.net sql-database? My reasons for asking is
1: The data i am to handle is massive (millions of rows in each csv). 2: The data is extremely precise since it describes measurements on a nanoscopic scale, and is therefor delicate.
My first though was to store each row of the csv in a correspondant row in the database. I did this using The DataTable.cs-class. When done, i feelt that if something goes wrong when parsing the .csv-file, i would never notice.
My second though is to upload the .csvfiles to a database in it's .csv-format and later parse the file from the database to the local enviroment when the user asks for it. If even possible in c#.net with visual stuido 2013, how could this be done in a efficient and secure manner?
I used .Net DataStreams library from csv reader in my project. It uses the SqlBulkCopy class, though it is not free.
Example:
using (CsvDataReader csvData = new CsvDataReader(path, ',', Encoding.UTF8))
{
// will read in first record as a header row and
// name columns based on the values in the header row
csvData.Settings.HasHeaders = true;
csvData.Columns.Add("nvarchar");
csvData.Columns.Add("float"); // etc.
using (SqlBulkCopy bulkCopy = new SqlBulkCopy(connection))
{
bulkCopy.DestinationTableName = "DestinationTable";
bulkCopy.BulkCopyTimeout = 3600;
// Optionally, you can declare columnmappings using the bulkCopy.ColumnMappings property
bulkCopy.WriteToServer(csvData);
}
}
It sounds like you are simply asking whether you should store a copy of the source CSV in the database, so if there was an import error you can check to see what happened after the fact.
In my opinion, this is probably not a great idea. It immediately makes me ask, how would you know that an error had occurred? You certainly shouldn't rely on humans noticing the mistake so you must develop a way to programmatically check for errors. If you have an automated error checking method you should apply that method when the import occurs and avoid the error in the first place. Do you see the circular logic here?
Maybe I'm missing something but I don't see the benefit of storing the CSV.
You should probably use Bulk Insert. With your csv-file as a source.
But this will only work if the file is accessible from the PC that is running your SQL Server.
Here you can find a nice solution as well. To be short it looks like this:
StreamReader file = new StreamReader(bulk_data_filename);
CsvReader csv = new CsvReader(file, true,',');
SqlBulkCopy copy = new SqlBulkCopy(conn);
copy.DestinationTableName = tablename;
copy.WriteToServer(csv);
I am doing a small project to learn how to use DataSet but i have a small problem. Consider following code:
foreach (DatabaseDataSet.ApplicationRow rowApplication in database.Application)
{
if (rowApplication.AID.ToString() == lblIDApplication.Text)
{
rowApplication.Date= tbApplicationDatum.Text;
rowApplication.Status = tbApplicationStatus.Text;
applicationAdapter.Update(rowApplication);
break;
}
}
I don't know why but the database doesn't get updated. The DataRow is being updated as when I call the data again I see the new value. But when I re-run my application it's back to it's old value again. Any help?
EDIT: I'm working with strongly typed DataSet
You need to call the Update method of your adapter to propogate the changes
AcceptChanges only updates the changes in memory for the row and does not migrate those to the database
MSDN
AcceptChanges and RejectChanges only apply to DataRow related changes
(that is, Add, Remove, Delete, and Modify). They are not applicable to
schema or structural changes.
Calling AcceptChanges will not replicate these changes back to the
data source if the DataSet was filled using a DataAdapter. In that
situation, call Update instead
See Updating Data Sources with DataAdapters for more information
Its important to remember that the DataSet is a 'local copy' of the data not a 'live link' to the DB. If your DataSet is populated by a IDataAdaptor (say a TableAdaptor) for example you need to call the DataAdaptors Update method passing in the Updated dataset to sync the results back to the underlying DB.
Also I would suspect you DONT want to be doing 'new ApplicationTableAdapter()' because typically you would want to update with the TableAdaptor you populated with, at the least you would need to ensure you had the correct connection, query etc set up.
SOLUTION: It happens that nothing was wrong with the code. I had two ConnectionString defined in App.config. I forgot to remove the first one after I removed a previous database that had errors in it. Upon removing the first ConnectionString, everything worked.
I've created a new project in .Net (2010 4.0) and added an SDF data file. I've generated a dataset and created a table in it (and I believe generated the Fill and other methods).
In code, I'm trying to add a row to the database.
eBureauScrubber.App_Data.matchingtempDataSet ds = new App_Data.matchingtempDataSet();
eBureauScrubber.App_Data.matchingtempDataSet.ctfFileRow row = ds.ctfFile.NewctfFileRow();
row.Address = "123 Main St.";
row.City = "Overland Park";
row.FirstName = "Matt";
row.LastName = "Dawdy";
row.rownum = 1;
EDIT: Added the next bit of code.
ds.ctfFile.Rows.Add(row);
ds.ctfFile.AcceptChanges();
ds.AcceptChanges();
eBureauScrubber.App_Data.matchingtempDataSetTableAdapters.ctfFileTableAdapter ctfa = new App_Data.matchingtempDataSetTableAdapters.ctfFileTableAdapter();
ctfa.Update(ds.ctfFile);
This runs fine. However, after the program completes, the data is not persisted in the database. What am I missing?
EDIT: I've tried all different combinations of AcceptChanges() on the datatable, the dataset, running update() before, after, etc. I'm missing something huge here. I'm not even sure it is connecting to the "right" database. Maybe that's my problem.
EDIT 2: Here's what I did to get this to work (it's still funky, though).
Change the properties of my DB file in App_Data to "Do Not Copy"
Manually copy that db file to bin\debug\app_data
Use the data adapter's fill method to fill the ds.ctfFile data table.
Create a row (.NewctfFileRow())
Set values on that row.
ds.ctfFile.Rows.Add(row)
ds.ctfFile.AcceptChanges();
ds.AcceptChanges();
Call the adapater's update method.
Now, the data is in my database file (in bin\debug\app_data), but I can't see it because the Data Sources connection. I'm still trying to find out how to do that.
It should have generated a TableAdapter class with a .Update() method that you have to call to save data in your database. See MSDN for some examples.
I have list of maybe 50,000 entries that are populated in datagrid in wpf. Now I want to save the data in the list to a file that may be text, or preferably CSV. As list is too big. There is a problem that my implemented method that may be simple text file writing or the method to copy the contents from the datagrid to clipboard and then back to string, and then that string to file using StreamReader. It consumes approx 4-5 minutes even it is in background worker.
Is there any way that I can save huge list to file quickly?
I am using DataGrid in WPF
CODE
dataGrid1.SelectAllCells();
dataGrid1.ClipboardCopyMode = DataGridClipboardCopyMode.IncludeHeader;
ApplicationCommands.Copy.Execute(null, dataGrid1);
String result = (string)Clipboard.GetData(DataFormats.CommaSeparatedValue);
///Never reach to step Below thread stays on above line
dataGrid1.UnselectAllCells();
Clipboard.Clear();
StreamWriter file = new System.IO.StreamWriter(SavePageRankToPDF.FileName);
file.WriteLine(result);
file.Close();
Instead of using the clipboard, why not iterate through the datatable and build the csv file.
Update
Here are some examples:
Convert DataTable to CSV stream
Converting DataSet\DataTable to CSV
One thing that will help is to not load ALL of your data into the datagrid when using it for display purposes. It'd be a good idea to use paging: only load the data into the datagrid that will be needed for calculations or display purposes. If the user wants to see/use more data, go back to your data source and get more of the data. Not only will your app run faster, you'll use much less memory.