Import large XML file into SQL Server CE - c#

I try to import data from a XML file into SQL Server CE database. I use ErikEJ SQL Server Compact Bulk Insert Library (from NuGet) this library on codeplex. I create database and table. Then I read XML to DataTable and import this DataTable to DB table.
DataSet ds = new DataSet();
ds.ReadXml("myxml.xml");
DataTable table = new DataTable();
table = ds.Tables[0];
String connString = #"Data Source = test.sdf";
SqlCeBulkCopy bulkInsert = new SqlCeBulkCopy(connString);
bulkInsert.DestinationTableName = "testtable";
bulkInsert.WriteToServer(table);
It works on a small xml, but when I use large xml (more then 1gb) I get this error on ReadXml :
"System.OutOfMemoryException" in mscorlib.dll
How to fix this?
update: I know that this error because I use large xml - question is how optimize this algorithm, mayby using buffer or read xml part by part, any idea?

There is no simple libary that will solve this for you.
You need to read the XML file in a streaming fashion ( Reading Xml with XmlReader in C# ) to avoid loading the entire XML file, and then for each element read add these to a List or DataTable, up to say 100,000 entries, then BulkInsert those, dispose/clear all unused objects and go on, until the entire file has been read.
In addition, calls to SqlCeBulkCopy should be wrapped in usings to dispose unmanaged resources:
using (SqlCeBulkCopy bulkInsert = new SqlCeBulkCopy(connString))
{
bulkInsert.DestinationTableName = "testtable";
bulkInsert.WriteToServer(table);
}

Related

How to update Entity Framework from DataSet that is reading XML?

I want to update my database over a WCF Data Service and the Entity Framework
from an XML file read from disk.
I am using a DataSet to read the XML file but I don't know how to update the database from the dataset via the Entity Framework part.
I am using a DataSet to read the XML because it works so well but if there
is another way to read XML into the database via Entity Framework that would be great.
The code I want looks like the following.
private void AddToDBFromXMLFile(string xmlFile)
{
// this is reading my XML just fine, but i suspect there is another way to
// get the XML into the
// database, otherwise how can I get the XML into the database via the EF?
DataSet dataSet = new DataSet();
dataSet.ReadXml(xmlFile, XmlReadMode.InferSchema);
// this is working and can update and retrieve my data using the usual methods
var proxy = new TestApp.ServiceReference.TestEntities(new
Uri("http://localhost:4976/TestWcfDataService.svc/"));
// how do I update the database from the dataset?
//proxy << dataset
}

Passing Xml output from Stored Procedure to AJAX client (web browser) with least overhead (memory, processing time) due to the Asp.Net middle tier?

I am using JQuery to perform async calls to a WebService written in C#. The WebService invokes a Stored Procedure (located in a SQL Server Database). The stored procedure returns results as XML (I use 'for xml' in the procedure to retrieve the result as XML).
The question is: How can I place the XML returned by the stored procedure (that is streamed by SQL Server) on to the HTTP Response stream of the WebService with the least amount of memory and processing time usage.
At the moment, my WebService returns XmlDocument object.I would like to know if there is a way to reduce the overhead of creation of XmlDocument or DataSet Objects, which consume both memory and processor time.
Ideal solution would be:
Connect the result stream pipe from SQL Server to the HTTP Response stream pipe. This way, whatever is coming from SQL Server will get to the client without the content being touched. No new objects are created (like DataSet, XmlDocument) and XmlParsing is avoided while creating XmlDocuments. Thus keeping the memory and processing footprint to the least. Also, whether the XML returned by the procedure is small or very large, since streams are used, memory usage will not grow (which is the case is XmlDocuments are created).
My reduced c# webservice method:
public XmlDocument GetXmlDataFromDB()
{
string connStr = System.Convert.ToString(
System.Web.Compilation.ConnectionStringsExpressionBuilder.GetConnectionString("DbConnectionString"),
System.Globalization.CultureInfo.CurrentCulture);
SqlConnection conn = new SqlConnection(connStr);
SqlCommand sqlCmd = new SqlCommand("stp_GetXmlData", conn);
sqlCmd.CommandType = CommandType.StoredProcedure;
SqlDataAdapter sda = new SqlDataAdapter(sqlCmd);
DataSet ds = new DataSet();
conn.Open();
XmlReader xmlReader = sqlCmd.ExecuteXmlReader();
while (xmlReader.Read())
{
ds.ReadXml(xmlReader);
}
conn.Close();
XmlDocument xmlDoc = new XmlDocument();
xmlDoc.InnerXml = ds.GetXml();
return xmlDoc;
}
Any suggestions to improve the performance of the above code?
In your code, you don't need DataSet, SqlDataAdapter and InnerXml
var doc = new XmlDocument();
...
var reader = cmd.ExecuteXmlReader();
if (reader.Read())
doc.Load(reader);
Alternatively you can use reader.ReadOuterXml() to get xml as string,without constructing a document.
Also you can think of using WCF Services for ASP.NET AJAX
Obligatory warning:
"We should forget about small
efficiencies, say about 97% of the
time: premature optimization is the
root of all evil" - Donald Knuth
Having said that, you could make SQL Server return a string instead of an xml type, like:
; with Query(XmlColumn) as
(
SELECT *
FROM YourTable
for xml auto
)
select cast(XmlColumn as varchar(max)) as StringColumn
from Query
The with construct gives a name to the for xml result. You can stream the resulting varchar(max) column back to the client with Response.Write. This would avoid any parsing or object construction in the ASP.NET middle tier.

Can a .csv file be used as a data source in Visual Studio 2008?

I'm pretty new to C# and Visual Studio. I'm writing a small program that will read a .csv file and then write the records read to a SQL Server database table.
I can manually parse the .csv file, but I was wondering if it is possible to somehow "describe" the .csv file to Visual Studio so that I can use it as a data source? I should mention that the first two lines in the .csv file contain header information and the following lines are the actual comma-delimited data.
Also, I should mention that this program is a stand-alone console program with no user interface.
This is a great example of using the power of LINQ. Here's a quick reference with an example of how to do it.
The run down is this. You can read in your CSV to a string array, then use LINQ to query against that collection. As Reed points out though, you'll have to code around your header line, as it will throw off your query.
You can also use the TextFieldParser too to handle escaping commas. Here's an example on thinqlinq that uses the TextFieldParser to parse the file, and a LINQ query to get the results. It even has a unit test to make sure escaped commas are handled.
If you have a 2 line header, it's not a standard CSV file.
In this case, the automatic tools won't work, and you'll have to revert to parsing the file manually.
If you want to remove one of the header lines, you might be able to use this technique of parsing CSV files into an ADO.NET DataTable.
If not, however, the TextFieldParser in the Microsoft.VisualBasic.dll assembly (usable from C# too) makes parsing CSV files very simple.
To parse it manually is very simple, and you could have a program that parses it, strips out the first two unnecessary lines and then feeds it directly to SSIS.
Here is a link for using LINQ to read it in:
http://blogs.msdn.com/wriju/archive/2009/05/24/linq-to-csv-getting-data-the-way-you-want.aspx
Using The Built In OLEDB CSV Parser via C# in order to parse a CVS file.
You can find a sample here
It basically lets you treat the csv file like a database table.
The link in Development 4.0's post has dissapeared. The code in that link was the following:
class CSVParser
{
public static DataTable ParseCSV(string path)
{
if (!File.Exists(path))
return null;
string full = Path.GetFullPath(path);
string file = Path.GetFileName(full);
string dir = Path.GetDirectoryName(full);
//create the "database" connection string
string connString = "Provider=Microsoft.Jet.OLEDB.4.0;"
+ "Data Source=\"" + dir + "\\\";"
+ "Extended Properties=\"text;HDR=No;FMT=Delimited\"";
//create the database query
string query = "SELECT * FROM " + file;
//create a DataTable to hold the query results
DataTable dTable = new DataTable();
//create an OleDbDataAdapter to execute the query
OleDbDataAdapter dAdapter = new OleDbDataAdapter(query, connString);
try
{
//fill the DataTable
dAdapter.Fill(dTable);
}
catch (InvalidOperationException /*e*/)
{ }
dAdapter.Dispose();
return dTable;
}
}
}

Search in DBF file using .idx file

I have a DBF file and a index file.
I want to read index file and search records satisfy some condition.
(for example: search records which its StudentName begin with "A" by using Student.DBF and StudentName.idx)
How do I do this programmatically?
It would be easiest to query via OleDB Connection
using System.Data.OleDb;
using System.Data;
OleDbConnection oConn = new OleDbConnection("Provider=VFPOLEDB.1;Data Source=C:\\PathToYourDataDirectory");
OleDbCommand oCmd = new OleDbCommand();
oCmd.Connection = oConn;
oCmd.Connection.Open();
oCmd.CommandText = "select * from SomeTable where LEFT(StudentName,1) = 'A'";
// Create an OleDBAdapter to pull data down
// based on the pre-built SQL command and parameters
OleDbDataAdapter oDA = new OleDbDataAdapter(oCmd);
DataTable YourResults
oDA.Fill(YourResults);
oConn.Close();
// then you can scan through the records to get whatever
String EachField = "";
foreach( DataRow oRec in YourResults.Rows )
{
EachField = oRec["StudentName"];
// but now, you have ALL fields in the table record available for you
}
I dont have the code off the top of my head, but if you do not want to use ODBC, then you should look into reading ESRI shape files, they consist of 3 parts (or more) a .DBF (what you are looking for), a PRJ file and a .SHP file. It could take some work, but you should be able to dig out the code. You should take a look at Sharpmap on codeplex. It's not a simple task to read a dbf w/o ODBC but it can be done, and there is a lot of code out there for doing this. You have to deal with big-endian vs little-endian values, and a range of file versions as well.
if you go here you will find code to read a dbf file. specifically, you would be interested in the public void ReadAttributes( Stream stream ) method.

Reading Excel files from C#

Locked. This question and its answers are locked because the question is off-topic but has historical significance. It is not currently accepting new answers or interactions.
Is there a free or open source library to read Excel files (.xls) directly from a C# program?
It does not need to be too fancy, just to select a worksheet and read the data as strings. So far, I've been using Export to Unicode text function of Excel, and parsing the resulting (tab-delimited) file, but I'd like to eliminate the manual step.
var fileName = string.Format("{0}\\fileNameHere", Directory.GetCurrentDirectory());
var connectionString = string.Format("Provider=Microsoft.Jet.OLEDB.4.0; data source={0}; Extended Properties=Excel 8.0;", fileName);
var adapter = new OleDbDataAdapter("SELECT * FROM [workSheetNameHere$]", connectionString);
var ds = new DataSet();
adapter.Fill(ds, "anyNameHere");
DataTable data = ds.Tables["anyNameHere"];
This is what I usually use. It is a little different because I usually stick a AsEnumerable() at the edit of the tables:
var data = ds.Tables["anyNameHere"].AsEnumerable();
as this lets me use LINQ to search and build structs from the fields.
var query = data.Where(x => x.Field<string>("phoneNumber") != string.Empty).Select(x =>
new MyContact
{
firstName= x.Field<string>("First Name"),
lastName = x.Field<string>("Last Name"),
phoneNumber =x.Field<string>("Phone Number"),
});
If it is just simple data contained in the Excel file you can read the data via ADO.NET. See the connection strings listed here:
http://www.connectionstrings.com/?carrier=excel2007
or
http://www.connectionstrings.com/?carrier=excel
-Ryan
Update: then you can just read the worksheet via something like select * from [Sheet1$]
The ADO.NET approach is quick and easy, but it has a few quirks which you should be aware of, especially regarding how DataTypes are handled.
This excellent article will help you avoid some common pitfalls:
http://blog.lab49.com/archives/196
This is what I used for Excel 2003:
Dictionary<string, string> props = new Dictionary<string, string>();
props["Provider"] = "Microsoft.Jet.OLEDB.4.0";
props["Data Source"] = repFile;
props["Extended Properties"] = "Excel 8.0";
StringBuilder sb = new StringBuilder();
foreach (KeyValuePair<string, string> prop in props)
{
sb.Append(prop.Key);
sb.Append('=');
sb.Append(prop.Value);
sb.Append(';');
}
string properties = sb.ToString();
using (OleDbConnection conn = new OleDbConnection(properties))
{
conn.Open();
DataSet ds = new DataSet();
string columns = String.Join(",", columnNames.ToArray());
using (OleDbDataAdapter da = new OleDbDataAdapter(
"SELECT " + columns + " FROM [" + worksheet + "$]", conn))
{
DataTable dt = new DataTable(tableName);
da.Fill(dt);
ds.Tables.Add(dt);
}
}
How about Excel Data Reader?
http://exceldatareader.codeplex.com/
I've used in it anger, in a production environment, to pull large amounts of data from a variety of Excel files into SQL Server Compact. It works very well and it's rather robust.
Here's some code I wrote in C# using .NET 1.1 a few years ago. Not sure if this would be exactly what you need (and may not be my best code :)).
using System;
using System.Data;
using System.Data.OleDb;
namespace ExportExcelToAccess
{
/// <summary>
/// Summary description for ExcelHelper.
/// </summary>
public sealed class ExcelHelper
{
private const string CONNECTION_STRING = "Provider=Microsoft.Jet.OLEDB.4.0;Data Source=<FILENAME>;Extended Properties=\"Excel 8.0;HDR=Yes;\";";
public static DataTable GetDataTableFromExcelFile(string fullFileName, ref string sheetName)
{
OleDbConnection objConnection = new OleDbConnection();
objConnection = new OleDbConnection(CONNECTION_STRING.Replace("<FILENAME>", fullFileName));
DataSet dsImport = new DataSet();
try
{
objConnection.Open();
DataTable dtSchema = objConnection.GetOleDbSchemaTable(OleDbSchemaGuid.Tables, null);
if( (null == dtSchema) || ( dtSchema.Rows.Count <= 0 ) )
{
//raise exception if needed
}
if( (null != sheetName) && (0 != sheetName.Length))
{
if( !CheckIfSheetNameExists(sheetName, dtSchema) )
{
//raise exception if needed
}
}
else
{
//Reading the first sheet name from the Excel file.
sheetName = dtSchema.Rows[0]["TABLE_NAME"].ToString();
}
new OleDbDataAdapter("SELECT * FROM [" + sheetName + "]", objConnection ).Fill(dsImport);
}
catch (Exception)
{
//raise exception if needed
}
finally
{
// Clean up.
if(objConnection != null)
{
objConnection.Close();
objConnection.Dispose();
}
}
return dsImport.Tables[0];
#region Commented code for importing data from CSV file.
// string strConnectionString = "Provider=Microsoft.Jet.OLEDB.4.0;" +"Data Source=" + System.IO.Path.GetDirectoryName(fullFileName) +";" +"Extended Properties=\"Text;HDR=YES;FMT=Delimited\"";
//
// System.Data.OleDb.OleDbConnection conText = new System.Data.OleDb.OleDbConnection(strConnectionString);
// new System.Data.OleDb.OleDbDataAdapter("SELECT * FROM " + System.IO.Path.GetFileName(fullFileName).Replace(".", "#"), conText).Fill(dsImport);
// return dsImport.Tables[0];
#endregion
}
/// <summary>
/// This method checks if the user entered sheetName exists in the Schema Table
/// </summary>
/// <param name="sheetName">Sheet name to be verified</param>
/// <param name="dtSchema">schema table </param>
private static bool CheckIfSheetNameExists(string sheetName, DataTable dtSchema)
{
foreach(DataRow dataRow in dtSchema.Rows)
{
if( sheetName == dataRow["TABLE_NAME"].ToString() )
{
return true;
}
}
return false;
}
}
}
Koogra is an open-source component written in C# that reads and writes Excel files.
While you did specifically ask for .xls, implying the older file formats, for the OpenXML formats (e.g. xlsx) I highly recommend the OpenXML SDK (http://msdn.microsoft.com/en-us/library/bb448854.aspx)
I did a lot of reading from Excel files in C# a while ago, and we used two approaches:
The COM API, where you access Excel's objects directly and manipulate them through methods and properties
The ODBC driver that allows to use Excel like a database.
The latter approach was much faster: reading a big table with 20 columns and 200 lines would take 30 seconds via COM, and half a second via ODBC. So I would recommend the database approach if all you need is the data.
Cheers,
Carl
ExcelMapper is an open source tool (http://code.google.com/p/excelmapper/) that can be used to read Excel worksheets as Strongly Typed Objects. It supports both xls and xlsx formats.
I want to show a simple method to read xls/xlsx file with .NET. I hope that the following will be helpful for you.
private DataTable ReadExcelToTable(string path)
{
//Connection String
string connstring = "Provider=Microsoft.ACE.OLEDB.12.0;Data Source=" + path + ";Extended Properties='Excel 8.0;HDR=NO;IMEX=1';";
//the same name
//string connstring = Provider=Microsoft.JET.OLEDB.4.0;Data Source=" + path + //";Extended Properties='Excel 8.0;HDR=NO;IMEX=1';";
using(OleDbConnection conn = new OleDbConnection(connstring))
{
conn.Open();
//Get All Sheets Name
DataTable sheetsName = conn.GetOleDbSchemaTable(OleDbSchemaGuid.Tables,new object[]{null,null,null,"Table"});
//Get the First Sheet Name
string firstSheetName = sheetsName.Rows[0][2].ToString();
//Query String
string sql = string.Format("SELECT * FROM [{0}]",firstSheetName);
OleDbDataAdapter ada =new OleDbDataAdapter(sql,connstring);
DataSet set = new DataSet();
ada.Fill(set);
return set.Tables[0];
}
}
Code is from article: http://www.c-sharpcorner.com/uploadfile/d2dcfc/read-excel-file-with-net/. You can get more details from it.
Not free, but with the latest Office there's a very nice automation .Net API. (there has been an API for a long while but was nasty COM) You can do everything you want / need in code all while the Office app remains a hidden background process.
Forgive me if I am off-base here, but isn't this what the Office PIA's are for?
Lately, partly to get better at LINQ.... I've been using Excel's automation API to save the file as XML Spreadsheet and then get process that file using LINQ to XML.
SpreadsheetGear for .NET is an Excel compatible spreadsheet component for .NET. You can see what our customers say about performance on the right hand side of our product page. You can try it yourself with the free, fully-functional evaluation.
SmartXLS is another excel spreadsheet component which support most features of excel Charts,formulas engines, and can read/write the excel2007 openxml format.
The .NET component Excel Reader .NET may satisfy your requirement. It's good enought for reading XLSX and XLS files. So try it from:
http://www.devtriogroup.com/ExcelReader
I recommend the FileHelpers Library which is a free and easy to use .NET library to import/export data from EXCEL, fixed length or delimited records in files, strings or streams + More.
The Excel Data Link Documentation Section
http://filehelpers.sourceforge.net/example_exceldatalink.html
You can try using this open source solution that makes dealing with Excel a lot more cleaner.
http://excelwrapperdotnet.codeplex.com/
SpreadsheetGear is awesome. Yes it's an expense, but compared to twiddling with these other solutions, it's worth the cost. It is fast, reliable, very comprehensive, and I have to say after using this product in my fulltime software job for over a year and a half, their customer support is fantastic!
The solution that we used, needed to:
Allow Reading/Writing of Excel produced files
Be Fast in performance (not like using COMs)
Be MS Office Independent (needed to be usable without clients having MS Office installed)
Be Free or Open Source (but actively developed)
There are several choices, but we found NPoi (.NET port of Java's long existing Poi open source project) to be the best:
http://npoi.codeplex.com/
It also allows working with .doc and .ppt file formats
If it's just tabular data. I would recommend file data helpers by Marcos Melli which can be downloaded here.
Late to the party, but I'm a fan of LinqToExcel
you could write an excel spreadsheet that loads a given excel spreadsheet and saves it as csv (rather than doing it manually).
then you could automate that from c#.
and once its in csv, the c# program can grok that.
(also, if someone asks you to program in excel, it's best to pretend you don't know how)
(edit: ah yes, rob and ryan are both right)
I know that people have been making an Excel "extension" for this purpose.
You more or less make a button in Excel that says "Export to Program X", and then export and send off the data in a format the program can read.
http://msdn.microsoft.com/en-us/library/ms186213.aspx should be a good place to start.
Good luck
Just did a quick demo project that required managing some excel files. The .NET component from GemBox software was adequate for my needs. It has a free version with a few limitations.
http://www.gemboxsoftware.com/GBSpreadsheet.htm
Excel Package is an open-source (GPL) component for reading/writing Excel 2007 files. I used it on a small project, and the API is straightforward. Works with XLSX only (Excel 200&), not with XLS.
The source code also seems well-organized and easy to get around (if you need to expand functionality or fix minor issues as I did).
At first, I tried the ADO.Net (Excel connection string) approach, but it was fraught with nasty hacks -- for instance if second row contains a number, it will return ints for all fields in the column below and quietly drop any data that doesn't fit.
We use ClosedXML in rather large systems.
Free
Easy to install
Straight forward coding
Very responsive support
Developer team is extremly open to new suggestions. Often new features and bug fixes are implemented within the same week
Take.io Spreadsheet will do this work for you, and at no charge. Just take a look at this.
I just used ExcelLibrary to load an .xls spreadsheet into a DataSet. Worked great for me.

Categories