Transfer data from DBF to SQL Server using C# - c#

I need a function to transfer data from a .DBF file to SQL Server.
Here is what I do:
First step: I use OleDBDataAdapter.Fill to read from .DBF file
Second step: insert this table into SQL Server.
The first step takes 74 seconds for 90 column X 80000 rows.
Is there any way to speed this process up?
Also, if there is any way to communicated directly from .DBF to SQL Server, please guide me. BTW, I am using C# and SQL Server 2008.
My mistake, guys. I rarely post here.
Here is my code (take over 1 minute to transfer DBF into datatable):
OleDbCommand oledbcommand = new OleDbCommand();
OleDbDataAdapter adp = new OleDbDataAdapter();
oledbcommand.Connection = oledbConnectOpen();
oledbcommand.CommandText = #"SELECT * FROM " + filename;
adp.SelectCommand = oledbcommand;
adp.Fill(dt); //this is the step that consume time
dt.TableName = filename;
return dt;

In this scenario you would be better using an OleDbDataReader as opposed to OleDBDataAdapter because the reader is optimised for forward-only, read-only access. See this article.

Related

Can't read file after 8th row EXCEL OLE DB

I can't read this excel file of mine after the 8th row. I am using a OLEDB connection to access it from a c# script task inside a SSIS package :
strCoExcel = "Provider = Microsoft.ACE.OLEDB.12.0;Mode=Read;Data Source =" + Path.Combine((string)Dts.Variables["PathINPUT"].Value, Dts.Variables["FileNameForEach"].Value.ToString()) + ";Extended Properties=\"Excel 12.0 Xml;HDR=NO;ImportMixedTypes=Text;TypeGuessRows=0;IMEX=1;\"";
//Gathering data from the renamed sheet
OleDbDataAdapter adapter = new OleDbDataAdapter("Select * from [DataTQT$]", coExcel);
DataTable data = new DataTable();
adapter.Fill(data);
What is wrong:
Some excel files are opened and everything is fine but others do not produce any rows or only 8.
I tried the following:
-HDR no/yes
-IMEX =1 doesnt change anything
-nor do ImportMixedTypes=Text;TypeGuessRows=0
-setting all the cell from the excel file to standard or text field
any help ?
OK so the final answer for me was not in this piece of code as suspected by others.
This data goes into a SQL database and the columns were not big enough for the data i wanted to insert. I modified the table and now everything works fine.

How to read large excel files containing more that 200,000 rows and load that data in datatable in C#

I am working on reading data from excel and loading it in datatable. My problem is that it is giving SystemOutOfMemoryException while loading large excel files.
The colomns in excel are not fixed so I can't load that data in sql table.
I need to do some manipulation on data so I am loading it in datatable.
Can anyone suggest me how to resolve this issue?
I am doing it like this
OleDbConnection conn = new OleDbConnection();
OleDbCommand cmd = new OleDbCommand();
OleDbDataAdapter da = new OleDbDataAdapter();
conn = new OleDbConnection(GetOleDbConnectionString(strFileType, strNewPath));
if (conn.State == ConnectionState.Closed) conn.Open();
string query = null;
DataTable dt = new DataTable();
query = "SELECT * FROM [" + SpreadSheetName + "]";
cmd.Connection = conn;
cmd.CommandText = query;
da.SelectCommand = cmd;
da.Fill(dt);
da.Dispose();
conn.Close();
conn.Dispose();
Your problem is not enough memory - likely your application runs as 32 bit app and all the stuff you load just is overloading it.
Make it a 64 bit application (in settings under the executable project) - and make sure you have physical memory adequate for a modern machine (8+gb).
You are loading the data of whole excel sheet in memory, never do that, few environment variables come into scenario like available memory that may change if application is deployed in another machine, and columns filled in excel sheet - what if all of the columns are used with lengthy text then few thousand records could be enough to run out of memory.
Better way is to fire a query to get only column name like
query = "SELECT * FROM [" + SpreadSheetName + "]" where 1=2"
This will give you all the column names, use this to create a table in database.
Once table is created either load records few at a time and do manipulation here on limited records and repeat till end.

Faster way to export large SQL Server CE database to XML file?

What is a faster way to export large (25,000 rows) SQL Server CE database into XML file?
Here is what I currently use:
using (SqlCeConnection cn = new SqlCeConnection(strConnection))
{
if (cn.State == ConnectionState.Closed)
cn.Open();
using (SqlCeCommand cmd = new SqlCeCommand(strCommand, cn))
{
SqlCeDataAdapter da = new SqlCeDataAdapter(cmd);
DataSet ds = new DataSet();
da.Fill(ds, "item");
StreamWriter xmlDoc = new StreamWriter("Output.xml");
ds.WriteXml(xmlDoc);
xmlDoc.Close();
}
}
It takes about 60 seconds inside emulator (Windows Mobile).
Also.. I am using Compact Framework 3.5 with SQL Server CE 3.5.
Current performance:
60 seconds entire code
~20 seconds for everything without ds.WriteXml(xmlDoc);, leaving ~40 seconds for ds.WriteXml(xmlDoc);.
If the effort is worth it to you, the fastest method is probably to roll your own XML generation. The library implementation of XmlWriter() is loops within loops, with considerable generality.
Declare your own output stream specifying the buffer size to something reasonably large (but still sane). I have tested StringBuilder() before and append() to it millions of times with good performance. Other output options may not be as fast, but I would hope the StreamWriter gives good performance when buffersize if appropriate and it force you to build everything in memory.
Don't use da.Fill() -- replace with a SqlCeDataReader.
For each row. Generate the XML for a single row using logic that is coded as close as possible to the metal. I.e., Precalculate column index values, etc. instead of using column names within this loop, use hard-coded type conversions as needed. Don't loop through each column, put each column
Also, test by having the database generating XML output. Although I don't expect this to be the fastest, it is easy to try, and if it turns out to be fast you would never have discovered it without trying.
Maybe something like this is more lightweight:
private void ExportDataTable(SqlCeCommand cmd, SqlCeConnection conn)
{
cmd.Connection = conn;
System.Data.DataTable table = new System.Data.DataTable();
table.Locale = CultureInfo.InvariantCulture;
table.Load(cmd.ExecuteReader());
StreamWriter xmlDoc = new StreamWriter("Output.xml");
table.WriteXml(xmlDoc);
}

C# read open Excel file through OleDb

I need to connect to an open Excel 2003 file using .NET 3.5
It seems the OleDb connection which I am trying to use wants the file exclusively. But I need to have this file open in Excel in the same time.
Is non-locking reading possible?
EDIT: I resolved this by copying file before opening it.
the question seems have no answer. and I cant delete it....
my solution was - run macro on timer to save the excel file in question and C# app was copying the file to another one and reading it using OleDb.
This seems like a similar problem:
Writing into excel file with OLEDB
Does that work out for you?
What parameters are you passing in when you open the Excel document? Could you set the "ReadOnly" parameter in Workbook.Open() to true? See here.
Refer to the code below how to get the information of Excel data into an array. Then you will perform any validations on that Excel sheet.
var fileName = #"D:\Pavan\WorkDiployed.xlsx";
var connectionString = string.Format("Provider=Microsoft.ACE.OLEDB.12.0; data source={0}; Extended Properties=Excel 12.0;", fileName);
OleDbConnection con = new System.Data.OleDb.OleDbConnection(connectionString);
OleDbDataAdapter cmd = new System.Data.OleDb.OleDbDataAdapter("select * from [Sheet1$]", con);
con.Open();
System.Data.DataSet excelDataSet = new DataSet();
cmd.Fill(excelDataSet);
DataTable data = excelDataSet.Tables[0];
DataRow[] arrdata = data.Select();
foreach (DataRow rw in arrdata)
{
object[] cval = rw.ItemArray;
}
con.Close();
MessageBox.Show (excelDataSet.Tables[0].ToString ());

Search in DBF file using .idx file

I have a DBF file and a index file.
I want to read index file and search records satisfy some condition.
(for example: search records which its StudentName begin with "A" by using Student.DBF and StudentName.idx)
How do I do this programmatically?
It would be easiest to query via OleDB Connection
using System.Data.OleDb;
using System.Data;
OleDbConnection oConn = new OleDbConnection("Provider=VFPOLEDB.1;Data Source=C:\\PathToYourDataDirectory");
OleDbCommand oCmd = new OleDbCommand();
oCmd.Connection = oConn;
oCmd.Connection.Open();
oCmd.CommandText = "select * from SomeTable where LEFT(StudentName,1) = 'A'";
// Create an OleDBAdapter to pull data down
// based on the pre-built SQL command and parameters
OleDbDataAdapter oDA = new OleDbDataAdapter(oCmd);
DataTable YourResults
oDA.Fill(YourResults);
oConn.Close();
// then you can scan through the records to get whatever
String EachField = "";
foreach( DataRow oRec in YourResults.Rows )
{
EachField = oRec["StudentName"];
// but now, you have ALL fields in the table record available for you
}
I dont have the code off the top of my head, but if you do not want to use ODBC, then you should look into reading ESRI shape files, they consist of 3 parts (or more) a .DBF (what you are looking for), a PRJ file and a .SHP file. It could take some work, but you should be able to dig out the code. You should take a look at Sharpmap on codeplex. It's not a simple task to read a dbf w/o ODBC but it can be done, and there is a lot of code out there for doing this. You have to deal with big-endian vs little-endian values, and a range of file versions as well.
if you go here you will find code to read a dbf file. specifically, you would be interested in the public void ReadAttributes( Stream stream ) method.

Categories