bulk upload text file using Microsoft.Jet.OLEDB - c#

i have following code and i tried everything mentioned on online and couldn't able to read data from text file.
string path=#"D:\New folder\abc.txt"
string pathOnly = Path.GetDirectoryName(path);
string fileName = Path.GetFileName(path);
string excelConnectionString = #"Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" + pathOnly + #";Extended Properties=""text;HDR=YES;FMT=TabDelimited""";
OleDbConnection excelConnection = new OleDbConnection(excelConnectionString);
excelConnection.Open();
OleDbCommand cmd = excelConnection.CreateCommand();
cmd.CommandText = String.Format("SELECT * FROM [{0}]", fileName);
OleDbDataReader dReader;
dReader = cmd.ExecuteReader();
DateTime UploadedDate = DateTime.Now;
DataTable sourceData = new DataTable();
sourceData.Load(dReader);
DataColumn col = new DataColumn("UploadedDateCol", typeof(DateTime));
col.DefaultValue = UploadedDate;
sourceData.Columns.Add(col);
int x = sourceData.Rows.Count;
always this x value is 0. my pc is 64 bit pc.
Or else is there any other library that i can use for bulk upload.
My .txt file as below: these values are separated by tab or pipeline(|)
0421230424 3391542691 5295963551 2755344586 12345678

As far as tab delimited files are concerned, you should always have a look at the FileHelper library.
Don't reinvent the wheel, this library is very mature.
The FileHelpers are a free and easy to use .NET library to
import/export data from fixed length or delimited records in files,
strings or streams.
The idea is pretty simple:
You can strong type your flat file (fixed or delimited) simply
describing a class that maps to each record and later read/write your
file as an strong typed .NET array
The Library also has support for import/export data from differents
storages like Excel, Access, SqlServer, etc.

In order to overcome this problem defined details in Schema.ini file. if the file name is abc.txt then its need to create abc.ini file in same location.
string iniFileMsg = "[" + newfileName + ".txt]";
StreamWriter sw = new StreamWriter(newFilePath + "/schema.ini", false);
sw.WriteLine(iniFileMsg);
sw.WriteLine("Format=Delimited(|)");
sw.WriteLine("ColNameHeader=True");
sw.Flush();
If you have different coumns then its need to add as follows:
List<string> columns = excelobj.GetCSVColumnNames(excelUploadedFullPath);
// sw.WriteLine("Col1=Phone Text Width 10");
for (int i = 0; i < columns.Count; i++)
{
sw.WriteLine("Col" + (i + 1) + "=" + columns[i].Replace(" ", "_") + " Text Width 100");
}
sw.Flush();
sw.Close();
From GetCSVColumnNames() i got column names. after that i upload data as above method

Related

How to import Excel which is in HTML format

I have exported the data from database using HttpContext with formatting of table, tr and td. I want to read the same file and convert into datatable.
<add name="Excel03ConString" connectionString="Provider=Microsoft.Jet.OLEDB.4.0;Data Source={0};Extended Properties='HTML Import;HDR={1};IMEX=1'" />
<add name="Excel03ConString" connectionString="Provider=Microsoft.Jet.OLEDB.4.0;Data Source={0};Extended Properties='Excel 8.0;HDR={1};IMEX=1'" />
private DataTable GetTableFromExcel()
{
DataTable dt = new DataTable();
try
{
if (exclFileUpload.HasFile)
{
string FileName = Path.GetFileName(exclFileUpload.PostedFile.FileName);
string Extension = Path.GetExtension(exclFileUpload.PostedFile.FileName);
string FolderPath = Server.MapPath(ConfigurationManager.AppSettings["FolderPath"]);
//string NewFileName = string.Format("{0}_{1}", DateTime.Now.ToString().Replace("/", "").Replace(" ", "").Replace(":", ""), FileName);
string FilePath = Path.Combine(string.Format("{0}/{1}", FolderPath, FileName));
exclFileUpload.SaveAs(FilePath);
string conStr = "";
switch (Extension)
{
case ".xls": //Excel 97-03
conStr = ConfigurationManager.ConnectionStrings["Excel03ConString"].ConnectionString;
break;
case ".xlsx": //Excel 07
conStr = ConfigurationManager.ConnectionStrings["Excel07ConString"].ConnectionString;
break;
}
conStr = String.Format(conStr, FilePath, true);
OleDbConnection connExcel = new OleDbConnection(conStr);
OleDbCommand cmdExcel = new OleDbCommand();
OleDbDataAdapter oda = new OleDbDataAdapter();
cmdExcel.Connection = connExcel;
connExcel.Open();
DataTable dtExcelSchema;
dtExcelSchema = connExcel.GetOleDbSchemaTable(OleDbSchemaGuid.Tables, null);
string SheetName = dtExcelSchema.Rows[0]["TABLE_NAME"].ToString();
connExcel.Close();
connExcel.Open();
cmdExcel.CommandText = "SELECT * From [" + SheetName + "]";
oda.SelectCommand = cmdExcel;
oda.Fill(dt);
connExcel.Close();
File.Delete(FilePath);
}
}
catch (Exception ex)
{
}
return dt;
}
When using the second connection string I am getting error "External table is not in the expected format on connection.Open()." But when using the first, I am getting error on reading the sheet name.
Please tell me how to read the sheet or, directly, data from Excel.
I think This Third party dll-(ExcellDataReader) may help solve your problem.
FileStream stream = File.Open(filePath, FileMode.Open, FileAccess.Read);
//1. Reading from a binary Excel file ('97-2003 format; *.xls)
IExcelDataReader excelReader = ExcelReaderFactory.CreateBinaryReader(stream);
//...
//2. Reading from a OpenXml Excel file (2007 format; *.xlsx)
IExcelDataReader excelReader = ExcelReaderFactory.CreateOpenXmlReader(stream);
//...
//3. DataSet - The result of each spreadsheet will be created in the result.Tables
DataSet result = excelReader.AsDataSet();
//...
//4. DataSet - Create column names from first row
excelReader.IsFirstRowAsColumnNames = true;
DataSet result = excelReader.AsDataSet();
//5. Data Reader methods
while (excelReader.Read())
{
//excelReader.GetInt32(0);
}
//6. Free resources (IExcelDataReader is IDisposable)
excelReader.Close();
I found this online:
C# Excel file OLEDB read HTML IMPORT
Here they say:
Instead of using the sheetname, you must use the page title in the
select statement without the $. SELECT * FROM [HTMLPageTitle]
In that post they also link to this manual, which may come in handy, but is too long to copy here:
http://ewbi.blogs.com/develops/2006/12/reading_html_ta.html
If that doesn't work, I t think you will have to recreate the original excel so it remains an excel file, and not HTML (if that is at all possible in your scenario)
You may be facing this issue due to different reasons. There are different solutions for this one of them is to make your solution debug as x86. Here is how you can change it to x86.
Right click on sloution from Visual studio.
Click configuration Manager
Select x86 from Active solution platform if available
If not available click New and select or type x86 and click ok.
Rebuild solution and run you application.
If this deos not solve your problem you may have to install 32 bit version of office system drivers. Here is a complete article explaining the problem.
After a deep research I found the solution.
First convert the particular Excel file to an html page by using the following code.
File.Move(Server.MapPath("~/Foldername/ExcelName.xls",Path.ChangeExtension(Server.MapPath("~/Foldername/ExcelName.xls"), ".html"));
we have to download HTML string and extract content. tag contains and tags but it may have style properties. So first we have to avoid these style properties and after we can get required content from table.
string url = Server.MapPath("~/FolderName/Excelname.html");
WebClient wc = new WebClient();
string fileContent = wc.DownloadString(url);
Here we have to format HTML tags for avoid style properties.
const string msgFormat = "table[{0}], tr[{1}], td[{2}], a: {3}, b: {4}";
const string table_pattern = "<table.*?>(.*?)</table>";
const string tr_pattern = "<tr.*?>(.*?)</tr>";
const string td_pattern = "<td.*?>(.*?)</td>";
const string a_pattern = "";
const string b_pattern = "<b>(.*?)</b>";
After through looping we can find <tr> and <td> elements. Then we can get content within <td></td> tags by using this method.
private static List<string> GetContents(string input, string pattern)
{
MatchCollection matches = Regex.Matches(input, pattern, RegexOptions.Singleline);
List<string> contents = new List<string>();
foreach (Match match in matches)
contents.Add(match.Value);
return contents;
}
Then we can insert imported records to database by each row.
Reference link here

OleDB, Misses the first character of data

I have a CSV Reading code for ASP.NET application I maintain. This ASP.NET website is running fine from 3 yrs now, and CSV reading code that use Ole.JetDB.4.0 is doing its work fine, except that once in a while some CSV with more than 4K-5K records create a problem. Usually the problem is that a record at random position [random row] miss the first character of it.
CSV File is just bunch of name and addresses per row, and they are in ASNI Format. CSV is comma seperate, no data have "comma" in data and now enclosing of field in Single or Double quote. Also, it doesn't happen often, We use the same code for say 70K record upload they works fine, but some time say in 3 yrs about 3-4 files have this problem only, we upload about one file daily.
For those who need what I did
using (System.Data.OleDb.OleDbConnection conn = new System.Data.OleDb.OleDbConnection
("Provider=Microsoft.Jet.OLEDB.4.0;Extended Properties='text;HDR=Yes;FMT=Delimited';Data Source=" + HttpContext.Current.Server.MapPath("/System/SaleList/"))
{
string sql_select = "select * from [" + this.FileName + "]";
System.Data.OleDb.OleDbDataAdapter da = new System.Data.OleDb.OleDbDataAdapter();
da.SelectCommand = new System.Data.OleDb.OleDbCommand(sql_select, conn);
DataSet ds = new DataSet();
// Read the First line of File to know the header
string[] lines = System.IO.File.ReadAllLines(HttpContext.Current.Server.MapPath("/System/SaleList/") + FileName);
string header = "";
if (lines.Length > 0)
header = lines[0];
string[] headers = header.Split(',');
CreateSchema(headers, FileName);
da.Fill(ds, "ListData");
DataTable dt = ds.Tables["ListData"];
}
And this code is working fine except the mention thing. I cut some unrelated part so, might not work by copy paste.
EDIT: More information
I try to use ODBC with Microsoft Text Driver, then I use ACE Driver with OleDB. the result is same with all three drive.
If I swap the problem record, with the preceding Row those rows are read quite well, until the next problem row [if more than one row is having problem in original file], if those are only problem row it works fine.
So from above it looks like that something is there that distract character counter, but how I can ensure it working smooth is still a quiz.
EDIT 2: I have submitted it as bug to Microsoft here : https://connect.microsoft.com/VisualStudio/feedback/details/811869/oledb-ace-driver-12-jet-4-0-or-odbc-text-driver-all-fail-to-read-data-properly-from-csv-text-file
I would suggest you examine a problem file with a hex editor - inspect the line that causes the problem and the line immediately preceding it.
In particular look at the line terminators (CR/LF? CR only? LF only?) and look for any non-printable characters.
Try using ACE Driver instead of JET (it's available on x86 and x64 servers, JET is only x86!)
using (System.Data.OleDb.OleDbConnection conn
= new System.Data.OleDb.OleDbConnection
("Provider=Microsoft.ACE.OLEDB.12.0;Extended Properties="Excel 12.0 Xml;HDR=YES";
Data Source=" + HttpContext.Current.Server.MapPath("/System/SaleList/"))
{
I got the same OleDB, Missing characters of data problem, see here:
The characters go missing because the Microsoft.Jet.OLEDB.4.0 driver
tries to guess the column datatype. In my case its was treating the
data as hexadecimal not alphanumeric.
Problematic oledbProviderString:
oledbProviderString = "Provider=Microsoft.Jet.OLEDB.4.0;Data Source=\"
{0}\";Extended Properties=\"Text;HDR=No;FMT=Delimited\"";
To fix the problem I added TypeGuessRows=0
oledbProviderString = "Provider=Microsoft.Jet.OLEDB.4.0;Data Source=\"
{0}\";Extended Properties=\"Text;HDR=No;FMT=Delimited;TypeGuessRows=0\"";
Repro:
Create a Book1.csv file with this content:
KU88,G6,CC
KU88,F7,CC
Step through this code as pictured above.
private void button1_Click(object sender, EventArgs e)
{
string folder = #"G:\Developers\Folder";
ReproProblem(folder);
}
static string oledbProviderString = "Provider=Microsoft.Jet.OLEDB.4.0;Data Source=\"{0}\";Extended Properties=\"Text;HDR=No;FMT=Delimited\"";
private void ReproProblem(string folderPath)
{
using (OleDbConnection oledbConnection = new OleDbConnection(string.Format(oledbProviderString, folderPath)))
{
string sqlStatement = "Select * from [Book1.csv]";
//open the connection
oledbConnection.Open();
//Create an OleDbDataAdapter for our connection
OleDbDataAdapter adapter = new OleDbDataAdapter(sqlStatement, oledbConnection);
//Create a DataTable and fill it with data
DataTable table = new DataTable();
adapter.Fill(table);
//close the connection
oledbConnection.Close();
}
}
why dont u just use this:
using (System.Data.OleDb.OleDbConnection conn = new System.Data.OleDb.OleDbConnection
("Provider=Microsoft.Jet.OLEDB.4.0;Extended Properties='text;HDR=Yes;FMT=Delimited';Data Source=" + HttpContext.Current.Server.MapPath("/System/SaleList/"))
{
string sql_select = "select * from [" + this.FileName + "]";
System.Data.OleDb.OleDbDataAdapter da = new System.Data.OleDb.OleDbDataAdapter();
da.SelectCommand = new System.Data.OleDb.OleDbCommand(sql_select, conn);
DataSet ds = new DataSet();
// Read the First line of File to know the header
string[] lines = System.IO.File.ReadAllLines(HttpContext.Current.Server.MapPath("/System/SaleList/") + FileName);
DataTable mdt=new DataTable("ListData");
for (int i = 1; i < lines.Length; i++)
{
string[] sep=lines[i].Split(',');
foreach (var item in sep)
{
mdt.Rows.Add(sep);
}
}
string header = "";
if (lines.Length > 0)
header = lines[0];
string[] headers = header.Split(',');
ds.Tables.Add(mdt);
CreateSchema(headers, FileName);
da.Fill(ds, "ListData");
DataTable dt = mdt;}
i didnt debugged it. i hope there is no problem but if there is im here for you.
thank you very much

.XLSX files not generated with Worksheets having more than 8K records

I am using Excel 2007 and OLE-DB provider to generate .xlsx files. The files has 2 sheets, 1st one a sheet containing graph and the 2nd Sheet contains Data.
I am using following connection string to generate the excel:
using (OleDbConnection conn = new OleDbConnection ("Provider=Microsoft.ACE.OLEDB.12.0;Data Source=C:\\Test.xlsx;Extended Properties=\"Excel 12.0 XML;HDR=Yes;IMEX=0\""))
The Graph is generated using Excel-Interop, then the workbook is closed. Followed by this a new workflow starts that opens the same Excel file and inserts Data.
The Code is something like this.
using (OleDbConnection conn = new OleDbConnection("Provider=Microsoft.ACE.OLEDB.12.0;Data Source=" + strFilename + ";Extended Properties=\"Excel 12.0 XML;HDR=Yes;IMEX=0\""))
{
clsCommon.LogMessages("Excel Connection opening");
conn.Open();
clsCommon.LogMessages("Connection opened.");
using (OleDbCommand cmd_createTable = new OleDbCommand("CREATE TABLE [" + SheetName + "](" + sb.ToString() + ")", conn))
{
cmd_createTable.ExecuteNonQuery();
//cmd_createTable.CommandTimeout = 14000;
}
clsCommon.LogMessages("Excel Sheet Created.");
//write the data
List<string> lstRowData = new List<string>();
string rowdata = string.Empty;
foreach (DataRow row in dt.Rows)
{
for (int i = 0; i < dt.Columns.Count; i++)
{
if (row[i].ToString().Contains("'"))
{
rowdata = rowdata + "'" + row[i].ToString().Replace("'", "''") + "',";
}
else
{
rowdata = rowdata + "'" + row[i].ToString() + "',";
}
}
rowdata = rowdata.Remove(rowdata.Length - 1, 1);
lstRowData.Add(rowdata);
rowdata = string.Empty;
}
#region Creating Single Command to execute multiple statements.
using (OleDbCommand cmd_Insert = new OleDbCommand())
{
clsCommon.LogMessages("Data Rows for Sheet:" + SheetName + ", is: " + lstRowData.Count.ToString());
cmd_Insert.Connection = conn;
cmd_Insert.CommandTimeout = 14000;
foreach (string item in lstRowData)
{
cmd_Insert.CommandText = "Insert into [" + SheetName + "$] values(" + item + ")";
cmd_Insert.ExecuteNonQuery();
}
cmd_Insert.Dispose();
}
#endregion
conn.Close();
conn.Dispose();
}
The problem is that, the excel is getting generated properly where worksheets that have smaller amount of Data.
For the worksheets with data more than 8k(approx), the above lines of code are writing the data to the excel properly, but when the file is buffered to the webpage, the datasheets with huge records are not present.
The Code where the file is buffered to the webpage is as follows:
Response.Clear();
Response.ClearContent();
Response.ClearHeaders();
Response.BufferOutput = true;
Response.ContentType = "application/vnd.ms-excel";
Response.AppendHeader("Content-Disposition", string.Format("attachment; filename={0}", Path.GetFileName(newFilePath)));
Response.WriteFile(newFilePath);
if (Response.IsClientConnected)
{
Response.Flush();
}
try
{
Response.Close();
HttpContext.Current.ApplicationInstance.CompleteRequest();
}
catch
{
// Eat Exception
}
The Strange Fact:
The same code works fine on my local development environment and all the worksheets are populated properly. I have Excel 2007 installed on my machine and the server as well.
I have checked the permissions on the Server using DCOM Configuration Manager (Under Component Service) and they look correct.
Workarounds Tried
1) Changed the Connection-String
From
Extended Properties=\"Excel 12.0 XML;HDR=Yes;IMEX=0\""))
To
Extended Properties=\"Excel 12.0 ;HDR=Yes;IMEX=0\""))
without luck.
Could you please help me with this issue.
Do I need to change some settings of Excel 2007 on my server to enable large data exports.
Thanks for your quick assistance
--Shalabh Gupta
IMPORTANT: Interop (used for the first worksheet in your code) is NOT supported in sever-scenarios (like ASP.NET or similar) by MS.
There are many options to read/edit/create Excel files without Interop:
MS provides the free OpenXML SDK V 2.0 - see http://msdn.microsoft.com/en-us/library/bb448854%28office.14%29.aspx (XLSX only)
This can read+write MS Office files (including Excel).
Another free option see http://www.codeproject.com/KB/office/OpenXML.aspx (XLSX only)
IF you need more like handling older Excel versions (like XLS, not only XLSX), rendering, creating PDFs, formulas etc. then there are different free and commercial libraries like ClosedXML (free, XLSX only), EPPlus (free, XLSX only), Aspose.Cells, SpreadsheetGear, LibXL and Flexcel etc.
i faced same issue for more than 6k records,
changing .xlsx to .xls is worked for me.

C# VS2005 Import .CSV File into SQL Database

I am trying to import a .csv file into my database. I am able to import an excel worksheet into my database, however due to different file format as .csv as from .xls, I need to make an import function specially for .csv.
Below is my code:
protected void Button1_Click(object sender, EventArgs e)
{
if (FileUpload1.HasFile)
{
// Get the name of the Excel spreadsheet to upload.
string strFileName = Server.HtmlEncode(FileUpload1.FileName);
// Get the extension of the Excel spreadsheet.
string strExtension = Path.GetExtension(strFileName);
// Validate the file extension.
if (strExtension != ".xls" && strExtension != ".xlsx" && strExtension != ".csv" && strExtension != ".csv")
{
Response.Write("<script>alert('Failed to import DEM Conflicting Role Datasheet. Cause: Invalid Excel file.');</script>");
return;
}
// Generate the file name to save.
string strUploadFileName = #"C:\Documents and Settings\rhlim\My Documents\Visual Studio 2005\WebSites\SoD\UploadFiles\" + DateTime.Now.ToString("yyyyMMddHHmmss") + strExtension;
// Save the Excel spreadsheet on server.
FileUpload1.SaveAs(strUploadFileName);
// Create Connection to Excel Workbook
string connStr = "Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" + strUploadFileName + ";Extended Properties=Text;";
using (OleDbConnection ExcelConnection = new OleDbConnection(connStr)){
OleDbCommand ExcelCommand = new OleDbCommand("SELECT [columns] FROM +userrolelist", ExcelConnection);
OleDbDataAdapter ExcelAdapter = new OleDbDataAdapter(ExcelCommand);
ExcelConnection.Open();
using (DbDataReader dr = ExcelCommand.ExecuteReader())
{
// SQL Server Connection String
string sqlConnectionString = "Data Source=<IP>;Initial Catalog=<DB>;User ID=<userid>;Password=<password>";
// Bulk Copy to SQL Server
using (SqlBulkCopy bulkCopy =
new SqlBulkCopy(sqlConnectionString))
{
bulkCopy.DestinationTableName = "DEMUserRoles";
bulkCopy.WriteToServer(dr);
Response.Write("<script>alert('DEM User Data imported');</script>");
}
}
}
}
else Response.Write("<script>alert('Failed to import DEM User Roles Data. Cause: No file found.');</script>");
}
The file has been successfully saved, but the error says that the path for the file is not valid, even though the file has been successfully saved as .csv, therefore I am not able to continue with the process of importing the data into my database.
Below are the screenshots of my error:
In conclusion I am having the error that the file path which the csv file is saved is not valid, although the csv file is successfully saved. Need some help from experienced. Thank You
If you're reading a CSV file, your connection string should specify the directory containing your CSV file.
string connStr = "Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" +
Path.GetDirectoryName(strUploadFileName);
You then use the filename in your SELECT statement:
"SELECT * FROM [" + Path.GetFileName(strUploadFileName) + "]"
I think you have this problem because you use "/" instead of "\"
Try to modify the path C:\.....
You need to use the backward slashes(\) on the file path.
string strUploadFileName = #"C:\Documents and Settings\rhlim\My Documents\Visual Studio 2005\WebSites\SoD\UploadFiles\" + DateTime.Now.ToString("yyyyMMddHHmmss") + strExtension;
EDIT 1: I believe FileUpload1.SaveAs converts the / to \ internally to identify the correct location.
EDIT 2: Its the problem with your connectionstring, even though you are using .csv format, you need to set Excel 8.0 or Excel 12.0 Xml as the Extended Properties
Here is the sample:
string connStr = "Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" + strUploadFileName + ";Extended Properties=Excel 12.0 Xml;";
For other types, check the code of OLEDB section of my article.
To avoid the connection open you can use like
// Read the CSV file name & file path
// I am usisg here Kendo UI Uploader
string path = "";
string filenamee = "";
if (files != null)
{
foreach (var file in files)
{
var fileName = Path.GetFileName(file.FileName);
path = Path.GetFullPath(file.FileName);
filenamee = fileName;
}
// Read the CSV file data
StreamReader sr = new StreamReader(path);
string line = sr.ReadLine();
string[] value = line.Split(',');
DataTable dt = new DataTable();
DataRow row;
foreach (string dc in value)
{
dt.Columns.Add(new DataColumn(dc));
}
while (!sr.EndOfStream)
{
value = sr.ReadLine().Split(',');
if (value.Length == dt.Columns.Count)
{
row = dt.NewRow();
row.ItemArray = value;
dt.Rows.Add(row);
}
}
For more help you can also See This Link

How can I save a DataTable to a .DBF?

I've been working on a program to read a dbf file, mess around with the data, and save it back to dbf. The problem that I am having is specifically to do with the writing portion.
private const string constring = "Driver={Microsoft dBASE Driver (*.dbf)};"
+ "SourceType=DBF;"
+ "DriverID=277;"
+ "Data Source=¿;"
+ "Extended Properties=dBASE IV;";
private const string qrystring = "SELECT * FROM [¿]";
public static DataTable loadDBF(string location)
{
string filename = ConvertLongPathToShort(Path.GetFileName(location));
DataTable table = new DataTable();
using(OdbcConnection conn = new OdbcConnection(RTN(constring, filename)))
{
conn.Open();
table.Load(new OdbcCommand(RTN(qrystring, filename), conn).ExecuteReader());
conn.Close();
}
return table;
}
private static string RTN(string stmt, string tablename)
{ return stmt.Replace("¿", tablename); }
[DllImport("Kernel32", CharSet = CharSet.Auto)]
static extern Int32 GetShortPathName(
String path, // input string
StringBuilder shortPath, // output string
Int32 shortPathLength); // StringBuilder.Capacity
public static string ConvertLongPathToShort(string longPathName)
{
StringBuilder shortNameBuffer;
int size;
shortNameBuffer = new StringBuilder();
size = GetShortPathName(longPathName, shortNameBuffer, shortNameBuffer.Capacity);
if (size >= shortNameBuffer.Capacity)
{
shortNameBuffer.Capacity = size + 1;
GetShortPathName(longPathName, shortNameBuffer, shortNameBuffer.Capacity);
}
return shortNameBuffer.ToString();
}
This is what I'm working with. I've tried a number of methods to write a new file, none of them productive. To be honest, while normally I would be an advocate of form and function, I just want the damn thing to work, this app is supposed to do one very specific thing, it's not going to simulate weather.
-=# Edit #=-
I've since discontinued the app due to time pressure, but before I scrapped it I realised that the particular format of dbf I was working with had no primary key information. This of course meant that I had to essentially read the data out to DataTable, mess with it, then wipe all the records in the dbf and insert everything from scratch.
Screw that for a lark.
For people coming here in the future: I wrote this today and it works well. The filename is without the extension (.dbf). The path (used for connection) is the directory path only (no file). You can add your datatable to a dataset and pass it in. Also, some of my datatypes are foxpro data types and may not be compatible with all DBF files. Hope this helps.
public static void DataSetIntoDBF(string fileName, DataSet dataSet)
{
ArrayList list = new ArrayList();
if (File.Exists(Path + fileName + ".dbf"))
{
File.Delete(Path + fileName + ".dbf");
}
string createSql = "create table " + fileName + " (";
foreach (DataColumn dc in dataSet.Tables[0].Columns)
{
string fieldName = dc.ColumnName;
string type = dc.DataType.ToString();
switch (type)
{
case "System.String":
type = "varchar(100)";
break;
case "System.Boolean":
type = "varchar(10)";
break;
case "System.Int32":
type = "int";
break;
case "System.Double":
type = "Double";
break;
case "System.DateTime":
type = "TimeStamp";
break;
}
createSql = createSql + "[" + fieldName + "]" + " " + type + ",";
list.Add(fieldName);
}
createSql = createSql.Substring(0, createSql.Length - 1) + ")";
OleDbConnection con = new OleDbConnection(GetConnection(Path));
OleDbCommand cmd = new OleDbCommand();
cmd.Connection = con;
con.Open();
cmd.CommandText = createSql;
cmd.ExecuteNonQuery();
foreach (DataRow row in dataSet.Tables[0].Rows)
{
string insertSql = "insert into " + fileName + " values(";
for (int i = 0; i < list.Count; i++)
{
insertSql = insertSql + "'" + ReplaceEscape(row[list[i].ToString()].ToString()) + "',";
}
insertSql = insertSql.Substring(0, insertSql.Length - 1) + ")";
cmd.CommandText = insertSql;
cmd.ExecuteNonQuery();
}
con.Close();
}
private static string GetConnection(string path)
{
return "Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" + path + ";Extended Properties=dBASE IV;";
}
public static string ReplaceEscape(string str)
{
str = str.Replace("'", "''");
return str;
}
What kind of dbf file are you working with? (There are several, e.g. dBase, FoxPro etc that are not 100% compatible.) I have gotten this to work with the Microsoft Visual FoxPro OleDB Provider from C#, you might give that a shot instead of using the dBase ODBC driver.
Using ADO.Net to read and write dbf files turns out to be really slow so I would suggest you use an alternative approach.
One option would be to use the old DAO 3.6 library. This is much faster and just as compatible, but depends on a com object to work.
A better approach would be to use the open source DBFExporter component. It may require some code to set up (you need a class with properties that describe your recordset and the properties must have certain attributes set) but after that it works really well. It is fast to use but it doesn't read dbf files. The component is licences under the LGPL so you should be able to use it in commercial code.

Categories