Error in Bulk Upload in SQL server from Excel 2007 using C# - c#

I am trying to insert in SQL using Bulk upload. It is working well when I am using from local, even working on Test live URL too. But when I am uploading on Live it breaks after uploading 10000 rows.
Below is my code:
public bool ExportExcelToSql (DataTable DT)
{
bool result = false;
tableColumns = "*";
try
{
using (var connection = new OleDbConnection(_excelConnectionString))
{
connection.Open();
DataTable dt = null;
dt = connection.GetOleDbSchemaTable(OleDbSchemaGuid.Tables, null);
_excelSheetName = dt.Rows[0]["TABLE_NAME"].ToString();
var command = new OleDbCommand("Select " + tableColumns + " FROM [" + _excelSheetName + "]", connection);
command.CommandTimeout = 6000;
using (DbDataReader dr = command.ExecuteReader())
{
string conString = _sqlConnectionString;
var sqlConn = new SqlConnection(conString);
sqlConn.Open();
using (var bulkCopy = new SqlBulkCopy(sqlConn))
{
bulkCopy.BulkCopyTimeout = 6000;
bulkCopy.DestinationTableName = tableName.ToString();
for (int i = 0; i < DT.Rows.Count; i++)
{
bulkCopy.ColumnMappings.Add(new SqlBulkCopyColumnMapping(DT.Rows[i][0].ToString(), DT.Rows[i][1].ToString(),));
}
bulkCopy.WriteToServer(dr);
}
result = true;
}
}
}
catch (Exception ex)
{
throw ex;
}
return result;
}

I would split the file in chunks and import them one at a time. No, there is no such limit after 100,000 rows. I would suspect that the problem is that the file does not match the format file or that your using something like SQL Express 2008 R2 is limited to 4GB database size.

Related

How to get excel rows one by one in command text using loop?oledb,c#

I am using oledb to read excel file and then i am trying to apply some conditions and then save in db, but i just want to read excel row one by one using loop.
var cmd = conn.CreateCommand();
cmd.CommandText = "select * from [الحيازات$]";// want to get this rows using loop so
//that only specific row is selected perform opertion blow then i will save in db,but
//its selecting all rows now
using (var rdr = cmd.ExecuteReader())
{
//LINQ query - when executed will create anonymous objects for each row
var query = (from DbDataRecord row in rdr
select row)
.Select(x =>
{
//dynamic item = new ExpandoObject();
Dictionary<string, string> item = new Dictionary<string, string>();
for (int i = 0; i < x.FieldCount; i++)
{
//code
}
});
}
Please check this
private void ReadExcel()
{
try
{
string connectionString = #"Provider=Microsoft.ACE.OLEDB.12.0;Data Source=" + filePath + #";Extended Properties=""Excel 12.0;HDR=YES;IMEX=1;""";
using (OleDbConnection connection = new OleDbConnection())
{
connection.ConnectionString = connectionString;
using (OleDbCommand command = connection.CreateCommand())
{
command.CommandText = "SELECT * FROM [Sheet1$]";
connection.Open();
OleDbDataAdapter dataAdapter = new OleDbDataAdapter(command);
DataSet dataSet = new DataSet();
dataAdapter.Fill(dataSet);
if (dataSet != null && dataSet.Tables[0] != null)
{
foreach (DataRow row in dataSet.Tables[0].Rows)
{
string value = row["EmailColumnName"].ToString();
}
}
}
}
}
catch (Exception ex)
{
//Handle Exception
}
}

selecting all photographs from sql database

I want to export all photographs from our database into a data table. I will then loop through the table and save each image to disk. There are approx 7000 photos.
When I start the process I can retrieve around 4000 photographs before I start to get error messages like, Exception of type 'System.OutOfMemoryException' was thrown.
If I change the SQL query to retrieve half as many photographs, say 3500 then the process completes successfully.
While I have now achieved what I wanted by modifying the SQL each time I run the code, I would like to improve my code so that all 7000 photographs are returned. Could somebody please advise on a better process.
Here is my method
public static DataTable GetAllPhotos()
{
DataTable dt = new DataTable();
dt.Columns.Add("personId", typeof(string));
dt.Columns.Add("Photo", typeof(Bitmap));
string SQL = "";
byte[] getImg = new byte[0];
byte[] BitmapImg = new byte[0];
string personId = "";
SqlConnection conn = new SqlConnection();
conn.ConnectionString = _connString;
SQL = #"select per.person_id,pho.photo
from person as per
left join photo as pho on per.photo_id = pho.photo_id
where photo is not null";
conn.Open();
SqlDataReader dr = null;
SqlCommand cmd = new SqlCommand(SQL, conn);
dr = cmd.ExecuteReader();
while (dr.Read())
{
try
{
getImg = (byte[])dr["Photo"];
personId = Convert.ToString(dr["person_id"]);
MemoryStream str = new MemoryStream(getImg);
Bitmap bitmap = new Bitmap(Image.FromStream(str));
BitmapImg = ImageToByte(bitmap);
dt.Rows.Add(personId, bitmap);
}
catch (Exception ex)
{
LogWriter.WriteLine(personId + ex.Message.ToString());
}
}
conn.Close();
return dt;
}
If your intent is saving images to disk, then why would you get them into an intermediate datatable? Wouldn't it be better if you directly write out to disk as you read them? If so, assuming those are .bmp files:
public static void DumpAllPhotos()
{
string sql = #"select per.person_id,pho.photo
from person as per
inner join photo as pho on per.photo_id = pho.photo_id";
string folder = #"c:\MyFolder"; // output folder
using (SqlConnection con = new SqlConnection(_connString))
using (SqlCommand cmd = new SqlCommand(sql,con))
{
con.Open();
var rdr = cmd.ExecuteReader();
while (rdr.Read())
{
var bytes = (byte[])rdr["photo"];
var path = Path.Combine(folder, $"{rdr["person_id"].ToString()}.bmp");
File.WriteAllBytes(path, bytes);
}
con.Close();
}
}
Bitmap has to be disposed, you're using too many handles.
So your while loop should be something like this:
while (dr.Read())
{
try
{
getImg = (byte[])dr["Photograph"];
personId = Convert.ToString(dr["person_id"]);
MemoryStream str = new MemoryStream(getImg);
Bitmap bitmap = new Bitmap(Image.FromStream(str));
BitmapImg = ImageToByte(bitmap);
dt.Rows.Add(personId, bitmap);
bitmap.Dipose(); // <--- DISPOSE!!
}
catch (Exception ex)
{
LogWriter.WriteLine(personId + ex.Message.ToString());
}
}
or maybe even better:
while (dr.Read())
{
try
{
getImg = (byte[])dr["Photograph"];
personId = Convert.ToString(dr["person_id"]);
MemoryStream str = new MemoryStream(getImg);
using (Bitmap bitmap = new Bitmap(Image.FromStream(str))) {
BitmapImg = ImageToByte(bitmap);
dt.Rows.Add(personId, bitmap);
}
}
catch (Exception ex)
{
LogWriter.WriteLine(personId + ex.Message.ToString());
}
}

Properly read the column data type in ms access database

How to properly get the Data Type of MS Access table
Below is my code:
var con = $"Provider=Microsoft.Jet.OLEDB.4.0;" +
$"Data Source={database};" +
"Persist Security Info=True;";
using (OleDbConnection myConnection = new OleDbConnection())
{
myConnection.ConnectionString = con;
myConnection.Open();
DataTable dt = myConnection.GetSchema("Tables");
var l = myConnection.GetOleDbSchemaTable(OleDbSchemaGuid.Tables, new object[] { null, null, null, "TABLE" });
var isTableExist = false;
foreach (DataRow item in l.Rows)
{
var tbleName = item[2].ToString();
if (tbleName == tableName)
{
isTableExist = true;
break;
}
}
if (isTableExist)
{
string sql = string.Format($"SELECT * FROM {tableName}");
using (OleDbCommand cmd = new OleDbCommand(sql, myConnection))
{
using (OleDbDataReader reader = cmd.ExecuteReader())
{
if (reader != null)
{
for (int i = 0; i < reader.FieldCount; i++)
{
Console.WriteLine(reader.GetName(i));
Console.WriteLine(reader.GetDataTypeName(i));
}
}
}
}
}
}
Here is the structure of the mdb file im reading
And this is what i'm receiving
id
DBTYPE_I4
main_id
DBTYPE_I4
Document_ID
DBTYPE_WVARCHAR
End_page
DBTYPE_WVARCHAR
No_pages
DBTYPE_I4
Why does the AutoNumber datatype show the same as the number data type?
What i'm doing here is merging tables. And i want to skip the insert statement for those data type AutoNumber but i can't make it work because the result is the same with number. Thank you

SQL Server return Select statement with XML datatype and convert it into DataSet in C#, ASP.Net

In my database table, I have a column name 'SectionDatatable' which are in XML datatype. In my C# code, after I have connection to database to my database and I make a query to get SectionDatatablewhich is XML format in my database, UserDefinedSectionData. I need to convert the 'SectionDatatable' in XML Datatype and convert it into DataSet, how can I do it. I have stuck for 1 day, the following is my code.
SqlConnectionStringBuilder csb = new SqlConnectionStringBuilder();
csb.DataSource = #"CPX-XSYPQKSA91D\SQLEXPRESS";
csb.InitialCatalog = "DNN_Database";
csb.IntegratedSecurity = true;
string connString = csb.ToString();
string queryString = "select * FROM UserDefinedSectionData WHERE SectionDataTabId = #tabId";
using (SqlConnection connection = new SqlConnection(connString))
using (SqlCommand command = connection.CreateCommand())
{
command.CommandText = queryString;
command.Parameters.Add(new SqlParameter("tabId", tabId));
connection.Open();
using (SqlDataReader reader = command.ExecuteReader())
{
while (reader.Read())
{
string sectionData= reader["SectionDatatable"].ToString();
int moduleId = Int32.Parse(reader["SectionDataModuleId"].ToString());
}
}
}
This is the simple example of converting XML string to DataSet. This sample also demonstrates processing all tables in DataSet.
You need to replace XML string in this sample with your XML output from database. You can change code as per you need to access data.
string RESULT_OF_SectionDatatable = "<note><to>Tove</to><from>Jani</from><heading>Reminder</heading><body>Don't forget me this weekend!</body></note>";
var xmlReader = XmlReader.Create(new StringReader(RESULT_OF_SectionDatatable));
DataSet ds = new DataSet();
ds.ReadXml(xmlReader);
foreach (DataTable table in ds.Tables)
{
Console.WriteLine(table);
Console.WriteLine();
foreach (var row in table.AsEnumerable())
{
for (int i = 0; i < table.Columns.Count; ++i)
{
Console.WriteLine(table.Columns[i].ColumnName +"\t" + row[i]);
}
Console.WriteLine();
}
}

SQL Bulkcopy run datatable in batches

Currently I have a flatfile of around 10 million records.
I firstly read the file and map to columns into a datatable. Then from the Datatable using SQLBulkCopy move it into database. This is a slow process and times out a lot.
I wonder if this process could be done better.
1) I.e. rather than load my 10 million records into a datatable and then into database. Would it better to load "chunks" of 1 million into datatable and process them into database in a loop till all are processed ? Or if any one has any suggestions on how to improve this process would appreciate it !
foreach (var line in File.ReadLines(fileLocation + #"\" + filename))
{
var dataColumns = line.Split('\t');
var dr = dt.NewRow();
for (var i = 0; i < dataColumns.Length; i++)
{
dr[i] = dataColumns[i].Trim().Length == 0 ? null : dataColumns[i];
}
dt.Rows.Add(dr);
}
using (var destinationConnection = new SqlConnection(Settings.Default.ConnectionString))
{
var tableName = filename.Substring(0, filename.IndexOf('.'));
destinationConnection.Open();
try
{
using (var createAndDropTableCommand = new SqlCommand("sp_DropAndCreateTable", destinationConnection))
{
createAndDropTableCommand.CommandType = CommandType.StoredProcedure;
createAndDropTableCommand.Parameters.Add("#TableToCreate", SqlDbType.VarChar).Value = tableName;
createAndDropTableCommand.ExecuteNonQuery();
}
using (var bulkCopy = new SqlBulkCopy(destinationConnection))
{
bulkCopy.DestinationTableName = "dbo." + tableName;
bulkCopy.BatchSize = 10000;
bulkCopy.ColumnMappings.Clear();
foreach (DataColumn col in dt.Columns)
{
bulkCopy.ColumnMappings.Add(new SqlBulkCopyColumnMapping(col.ColumnName, col.ColumnName));
}
bulkCopy.WriteToServer(dt);
}
}
finally
{
destinationConnection.Close();
}
}
You are definately on the right track.
SQL Bulk copy should be able to process over a million records in around 30 seconds. I have used DataSets in dbsourcetools.codeplex.com with SQLBulkCopy
System.Data.DataSet oDataSet = new System.Data.DataSet();
oDataSet.ReadXmlSchema(strSchemaXml);
oDataSet.ReadXml(strTableXml);
if (oDataSet.Tables.Count > 0)
{
using (System.Data.SqlClient.SqlConnection conn = new System.Data.SqlClient.SqlConnection(oTargetDatabase.ConnectionString))
{
conn.Open();
System.Data.SqlClient.SqlTransaction oTran = conn.BeginTransaction();
System.Data.SqlClient.SqlBulkCopy oSqlBulkCopy
= new System.Data.SqlClient.SqlBulkCopy(conn, System.Data.SqlClient.SqlBulkCopyOptions.KeepIdentity, oTran);
oSqlBulkCopy.BulkCopyTimeout = 600;
oSqlBulkCopy.BatchSize = 1000;
oSqlBulkCopy.DestinationTableName = strFullyQualifiedTableName;
oSqlBulkCopy.NotifyAfter = 10000;
oSqlBulkCopy.SqlRowsCopied += new System.Data.SqlClient.SqlRowsCopiedEventHandler(oSqlBulkCopy_SqlRowsCopied);
foreach (System.Data.DataColumn ocol in oDataSet.Tables[0].Columns)
{
oSqlBulkCopy.ColumnMappings.Add(ocol.ColumnName, ocol.ColumnName);
}
oSqlBulkCopy.WriteToServer(oDataSet.Tables[0]);
oTran.Commit();
conn.Close();
}
System.Console.WriteLine("Wrote : " + oDataSet.Tables[0].Rows.Count.ToString() + " records ");
}

Categories