var bl = new MySqlBulkLoader(mycon);
bl.TableName = "tblspmaster";
bl.FieldTerminator = ",";
bl.LineTerminator = "\r\n";
bl.FileName = "E://31october//SP//sp_files_sample1//400k sp00 6-19 E.csv";
bl.NumberOfLinesToSkip = 1;
var inserted = bl.Load();
I am using this code to upload a csv file in db but its not throwing any exception and inserted always show zero .
dotnetconnector for mysql is already installed and reference is also added .
Finally i have used this code and its working for me
string sql = #"load data infile 'E:/a1.csv' ignore into table tblspmaster fields terminated by '' enclosed by '' lines terminated by '\n' IGNORE 1 LINES (sp)";
MySqlCommand cmd = new MySqlCommand(sql, mycon);
cmd.CommandTimeout = 5000000;
cmd.ExecuteNonQuery();
It may not insert the data, if your csv file and the mysql table columns are not matched.
For example if your mysql table has a primary key column with identity key, and generally you won't have this column in csv file.
So in the above case the mysqlloader will not insert the data.
To solve this use columns property and add the columns mysql table column names
here is a sample code.
public async Task<bool> MySqlBulkLoaderAsync(string csvFilePath)
{
bool result = true;
try
{
using (var conn = new MySqlConnection(_connString + ";AllowLoadLocalInfile=True"))
{
var bl = new MySqlBulkLoader(conn)
{
TableName = "patientdetailstagings",
Timeout = 600,
FieldTerminator = ",",
LineTerminator = "\n",
FieldQuotationCharacter = '"',
FileName = csvFilePath,
NumberOfLinesToSkip = 1
};
bl.Columns.AddRange(new List<string>() {"Column1", "Column2"});
var numberOfInsertedRows = await bl.LoadAsync();
}
System.IO.File.Delete(csvFilePath);
}
catch (Exception ex)
{
result = false;
throw;
}
return result;
}
Note: the column mapping will be done based on the way you have added them in the columns property.
In the same order it will access the column value from csv file
Related
I have a function that inserts SQL data based on a spreadsheet. I have had the issue of certain rows triggering the exception due to being truncated, and I am trying to figure out the rows that are causing the issue. (I have to query 3 different tables so I am using a function passing in SQL/command parameters/values instead of writing the same function 3 times)
The function works to insert the SQL data, except for the few rows that throws the ex message:
String or Binary data would be truncated. The statement has been
terminated
My question is how do I print out the row number that causes the above message to troubleshoot the data in the excel sheet. The size of the sheet is 100+ thousand rows, so I don't want to go through it row by row.
The function I have:
public static void insert_data(string[] cols, string[] vals, string sql)
{
int exception_count = 0;
List<string> rows = new List<string>();
string connectionString = "Server = ; Database = ; User Id = ; Password = ";
using (SqlConnection connection = new SqlConnection(connectionString))
using (SqlCommand command = new SqlCommand(sql, connection))
{
connection.Open();
for(int x = 0; x<cols.Length; x++)
command.Parameters.AddWithValue(cols[x], vals[x]);
try
{
command.ExecuteNonQuery();
}
catch (Exception ex)
{
exception_count =+ 1;
Console.WriteLine(ex.Message);
//rows.Add(--rows number--);
}
}
}
Like I can see, the rows come from array, so first use a lambda expression to find the rows that have more than the length that you want, or viceverse find rows than have less or equal length, depends of you.
public void test2()
{
//ensure that there is no empty rows in the array... of will thrown an exception
string[] vals = new string[7];
int myMaxColumnDatabaseLenght = 7;
vals[0] = "length7";
vals[1] = "length7";
vals[2] = "length7";
vals[3] = "length7";
vals[4] = "length_8";
vals[5] = "length__9";
vals[6] = "length__10";
Debug.WriteLine(vals.Count());
vals = vals.Where(x => x.Length <= myMaxColumnDatabaseLenght).ToArray();
Debug.WriteLine(vals.Count());
}
If you want make this dynamic, maybe you need query the length definition of the specific columns in SQL, if you want this part I will try to find for add information about that... .
I've loaded a csv into a datatable but some of the columns are being read as blank when they're not blank. I initially posed this question only having an issue with the header, but I'm now also seeing this issue in my data rows so I need to reask... what is the problem with my dataset and why are some columns being read as a blank?
Currently, this setup will read data for columns 1-7 (I don't need 8-10). Data is populated for all columns correctly except column 4. Strangely, I have two files I've tested, both similar in structure, but one of them has values in column 4 and the other doesn't. The full code is setup to loop through many files, checking for start and end of data, then load to sql server.
Sample CSV:
By OrgID/Location
As of: December 6, 2017 at 10:13 AM
Date Range: summaryYM 2017M08 to 2017M08
"orgid=13778 medType=' '"
"col1","col2","col3","col4","col5","col6","col7","col8","col9","col10"
13778,140242,"2A","2017M08",0,0.058,78,".",".",
13778,140242,"2B","2017M08",0,0.014,19,".",".",
13778,140242,"2C","2017M08",0,0.083,133,".",".",
13778,140242,"2ICU","2017M08",0,0.099,114,".",".",
13778,140242,"3 ICU","2017M08",0,0.076,88,".",".",
code
//open connection to csv
string connStrCsv = string.Format(#"Provider=Microsoft.Jet.OleDb.4.0; Data Source={0};Extended Properties=""Text;HDR=NO;FMT=Delimited"""
, Path.GetDirectoryName(file));
OleDbConnection connCsv = new OleDbConnection(connStrCsv);
connCsv.Open();
//store csv data in datatable
string readCsv = "select * from [" + Path.GetFileName(file) + "]";
OleDbDataAdapter adapter = new OleDbDataAdapter(readCsv, connCsv);
DataSet ds = new DataSet();
adapter.Fill(ds, "sheet1");
DataTable table = ds.Tables["sheet1"];
connCsv.Close();
//find header to define start of data
int start = 0;
StreamReader headerSearch = null;
int incr = 0;
headerSearch = new StreamReader(file);
while (!headerSearch.EndOfStream)
{
incr++;
string line = headerSearch.ReadLine();
if (line.Contains("\"col1\",\"col2\",\"col3\",\"col4\",\"col5\",\"col6\""))
{
start = incr;
}
}
headerSearch.Close();
//load each row of excel into SQL server until first empty row
string sqlConnStr = "Data Source=mysource;Initial Catalog=mydatabase;Trusted_Connection=Yes;Integrated Security=SSPI;";
SqlConnection connSql = new SqlConnection(sqlConnStr);
connSql.Open();
int end = start;
while (table.Rows[end][0].ToString().Length != 0)
{
string sql = string.Format
(#"
delete from schema.table
where ss_col1 = {0}
and ss_col2 = '{1}'
and ss_col3 = '{2}'
and ss_col4 = '{3}';
insert into schema.table
values ({4}
,'{5}'
,'{6}'
,'{7}'
, {8}
,'{9}'
,'{10}'
,getdate()
,user_name()
,getdate()
,user_name());"
//delete statement variables
, table.Rows[end][0].ToString()
, table.Rows[end][2].ToString()
, table.Rows[end][3].ToString()
, infTypes[i]
//insert statement variables
, table.Rows[end][0].ToString()
, table.Rows[end][2].ToString()
, table.Rows[end][3].ToString()
, infTypes[i]
, table.Rows[end][4]
, table.Rows[end][5].ToString()
, table.Rows[end][6]
);
SqlCommand execSql = new SqlCommand(sql, connSql);
execSql.ExecuteNonQuery();
end++;
}
connSql.Close();
Do you have to load it into a datatable?
If "header1","header2","header3","header4","header5","header6" are unique would it not be easier just read the csv file until you find those?
Example...
StreamReader Reader = null;
string FilePath = "Your File Path";
try
{
Reader = new StreamReader(FilePath);
while(Reader.Peek() > 0)
{
string line = Reader.ReadLine();
bool HeaderFound = false;
if(line == "What ever your headers are")
{
HeaderFound = true;
}
if(HeaderFound)
{
//Here is all your data you were looking for.
//Do whatever you need to do with it now.
}
}
} catch(exception e)
{/*Deal with the issues*/}
finally
{
if(Reader != null)
{
Reader.Close();
Reader.Dispose();
}
}
I've written a method which will try and delete a row from a db table based on a primary key id. The problem i have is that the try block is always returning "Success" even if a record has already been deleted / or it doesn't exist.
public string delete_visit(int id)
{
string deleteResponse = null;
string cnn = ConfigurationManager.ConnectionStrings[connname].ConnectionString;
using (SqlConnection connection = new SqlConnection(cnn))
{
string SQL = string.Empty;
SQL = "DELETE FROM [" + dbname + "].[dbo].[" + tbname + "] WHERE VisitorNumber = #IDNumber ";
using (SqlCommand command = new SqlCommand(SQL, connection))
{
command.Parameters.Add("#IDNumber", SqlDbType.Int);
command.Parameters["#IDNumber"].Value = id;
try
{
connection.Open();
command.ExecuteNonQuery();
deleteResponse = "Success";
}
catch (Exception ex)
{
deleteResponse = "There was a problem deleting the visit from the database. Error message: " + ex.Message;
}
}
}
return deleteResponse;
}
I want to be able to tell if the row was affected. I can do this in SQL Server Management Studio like so:
DELETE FROM Visits
WHERE VisitorNumber=88;
IF ##ROWCOUNT = 0
PRINT 'Warning: No rows were updated';
So i want to know how do i plug in the ##ROWCOUNT bit into my c# so that i can tell if the row was deleted?
thanks
ExecuteNonQuery() returns an int, indicating how many rows were affected.
So:
int rowsAffected = command.ExecuteNonQuery();
if (rowsAffected == 0)
{
deleteResponse = "No rows affected";
}
The problem is that this number can be influenced based on what the query actually does. Executing triggers or calling stored procedures could mess with the output, changing the affected number of rows. If you really must, then first execute a query where you check that the record with the given ID exists.
My application is a winform app that relies on a database. At startup of the application it connects to an SQL Database on the server and put this information in a DataSet/DataTable.
If for some reason the database on the server is not accessible, the application has a built in failover and it will get its information from the local database.
If, in a normal scenario, I start the tool it will read from the sql database and if it has been updated on the server (a seperate snippet checks this), it should make sure the local database is up to date and this is where the problem starts.. (see below)
This part works fine and is added as context - this is where we connect to the SQL Database
public static DataSet dtsTableContents;
public static DataTable CreateDatabaseSQLConnection()
{
try
{
string strSqlConnectionString = "Data Source=MyLocation;Initial Catalog=MyCatalog;User=MyUser;Password=MyPassword;";
SqlCommand scoCommand = new SqlCommand();
scoCommand.Connection = new SqlConnection(strSqlConnectionString);
scoCommand.Connection.Open();
string strQueryToTable = "SELECT * FROM " + strTableName;
dtsTableContents = new DataSet();
SqlCommand scmTableInformation = new SqlCommand(strQueryToTable, scnConnectionToDatabase);
SqlDataAdapter sdaTableInformation = new SqlDataAdapter(scmTableInformation);
scnConnectionToDatabase.Open();
sdaTableInformation.Fill(dtsTableContents, strTableName);
DataTable dttTableInformation = dtsTableContents.Tables[strTableName];
scnConnectionToDatabase.Close();
return dttTableInformation;
}
catch
{
return null;
}
}
This snippet is part of the failover method that reads from my local database...
This part works fine and is added as context - this is where we connect to the MDB Database
public static DataTable CreateDatabaseConnection()
{
try
{
string ConnectionString = #"Provider=Microsoft.Jet.OLEDB.4.0;Data Source=MyLocation;Persist Security Info=True;JET OLEDB:Database Password=MyPassword;"
odcConnection = new OleDbConnection(ConnectionString);
odcConnection.Open();
string strQueryToTable = "SELECT * FROM " + strTableName;
DataSet dtsTableContents = new DataSet();
OleDbCommand ocmTableInformation = new OleDbCommand(strQueryToTable, ocnConnectionToDatabase);
OleDbDataAdapter odaTableInformation = new OleDbDataAdapter(ocmTableInformation);
ocnConnectionToDatabase.Open();
odaTableInformation.Fill(dtsTableContents, strTableName);
DataTable dttTableInformation = dtsTableContents.Tables[strTableName];
ocnConnectionToDatabase.Close();
return dttTableInformation;
}
catch
{
return null;
}
}
From my CreateDatabaseSQLConnection() I have a DataSet. This DataSet is verified to contain all the information from the server database. Now I have been googling around and found myself trying to use this code to update the local database based on this article:
http://msdn.microsoft.com/en-us/library/system.data.common.dataadapter.update(v=vs.71).aspx
public static void UpdateLocalDatabase(string strTableName)
{
try
{
if (CreateDatabaseConnection() != null)
{
string strQueryToTable = "SELECT * FROM " + strTableName;
OleDbDataAdapter odaTableInformation = new OleDbDataAdapter();
odaTableInformation.SelectCommand = new OleDbCommand(strQueryToTable, odcConnection);
OleDbCommandBuilder ocbCommand = new OleDbCommandBuilder(odaTableInformation);
odcConnection.Open();
odaTableInformation.Update(dtsTableContents, strTableName);
odcConnection.Close();
}
}
catch { }
}
This snippet runs error-free but it does not seem to change anything. Also the time it takes to run this step takes like milliseconds and I would think this to take longer.
I am using the DataSet I obtained from my SQL Connection, this DataSet I am trying to write to my local database.
Might it be the fact that this is a DataSet from an SQL connection and that I can't write this to my mdb connection via my OleDbAdapter or am I just missing the obvious here?
Any help is appreciated.
Thanks,
Kevin
For a straight backup from the external database to the internal backup
I've just been messing around with an sdf and a server based sql database and it has worked.. It's not by all means the finished product but for one column I've got the program to read from the external database and then write straight away to the local .sdf in around 15 lines of code
SqlConnection sqlCon = new SqlConnection( ExternalDatabaseConnectionString );
SqlCeConnection sqlCECon = new SqlCeConnection( BackUpConnectionString );
using ( sqlCon )
{
using ( sqlCECon )
{
sqlCon.Open( );
sqlCECon.Open( );
SqlCommand get = new SqlCommand( "Select * from [TableToRead]", sqlCon );
SqlCeCommand save = new SqlCeCommand( "Update [BackUpTable] set InfoColumn = #info where ID = #id", sqlCECon );
SqlDataReader reader = get.ExecuteReader( );
if ( reader.HasRows )
{
reader.Read( );
save.Parameters.AddWithValue("#id", reader.GetString(0));
save.Parameters.AddWithValue( "#info", reader.GetString( 1 ));
save.ExecuteNonQuery( );
}
}
}
For one row of a database, backing up one column, it works, I'm assuming you will have some sort of auto incremented key like ID?
I think a first step would be to reduce your reliance on static methods and fields.
If you look at your UpdateLocalDatabase method you'll see that you are passing in strTableName which is used by that method but a method that UpdateLocalDatabase calls (CreateDatabaseConnection) refers to a different global static variable named the same. Most likely the two strTableName variables contain different values and you are not seeing that they are not the same variable.
Also you are trying to write out the global static data set dtsTableContents in UpdateLocalDatabase but if you then look at CreateDatabaseConnection it actually creates a local version of that variable -- again, you have two variables named the same thing where one is global and one is local.
I suspect that the two variables of dtsTableContents is the problem.
My suggestion, again, would be to not have any static methods or variables and, for what you're doing here, try not to use any global variables. Also, refactor and/or rename your methods to match more what they are actually doing.
After endlessly trying to use the DataAdapter.Update(Method) I gave up.. I settled for using an sdf file instead of an mdb.
If I need to update my database, I remove the local database, create a new one, using the same connectionstring. Then I loop over the tables in my dataset, read the column names and types from it and create tables based on that.
After this I loop over my dataset and insert the contents of my dataset which I filled with the information from the server. Below is the code, this is just 'quick and dirty' as a proof of concept but it works for my scenario.
public static void RemoveAndCreateLocalDb(string strLocalDbLocation)
{
try
{
if (File.Exists(strLocalDbLocation))
{
File.Delete(strLocalDbLocation);
}
SqlCeEngine sceEngine = new SqlCeEngine(#"Data Source= " + strLocalDbLocation + ";Persist Security Info=True;Password=MyPass");
sceEngine.CreateDatabase();
}
catch
{ }
}
public static void UpdateLocalDatabase(String strTableName, DataTable dttTable)
{
try
{
// Opening the Connection
sceConnection = CreateDatabaseSQLCEConnection();
sceConnection.Open();
// Creating tables in sdf file - checking headers and types and adding them to a query
StringBuilder stbSqlGetHeaders = new StringBuilder();
stbSqlGetHeaders.Append("create table " + strTableName + " (");
int z = 0;
foreach (DataColumn col in dttTable.Columns)
{
if (z != 0) stbSqlGetHeaders.Append(", "); ;
String strName = col.ColumnName;
String strType = col.DataType.ToString();
if (strType.Equals("")) throw new ArgumentException("DataType Empty");
if (strType.Equals("System.Int32")) strType = "int";
if (strType.Equals("System.String")) strType = "nvarchar (100)";
if (strType.Equals("System.Boolean")) strType = "nvarchar (15)";
if (strType.Equals("System.DateTime")) strType = "datetime";
if (strType.Equals("System.Byte[]")) strType = "nvarchar (100)";
stbSqlGetHeaders.Append(strName + " " + strType);
z++;
}
stbSqlGetHeaders.Append(" )");
SqlCeCommand sceCreateTableCommand;
string strCreateTableQuery = stbSqlGetHeaders.ToString();
sceCreateTableCommand = new SqlCeCommand(strCreateTableQuery, sceConnection);
sceCreateTableCommand.ExecuteNonQuery();
StringBuilder stbSqlQuery = new StringBuilder();
StringBuilder stbFields = new StringBuilder();
StringBuilder stbParameters = new StringBuilder();
stbSqlQuery.Append("insert into " + strTableName + " (");
foreach (DataColumn col in dttTable.Columns)
{
stbFields.Append(col.ColumnName);
stbParameters.Append("#" + col.ColumnName.ToLower());
if (col.ColumnName != dttTable.Columns[dttTable.Columns.Count - 1].ColumnName)
{
stbFields.Append(", ");
stbParameters.Append(", ");
}
}
stbSqlQuery.Append(stbFields.ToString() + ") ");
stbSqlQuery.Append("values (");
stbSqlQuery.Append(stbParameters.ToString() + ") ");
string strTotalRows = dttTable.Rows.Count.ToString();
foreach (DataRow row in dttTable.Rows)
{
SqlCeCommand sceInsertCommand = new SqlCeCommand(stbSqlQuery.ToString(), sceConnection);
foreach (DataColumn col in dttTable.Columns)
{
if (col.ColumnName.ToLower() == "ssma_timestamp")
{
sceInsertCommand.Parameters.AddWithValue("#" + col.ColumnName.ToLower(), "");
}
else
{
sceInsertCommand.Parameters.AddWithValue("#" + col.ColumnName.ToLower(), row[col.ColumnName]);
}
}
sceInsertCommand.ExecuteNonQuery();
}
}
catch { }
}
In my website I am reading a CSV file and parsing it. Now the CSV does not have a column names. It is simply a raw list of comma seperated values.
I take this file and use the ODBCDataReader class to read the rows.
The problem is that when I retrieve the first value it skips the first row of the CSV. This is probably because it considers first row as column header. But in my case there are no column headers. So every time my first row is skipped.
How can I retrieve the first row of my CSV?
Here is the screenshot of my CSV:
Here is the code that I am using to parse the CSV.
public string CsvParser()
{
int _nNrRowsProccessed = 0;
string connectionString = #"Driver={Microsoft Text Driver (*.txt; *.csv)};Dbq=" + ConfigurationManager.AppSettings["CSVFolder"] + ";";
OdbcConnection conn = new OdbcConnection(connectionString);
try
{
conn.Open();
string strFileName = ConfigurationManager.AppSettings["CSVFile"];
string strSQL = "Select * from " + strFileName;
OdbcCommand cmd = new OdbcCommand();
cmd.Connection = conn;
cmd.CommandText = strSQL;
cmd.CommandType = CommandType.Text;
OdbcDataReader reader = cmd.ExecuteReader();
string strLine = null;
// MasterCalendar_DB.OpenMySQLConnection();
while (reader.Read())
{
// insert data into mastercalendar
strLine = reader[0].ToString();
string strLine1 = reader[1].ToString();
string strLine2 = reader[2].ToString();
string strLine3 = reader[3].ToString();
string[] arLine = strLine.Split(';');
// string strAgencyPropertyID = arLine[0];
// DateTime dt = DateTime.Parse(arLine[1]);
// Int64 nDate = (Int64)Util.ConvertToUnixTimestamp(dt);
// String strAvailability = (arLine[2]);
_nNrRowsProccessed++;
// MasterCalendar_DB.Insert(strAgencyPropertyID, nDate, strAvailability);
}
}
catch (Exception ex)
{
throw ex;
}
finally
{
conn.Close();
// MasterCalendar_DB.CloseMySQLConnection();
}
return "Success";
}
You want to have a look at the page of the Text-Driver over at connectionstrings.org.
Basically, you create a schema.ini in the same directory, which holds varies options. One of them is the ColNameHeader option, which takes a boolean.
Example from the site:
[customers.txt]
Format=TabDelimited
ColNameHeader=True
MaxScanRows=0
CharacterSet=ANSI
Check this: Schema.ini File (Text File Driver)
You may need to set ColNameHeader = false
Reference Microsoft.VisualBasic and you can use TextFieldParser, which almost certainly has less dependencies than your proposed method.
using (var parser =
new TextFieldParser(#"c:\data.csv")
{
TextFieldType = FieldType.Delimited,
Delimiters = new[] { "," }
})
{
while (!parser.EndOfData)
{
string[] fields;
fields = parser.ReadFields();
//go go go!
}
}
Quick and dirty solution:
string strLine = reader.GetName(0);
string strLine1 = reader.GetName(1);
string strLine2 = reader.GetName(2);
string strLine3 = reader.GetName(3);
reader.GetName(int i); // Gets the name of specified column.