Here's the code I'm executing:
public static void Main(string[] args)
{
var connectionString = "Data Source=dbname;User Id=usrname;Password=pass;";
DataTable dt = new DataTable("BULK_INSERT_TEST");
dt.Columns.Add("N", typeof(double));
var row = dt.NewRow();
row["N"] = 1;
using (var connection = new OracleConnection(connectionString)){
connection.Open();
using(var bulkCopy = new OracleBulkCopy(connection, OracleBulkCopyOptions.UseInternalTransaction))
{
bulkCopy.DestinationTableName = dt.TableName;
bulkCopy.WriteToServer(dt);
}
}
using (var connection = new OracleConnection(connectionString)){
connection.Open();
var command = new OracleCommand("select count(*) from BULK_INSERT_TEST", connection);
var res = command.ExecuteScalar();
Console.WriteLine(res); // Here I'm getting 0
}
}
It uses OracleBulkCopy to insert 1 entry to table and then it counts rows in the table.
Why am I getting 0 rows? Here's the structure of table:
-- Create table
create table BULK_INSERT_TEST
(
n NUMBER
)
You haven't actually added the row to the table. You've used the table to create a new row with the right columns, but not actually added it to the table. You need:
dt.Rows.Add(row);
(before your first using statement, basically)
Related
We have a big list around 100000 records and want to insert it into a sql table.
What are we doing is; converting that list into data table and passing datatable to SqlBulkcopy method.
This conversion from list to Datatable taking more time. Tried using Parallel but as Datatable is not thread safe so avoided that.
Adding sample poc code which generates integer list and insert it into temp table
static void Main(string[] args)
{
List<int> valueList = GenerateList(100000);
Console.WriteLine("Starting with Bulk Insert ");
DateTime startTime = DateTime.Now;
int recordCount = BulkInsert(valueList);
TimeSpan ts = DateTime.Now.Subtract(startTime);
Console.WriteLine("Bulk insert for {0} records in {1} miliseconds.-> ", recordCount, ts.Milliseconds);
Console.WriteLine("Done.");
Console.ReadLine();
}
private static int BulkInsert(List<int> valueList)
{
SqlBulkHelper sqlBulkHelper = new SqlBulkHelper();
var eventIdDataTable = CreateIdentityDataTable(valueList, "SqlTable", "Id");
return FillBulkPoundTable(eventIdDataTable, "#SqlTable");
}
private static List<int> GenerateList(int size)
{
return Enumerable.Range(0, size).ToList();
}
private static DataTable CreateIdentityDataTable(List<int> ids, string dataTableName, string propertyName)
{
if (ids == null) return null;
using (var dataTable = new DataTable(dataTableName))
{
dataTable.Locale = CultureInfo.CurrentCulture;
var dtColumn = new DataColumn(propertyName, Type.GetType("System.Int32"));
dataTable.Columns.Add(dtColumn);
foreach (int id in ids)
{
DataRow row = dataTable.NewRow();
row[propertyName] = id;
dataTable.Rows.Add(row);
}
return dataTable;
}
}
private static int FillBulkPoundTable(DataTable dataTable, string destinationTableName)
{
int totalInsertedRecordCount = 0;
using (SqlConnection _connection = new SqlConnection(CongifUtil.sqlConnString))
{
string sql =
#"If object_Id('tempdb..#EventIds') is not null drop table #EventIds
CREATE TABLE #EventIds(EvId int) ";
_connection.Open();
using (var command = new SqlCommand(sql, _connection))
{
command.ExecuteNonQuery();
}
using (var sqlBulkCopy = new SqlBulkCopy(_connection))
{
sqlBulkCopy.BulkCopyTimeout = 0;
sqlBulkCopy.DestinationTableName = destinationTableName;
sqlBulkCopy.WriteToServer(dataTable);
}
using (var command = new SqlCommand(sql, _connection))
{
command.CommandText = "Select Count(1) as RecordCount from #EventIds";
SqlDataReader reader = command.ExecuteReader();
if (reader.HasRows)
{
while (reader.Read())
{
totalInsertedRecordCount = Convert.ToInt32(reader["RecordCount"]);
}
}
}
}
return totalInsertedRecordCount;
}
Currently it is taking around 8 seconds but we need to make it more faster. Reason is our target is to insert 900,000 records which will be devided into 100,000 batch each.
Can you give us any hint how can we make it perfect and faster?
PS. Tried with Dapper insert too but it is not faster than BulkCopy.
First Covert your list into XML something like
List<int> Branches = new List<int>();
Branches.Add(1);
Branches.Add(2);
Branches.Add(3);
XElement xmlElements = new XElement("Branches", Branches.Select(i => new
XElement("branch", i)));
Then pass the xml to a SP as parameter and insert it directly to your table, Example :
DECLARE #XML XML
SET #XML = '<Branches>
<branch>1</branch>
<branch>2</branch>
<branch>3</branch>
</Branches>'
DECLARE #handle INT
DECLARE #PrepareXmlStatus INT
EXEC #PrepareXmlStatus= sp_xml_preparedocument #handle OUTPUT, #XML
SELECT * FROM OPENXML(#handle, '/Branches/branch', 2)
WITH (
branch varchar
)
EXEC sp_xml_removedocument #handle
Bach Size
From what I understand, you try to insert with a BatchSize of 100000. Higher is not always better.
Try to lower this amount to 5,000 instead and check for the performance difference.
You increase the amount of database round-trip but it may also go faster (Too much factor such as the row size are involved here)
TableLock
Using the SqlBulkCopyOptions.TableLock will improve your insert performance.
using (var sqlBulkCopy = new SqlBulkCopy(_connection, SqlBulkCopyOptions.KeepIdentity))
I am running a SQL Query which will return a count the query is
Select Count(numstudents) from classA
I am using C# to connect to SQL Server and execute this query, but my issue is, how do I get the actual number returned? My current method returns the number of rows in the DataTable which by default will always be 1. I need to get the Count() returned.
Here is full C# syntax:
private void GetData()
{
DataSet ds = new DataSet()
using (var con = new SqlConnection(connectionString))
{
using (var cmd = new SqlCommand("RunAStoredProc", con))
{
using (var da = new SqlDataAdapter(cmd))
{
cmd.CommandType = CommandType.StoredProcedure;
da.Fill(ds);
}
}
}
DataTable table1 = new DataTable();
table1 = ds.Tables[0];
DataTable table2 = new DataTable();
table2 = ds.Tables[1];
string numberreturned = table1.Rows.Count.ToString();
Console.WriteLine(numberreturned);
Console.ReadKey();
}
Stored procedure reads like such:
Alter Procedure [dbo].[GetData]
As
Select Count(*) FROM classA
Select studentfirstname, studentlastname FROM classA
Where enrolled = 'Yes'
You don't need an SqlDataAdapter and all the infrastructure required to work with if you just have a single value returned by your Stored Procedure. Just use ExecuteScalar
int count = 0;
using (var con = new SqlConnection(connectionString))
using (var cmd = new SqlCommand("RunAStoredProc", con))
{
cmd.CommandType = CommandType.StoredProcedure;
count = (int)cmd.ExecuteScalar();
}
Console.WriteLine(count);
Console.ReadKey();
However if your really want to use an adapter and a dataset then you can find the result of your query reading the value from the first row and first column from the returned table
int count = Convert.ToInt32(table1.Rows[0][0]);
or even (without declaring the table1 variable)
int count = Convert.ToInt32(ds.Tables[0].Rows[0][0]);
To discover the difference between the result of the first select statement and the count of rows returned in the second select statement you could write
int allStudents = Convert.ToInt32(ds.Tables[0].Rows[0][0]);
int enrolledStudents = ds.Tables[1].Rows.Count;
int notEnrolledStudents = allStudents - enrolledStudents;
This application I'm developing will read in a bunch of records, validate / modify data, then send that data back to the DB.
I need to do this with an "All or None" approach; either update all of the rows, or don't modify any of the data in the DB.
I found the SqlBulkCopy class which has a method for writing all of the rows in a DataTable to a database but this is unacceptable for my purposes. I just wrote a quick app that'd write a few rows to the DB, populate a DataTable from that database, modify the data, then call WriteToServer(DataTable dt) again. Even though the DataTable has all of the Primary Key information, it simply added new rows with the data and new primary keys rather than updating the data for the rows that have matching primary keys.
So, how can I do bulk Update statements to an MSSQL database from a C# Winforms application?
I don't think this is really relevant, but I'll include my code anyways. I'm not great when it comes to interacting with databases, so it's very possible I'm just doing something dumb and wrong.
static void Main(string[] args)
{
using (SqlConnection conn = new SqlConnection("myConnectionInfo"))
{
//DataTable dt = MakeTable();
DataTable dt = FillDataTable(conn);
using (SqlBulkCopy sbc = new SqlBulkCopy(conn))
{
sbc.DestinationTableName = "TestTbl";
try
{
for (int i = 0; i < dt.Rows.Count; i++)
{
dt.Rows[i][1] = i;
dt.Rows[i][2] = i+1;
dt.Rows[i][3] = i+2;
}
sbc.WriteToServer(dt);
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
}
finally
{
Console.WriteLine("Press Any Key To Continue");
Console.ReadKey();
}
}
}
}
private static DataTable MakeTable()
{
DataTable newProducts = new DataTable("TestTbl");
DataColumn TestPk = new DataColumn();
TestPk.DataType = System.Type.GetType("System.Int32");
TestPk.ColumnName = "TestPk";
TestPk.AllowDBNull = false;
TestPk.AutoIncrement = true;
TestPk.Unique = true;
newProducts.Columns.Add(TestPk);
DataColumn Col1 = new DataColumn();
Col1.DataType = System.Type.GetType("System.String");
Col1.ColumnName = "Col1";
Col1.AllowDBNull = false;
newProducts.Columns.Add(Col1);
// Add 2 more columns like the above block
DataRow row = newProducts.NewRow();
row["Col1"] = "CC-101-WH";
row["Col2"] = "Cyclocomputer - White";
row["Col3"] = DBNull.Value;
newProducts.Rows.Add(row);
// Add 2 more rows like the above block
newProducts.AcceptChanges();
return newProducts;
}
private static DataTable FillDataTable(SqlConnection conn)
{
string query = "SELECT * FROM NYMS.dbo.TestTbl";
using (SqlCommand cmd = new SqlCommand(query, conn))
{
conn.Open();
DataTable dt = new DataTable();
dt.Load(cmd.ExecuteReader());
return dt;
}
}
The call to MakeTable() is commented out because I used it when I first ran the application to insert some data, then while trying to update that test data I use FillDataTable simply to populate it
For inserting a huge amount of data in a database, I used to collect all the inserting information into a list and convert this list into a DataTable. I then insert that list to a database via SqlBulkCopy.
Where I send my generated list LiMyList which contain information of all bulk data which I want to insert to database and pass it to my bulk insertion operation
InsertData(LiMyList, "MyTable");
Where InsertData is
public static void InsertData<T>(List<T> list,string TableName)
{
DataTable dt = new DataTable("MyTable");
clsBulkOperation blk = new clsBulkOperation();
dt = ConvertToDataTable(list);
ConfigurationManager.OpenExeConfiguration(ConfigurationUserLevel.PerUserRoamingAndLocal);
using (SqlBulkCopy bulkcopy = new SqlBulkCopy(ConfigurationManager.ConnectionStrings["SchoolSoulDataEntitiesForReport"].ConnectionString))
{
bulkcopy.BulkCopyTimeout = 660;
bulkcopy.DestinationTableName = TableName;
bulkcopy.WriteToServer(dt);
}
}
public static DataTable ConvertToDataTable<T>(IList<T> data)
{
PropertyDescriptorCollection properties = TypeDescriptor.GetProperties(typeof(T));
DataTable table = new DataTable();
foreach (PropertyDescriptor prop in properties)
table.Columns.Add(prop.Name, Nullable.GetUnderlyingType(prop.PropertyType) ?? prop.PropertyType);
foreach (T item in data)
{
DataRow row = table.NewRow();
foreach (PropertyDescriptor prop in properties)
row[prop.Name] = prop.GetValue(item) ?? DBNull.Value;
table.Rows.Add(row);
}
return table;
}
Now I want to do an update operation, is there any way as for inserting data is done by SqlBulkCopy for Updating data to DataBase From C#.Net
What I've done before is perform a bulk insert from the data into a temp table, and then use a command or stored procedure to update the data relating the temp table with the destination table. The temp table is an extra step, but you can have a performance gain with the bulk insert and massive update if the amount of rows is big, compared to updating the data row by row.
Example:
public static void UpdateData<T>(List<T> list,string TableName)
{
DataTable dt = new DataTable("MyTable");
dt = ConvertToDataTable(list);
using (SqlConnection conn = new SqlConnection(ConfigurationManager.ConnectionStrings["SchoolSoulDataEntitiesForReport"].ConnectionString))
{
using (SqlCommand command = new SqlCommand("", conn))
{
try
{
conn.Open();
//Creating temp table on database
command.CommandText = "CREATE TABLE #TmpTable(...)";
command.ExecuteNonQuery();
//Bulk insert into temp table
using (SqlBulkCopy bulkcopy = new SqlBulkCopy(conn))
{
bulkcopy.BulkCopyTimeout = 660;
bulkcopy.DestinationTableName = "#TmpTable";
bulkcopy.WriteToServer(dt);
bulkcopy.Close();
}
// Updating destination table, and dropping temp table
command.CommandTimeout = 300;
command.CommandText = "UPDATE T SET ... FROM " + TableName + " T INNER JOIN #TmpTable Temp ON ...; DROP TABLE #TmpTable;";
command.ExecuteNonQuery();
}
catch (Exception ex)
{
// Handle exception properly
}
finally
{
conn.Close();
}
}
}
}
Notice that a single connection is used to perform the whole operation, in order to be able to use the temp table in each step, because the scope of the temp table is per connection.
In my personal experience, the best way to handled this situation is utilizing a Stored Procedure with a Table-Valued Parameter and a User-Defined Table Type. Just set up the type with the columns of the data table, and pass in said-data table as a parameter in the SQL command.
Within the Stored Procedure, you can either join directly on some unique key (if all rows you are updating exist), or - if you might run into a situation where you are having to do both updates and inserts - use the SQL Merge command within the stored procedure to handle both the updates and inserts as applicable.
Microsoft has both syntax reference and an article with examples for the Merge.
For the .NET piece, it's a simple matter of setting the parameter type as SqlDbType.Structured and setting the value of said-parameter to the Data Table that contains the records you want to update.
This method provides the benefit of both clarity and ease of maintenance. While there may be ways that offer performance improvements (such as dropping it into a temporary table then iterating over that table), I think they're outweighed by the simplicity of letting .NET and SQL handle transferring the table and updating the records itself. K.I.S.S.
Bulk Update:
Step 1: put the data which you want to update and primary key in a list.
Step 2: pass this list and ConnectionString to BulkUpdate Method As shown below
Example:
//Method for Bulk Update the Data
public static void BulkUpdateData<T>(List<T> list, string connetionString)
{
DataTable dt = new DataTable("MyTable");
dt = ConvertToDataTable(list);
using (SqlConnection conn = new SqlConnection(connetionString))
{
using (SqlCommand command = new SqlCommand("CREATE TABLE
#TmpTable([PrimaryKey],[ColumnToUpdate])", conn))
{
try
{
conn.Open();
command.ExecuteNonQuery();
using (SqlBulkCopy bulkcopy = new SqlBulkCopy(conn))
{
bulkcopy.BulkCopyTimeout = 6600;
bulkcopy.DestinationTableName = "#TmpTable";
bulkcopy.WriteToServer(dt);
bulkcopy.Close();
}
command.CommandTimeout = 3000;
command.CommandText = "UPDATE P SET P.[ColumnToUpdate]= T.[ColumnToUpdate] FROM [TableName Where you want to update ] AS P INNER JOIN #TmpTable AS T ON P.[PrimaryKey] = T.[PrimaryKey] ;DROP TABLE #TmpTable;";
command.ExecuteNonQuery();
}
catch (Exception ex)
{
// Handle exception properly
}
finally
{
conn.Close();
}
}
}
}
Step 3: put The ConvertToDataTable Method as shown Below.
Example:
public static DataTable ConvertToDataTable<T>(IList<T> data)
{
PropertyDescriptorCollection properties = TypeDescriptor.GetProperties(typeof(T));
DataTable table = new DataTable();
foreach (PropertyDescriptor prop in properties)
table.Columns.Add(prop.Name, Nullable.GetUnderlyingType(prop.PropertyType) ?? prop.PropertyType);
foreach (T item in data)
{
DataRow row = table.NewRow();
foreach (PropertyDescriptor prop in properties)
row[prop.Name] = prop.GetValue(item) ?? DBNull.Value;
table.Rows.Add(row);
}
return table;
}
Notes: WhereEver SquareBracket[] is there, put your own value.
Try out SqlBulkTools available on Nuget.
Disclaimer: I'm the author of this library.
var bulk = new BulkOperations();
var records = GetRecordsToUpdate();
using (TransactionScope trans = new TransactionScope())
{
using (SqlConnection conn = new SqlConnection(ConfigurationManager
.ConnectionStrings["SqlBulkToolsTest"].ConnectionString))
{
bulk.Setup<MyTable>()
.ForCollection(records)
.WithTable("MyTable")
.AddColumn(x => x.SomeColumn1)
.AddColumn(x => x.SomeColumn2)
.BulkUpdate()
.MatchTargetOn(x => x.Identifier)
.Commit(conn);
}
trans.Complete();
}
Only 'SomeColumn1' and 'SomeColumn2' will be updated. More examples can be found here
I would insert new values in a temporary table and then do a merge against the destination table, something like this:
MERGE [DestTable] AS D
USING #SourceTable S
ON D.ID = S.ID
WHEN MATCHED THEN
UPDATE SET ...
WHEN NOT MATCHED
THEN INSERT (...)
VALUES (...);
You could try to build a query that contains all data. Use a case. It could look like this
update your_table
set some_column = case when id = 1 then 'value of 1'
when id = 5 then 'value of 5'
when id = 7 then 'value of 7'
when id = 9 then 'value of 9'
end
where id in (1,5,7,9)
I'd go for a TempTable approach because that way you aren't locking anything. But if your logic needs to be only in the front end and you need to use bulk copy, I'd try a Delete/Insert approach but in the same SqlTransaction to ensure integrity which would be something like this:
// ...
dt = ConvertToDataTable(list);
using (SqlConnection cnx = new SqlConnection(myConnectionString))
{
using (SqlTranscation tran = cnx.BeginTransaction())
{
DeleteData(cnx, tran, list);
using (SqlBulkCopy bulkcopy = new SqlBulkCopy(cnx, SqlBulkCopyOptions.Default, tran))
{
bulkcopy.BulkCopyTimeout = 660;
bulkcopy.DestinationTableName = TabelName;
bulkcopy.WriteToServer(dt);
}
tran.Commit();
}
}
Complete answer, disclaimer: arrow code; this is mine built from research; Published in SqlRapper. It uses custom attributes over properties to determine whether a key is primary. Yes, super complicated. Yes super reusable. Yes, needs to be refactored. Yes, it is a nuget package. No, the documentation isn't great on github, but it exists. Will it work for everything? Probably not. Will it work for simple stuff? Oh yeah.
How easy is it to use after setup?
public class Log
{
[PrimaryKey]
public int? LogId { get; set; }
public int ApplicationId { get; set; }
[DefaultKey]
public DateTime? Date { get; set; }
public string Message { get; set; }
}
var logs = new List<Log>() { log1, log2 };
success = db.BulkUpdateData(logs);
Here's how it works:
public class PrimaryKeyAttribute : Attribute
{
}
private static bool IsPrimaryKey(object[] attributes)
{
bool skip = false;
foreach (var attr in attributes)
{
if (attr.GetType() == typeof(PrimaryKeyAttribute))
{
skip = true;
}
}
return skip;
}
private string GetSqlDataType(Type type, bool isPrimary = false)
{
var sqlType = new StringBuilder();
var isNullable = false;
if (Nullable.GetUnderlyingType(type) != null)
{
isNullable = true;
type = Nullable.GetUnderlyingType(type);
}
switch (Type.GetTypeCode(type))
{
case TypeCode.String:
isNullable = true;
sqlType.Append("nvarchar(MAX)");
break;
case TypeCode.Int32:
case TypeCode.Int64:
case TypeCode.Int16:
sqlType.Append("int");
break;
case TypeCode.Boolean:
sqlType.Append("bit");
break;
case TypeCode.DateTime:
sqlType.Append("datetime");
break;
case TypeCode.Decimal:
case TypeCode.Double:
sqlType.Append("decimal");
break;
}
if (!isNullable || isPrimary)
{
sqlType.Append(" NOT NULL");
}
return sqlType.ToString();
}
/// <summary>
/// SqlBulkCopy is allegedly protected from Sql Injection.
/// Updates a list of simple sql objects that mock tables.
/// </summary>
/// <typeparam name="T"></typeparam>
/// <param name="rows">A list of rows to insert</param>
/// <param name="tableName">a Table name if your class isn't your table name minus s.</param>
/// <returns>bool success</returns>
public bool BulkUpdateData<T>(List<T> rows, string tableName = null)
{
var template = rows.FirstOrDefault();
string tn = tableName ?? template.GetType().Name + "s";
int updated = 0;
using (SqlConnection con = new SqlConnection(ConnectionString))
{
using (SqlCommand command = new SqlCommand("", con))
{
using (SqlBulkCopy sbc = new SqlBulkCopy(con))
{
var dt = new DataTable();
var columns = template.GetType().GetProperties();;
var colNames = new List<string>();
string keyName = "";
var setStatement = new StringBuilder();
int rowNum = 0;
foreach (var row in rows)
{
dt.Rows.Add();
int colNum = 0;
foreach (var col in columns)
{
var attributes = row.GetType().GetProperty(col.Name).GetCustomAttributes(false);
bool isPrimary = IsPrimaryKey(attributes);
var value = row.GetType().GetProperty(col.Name).GetValue(row);
if (rowNum == 0)
{
colNames.Add($"{col.Name} {GetSqlDataType(col.PropertyType, isPrimary)}");
dt.Columns.Add(new DataColumn(col.Name, Nullable.GetUnderlyingType(col.PropertyType) ?? col.PropertyType));
if (!isPrimary)
{
setStatement.Append($" ME.{col.Name} = T.{col.Name},");
}
}
if (isPrimary)
{
keyName = col.Name;
if (value == null)
{
throw new Exception("Trying to update a row whose primary key is null; use insert instead.");
}
}
dt.Rows[rowNum][colNum] = value ?? DBNull.Value;
colNum++;
}
rowNum++;
}
setStatement.Length--;
try
{
con.Open();
command.CommandText = $"CREATE TABLE [dbo].[#TmpTable]({String.Join(",", colNames)})";
//command.CommandTimeout = CmdTimeOut;
command.ExecuteNonQuery();
sbc.DestinationTableName = "[dbo].[#TmpTable]";
sbc.BulkCopyTimeout = CmdTimeOut * 3;
sbc.WriteToServer(dt);
sbc.Close();
command.CommandTimeout = CmdTimeOut * 3;
command.CommandText = $"UPDATE ME SET {setStatement} FROM {tn} as ME INNER JOIN #TmpTable AS T on ME.{keyName} = T.{keyName}; DROP TABLE #TmpTable;";
updated = command.ExecuteNonQuery();
}
catch (Exception ex)
{
if (con.State != ConnectionState.Closed)
{
sbc.Close();
con.Close();
}
//well logging to sql might not work... we could try... but no.
//So Lets write to a local file.
_logger.Log($"Failed to Bulk Update to Sql: {rows.ToCSV()}", ex);
throw ex;
}
}
}
}
return (updated > 0) ? true : false;
}
I am trying to get the data from database by using the below code.....
if there is no data in the table it will always goes to
this statement
I am using mysql.net connector for getting the data and i am doing winforms applications
using c#
public DataTable sales(DateTime startdate, DateTime enddate)
{
const string sql = #"SELECT memberAccTran_Source as Category, sum(memberAccTran_Value) as Value
FROM memberacctrans
WHERE memberAccTran_DateTime BETWEEN #startdate AND #enddate
GROUP BY memberAccTran_Source";
return sqlexecution(startdate, enddate, sql);
}
and the below code is for return sqlexceution...function..
private static DataTable sqlexecution(DateTime startdate, DateTime enddate, string sql)
{
var table = new DataTable();
using (var conn = new MySql.Data.MySqlClient.MySqlConnection(connectionstring))
{
conn.Open();
var cmd = new MySql.Data.MySqlClient.MySqlCommand(sql, conn);
var ds = new DataSet();
var parameter = new MySql.Data.MySqlClient.MySqlParameter("#startdate", MySql.Data.MySqlClient.MySqlDbType.DateTime);
parameter.Direction = ParameterDirection.Input;
parameter.Value = startdate.ToString(dateformat);
cmd.Parameters.Add(parameter);
var parameter2 = new MySql.Data.MySqlClient.MySqlParameter("#enddate", MySql.Data.MySqlClient.MySqlDbType.DateTime);
parameter2.Direction = ParameterDirection.Input;
parameter2.Value = enddate.ToString(dateformat);
cmd.Parameters.Add(parameter2);
var da = new MySql.Data.MySqlClient.MySqlDataAdapter(cmd);
da.Fill(ds);
try
{
table = ds.Tables[0];
}
catch
{
table = null;
}
}
return table;
}
even if there is no data the process flow will goes to this line
table = ds.Tables[0];
how can i reduce this .....
would any one pls help on this....
In your case if you are think that catch block will get excuted if there is no row available than you are wrong because Even if there is no data once select query is get exucuted without exception it Creates datatable with the columns but with no rows.
for this i think you can make use of ds.table[0].rows.count property which return 0 if there is no row in datatable.
if ( ds.Tables[0].Rows.Count > 0 )
table = ds.Tables[0];
else
table=null;
It returns an empty table. This is common behavior. If you want to have table null you should check for the row count :
If ( ds.Tables[0].Rows.Count >. 0 )
table = ds.Tables[0];
Else
table=0
I'm not really sure what you're asking here ... I assume you want it to skip the table = ds.tables[0] line if there is no data?
if thats the case a try/catch wont work as it wont throw an exception ... try something like this instead ...
if(ds.Tables.Count > 0 && ds.Tables[0].Rows.Count >0)
{
table = ds.Tables[0];
}
else
{
table = null;
}