We have number of store procedures against each data layer. For an example , we have an Employee table with 20 columns and there are about seven store procedures where this table has been referenced. We have one data binding method used against all employee store procedures. Every time i add a new column in the table, i have to add the column reference to all seven store procedure (even though it is not required in all of them). which is bit pain.
As we are using one data binding method, what would be the best way to make this process more efficient?
What if i add a column reference to just in those sp where it is required and then check during data binding if column exists in the dataReader. I don't want to loop through each row and then loop through all columns to find out if column exists. If i have 1000 rows and 20 columns then it would be a loop of 1000 x 20 which is not very efficient.
Would that be okay if i add dataReader results in ArrayList and then use contain method to find if column exists in the ArrayList?
Here's an extension method I found a while back to check for column existence:
Should note that it's not very efficient.
public static bool HasColumn(this IDataRecord dr, string columnName)
{
for (int i = 0; i < dr.FieldCount; i++)
{
if (dr.GetName(i).Equals(columnName, StringComparison.InvariantCultureIgnoreCase))
{
return true;
}
}
return false;
}
Perhaps you could use it on the first record and cache the results via some boolean values.
Something like the following:
public void test()
{
//DataBrokerSql is my own helper.
using (DataBrokerSql db = new DataBrokerSql(m_ConnString))
{
bool columnsChecked = false;
bool hasFirstName = false;
bool hasLastName = false;
using (DbDataReader reader = db.GetDataReader("Select * From Person"))
{
while (reader.Read())
{
//Only check for columns on the first row.
if (!columnsChecked)
{
hasFirstName = reader.HasColumn("FirstName");
hasLastName = reader.HasColumn("LastName");
columnsChecked = true;
}
if (hasFirstName)
{
//Read FirstName
var firstName = reader["FirstName"];
}
if (hasLastName)
{
//Read LastName
var lastName = reader["LastName"];
}
}
}
}
}
Related
I have this:
var productDetailsFromFile = (from row in dt.AsEnumerable()
select new ProductDetails
{
ItemNumber = row.Field<string>("Item Number"),
Cost = row.Field<string>("Cost").ToDecimal(),//custom method .ToDecimal
WHQtyList = new List<int>()
{
row.Field<string>("foo").ToInteger(),//custom method .ToInteger
row.Field<string>("bar").ToInteger(),
row.Field<string>("foo2").ToInteger(),
}
}).ToList();
It reads the info from a .csv file. What I am trying to achieve is an elegant way of checking if Fields "foo", "bar" or "foo2".
Right now the issue is that if from the CSV file I remove one of the columns, column not in datatable error pops up. I can't get this to work for 2 hours now.
What I am essentially seeking is - how to check if a column exists as I use it to initialize the list, or if the column doesn't exist the default value to be 0 for each row, where it doesn't exist.
I did it through a method. I was wondering if there was a way to do it faster without having to add additional lines of code or make the additional lines of code less than what they are now.
int ContainsColumn (string columnName, DataTable table, DataRow row)
{
DataColumnCollection columns = table.Columns;
if (columns.Contains(columnName))
{
return int.Parse(row.Field<string>(columnName));
}
else
{
return 0;
}
}
I have a .csv file with around 200 columns and the order of columns changes all the time. I want to read each row from the file, identify the corresponding column names in the database, and write data to the table accordingly.
For this I can use a simple switch case checking for the name of column. Since there are 200 columns, I'm wondering if there is any other way to do it.
Example:
public void ColName(string str, Type a)
{
SampleTableName obj = new SampleTableName();
obj."str" = a;
connection.AddSampleTableName(obj);
connection.savechanges();
}
/* SampleTableName has columns: [Name, Age] */
ColName("Name","XYZ");
Output:
Name Age
XYZ NULL
Any ideas please? Thanks.
If the column names are the same you can use SqlBulkCopy and add a list of Column Mappings. The order doesn't matter, as long as the DataTable name is set.
DataTable table = CreateTable(rows);
using (var bulkCopy = new SqlBulkCopy(connectionString))
{
foreach (var col in table.Columns.OfType<DataColumn>())
{
bulkCopy.ColumnMappings.Add(
new SqlBulkCopyColumnMapping(col.ColumnName, col.ColumnName));
}
bulkCopy.BulkCopyTimeout = 600; // in seconds
bulkCopy.DestinationTableName = "<tableName>";
bulkCopy.WriteToServer(table);
}
If the column names are no the same, a dictionary to lookup the different names could be used.
To keep it simple for maintenance purpose, I went with a switch case sigh. However, I wrote a small script to add all those fields values to the table object.
this is my code right now:
private static MySqlConnection conn = null;
private static MySqlDataAdapter AccountsDa = null;
private static MySqlCommandBuilder AccountsCb = null;
AccountsDa = new MySqlDataAdapter("SELECT * FROM accounts", conn);
AccountsCb = new MySqlCommandBuilder(AccountsDa);
Accounts = new DataTable();
AccountsDa.Fill(Accounts);
I'm trying to figure out how to define the column default values without having to do it by hand
if I do like this:
DataColumn col = new DataColumn();
col.ColumnName = "id";
col.AllowDBNull = false;
col.DataType = System.Type.GetType("System.Int32");
col.DefaultValue = 0;
Accounts.Columns.Add(col);
for every colum it works fine but how do I have it automatically set the default values from the database when the table is filled. I'm hoping I don't have to define 30 columns by hand.
I tried the Accountsda.FillSchema(Accounts, SchemaType.Source);
which sets up the allow nulls and auto increments but not default values
the problem arrises when adding a row to the data table later sometimes I only need to set the value for one column and let the rest of the columns resort to their default value.
I could put 180 lines of code to manually define the default values for inserting rows but there has to be a way to grab that from the database when creating/filling the data table
I'm using in memory data tables because there are times where data will only exist for example 2 minutes and then be deleted again as this is for a dedicated server for an online rts game. so to save hits on the database I'm using data tables and manipulating them and flushing them every 10 minutes so that I only have 1,000 hits to the database every 10 mins instead of possibly 40,000 hits
well according to the msdn gurus after finally getting a response on their forums its not possible to get the default values. all you can do is load wether the value is allowed to be null and wether its autoincrememnt but then you stll have to set the seed and step on auto incrememnt it doesn't get that from the database either but they gave a shorthand version that cuts it down to 30 lines of code instead of 180
after calling fillschema and then filling the data table can simply do like this wich cuts it down to one line instead of the six
Cities.Columns["wood"].DefaultValue = 0;
after a few replies there is even a much easier way to do this not the way I wanted but maybe it will help someone else down the same road instead of one line for each column this does them all in 3 lines
foreach (DataColumn col in Cities.Columns) {
if (col.ColumnName != "id") col.DefaultValue = 0;
}
id is the primary key and can't set a default value
So I was trying to do something similar to you (except I have no idea how to get the information about auto increment) - I got the idea from https://stackoverflow.com/a/12731310/222897
private void AssignMandatoryColumns([NotNull] DataTable structure, string tableName)
{
// find schema
string[] restrictions = new string[4]; // Catalog, Owner, Table, Column
restrictions[2] = tableName;
DataTable schemaTable = _dbCon.GetSchema("Columns", restrictions);
if (schemaTable == null) return;
// set values for columns
foreach (DataRow row in schemaTable.Rows)
{
string columnName = row["COLUMN_NAME"].ToString();
if (!structure.Columns.Contains(columnName)) continue;
if (row["IS_NULLABLE"].ToString() == "NO") structure.Columns[columnName].AllowDBNull = false;
//if (structure.Columns[columnName].AutoIncrement) continue; // there can be no default value
var valueType = row["DATA_TYPE"];
var defaultValue = row["COLUMN_DEFAULT"];
try
{
structure.Columns[columnName].DefaultValue = defaultValue;
if (!structure.Columns[columnName].AllowDBNull && structure.Columns[columnName].DefaultValue is DBNull)
{
Logger.DebugLog("Database column {0} is not allowed to be null, yet there is no default value.", columnName);
}
}
catch (Exception exception)
{
if (structure.Columns[columnName].AllowDBNull) continue; // defaultvalue is irrelevant since value is allowed to be null
Logger.LogWithoutTrace(exception, string.Format("Setting DefaultValue for {0} of type {1} {4} to {2} ({3}).", columnName, valueType, defaultValue, defaultValue.GetType(), structure.Columns[columnName].AllowDBNull ? "NULL" : "NOT NULL"));
}
}
}
The function takes the DataTable you want to set the values for (I get mine by querying the DB) and the name of the table.
For some reason the timestamp and date columns don't like their default value no matter what I do.
I use telerik:RadComboBox
Like this :
<telerik:RadComboBox runat="server" ID="RadComboBox1" EnableLoadOnDemand="true"
ShowMoreResultsBox="true" EnableVirtualScrolling="true" CollapseDelay="0" Culture="ar-EG" ExpandDelay="0" Filter="StartsWith" ItemsPerRequest="100"
MarkFirstMatch="true" Skin="Outlook" ValidationGroup="L" Width="202px" EnableAutomaticLoadOnDemand="True"
EmptyMessage="-Enter user name-"
EnableItemCaching="true" >
<WebServiceSettings Path="../WebService/Employees.asmx" Method="LoadData" />
and my web service :
[System.Web.Script.Services.ScriptService]
public class Employees : System.Web.Services.WebService
{
[WebMethod(EnableSession = true)]
public RadComboBoxData LoadData(RadComboBoxContext context)
{
RadComboBoxData result = new RadComboBoxData();
DataTable dt = FollowsDAL.GetAllEmployees();
var allEmployees = from r in dt.AsEnumerable()
orderby r.Field<string>("name")
select new RadComboBoxItemData
{
Text = r.Field<string>("name").ToString().TrimEnd()
};
string text = context.Text;
if (!String.IsNullOrEmpty(text))
{
allEmployees = allEmployees.Where(item => item.Text.StartsWith(text));
}
//Perform the paging
// - first skip the amount of items already populated
// - take the next 10 items
int numberOfItems = context.NumberOfItems;
var employees = allEmployees.Skip(numberOfItems).Take(100);
result.Items = employees.ToArray();
int endOffset = numberOfItems + employees.Count();
int totalCount = allEmployees.Count();
//Check if all items are populated (this is the last page)
if (endOffset == totalCount)
result.EndOfItems = true;
//Initialize the status message
result.Message = String.Format("Items <b>1</b>-<b>{0}</b> out of <b>{1}</b>",
endOffset, totalCount);
return result;
}}
My problem is :
Although this control is so fast , every time i enter specific name firstly it fetches the 20000 employee in the datatable dt !!!
with every character .
My question is:
How it 's fast like this with this bad behavior?
Is there some way to get all the employees only once ?
How to enhance the performance?
It is always better to use server side filtering, because you do not need to retreive 20000 records to the webserver to use 10 or 20 items to return.
http://demos.telerik.com/aspnet-ajax/combobox/examples/populatingwithdata/autocompletesql/defaultcs.aspx
Your DAL should have a method to filter the results based on the sent text, then you add them to the combobox. My DAL is Telerik OpenAccess ORM (Linq2SQL) but you could also write a stored procedure to filter the results as well.
Here is an example of one of my asmx services that populates a radcombobox:
[WebMethod]
public RadComboBoxData FindEmployee(RadComboBoxContext context)
{
RadComboBoxData comboData = new RadComboBoxData();
using (DataBaseContext dbc = new DataBaseContext())
{
IQueryable<Employee> Employees = dbc.FindEmployee(context.Text);
int itemOffset = context.NumberOfItems;
int endOffset = Math.Min(itemOffset + 10, Employees.Count());
List<RadComboBoxItemData> result = new List<RadComboBoxItemData>();
var AddingEmployees = Employees.Skip(itemOffset).Take(endOffset - itemOffset);
foreach (var Employee in AddingEmployees)
{
RadComboBoxItemData itemData = new RadComboBoxItemData();
itemData.Text = Employee.Person.FullName;
itemData.Value = Employee.EmployeeID.ToString();
result.Add(itemData);
}
comboData.EndOfItems = endOffset == Employees.Count();
comboData.Items = result.ToArray();
if (Employees.Count() <= 0)
comboData.Message = "No matches";
else
comboData.Message = String.Format("Items <b>1</b>-<b>{0}</b> out of <b>{1}</b>", endOffset, Employees.Count());
return comboData;
}
}
and in case you are wondering what my FindEmployee method is:
public IQueryable<Employee> FindEmployee(string SearchString, bool IncludeInactive = false)
{
return from e in this.Employees
where
(e.EmployeeID.ToString() == SearchString ||
e.Person.FirstName.Contains(SearchString) ||
e.Person.MiddleName.Contains(SearchString) ||
e.Person.LastName.Contains(SearchString) ||
(e.Person.FirstName + " " + e.Person.LastName).Contains(SearchString) ||
(e.Person.FirstName + " " + e.Person.MiddleName).Contains(SearchString) ||
(e.Person.FirstName + " " + e.Person.MiddleName + " " + e.Person.LastName).Contains(SearchString)) &&
((e.Inactive == false || e.Inactive == null) && IncludeInactive == false)
select e;
}
According to my understanding, sending request to the Database over and over again for the same purpose is not good for the Application health.
There are basically two ways to make the process fast.
Bring the Data in the form of DataTable from you DataBase.
Bring the Data in the form of DataSet from you DataBase.
DataTable Approach
Fetch all the records from the Database during your Form Load. Preserve it in the ViewState and not in Session. Please take care of this point. Access the Data like below..
Now access the ViewState. Type Cast it and access the below mentioned function.
public static class GetFilteredData
{
public static DataTable FilterDataTable(this DataTable Dt, string FilterExpression)
{
using (DataView Dv = new DataView(Dt))
{
Dv.RowFilter = FilterExpression;
return Dv.ToTable();
}
}
}
DataTableObject.FilterDataTable("Search Expression or your string variable")
This will return you the DataTable. Reassign the data to the control without any DataBase trips. Execute this step whenever you have to filter the records.
DataSet Approach
This process will send 26 DataTable from your database. I know it is looking very heavy. But as you have already mentioned that total records will be 25,000. So, all these records will be divided among these tables. Please see below the explanation.
The ComboBox DataField Text column can have 26 different Start With characters. You have to divide these records according to the Start with character. Record start with A will be inserted into First Table. Records start with B will be inserted into second table, records start with C will be inserted into third table and so on till Record start with Z will be inserted into 26th Table.
Please Note that Your UDT query will originally be used to insert all records in a Local Temporary Table. This Local Temporary Table will further have 26 select statements based upon the Start With Character.
Below is the Sample Stored Proc.
Create Proc ProcName
As
Create Table #Temp
(
ColumnName Varchar(50)
)
Insert into #Temp(ColumnName)
Select ColumnName from YourTableName
Select ColumnName From #Temp Where ColumnName like 'a%'
Select ColumnName From #Temp Where ColumnName like 'b%'
Select ColumnName From #Temp Where ColumnName like 'c%'
--UpTo Z
Now, Finally you have 26 Tables and Data will be returned as DataSet from your BLL.
Preserve it in ViewState only. Now will filtering the data, Please use the below mentioned function.
public static class GetFilteredData
{
public static DataTable FilterDataTable(this DataSet Dt, string FilterExpression)
{
string Lowercase = FilterExpression.ToLower();
Int16 TableID = 0;
if (Lowercase.StartsWith("a"))
{
TableID = 0;
}
else if (Lowercase.StartsWith("b"))
{
TableID = 1;
}
else if (Lowercase.StartsWith("c"))
{
TableID = 2;
}
//upTo Z
using (DataView Dv = new DataView(Dt.Tables[TableID]))
{
Dv.RowFilter = FilterExpression;
return Dv.ToTable();
}
}
}
So what we have understood the significance of using DataSet Technique is that, the records are further divided into Sub Nodes in the for of Tables. Your Search expression will be implemented on Splitted Nodes of DataSet rather then the Original DataSet.
Code Modification as Per mentioned in the Original Query
Add the following in your Web Application/WebSite only.
public static class GetFilteredData
{
public static DataTable FilterDataTable(this DataTable Dt, string FilterExpression)
{
using (DataView Dv = new DataView(Dt))
{
Dv.RowFilter = FilterExpression;
return Dv.ToTable();
}
}
}
Add the following Property in the WebForm itself. The following Property will return you the result set from Database in case the ViewState is null. Otherwise it will return the ViewState preserved data only.
public DataTable Employees
{
get
{
if (ViewState["Employees"] == null)
{
return FollowsDAL.GetAllEmployees();
}
return (DataTable)ViewState["Employees"];
}
set
{
ViewState["Employees"] = value;
}
}
Now you can access this ViewState in your WebForm , where you have Combobox control. As per my understanding you should go for DataSet Approach.
Please note that WebService is not required in this context.
I would create a method that loaded the values from your database and then stored them in cache. Subsequent calls to this method should return the cached version. Then set the DataSource to this method. That should give you a very nice performance boost.
http://msdn.microsoft.com/en-us/library/system.web.caching.cache.aspx
I think your solution should be a mix of answers by #PraVn and #nurgent. Write a stored procedure which filters records by search string. Have your DAL call this SP using a method which in-turn is called from your existing web method public RadComboBoxData LoadData(RadComboBoxContext context)
I have a DataTable dt with 2 columns. First col (call it CustomerId) is unique and doesn't allow nulls. the second one allows nulls and is not unique.
From a method I get a CustomerId and then I would like to either insert a new record if this CustomerId doesn't exist or increment by 1 what's in the second column corresponding to that CustomerId if it exists.
I'm not sure how I should approach this. I wrote a select statement (which returns System.Data.DataRow) but I don't know how to test whether it returned an empty string.
Currently I have:
//I want to insert a new row
if (dt.Select("CustomerId ='" + customerId + "'") == null) //Always true :|
{
DataRow dr = dt.NewRow();
dr["CustomerId"] = customerId;
}
If the datatable is being populated by a database. I would recommend making the customerid a identity column. That way when you add a new row it will automatically create a new customerid which will be unique and 1 greater than the previous id (depending on how you setup your identity column)
I would check the row count which is returned from the select statement. Something like
I would also use string.Format...
So it would look like this
var selectStatement = string.Format("CustomerId = {0}", customerId);
var rows = dt.Select(selectStatement);
if (rows.Count < 1){
var dr = dt.NewRow();
dr["CustomerId"] = customerId;
}
This is my method to solve similar problem. You can modify it to fit your needs.
public static bool ImportRowIfNotExists(DataTable dataTable, DataRow dataRow, string keyColumnName)
{
string selectStatement = string.Format("{0} = '{1}'", keyColumnName, dataRow[keyColumnName]);
DataRow[] rows = dataTable.Select(selectStatement);
if (rows.Length == 0)
{
dataTable.ImportRow(dataRow);
return true;
}
else
{
return false;
}
}
The Select Method returns an array of DataRow objects. Just check if its length is zero (it's never null).
By the way, don't write such statements in the code directly as in this example. There's a technique for breaching your code's security called "SQL Injection", I encourage you to read the Wikipedia Article. In brief, an experienced user could write SQL script that gets executed by your database and potentially do harmful things if you're taking customerId from the user as a string. I'm not experienced in database programming, this is just "general knowledge"...