How to get Dataset with specific Value in ASP.net? - c#

I have a Data-set which has some columns.There is one Column named Is_Deleted which has a bool value.
I am retrieving a full Dataset using ado.net Code in WCF Service from SQL Server.
Now I want all the rows in the Dataset which has the Is_Deleted Column value = false.
Initially I was getting the desired result where Stored Procedure itself. Where I was selecting Columns where Is_deleted = false.
i want this Same operation to be done in C#.
So please tell me how can I proceed.

If you want to filter the DataSet/DataTable you can use Linq-To-DataSet:
var nonDeletedRows = ds.Tables[0].AsEnumerable()
.Where(row => !row.Field<bool>("Is_Deleted"));
If you need to persist this query you could create a new DataTable via CopyToDataTable:
DataTable tableWithNonDeletedRows = nonDeletedRows.CopyToDataTable();
If the DataSet is strongly typed you can use the auto-generated column directly.
For example (assuming the table-name is "TableName"):
tableWithNonDeletedRows = ds.TableName.Where(r=> !r.Is_Deleted).CopyToDataTable();

Related

User defined table type with dynamic columns in SQL Server

In my .NET (C#) win application, I import an Excel file that contains multiple columns of varying numbers. So, I store the data in my datatable in code and I need to pass that datatable to a SQL Server stored procedure for further processing.
Is there any proper way to pass my datatable to a SQL Server stored procedure and can user defined table type have a dynamic number of columns?
if it's possible to create a table type with a dynamic number of
columns.
No, its not possible.
My suggestion would be you can update the data from datatable directly to sql server table like this, then use the table in your procedure.
using (var bulkCopy = new SqlBulkCopy(_connection.ConnectionString, SqlBulkCopyOptions.KeepIdentity))
{
// my DataTable column names match my SQL Column names, so I simply made this loop. However if your column names don't match, just pass in which datatable name matches the SQL column name in Column Mappings
foreach (DataColumn col in table.Columns)
{
bulkCopy.ColumnMappings.Add(col.ColumnName, col.ColumnName);
}
bulkCopy.BulkCopyTimeout = 600;
bulkCopy.DestinationTableName = destinationTableName;
bulkCopy.WriteToServer(table);
}
Note, datatable column name should be same as your sql table column name to get it mapped properly.
First of all, you can not create a table with a dynamic number of
columns.
You can use JSON to store all your borrowers in a single column. You really don't need to create multiple columns for borrowers. You can simply create a column in a table that will be of size NVARCHAR(MAX) and you have to create a JSON of all borrowers and store it in a table.
While retrieving data from the table deserialize it to get in original form.

Most Efficient way of searching a List of values in an SQL Server table

I am working on a C# Win-forms Application which is connected to SQL Server Database. I have a situation where I have to parse data from an Excel sheet and search each value in all columns of a table. And at the end display the rows of table whose value matched.
Now what I am doing is
I have parsed the whole Excel sheet into DataTable
I have a stored procedure in SQL server which take a string input and search it in all column of table and return a row if any matched.
Now I pass each value of the Datatable(extracted from excel sheet) to the stored procedure for searching.
Kindly guide me whether it is an efficient way, or give me suggestion for achieving it efficiently.
You can use Table Valued parameter and than in SQL you can use cursor or loop over the passed table and search each column in SQL table. Here is a simple example.
In database create a new Tpye
CREATE TYPE [dbo].[SearchInDB] AS TABLE(
[Id] [int] NOT NULL
)
And in SP you will pass this type from C# code. Your SP will receive it like this
ALTER PROCEDURE [dbo].[YourSPNameHere]
#DataToSearch dbo.SearchInDB ReadOnly
AS
BEGIN
--Your SP logic here
END
And in your code you will create a DataTable and fill datatable with values and pass it to SP like this
DataTable dt = new DataTable();
dt.Columns.Add("Id", typeof(int));
//Add rows in datatable with values
DataRow dr = dt.NewRow();
dr["Id"] = 10;
dt.Rows.Add(dr);
//Now pass this table to SP as parameter
SqlParameter parameter = new SqlParameter();
parameter.ParameterName = "#DataToSearch";
parameter.SqlDbType = System.Data.SqlDbType.Structured;
parameter.Value = dt;
Note
I have added only one column you have to add other column if needed. To get values from parameter passed to SP you will have to use loop or use cursor. Here is a link of another example
You can use SQL Server SSIS packages , to directly import excel into database table and can search the table using loop or cursor in stored procedure

Delete rows from DataTable at once without loop where a column has Null value

I'm populating a DataTable object from an Excel worksheet. I'm not counting on the user entering data correctly, and I'd like to delete rows that have a null value in column A. I've searched around quite a bit, and it looks like everyone is doing this with a for loop. Here's the problem: 757,000 rows. If data is not formatted properly that DataTable gets filled up with Excel's max of 1048575. I don't need to check each value individually. In SQL server you can write:
DELETE table
WHERE columnA IS NULL
I considered excluding nulls on the DataTable fill, but I couldn't get this to work reliably as I don't know the name of column A until it's in the DataTable and sometimes the column name has a \n character in the middle which throws syntax errors.
How can I accomplish the same sort of thing without a loop?
How about creating a new, clean DataTable from your first one?
DataTable t = new DataTable(); //actually your existing one with bad records
DataTable newTable = t.Select().Where(x => !x.IsNull(0)).CopyToDataTable();
Just use Linq...
var newData = (from d in DataTable //this would be your datatable.
where d.columnA != NULL
select d.*).CopyToDataTable<DataRow>();
Then just use your newData variable, which holds your newly populated datatable.

How to retrieve server generated Identity values when using SqlBulkCopy

I know I can do a bulk insert into my table with an identity column by not specifying the SqlBulkCopyOptions.KeepIdentity as mentioned here.
What I would like to be able to do is get the identity values that the server generates and put them in my datatable, or even a list. I saw this post, but I want my code to be general, and I can't have a version column in all my tables. Any suggestions are much appreciated. Here is my code:
public void BulkInsert(DataTable dataTable, string DestinationTbl, int batchSize)
{
// Get the DataTable
DataTable dtInsertRows = dataTable;
using (SqlBulkCopy sbc = new SqlBulkCopy(sConnectStr))
{
sbc.DestinationTableName = DestinationTbl;
// Number of records to be processed in one go
sbc.BatchSize = batchSize;
// Add your column mappings here
foreach (DataColumn dCol in dtInsertRows.Columns)
{
sbc.ColumnMappings.Add(dCol.ColumnName, dCol.ColumnName);
}
// Finally write to server
sbc.WriteToServer(dtInsertRows);
}
}
AFAIK, you can't.
The only way (that I know of) to get the values(s) of the identity field is by using either SCOPE_IDENTITY() when you insert row-by-row; or by using the OUTPUT approach when inserting an entire set.
The 'simplest' approach probably would be that you would SqlBulkCopy the records in the table and then fetch them back again later on. The problem might be that it could be hard to properly (and quickly) fetch those rows from the server again. (e.g. it would be rather ugly (and slow) to have a WHERE clause with IN (guid1, guid2, .., guid999998, guid999999) =)
I'm assuming performance is an issue here as you're already using SqlBulkCopy so I'd suggest to go for the OUTPUT approach in which case you'll firstly need a staging table to SqlBulkCopy your records in. Said table should then be including some kind of batch-identifier (GUID?) as to allow multiple treads to run side by side. You'll need a stored procedure to INSERT <table> OUTPUT inserted.* SELECT the data from the staging-table into the actual destination table and also clean-up the staging table again. The returend recordset from said procedure would then match 1:1 to the origanal dataset responsible for filling the staging table, but off course you should NOT rely on it's order. In other words : your next challenge than will be matching the returned Identity-fields back to the original records in your application.
Thinking things over, I'd say that in all cases -- except the row-by-row & SCOPY_IDENTITY() approach, which is going to be dog-slow -- you'll need to have (or add) a 'key' to your data to link the generated id's back to the original data =/
You can do a similar approach described above by deroby but instead of retrieving them back via a WHERE IN (guid1, etc... You match them back up to the rows inserted in memory based on their order.
So I would suggest to add a column onto the table to match the row to a SqlBulkCopy transaction and then do the following to match the generated Ids back to the in memory collection of rows you just inserted.
Create a new Guid and set this value on all the rows in the bulk copy mapping to the new column
Run the WriteToServer method of the BulkCopy object
Retrieve all the rows that have that same key
Iterate through this list which will be in the order they were added, these will be in the same order as the the in memory collection of rows so you then will know the generated id for each item.
This will give you better performance than giving each individual row a unique key. So after you bulk insert the data table you could do something like this (In my example I will have a list of objects from which I will create the data table and then map the generated ids back to them)
List<myObject> myCollection = new List<myObject>
Guid identifierKey = Guid.NewGuid();
//Do your bulk insert where all the rows inserted have the identifierKey
//set on the new column. In this example you would create a data table based
//off the myCollection object.
//Identifier is a column specifically for matching a group of rows to a sql
//bulk copy command
var myAddedRows = myDbContext.DatastoreRows.AsNoTracking()
.Where(d => d.Identifier == identiferKey)
.ToList();
for (int i = 0; i < myAddedRows.Count ; i++)
{
var savedRow = myAddedRows[i];
var inMemoryRow = myCollection[i];
int generatedId = savedRow.Id;
//Now you know the generatedId for the in memory object you could set a
// a property on it to store the value
inMemoryRow.GeneratedId = generatedId;
}

How can I deal with NULL values in data table turning into missing XML elements when inserting in SQL Server?

I want to insert data using XML in SQL Server 2005. So I got one datatable from the BackEnd and I passed the DataTable as follows;
DataSet dsItem = new DataSet();
DTItem.TableName = "ItemDetails"; //DTItem is the DataTable I got from the BackEnd
dsItem.Tables.Add(DTItem);
My problem is, If any column contains null value then the XML not taking that null column. For ex: Consider this is my DataTable
JobOrderID CustomerID
------------ ------------
4
Here CustomerID is null. When I fix the Trace and View, the DataTable which shows empty instead of null.
So. When I pass the DataTable, the XML does not consider the Null column. It takes as follows
<NewDataSet>
<ItemDetails>
<JobOrderID>4</JobOrderID>
</ItemDetails>
</NewDataSet>
It is not taking CustomerID, So the insertion is not working. Why the DataTable does not show the Null values in the Column fields and How to pass if the DataTable containing null value as a XML?
Please any suggestions.
I just solved this problem by assigning -1 instead of null. I know this is not the right solution, but I solved using this method.

Categories