How to select varying amount of columns with ExecuteStoreQuery - c#

I'm dynamically building a select query with a varying number of columns.
select
a as col0,
b as col1,
...,
c as coln
...
The query is supposed to retrieve a matrix of integers.
When executing the query with ObjectContext.ExecuteStoreQuery, I get the right amount of lines but each line seems to be empty.
Here is my code:
var lines = context.ExecuteStoreQuery<List<int>>(queryString).AsEnumerable()
How could I make it work?
I found here that I should be using ADO.NET for this kind of things.

Unfortunately Entity Framework 6 does not have a lot of flexibility in the internal mapping code, so it is not able to map the SQL result to your List<int> or to any other collection of primitive types.
To understand why it is not able to do it, you need to know that internally EF6 uses a DbDataReader to read the SQL results, then it builds a ColumnMap from the expected generic result type (in your case the type is a generic List), and it performs a dynamic translation of the SQL results to a result object using the ColumnMap to know what column to map to what property of the object.
Per my explanation above, EF6 ExecuteStoreQuery method is trying to map your columns ("a", "b"..etc.) to the List object properties, and since there are no properties on the List class that match your SQL result column names it is not able to map it.
These limitations make ADO.NET one of the simplest option for dynamic columns. You can use a DbDataReader like in the following code:
var arr = new List<int>();
using (var md = new Context())
{
var conn = md.Database.Connection;
conn.Open();
using (IDbCommand cmd = conn.CreateCommand())
{
cmd.CommandText = "select Col1,Col2 from Entities";
using (var reader = (DbDataReader)cmd.ExecuteReader())
{
while (reader.Read())
{
for (int i = 0; i < reader.FieldCount; i++)
{
arr.Add(reader.GetInt32(i));
}
}
}
}
}

Related

Insert data into database table using parameters for columns names

I want to insert data into table using SP. But I have database table column name in parameters. I want to use parameters as column in SP to insert data. You have any idea to insert data through column name as parameter.
cmd3.Connection = conn;
cmd3.CommandType = CommandType.StoredProcedure;
cmd3.CommandText = "CustomerInfoProcedure";
cmd2.Parameters.AddWithValue("#Col0", Col0);
cmd2.Parameters.AddWithValue("#Col1", Col1);
cmd2.Parameters.AddWithValue("#Col2", Col2);
//insert query using in SP, but its give error
insert into #AgentDetails (#Col0, #Col1, #Col2)
values (#eCol0, #eCol1, #eCol2);
To my knowledge (although I haven't worked with SQL Server for a while) you can't really do that directly. If you insist on having it this way, you have a couple of (bad) options:
Use sp_executesql and build your query dynamically by concatenating relevant strings (MSDN). This approach will likely result in fairly slow queries (since they can't be optimized before they are generated) and has huge security downsides (think SQL injection).
Have a set of prepared queries inside your SP that cover all possible combinations of the input parameters. This will alleviate performance and security concerns, but it will leave you (depending on the number of field combinations you want to have) with a huge and complicated code in your SP that will be hard to maintain.
UPD:
After seeing your comment: in your case it will be much better to handle column reordering in the code and only have a single signature for the SP. E.g.
var value0 = Col0 == "field0" ? eCol0 : Col1 == "field0" ? eCol1 : eCol2;
var value1 = ...
cmd3.Connection = conn;
cmd3.CommandType = CommandType.StoredProcedure;
cmd3.CommandText = "CustomerInfoProcedure";
cmd2.Parameters.AddWithValue("#value0", value0);
cmd2.Parameters.AddWithValue("#value1", value1);
cmd2.Parameters.AddWithValue("#value2", value2);
//insert query using in SP
insert into #AgentDetails (field0, field1, field2)
values (#value0, #value1, #value2);
UPD2
If you have a large number of variables, then similar approach would work as long as all the values and column-field mapping are stored in appropriate data structures. E.g. let's say that you have an array of column names in the order they follow in your spreadsheet, e.g. taken from a spreadsheet header:
string[] columnsNames; // ["field1", "field2", "field10", "field0", ...]
, an array of values that you need to insert, per row, at the same order as the columnsNames:
string[] values; // ["value1", "value2", "value10", "value0", ...]
and an array where the column names are listed in the order they need to be in for your SP parameters:
// This list can be made into a constant, but you need
// to keep it in sync with the SP signature
string[] parameterOrder = ["field0", "field1", "field2", ...];
In this case you can use logic like this one to add your data right into the correct place:
// This dictionary will be used for field position lookups
var columnOrderDict = new Dictionary<string, int>();
for (var i = 0; i < columnsNames.Length; i++)
{
columnOrderDict[columnsNames[i]] = i;
}
cmd3.Connection = conn;
cmd3.CommandType = CommandType.StoredProcedure;
cmd3.CommandText = "CustomerInfoProcedure";
for (var j = 0; j < parameterOrder.Length; j++)
{
var currentFieldName = parameterOrder[j];
if (columnOrderDict.ContainsKey(currentFieldName))
{
cmd3.Parameters.AddWithValue(currentFieldName, values[columnOrderDict[currentFieldname]]);
} else {
cmd3.Parameters.AddWithValue(currentFieldName, DBNull.Value);
}
}
This code is built on multiple assumptions, such as that the column headers in your spreadsheet will exactly match the stored procedure parameter names etc, but I hope it should give you enough of a hint to build your own logic.
Also don't forget proper validation - currently the only thing this code guards against is the situation when a field that's needed by SP is missing from the input data. You also need to validate data format, the number of values should match the number of headers etc. etc.

Get many rows from a list of IDs

I'm using C# and SQL Server. I have a list of IDs for documents which corresponds to the primary key for a table in SQL Server that has a row for each document and the row contains (among other things) the ID and the document for that ID. I want to get the document in the row for each of the IDs. Currently, I execute a query for each ID, but since there are 10,000s of them, this runs a ton of queries and takes a very long time. It ends up being faster to simply load everything from the table into memory and then filter by the ids I have, but that seems inefficient and won't scale over time. If that doesn't make sense, hopefully the following code that takes a long time to run shows what I'm trying to do.
private static Dictionary<Guid, string> foo(IEnumerable<Guid> guids, SqlConnection conn)
{
using (SqlCommand command = new SqlCommand(null, conn))
{
command.CommandText = "select document from Documents where id = #id";
SqlParameter idParam = new SqlParameter("#id", SqlDbType.UniqueIdentifier);
command.Parameters.Add(idParam);
command.Prepare();
var documents = new Dictionary<Guid, string>();
foreach (var guid in guids)
{
idParam.Value = guid;
object obj = command.ExecuteScalar();
if (obj != null)
{
documents[guid] = (string)obj;
}
}
return documents;
}
}
I could programmatically construct query strings to use where clause like this: ".... where id in (ID1, ID2, ID3, ..., ID100)" to get 100 documents at a time or something like that, but this feels janky and it seems to me like there's got to be a better way.
I'm sure I'm not the only one to run into this. Is there an accepted way to go about this?
You can use Table-Valued Parameters with no limits in amount of guids
In the code you will create SqlParameter with all Id's you need to
First you need create type of parameter in the sql server
CREATE TYPE IdTableType AS TABLE
(
Id uniqueidentifier
);
Then in the code
private static Dictionary<Guid, string> foo(IEnumerable<Guid> guids, SqlConnection conn)
{
using (SqlCommand command = new SqlCommand(null, conn))
{
// use parameter as normal table in the query
command.CommandText =
"select document from Documents d inner join #AllIds a ON d.id = a.Id";
// DataTable is used for Table-Valued parameter as value
DataTable allIds = new DataTable();
allIds.Columns.Add("Id"); // Name of column need to be same as in created Type
foreach(var id in guids)
allids.Rows.Add(id);
SqlParameter idParam = new SqlParameter
{
ParameterName = "#AllIds",
SqlDbType=SqlDbType.Structured // Important for table-valued parameters
TypeName = "IdTableType", // Important! Name of the type must be provided
Value = allIds
};
command.Parameters.Add(idParam);
var documents = new Dictionary<Guid, string>();
using (var reader = command.ExecuteReader())
{
while (reader.Read())
{
documents[guid] = reader[0].ToString();
}
}
return documents;
}
}
You don't need to prepare the command any more. Besides after first execution next queries will use same compiled query plan, because query text remain same.
You can bunch them into sets of ids and pass a table valued parameter into the query. With Dapper this looks a bit like:
connection.Query("select document from Documents where id in #ids", new { ids = guids});
BEWARE though theres an 8000 parameter limit in sql so you will need to batch up your reads.
btw.. I'd highly recommend looking at Dapper or another micro orm for this type of data access.

C# Casting item from SqlConnection to expose available items

I have a SQL query which I'm calling directly in code using plain ADO.NET (not Entity Framework) due to the stored procedure containing a pivot table which can potentially return a variable number of columns depending on the parameter input.
I've got to the stage where I can get a list of row data from the query and while debugging, if I drill down deep enough on the base properties, I can see that the object contains the 2 properties that I need (_fieldNameLookup and _values).
I am trying to access these values but I am not sure how I'm meant to be casting the items to expose the properties that I need.
The C# query is as follows:
using (var conn = new SqlConnection(ConfigurationManager.ConnectionStrings["Services"].ToString()))
{
conn.Open();
var cmd = conn.CreateCommand();
cmd.CommandText = "[SPROCQUERY]";
cmd.CommandType = CommandType.StoredProcedure;
var queryData = cmd.ExecuteReader();
var crei = new List<Items>();
while (queryData.Read())
{
for (int i = 0; i < queryData.FieldCount; i++)
{
var x = queryData[i]; <-- I am trying to expose values in this object
}
}
queryData.Close();
return null;
}
Using intellisense on var x shows up with generic properties such as Equals, GetType, ToString.
Thanks
queryData is your DbDataReader. Referencing an index on the default property simply returns an object of the column number.
To find out the column name, use the GetName() method of queryData, and to get the type, use GetFieldType():
Type t = queryData.GetFieldType(i);
To reference properties of your object--you need to cast it to the appropriate type.

Convert C# SQL Loop to Linq

I have a list Called ListTypes that holds 10 types of products. Below the store procedure loops and gets every record with the product that is looping and it stores it in the list ListIds. This is killing my sql box since I have over 200 users executing this constantly all day.
I know is not a good architecture to loop a sql statement, but this the only way I made it work. Any ideas how I can make this without looping? Maybe a Linq statement, I never used Linq with this magnitude. Thank you.
protected void GetIds(string Type, string Sub)
{
LinkedIds.Clear();
using (SqlConnection cs = new SqlConnection(connstr))
{
for (int x = 0; x < ListTypes.Count; x++)
{
cs.Open();
SqlCommand select = new SqlCommand("spUI_LinkedIds", cs);
select.CommandType = System.Data.CommandType.StoredProcedure;
select.Parameters.AddWithValue("#Type", Type);
select.Parameters.AddWithValue("#Sub", Sub);
select.Parameters.AddWithValue("#TransId", ListTypes[x]);
SqlDataReader dr = select.ExecuteReader();
while (dr.Read())
{
ListIds.Add(Convert.ToInt32(dr["LinkedId"]));
}
cs.Close();
}
}
}
Not a full answer, but this wouldn't fit in a comment. You can at least update your existing code to be more efficient like this:
protected List<int> GetIds(string Type, string Sub, IEnumerable<int> types)
{
var result = new List<int>();
using (SqlConnection cs = new SqlConnection(connstr))
using (SqlCommand select = new SqlCommand("spUI_LinkedIds", cs))
{
select.CommandType = System.Data.CommandType.StoredProcedure;
//Don't use AddWithValue! Be explicit about your DB types
// I had to guess here. Replace with the actual types from your database
select.Parameters.Add("#Type", SqlDBType.VarChar, 10).Value = Type;
select.Parameters.Add("#Sub", SqlDbType.VarChar, 10).Value = Sub;
var TransID = select.Parameters.Add("#TransId", SqlDbType.Int);
cs.Open();
foreach(int type in types)
{
TransID.Value = type;
SqlDataReader dr = select.ExecuteReader();
while (dr.Read())
{
result.Add((int)dr["LinkedId"]);
}
}
}
return result;
}
Note that this way you only open and close the connection once. Normally in ADO.Net it's better to use a new connection and re-open it for each query. The exception is in a tight loop like this. Also, the only thing that changes inside the loop this way is the one parameter value. Finally, it's better to design methods that don't rely on other class state. This method no longer needs to know about the ListTypes and ListIds class variables, which makes it possible to (among other things) do better unit testing on the method.
Again, this isn't a full answer; it's just an incremental improvement. What you really need to do is write another stored procedure that accepts a table valued parameter, and build on the query from your existing stored procedure to JOIN with the table valued parameter, so that all of this will fit into a single SQL statement. But until you share your stored procedure code, this is about as much help as I can give you.
Besides the improvements others wrote.
You could insert your ID's into a temp table and then make one
SELECT * from WhatEverTable WHERE transid in (select transid from #tempTable)
On a MSSQL this works really fast.
When you're not using a MSSQL it could be possible that one great SQL-Select with joins is faster than a SELECT IN. You have to test these cases by your own on your DBMS.
According to your comment:
The idea is lets say I have a table and I have to get all records from the table that has this 10 types of products. How can I get all of this products? But this number is dynamic.
So... why use a stored procedure at all? Why not query the table?
//If [Type] and [Sub] arguments are external inputs - as in, they come from a user request or something - they should be sanitized. (remove or escape '\' and apostrophe signs)
//create connection
string queryTmpl = "SELECT LinkedId FROM [yourTable] WHERE [TYPE] = '{0}' AND [SUB] = '{1}' AND [TRANSID] IN ({2})";
string query = string.Format(queryTmpl, Type, Sub, string.Join(", ", ListTypes);
SqlCommand select = new SqlCommand(query, cs);
//and so forth
To use Linq-to-SQL you would need to map the table to a class. This would make the query simpler to perform.

How can I directly execute SQL queries in linq

In C# with VS 2008,I have a query ,In this query i join more than one tables,so i don't know the type , I want to know how to directly run a sql query in linq .
IEnumerable<Type> results = db.ExecuteQuery<TYpe>("sql query")
My above query works fine but I want to avoid type, I want to write
var results = db.ExecuteQuery("sql query");
Is there any way to write it?
Thanks in advance.
var result = dataContext.ExecuteQuery<JobsDto>
("Select JobID,JobName From jobs");
but make sure JobsDto has two properties JobID and JobName and there types the same type as the table columns
PS. DTO stand for Data Transfer Object
You need to specify the type to map to from the query results. You can use a System.Type object instead of statically specifying it as a generic type parameter:
var results = db.ExecuteQuery(typeof(Customer), "sql query ");
If you just want a plain ADO.NET DataReader you could use the DataContext.Connection property:
using (var cmd = db.Connection.CreateCommand())
{
cmd.CommandText = "sql query ";
var results = cmd.ExecuteReader();
}

Categories