C# : Passing Parameters to a function via a 2d object array - c#

I'm trying to write a neat little generic Sql method that will accept a SQL Query and a list of parameters, and return a result. I want to keep it neat enough that I can call it using one line from any other code.
Is there any really awesomely neat way of doing this? I don't want to create all the SqlParameters in the calling code, and I don't want to have to pass and split a string. In the past I've used a string[] array and accepted every odd member as a parameter name and every even as a param value but that's too easy to screw up when calling the method.
Ideally I'd love to do just this:
Data.SQL("Select * from Table where my_id = #my_id", { my_id = 1 });
I know that's a little unrealistic, So I tried this:
Data.SQL("Select * from Table where my_id = #my_id", new Object[,]{ { "my_id", 1 } });
However when I try and handle that on the other end, I get nothing but trouble:
public static Object SQL(String command, Object[,] parameters = null){
[ ... reusable SQL code here... ]
foreach(Object[] p in parameters){
cmd.Parameters.Add(new SqlParameter(p[0].ToString(), p[1].ToString());
}
}
Looks fine, but throws an error on the foreach statement
foreach (Object[] p in parameters)
Unable to cast object of type 'System.String' to type 'System.Object[]'
But I didn't pass it an array of System.String. What I passed was a 2D System.Object[]! Wasn't it?
Maybe this is just some small code problem, something stupid I'm doing wrong. It usually is. But I'm figuring you guys know some even neater way to do the above.

Ideally I'd love to do just this:
Data.SQL("Select * from Table where my_id = #my_id", { my_id = 1 });
I know that's a little unrealistic,
Well, in exactly that form, yes... but try this instead:
Data.SQL("Select * from Table where my_id = #my_id", new { my_id = 1 });
That will use an anonymous type for the argument, which you can examine by reflection. You probably only need a single parameter (i.e. it would be SQL(string sql, object parameters)) because you would pass multiple parameters in a single object:
Data.SQL("Select * from Table where my_id = #my_id and name = #name",
new { my_id = 1, name = "Jon" });
More alternatives:
If you're using C# 4, you might find dynamic typing useful; look at what Massive does for example.
As mentioned by Ray, you could pass in a Dictionary<string, object>; again, C# 3 makes this easier than otherwise:
Data.SQL("...", new Dictionary<string, object> {
{ "my_id", 1 },
{ "name", "Jon" }});
EDIT: As for the exact problem you're running into: you need to understand the difference between a rectangular array (e.g. Object[,]) and a jagged array (e.g. Object[][]). The latter is an array of arrays, which is how you're trying to use the parameter, but it's really only a rectangular array. Changing your parameter type to Object[][] may well fix that immediate problem - but personally I'd move to one of the approaches above. I'd also try to avoid making everything into a string, by the way.

Related

Dapper row to json

Question
We have a Dapper Row as a result from a Dapper Query, which I want to store as a json string in our database. Unfortunately, I can't seem to get that to work.
So, let's start with some background information.
Background info
We're doing a project where we extract Table Names from a table, to know which tables we have to address. We also want this to be as flexible as possible, so we decided to not use a particular POCO for our data.
We're using SQL Server 2014, so unfortunately we don't have the option 'FOR JSON' yet.
Code
Our code looks something like this, where GetData is our actual query:
var data = _queryHandler.Handle(new GetData(tableName.ToString(), 0), database);
The Handle technically just connects to the Database, calling
conn.QueryAsync(query, parameters)
GetData looks like this (simplified):
EXEC ('SELECT * FROM ' + #table + ')'
Reasoning
Because the table name differs each time, we don't want to force a POCO on the output. Sometimes it's a user, other times a role, so to say, so there's no prediction what output it returns.
Results
This works fine. We can extract the data in our variable and this looks like it's an IEnumerable, which should be fine. I gather we can just read them in a loop and extract the rows. So far, no problem.
The issue at hand
Next thing we want to do is to convert the data from said DapperRow to a json string, but I cannot seem to get the data to behave like a json string as JsonConvert.SerializeObject fails miserably. The DapperRow looks like this (again, simplified).
{{DapperRow, Id = '07501399-b385-4d8e-bacc-gad9d04c35f7', UserName = 'test8', ApplicationId = '4721fafb-12e6-4e3c-9298-etd82d18a0cb', IsApproved = 'True', IsLockedOut = 'False', CreateDate = '26-3-2019 07:52:55' }}
I've already looked into things like the SqlMapper.ITypeHandler, but I'm still not getting there. For instance with the TypeHandler, I get stuck on the destinationType as I don't want a particular type - well, except for a json object. But that's not an accepted type.
public class JsonObjectTypeHandler : SqlMapper.ITypeHandler
{
public void SetValue(IDbDataParameter parameter, object value)
{
parameter.Value = (value == null)
? (object)DBNull.Value
: JsonConvert.SerializeObject(value);
parameter.DbType = DbType.String;
}
public object Parse(Type destinationType, object value)
{
return JsonConvert.DeserializeObject(value.ToString(), destinationType);
}
}
The only other thing that crosses my mind is to extract each and every column and building a type object out of it, but as I said, we don't want to use a model/type for the data as we want to keep it flexible.
Could anyone point me in the right direction? I have the feeling I seem to overlook something simple.
If you are using the non-typed Query API, each returned row is also an IDictionary<string,object> (in addition to the dynamic API), which usually works fine with JsonConvert; for example, the following works OK for me:
var tables = (from row in await conn.QueryAsync("select top 5 * from sys.tables")
select (IDictionary<string, object>)row).AsList();
var json = JsonConvert.SerializeObject(tables, Formatting.Indented);
System.Console.WriteLine(json);
outputting:
[
{
"name": "spt_fallback_db",
"object_id": 117575457,
"principal_id": null,
"schema_id": 1,
"parent_object_id": 0,
"type": "U ",
"type_desc": "USER_TABLE",
"create_date": "2003-04-08T09:18:01.557",
"modify_date": "2017-08-22T19:40:40.763",
"is_ms_shipped": true,
"is_published": false,
"is_schema_published": false,
"lob_data_space_id": 0,
"filestream_data_space_id": null,
"max_column_id_used": 8,
... etc

How to add multiple parameters to SQL command in one statement?

I have six lines of parameters like this:
cmd.Parameters.AddWithValue("#variable1", myvalue1);
cmd.Parameters.AddWithValue("#variable2", myvalue2);
cmd.Parameters.AddWithValue("#variable3", myvalue3);
and so on.
Is there any way to compress this a bit without directly inserting in the cmd.CommandText?
Edit: I guess I could have used a good old fashioned array. I've decided to stick with this though.
As far as I know, your code is the most compact possible in term of lines count, however you could use the List<SqlParameter> with the object initializer syntax to have just one line terminated by a semicolon to build your parameter list, then pass that list as the array of parameters expected by the AddRange method
List<SqlParameter> prm = new List<SqlParameter>()
{
new SqlParameter("#variable1", SqlDbType.Int) {Value = myValue1},
new SqlParameter("#variable2", SqlDbType.NVarChar) {Value = myValue2},
new SqlParameter("#variable3", SqlDbType.DateTime) {Value = myValue3},
};
cmd.Parameters.AddRange(prm.ToArray());
Notice that with this approach you need to define correctly the datatype of the parameter. In my example I have used some arbitrary types to show the correct syntax
A bit off-topic, by I think that in this general context is interesting to point out that AddWithValue is not to be considered when you want to get the best performance possible.
In this article on MSDN How data access code affects database perfomance is well explained why one should avoid the AddWithValue method for performance reasons.
In short, using AddWithValue could be a problem for the Sql Server Optimizer because the parameters of type string are passed with the size equal to the current length of the string. But this will force the Sql Server Optimizer to discard the query plan created for a previous identical call but with a string of different length.
It is better to call the SqlParameter constructor specifying the type and the size of the parameter and don't worry how to compress the size of the calls.
I took the question literally: "...in one statement" :)
Steve code is nice but it can be simplified a bit more using the most canonical SqlParameter constructor and implicit arrays declaration:
cmd.Parameters.AddRange(new []
{
new SqlParameter("#variable1", myValue1),
new SqlParameter("#variable2", myValue2),
new SqlParameter("#variable3", myValue3),
});
I think this will read very nicely as a one liner like this:
Usage:
// One liner to create and set SqlCommand parameters
cmd.SetParameters(Parameter("#variable1", myvalue1), Parameter("#variable2", myvalue2), Parameter("#variable3", myvalue3));
To support the one liner you need to create a function to wrap the Sql Parameter as a semantic bundle (Tuple like) as follows:
public SqlParameter Parameter(string name, object value)
{
return new SqlParameter(name, value);
}
Create a static class with an extension method to give us the syntactic sugar we are looking for. Notice the use of the params keyword which allows the multiple parameters in the above call to SetParameters.
public static class SqlDataUtils
{
public static void SetParameters(this SqlCommand command, params SqlParameter[] parameters)
{
command.Parameters.AddRange(parameters);
}
}
This answer is inspired by the accepted answer to Key value pairs in C# Params by Bryan Watts
Just for argument's sake, using the code example you gave where the stored proc variables are literally named variabe1, variable2, etc... you could do something like this:
string[] myValues = new string[] { "myvalue1", "myvalue2", "myvalue3", "myvalue4", "myvalue5", "myvalue6" };
for (int i = 0; i < 6; i++) { cmd.Parameters.AddWithValue("#variable" + (i + 1),myValues[i]); }
2 lines of ugly code... LOL
A loop like this may come in handy if you had say 25 - 50 values, though I don't see that very often. And you could use 2 arrays one for the variable names and one for the values, as long as the indexes match up, then this would work:
string[] myVarNames = new string[] { "variable1", "variable2", "variableThree", "variable4our", "variableFIVE", "variableSIX" };
string[] myValues = new string[] { "myvalue1", "myvalue2", "myvalue3", "myvalue4", "myvalue5", "myvalue6" };
for (int i = 0; i < 6; i++)
{
cmd.Parameters.AddWithValue("#" + myVarNames[i], myValues[i]);
}

Database.Query return type should be IEnumerable<dynamic>

I need help to understand this thing. Why in the following code the variables "ejes" and "habilidades" are resolved as "dynamic" and the third as IEnumerable<dynamic>. This is affecting the code that runs next, an exception appears when I try to invoke the extension method "Count()" because "ejes" and "habilidades" are not IEnumerable. They are the result of the same method "Database.Query".
Here is the snippet:
var db = Database.Open("froned");
db.Execute("begin transaction");
try
{
var asignacion = db.QuerySingle("select * from asignacion_avanza where id_asignacion = #0", id_asignacion);
var ejes = db.Query(String.Format(#"
select id_eje
from asignatura_eje_nivel
where id_nivel = {0}
and id_asignatura = {1}",
asignacion.id_nivel,
asignacion.id_asignatura));
var habilidades = db.Query(String.Format(#"
select id_habilidad
from asignatura_habilidad_nivel
where id_nivel = {0}
and id_asignatura = {1}",
asignacion.id_nivel,
asignacion.id_asignatura));
var dificultades = db.Query("select id_dificultad from dificultad");
var c_dif = dificultades.Count();
var c_eje = ejes.Count();
var c_habilidades = habilidades.Count();
I put an Image of the debugger to show the runtime type of the variables.
asignacion.id_nivel and asignacion.id_asignatura are dynamic types.
When you pass a dynamic type into any method as an argument, the return type of the method becomes dynamic instead of whatever MSDN says it is. You cannot use extension methods on dynamic types. Count() is an extension method on Enumerable. That's why you get the exception.
There are two ways to solve the problem and revert the return type to Enumerable. The first is to define it explicitly:
IEnumerable<dynamic> data = db.Query(someSQL, parameterValue);
and the other is to cast the parameter to a non-dynamic type:
var data = db.Query(someSQL, (string)parameterValue);
And as Knox points out, you must use parameters rather than string.Format.
The difference is debugger is due to dificultades has already evaluated
(var c_dif = dificultades.Count();).
Other two variables is not evaluated yet (linq deferred).
So debugger knows more about dificultades.
Are you sure that habilidades and the other one have data? I think it might be that db.Query is returning a null, meaning no rows returned, and then habilidades.Count can't execute because null doesn't have an object to execute against. One way to test this outside the debugger is to do a db.QueryValue( "Select Count(*) from ...") and see if you're getting zero rows returned.
By the way, the database functions in C# build in the variable handling in a slightly shorter way than you have. You can write
var ejes = db.Query( #"
Select *
From asign
Where id_nivel = #0",
asignacion.id_nivel );
This technique is called parameterized SQL. I'm only showing one parameter, but you will need more. The #0, #1, #2, and so forth get replaced just like in a String.Format.

Using Dapper to map more than 5 types

I am currently building a SELECT query that joins 12 tables together. I've been using Dapper for all my other queries and it works great. Problem is, the generic methods only have to five generic parameters.
I've previously modified the code to support up to 6 for another query, but now I really don't think I should be hacking 6 more levels of generics.
Is there a way to pass dapper an array of types, and it returns the results as an array of objects, which I can cast manually if I have to?
I also might be approaching the problem the wrong way! Any help will be appreciated!
In a project I worked on I saw something like this to get more than 7 types mapped. We used Dapper 1.38:
connection.Query<TypeOfYourResult>
(
queryString,
new[]
{
typeof(TypeOfArgument1),
typeof(TypeOfArgument2),
...,
typeof(TypeOfArgumentN)
},
objects =>
{
TypeOfArgument1 arg1 = objects[0] as TypeOfArgument1;
TypeOfArgument2 arg2 = objects[1] as TypeOfArgument2;
...
TypeOfArgumentN argN = objects[N] as TypeOfArgumentN;
// do your processing here, e.g. arg1.SomeField = arg2, etc.
// also initialize your result
var result = new TypeOfYourResult(...)
return result;
},
parameters,
splitOn: "arg1_ID,arg2_ID, ... ,argN_ID"
);
The queryString is self-explanatory. The splitOn parameter says how Dapper should split the columns from the SELECT statement so that everything can be mapped properly to the objects,
you can read about it here.
You could use a dynamic query and map it afterwards. Something like this
var result = conn.Query<dynamic>(query).Select(x => new Tuple<Type1, Type2, Type3, Type4, Type5>(
// type initialization here
new Type1(x.Property1,x.Property2),
new Type2(x.Property3,x.Property4),
new Type3(x.Property5,x.Property6) etc....));
Edit: With a rather huge result set, another option might be to use multiple querys and then use a Grid Reader. That might work for you.
There's the example taken from the dapper age:
var sql =
#"
select * from Customers where CustomerId = #id
select * from Orders where CustomerId = #id
select * from Returns where CustomerId = #id";
using (var multi = connection.QueryMultiple(sql, new {id=selectedId}))
{
var customer = multi.Read<Customer>().Single();
var orders = multi.Read<Order>().ToList();
var returns = multi.Read<Return>().ToList();
...
}
This has been answered long time ago, but I would like to add my two cents here. Instead of manually modify Dapper's source code, why don't you just create a poco class with those fields and use your query like a table?
The mapping would work fine, I know it is a pain also to do that class definition, but seems easier than dealing with later Dapper's updates.

Unable to cast object of type 'System.Object[]' to type 'System.String[]'

I am developing a C# VS 2008 / SQL Server website application. I am a newbie to ASP.NET. I am getting the above error, however, on the last line of the following code. Can you give me advice on how to fix this? This compiles correctly, but I encounter this error after running it.
All that I am trying to do is to store the items from the second row of "dt" into string parameters. The first row is the header, so I don't want these values. The second row is the first row of values. My SQL stored procedure requires these values as strings. So I want to parse the second row of data and load into 2 string parameters. I added more of my code below.
DataTable dt;
Hashtable ht;
string[] SingleRow;
...
SqlConnection conn2 = new SqlConnection(connString);
SqlCommand cmd = conn2.CreateCommand();
cmd.CommandText = "dbo.AppendDataCT";
cmd.Connection = conn2;
SingleRow = (string[])dt.Rows[1].ItemArray;
SqlParameter sqlParam = cmd.Parameters.AddWithValue("#" + ht[0], SingleRow[0]);
sqlParam.SqlDbType = SqlDbType.VarChar;
SqlParameter sqlParam2 = cmd.Parameters.AddWithValue("#" + ht[1], SingleRow[1]);
sqlParam2.SqlDbType = SqlDbType.DateTime;
My error:
System.InvalidCastException was caught
Message="Unable to cast object of type 'System.Object[]' to type 'System.String[]'."
Source="App_Code.g68pyuml"
StackTrace:
at ADONET_namespace.ADONET_methods.AppendDataCT(DataTable dt, Hashtable ht) in c:\Documents and Settings\Admin\My Documents\Visual Studio 2008\WebSites\Jerry\App_Code\ADONET methods.cs:line 88
InnerException:
You can't cast an array of objects into an array of strings, you have to cast each item in the array, as each items has to be checked if it can be cast. You can use the Cast method for that:
SingleRow = dt.Rows[1].ItemArray.Cast<string>().ToArray();
string[] arr = Array.ConvertAll(dt.Rows[1].ItemArray, o => (string)o);
(you can also use Cast<T>().ToArray(), but this approach works on more framework versions, and gets the array size correct from the start, rather than resizing it)
Hmmm. You're trying to access properties of the DataTable dt, but it looks like you might be expecting that table to contain the results of the AppendDataCT query. It doesn't. Be careful with your row index accesses, too: Arrays in C# are 0-based, and dt.Rows[1] will retrieve the second row in the table.
That aside, review the documentation for DataRow.ItemArray. That method returns an array of objects, not an array of strings. Even if your row contains nothing but strings, you're still dealing with an array of objects, and you'll have to treat it that way. You can cast each individual item in the row to a string, if that's the proper datatype for that column:
foreach (string s in dt.Rows[1].ItemArray)
{
//...
}
EDIT: Ok, in response to your edit, I see what you're trying to do. There are many different ways to do this, and I'd particularly recommend that you move away from HashTables and towards generic equivalents like Dictionary - you'll save yourself much runtime casting grief. That said, here's the simplest adaption of your code:
DataRow dr = dt.Rows[1]; // second row
SqlParameter p1 = cmd.Parameters.AddWithValue((string)ht[0], (string)dr[0]);
SqlParameter p2 = ...
You don't need the leading "#"; ADO.NET will add that for you. The (string) casts will work as long as there's a string at key 0 (which is a fairly non-standard way to use hashtables - you'd usually have some kind of descriptive key), and if the first column in your datatable contains strings.
I recommend you look at typed datasets and generic collections. The lack of them here makes your code somewhat brittle.
Each column item have it's own type, that can be different from String, so if you want to store each row value into an array you have to do it one-by-one with a loop, or with LINQ (I haven't tested this snippet but should do it's job):
(from columnVal in dt.Rows[1].ItemArray
select columnVal.ToString()).ToArray();
You can use LINQ for that:
var yourStringArray = dt.Rows[1].ItemArray
.Select(o => (o ?? (object)String.Emtpy).ToString())
.ToArray();
This converts each item of the array to a string by using ToString() and returns an empty string if the item is null.
How about this: make string[] SingleRow into object[] SingleRow, which will let the line
SingleRow = (object[])dt.Rows[1].ItemArray;
execute properly. Subsequently when you need to access its values as an array of strings, cast it or LINQ select it like this:
objectArray.Select<object, string>
(c => (c != null ? c.ToString() : "")).ToArray();
Make sure to add the following usings:
using System.Linq;
using System.Collections.Generic;

Categories