I'm using Dapper to read data from SQL Server. I have a SQL statement that returns a long Json result but the issue is this result being split into 3 rows with 2033 characters max per row, then Dapper can't parse the returned result because it's invalid Json.
How to prevent this splitting or how to make Dapper deal with it?
This is my code:
SqlMapper.ResetTypeHandlers();
SqlMapper.AddTypeHandler(new JsonTypeHandler<List<Product>>());
const string sql = #"SELECT
*,
(SELECT * FROM Balance b
WHERE p.SKU = b.SKU
FOR JSON PATH) AS [Balances]
FROM Product p
WHERE SKU IN #SKUs
FOR JSON PATH";
var connection = new SqlConnection("myconnection");
return connection.QuerySingleAsync<List<Product>>(sql, new{SKUs = new[] {"foo", "bar"}} });
And the code of TypeHandler:
public class JsonTypeHandler<T> : SqlMapper.TypeHandler<T>
{
public override T Parse(object value)
{
return JsonConvert.DeserializeObject<T>(value.ToString());
}
public override void SetValue(IDbDataParameter parameter, T value)
{
parameter.Value = JsonConvert.SerializeObject(value);
}
}
And here is how I run this SQL in DataGrip
Edit:
Here is the error message:
Newtonsoft.Json.JsonSerializationException : Unexpected end when deserializing object. Path '[0].Balances[4].WarehouseId', line 1, position 2033.
My solution is writing another extension method that wraps Query<string> method likes below:
public static T QueryJson<T>(this IDbConnection cnn, string sql, object param = null,
IDbTransaction transaction = null, bool buffered = true, int? commandTimeout = null,
CommandType? commandType = null) where T: class
{
var result = cnn.Query<string>(sql, param, transaction, buffered, commandTimeout, commandType).ToList();
if (!result.Any())
return default(T);
// Concats
var sb = new StringBuilder();
foreach (var jsonPart in result)
sb.Append(jsonPart);
var settings = new JsonSerializerSettings
{
// https://github.com/danielwertheim/jsonnet-contractresolvers
// I use this Contract Resolver to set data to private setter properties
ContractResolver = new PrivateSetterContractResolver()
};
// Using Json.Net to de-serialize objects
return JsonConvert.DeserializeObject<T>(sb.ToString(), settings);
}
This solution works quite well and slower then multiple mapping method when query large data (1000 objects took 2.7 seconds in compare to 1.3 seconds).
Related
I have a PostgreSQL function that takes in a parameter of type json. Using Dapper, how do I execute a call that passes the object to the PostgreSQL function such that PostgreSQL recognizes the type as json instead of as text?
Example PostgreSQL Function that takes in json type
CREATE OR REPLACE FUNCTION testfuncthattakesinjson(heroes json)
RETURNS SETOF characters
LANGUAGE 'sql'
STABLE
ROWS 100
AS $BODY$
SELECT c.*
FROM characters c
JOIN json_array_elements(heroes) j
ON c.first_name = j->>'first_name'
AND c.last_name = j->>'last_name';
$BODY$;
Broken Example C# Integration Test
[Test]
public void Query_CallFunctionThatTakesInJsonParameter_FunctionUsesJsonType()
{
using (var conn = new NpgsqlConnection(Db.GetConnectionStringToDatabase()))
{
var funcName = "testfuncthattakesinjson";
var expect = CharacterTestData.Where(character => character.id <= 3);
var jsonObj = JArray.FromObject(CharacterTestData
.Where(character => character.id <= 3)
.Select(character => new Hero(character))
.ToList());
SqlMapper.AddTypeHandler(new JArrayTypeHandler());
// Act
var results = conn.Query<Character>(funcName, new
{
heroes = jsonObj
}, commandType: CommandType.StoredProcedure);
// Assert
CollectionAssert.AreEquivalent(expect, results);
}
}
Supporting JArrayTypeHandler
internal class JArrayTypeHandler : SqlMapper.TypeHandler<JArray>
{
public override JArray Parse(object value)
{
throw new NotImplementedException();
}
public override void SetValue(IDbDataParameter parameter, JArray value)
{
parameter.Value = value;
}
}
In this current iteration, I've added a SqlMapper.TypeHandler. (At the moment, I'm only concerned with passing the JArray to PostgreSQL, hence the NotImplmentedException for Parse.)
With this example, I get the following exception:
System.NotSupportedException: 'The CLR array type Newtonsoft.Json.Linq.JArray isn't supported by Npgsql or your PostgreSQL. If you wish to map it to an PostgreSQL composite type array you need to register it before usage, please refer to the documentation.'
In past iterations, I've also tried things like using a List<Hero> type handler and letting that type handler deal with the Json conversion.
I've also tried adding the Npgsql.Json.NET Nuget package extension for Npgsql and call conn.TypeMapper.UseJsonNet() in my test method, but that didn't seem to have any effect.
And if I do anything to serialize the object to a JSON string, then I get a different error (below), which makes sense.
Npgsql.PostgresException: '42883: function testfuncthattakesinjson(heroes => text) does not exist'
So, it is possible to use Dapper to pass a JSON object as a PostgreSQL primitive to a function?
You can use Dapper's ICustomQueryParameter interface.
public class JsonParameter : ICustomQueryParameter
{
private readonly string _value;
public JsonParameter(string value)
{
_value = value;
}
public void AddParameter(IDbCommand command, string name)
{
var parameter = new NpgsqlParameter(name, NpgsqlDbType.Json);
parameter.Value = _value;
command.Parameters.Add(parameter);
}
}
Then your Dapper call becomes:
var results = conn.Query<Character>(funcName, new
{
heroes = new JsonParameter(jsonObj.ToString())
}, commandType: CommandType.StoredProcedure);
I let ServiceStack OrmLite (5.1.1) create the table, and persist the object, that contains a TimeSpan:
// ...
public TimeSpan _Jobs_VehicleNotificationTime { get; set; }
// ...
When I try to read it back, I get this error:
System.InvalidCastException: 'Invalid cast from 'System.Int64' to 'System.TimeSpan'.'
The value is persisted as a long it seems:
but I get this when using the FromObjectDictionary method:
Error is:
at System.Convert.DefaultToType(IConvertible value, Type targetType, IFormatProvider provider)
at ServiceStack.PlatformExtensions.ObjectDictionaryFieldDefinition.SetValue(Object instance, Object value)
at ServiceStack.PlatformExtensions.FromObjectDictionary(IReadOnlyDictionary`2 values, Type type)
at tWorks.Core.CoreServerCommons.Handlers.OrmLiteDbHandler.<>c__DisplayClass65_1.<ReadObjects>b__1(Dictionary`2 x) in D:\[GIT]\Core\CoreServerCommons\Handlers\DbHandlers\OrmLite\OrmLiteDbHandler.cs:line 577
Is this a bug or am I missing something?
TimeSpan's are stored as integer columns in OrmLite to ensure they maintain precision and behavior across all supported RDBMS's. If you're retrieving it using a dynamic resultset in an Object Dictionary then it will only be able to return the data reader value which hasn't gone through OrmLite's converters to convert it back to a TimeSpan, in which case you wont be able to use ServiceStack.Text's FromObjectDictionary() generic extension method here which does not use OrmLite's converters.
I think I have resolved the issue. Maybe #mythz can tell me if this is completely wrong, but it seems to work:
Implement your own TimeSpanAsIntConverter
I first dismissed that idea, as I wrongly interpreted Mythz to the effect of Converters were not relevant or were not executed if using the Untyped API. When I did implement the TimeSpan converter, it worked just as expected:
namespace Commons
{
public class MyTimeSpanConverter : TimeSpanAsIntConverter
{
public override string ColumnDefinition => "TIME";
public override DbType DbType => DbType.Time;
public override object ToDbValue(Type fieldType, object value)
{
TimeSpan timespan = (TimeSpan)value;
return timespan;
}
}
}
Then, when using that converter, the table is correctly created with the TIME type instead of bigint, and when persisted, it all looks OK:
Test code:
public void Test()
{
Customer c = new Customer() { Username = "TED ÅÄÖ", DeletedTime = DateTime.MaxValue, MyTimeSpan = new TimeSpan(1, 30, 0) };
CoreObject co = c;
long id;
using (IDbConnection db = _dbFactory.Open())
{
var typedApi = db.CreateTypedApi(co.GetType());
id = typedApi.Insert(co, selectIdentity: true);
};
using (IDbConnection db = _dbFactory.Open())
{
string tableName = co.GetType().GetModelMetadata().ModelName;
List<Dictionary<string, object>> results = db.Select<Dictionary<string, object>>($"SELECT * FROM {tableName} where id={id}");
List<CoreObject> coreObjects = results.Map(x => (CoreObject)x.FromObjectDictionary(co.GetType()));
}
}
Results:
This seems to resolve at least this issue for me - my TimeSpans work as expected.
I have following extension function:
public static IEnumerable<T> Select<T>(this IDataReader reader,
Func<IDataReader, T> selector)
{
while (reader.Read())
{
yield return selector(reader);
}
}
which is being used like:
var readFields = dsReader.Select(r =>
{
var serviceResponse = myService.Decrypt<DateTime>(r.GetString(DATE_VALUE), r.GetInt32(DEK_ID));
if (serviceResponse.IsSuccessful)
{
return new DataField<DateFieldValue>
{
FieldValue = new DateFieldValue { Data = serviceResponse.Value }
};
}
return null;
});
if (!readFields.IsCollectionNullOrEmpty())
returnFinalFields.AddRange(readFields);
The problem I am facing here is that even if serviceResponse.IsSuccessful is false the readFields is not empty it contains an enumerable with an item that is null. Is there a way we can return an empty collection here?
The real problem here is that you don't want to be selecting out a null value when the service doesn't have a successful response. You'll want to filter out the successful responses as a part of your query:
var readFields = from r in dsReader
let serviceResponse = myService.Decrypt<DateTime>(r.GetString(DATE_VALUE), r.GetInt32(DEK_ID))
where serviceResponse.IsSuccessful
select new DataField<DateFieldValue>
{
FieldValue = new DateFieldValue { Data = serviceResponse.Value }
};
The Select method could check the returned result and only yield it's value when valid. For example not null:
public static IEnumerable<T> Select<T>(this IDataReader reader, Func<IDataReader, T> selector)
where T:class
{
while (reader.Read())
{
var res = selector(reader);
if(res!=null)
yield return res;
}
}
Although as stated by Servy, that would normally not belong in a regular Select. The method could be called something like SelectValidValues to avoid confusion.
Another way would be to have the lambda parameter return a Tuple containing both the result and if it's valid.
Yet another way is to have an optional parameter (as a value or an extra predicate function) that checks which values are valid
Interesting (mis?) use of Select. Your issue is you return a null from the Select delegate when IsSuccessful is false. Since not returning a value from Select's delegate isn't an option, filter afterwards:
var readFields = dsReader.Select(r => {
var serviceResponse = myService.Decrypt<DateTime>(r.GetString(DATE_VALUE), r.GetInt32(DEK_ID));
if (serviceResponse.IsSuccessful)
return new DataField<DateFieldValue> {
FieldValue = new DateFieldValue { Data = serviceResponse.Value }
};
else
return null;
}).Where(df => df != null);
I am currently interacting with a legacy SDK, which is used to control hand reader hardware. It is a commercial SDK and product, and thus there is no way to access the hardware other than through this SDK.
I am attempting to run SQL queries on the hand reader's database, and the SDK only provides one method for doing so, with the following signature:
handReader.QueryDB(string sqlStatement);
This thus implies that any query can be passed to that method in the form of a string. While this is simple, it immediately points towards possible SQL injection attacks.
I do not have access to the method's workings, thus cannot see whether any detailed SQL prevention goes on within it. Thus, I would like to find a way to paramaterise the string, similar to paramaterised queries utilising the SqlCommand class.
Is this possible?
I've implement a string format that works with names instead of numbers
Here's my sql string:
string sqlStr = #"select * from Table1 t1
where t1.field1 = #Field1 and IsActive = 1";
int F1 = 12;
sqlStr = sqlStr.NamedStringFormatSQL(new
{
Field1 = F1
});
And this is code for formatter:
public static Dictionary<string, object> AnonymousToDictionary(object value)
{
Dictionary<string, object> dic = new Dictionary<string, object>(StringComparer.OrdinalIgnoreCase);
if (value != null)
{
foreach (PropertyDescriptor descriptor in TypeDescriptor.GetProperties(value))
{
object obj2 = descriptor.GetValue(value);
dic.Add(descriptor.Name, obj2);
}
}
return dic;
}
public static string NamedStringFormatSQL(this string aString, object replacements)
{
Dictionary<string, object> keyValues = AnonymousToDictionary(replacements);
foreach (var item in keyValues)
{
string val = item.Value == null ? "null" : item.Value.ToString();
aString = aString.Replace("#" + item.Key, val);
}
return aString;
}
Given an array of values, I would like to create an anonymous object with properties based on these values. The property names would be simply "pN" where N is the index of the value in the array.
For example, given
object[] values = { 123, "foo" };
I would like to create the anonymous object
new { p0 = 123, p1 = "foo" };
The only way I can think of to do this would be to to use a switch or if chain up to a reasonable number of parameters to support, but I was wondering if there was a more elegant way to do this:
object[] parameterValues = new object[] { 123, "foo" };
dynamic values = null;
switch (parameterValues.Length)
{
case 1:
values = new { p0 = parameterValues[0] };
break;
case 2:
values = new { p0 = parameterValues[0], p1 = parameterValues[1] };
break;
// etc. up to a reasonable # of parameters
}
Background
I have an existing set of methods that execute sql statements against a database. The methods typically take a string for the sql statement and a params object[] for the parameters, if any. The understanding is that if the query uses parameters, they will be named #p0, #p1, #p2, etc..
Example:
public int ExecuteNonQuery(string commandText, CommandType commandType, params object[] parameterValues) { .... }
which would be called like this:
db.ExecuteNonQuery("insert into MyTable(Col1, Col2) values (#p0, #p1)", CommandType.Text, 123, "foo");
Now I would like to use Dapper within this class to wrap and expose Dapper's Query<T> method, and do so in a way that would be consistent with the existing methods, e.g. something like:
public IEnumerable<T> ExecuteQuery<T>(string commandText, CommandType commandType, params object[] parameterValues) { .... }
but Dapper's Query<T> method takes the parameter values in an anonymous object:
var dog = connection.Query<Dog>("select Age = #Age, Id = #Id", new { Age = (int?)null, Id = guid });
leading to my question about creating the anonymous object to pass parameters to Dapper.
Adding code using the DynamicParameter class as requested by #Paolo Tedesco.
string sql = "select * from Account where Id = #p0 and username = #p1";
dynamic values = new DynamicParameter(123, "test");
var accounts = SqlMapper.Query<Account>(connection, sql, values);
throws an exception at line 581 of Dapper's SqlMapper.cs file:
using (var reader = cmd.ExecuteReader())
and the exception is a SqlException:
Must declare the scalar variable "#p0".
and checking the cmd.Parameters property show no parameters configured for the command.
You are misusing Dapper, you should never need to do this, instead either implement IDynamicParameters or use the specific extremely flexible DynamicParameters class.
In particular:
string sql = "select * from Account where Id = #id and username = #name";
var values = new DynamicParameters();
values.Add("id", 1);
values.Add("name", "bob");
var accounts = SqlMapper.Query<Account>(connection, sql, values);
DynamicParameters can take in an anonymous class in the constructor. You can concat DynamicParameters using the AddDynamicParams method.
Further more, there is no strict dependency on anon-types. Dapper will allow for concrete types as params eg:
class Stuff
{
public int Thing { get; set; }
}
...
cnn.Execute("select #Thing", new Stuff{Thing = 1});
Kevin had a similar question: Looking for a fast and easy way to coalesce all properties on a POCO - DynamicParameters works perfectly here as well without any need for magic hoop jumping.
Not exactly an anonymous object, but what about implementing a DynamicObject which returns values for p1 ... pn based on the values in the array? Would that work with Dapper?
Example:
using System;
using System.Dynamic;
using System.Text.RegularExpressions;
class DynamicParameter : DynamicObject {
object[] _p;
public DynamicParameter(params object[] p) {
_p = p;
}
public override bool TryGetMember(GetMemberBinder binder, out object result) {
Match m = Regex.Match(binder.Name, #"^p(\d+)$");
if (m.Success) {
int index = int.Parse(m.Groups[1].Value);
if (index < _p.Length) {
result = _p[index];
return true;
}
}
return base.TryGetMember(binder, out result);
}
}
class Program {
static void Main(string[] args) {
dynamic d1 = new DynamicParameter(123, "test");
Console.WriteLine(d1.p0);
Console.WriteLine(d1.p1);
}
}
You cannot dynamically create anonymous objects. But Dapper should work with dynamic object. For creating the dynamic objects in a nice way, you could use Clay. It enables you to write code like
var person = New.Person();
person["FirstName"] = "Louis";
// person.FirstName now returns "Louis"