Dapper: With or without parameters? - c#

Does anyone know if it has any difference and which one is better to use Dapper.
Send the SQL already mounted or send the args separeted:
string sql = string.Format($"DELETE FROM emp{utils.filial}.funcionariodependenteanexo WHERE id IN ({listaIds})");
await connection.ExecuteScalarAsync<int>(sql, args);
Thanks.

Related

PostgreSQL Log - parameter details [duplicate]

Is there a way to dump the generated sql to the Debug log or something? I'm using it in a winforms solution so the mini-profiler idea won't work for me.
I got the same issue and implemented some code after doing some search but having no ready-to-use stuff. There is a package on nuget MiniProfiler.Integrations I would like to share.
Update V2: it supports to work with other database servers, for MySQL it requires to have MiniProfiler.Integrations.MySql
Below are steps to work with SQL Server:
1.Instantiate the connection
var factory = new SqlServerDbConnectionFactory(_connectionString);
using (var connection = ProfiledDbConnectionFactory.New(factory, CustomDbProfiler.Current))
{
// your code
}
2.After all works done, write all commands to a file if you want
File.WriteAllText("SqlScripts.txt", CustomDbProfiler.Current.ProfilerContext.BuildCommands());
Dapper does not currently have an instrumentation point here. This is perhaps due, as you note, to the fact that we (as the authors) use mini-profiler to handle this. However, if it helps, the core parts of mini-profiler are actually designed to be architecture neutral, and I know of other people using it with winforms, wpf, wcf, etc - which would give you access to the profiling / tracing connection wrapper.
In theory, it would be perfectly possible to add some blanket capture-point, but I'm concerned about two things:
(primarily) security: since dapper doesn't have a concept of a context, it would be really really easy for malign code to attach quietly to sniff all sql traffic that goes via dapper; I really don't like the sound of that (this isn't an issue with the "decorator" approach, as the caller owns the connection, hence the logging context)
(secondary) performance: but... in truth, it is hard to say that a simple delegate-check (which would presumably be null in most cases) would have much impact
Of course, the other thing you could do is: steal the connection wrapper code from mini-profiler, and replace the profiler-context stuff with just: Debug.WriteLine etc.
You should consider using SQL profiler located in the menu of SQL Management Studio → Extras → SQL Server Profiler (no Dapper extensions needed - may work with other RDBMS when they got a SQL profiler tool too).
Then, start a new session.
You'll get something like this for example (you see all parameters and the complete SQL string):
exec sp_executesql N'SELECT * FROM Updates WHERE CAST(Product_ID as VARCHAR(50)) = #appId AND (Blocked IS NULL OR Blocked = 0)
AND (Beta IS NULL OR Beta = 0 OR #includeBeta = 1) AND (LangCode IS NULL OR LangCode IN (SELECT * FROM STRING_SPLIT(#langCode, '','')))',N'#appId nvarchar(4000),#includeBeta bit,#langCode nvarchar(4000)',#appId=N'fea5b0a7-1da6-4394-b8c8-05e7cb979161',#includeBeta=0,#langCode=N'de'
Try Dapper.Logging.
You can get it from NuGet. The way it works is you pass your code that creates your actual database connection into a factory that creates wrapped connections. Whenever a wrapped connection is opened or closed or you run a query against it, it will be logged. You can configure the logging message templates and other settings like whether SQL parameters are saved. Elapsed time is also saved.
In my opinion, the only downside is that the documentation is sparse, but I think that's just because it's a new project (as of this writing). I had to dig through the repo for a bit to understand it and to get it configured to my liking, but now it's working great.
From the documentation:
The tool consists of simple decorators for the DbConnection and
DbCommand which track the execution time and write messages to the
ILogger<T>. The ILogger<T> can be handled by any logging framework
(e.g. Serilog). The result is similar to the default EF Core logging
behavior.
The lib declares a helper method for registering the
IDbConnectionFactory in the IoC container. The connection factory is
SQL Provider agnostic. That's why you have to specify the real factory
method:
services.AddDbConnectionFactory(prv => new SqlConnection(conStr));
After registration, the IDbConnectionFactory can be injected into
classes that need a SQL connection.
private readonly IDbConnectionFactory _connectionFactory;
public GetProductsHandler(IDbConnectionFactory connectionFactory)
{
_connectionFactory = connectionFactory;
}
The IDbConnectionFactory.CreateConnection will return a decorated
version that logs the activity.
using (DbConnection db = _connectionFactory.CreateConnection())
{
//...
}
This is not exhaustive and is essentially a bit of hack, but if you have your SQL and you want to initialize your parameters, it's useful for basic debugging. Set up this extension method, then call it anywhere as desired.
public static class DapperExtensions
{
public static string ArgsAsSql(this DynamicParameters args)
{
if (args is null) throw new ArgumentNullException(nameof(args));
var sb = new StringBuilder();
foreach (var name in args.ParameterNames)
{
var pValue = args.Get<dynamic>(name);
var type = pValue.GetType();
if (type == typeof(DateTime))
sb.AppendFormat("DECLARE #{0} DATETIME ='{1}'\n", name, pValue.ToString("yyyy-MM-dd HH:mm:ss.fff"));
else if (type == typeof(bool))
sb.AppendFormat("DECLARE #{0} BIT = {1}\n", name, (bool)pValue ? 1 : 0);
else if (type == typeof(int))
sb.AppendFormat("DECLARE #{0} INT = {1}\n", name, pValue);
else if (type == typeof(List<int>))
sb.AppendFormat("-- REPLACE #{0} IN SQL: ({1})\n", name, string.Join(",", (List<int>)pValue));
else
sb.AppendFormat("DECLARE #{0} NVARCHAR(MAX) = '{1}'\n", name, pValue.ToString());
}
return sb.ToString();
}
}
You can then just use this in the immediate or watch windows to grab the SQL.
Just to add an update here since I see this question still get's quite a few hits - these days I use either Glimpse (seems it's dead now) or Stackify Prefix which both have sql command trace capabilities.
It's not exactly what I was looking for when I asked the original question but solve the same problem.

Mongo. C#. How can i execute string as mongo query

Let's say I have a string. " db.getCollection("somecollection").find({})". Can I execute this string as a query in C#? i.e. I get a string. And I just execute it as a query but in c#
I just want like this
string query = "db.getCollection("somename")";
Mongo.execute(query);
no, the best you can do in this context is to use db.RunCommand<BsonDocument>("{ ping : 1 }") (c#) which is close to the shell db.runCommand({ ping : 1 })
UPDATE:
you may look at this as well How to execute mongo commands through shell scripts?, I'm not familiar with this and it doesn't work for me on windows and 5.0 server in most of the cases mentioned there other than simple one: mongo --eval "printjson(db.serverStatus())", but if you will be able to make this suggested script mongo < script.js (or similar) work in, for example, the shell, you will be able to put your random query in this file(script.js) and then, add this file as argument into Process creating similar to:
using (var process = new Process())
{
// arguments below may require the whole path to the files
process.StartInfo.Arguments = "script.js";
process.StartInfo.FileName = "mongo";
process.Start();
}
to read results, you will need analyzing process.StandardOutput/process.StandardError stream.

ARSoft.Tools.Net SpfValidator.CheckHost() not responding

I am trying to follow this example:
https://docs.ar-soft.de/arsoft.tools.net/#SPF%20SenderIP%20Validation.html
var validator = new SpfValidator()
{
HeloDomain = DomainName.Parse("example.com"),
LocalDomain = DomainName.Parse("receivingmta.example.com"),
LocalIP = IPAddress.Parse("192.0.2.1")
};
SpfQualifier result = validator.CheckHost(IPAddress.Parse("192.0.2.200"),
DomainName.Parse("example.com"), "sender#example.com").Result;
However, no matter what IPs and urls I use, CheckHost() method does not finish.
Does anybody know the correct use, or example input parameters for which this would complete?
I would expect an exception if inputs are invalid.
You're using it the same way I'm using it. It works for perfectly for me. Maybe you have something in your firewall blocking it from performing the look up queries?

Oracle parameter with Dapper issue

I am struggling with using ORACLE parameters via DAPPER. The error message received is "ORA-00942: table or view does not exist".
However the code works without the parameter, and I suspect that this is a simple Oracle parameter syntax issue. The code follows:
public List<ForecastData>GetByFiscalYear(string fiscalYear)
{
List<ForecastData> queryResults = new List<ForecastData>();
String sqlQuery = #"SELECT RES.FISCALYEAR year FROM RESOURCE_AVAILABILITY RES WHERE RES.FISCALYEAR = :p_fiscalYear";
using (var oraCon = new OracleConnection(System.Configuration.ConfigurationManager.ConnectionStrings["Oracle_HRZD"].ToString()))
{
oraCon.Open();
queryResults = oraCon.Query<ForecastData>(sqlQuery, new { p_fiscalYear = fiscalYear }).ToList();
}
return new List<ForecastData>(queryResults);
}
Any assistance will be greatly appreciated...
Usually, ORA-00942 is exactly what it says, it can't find the table/view (RESOURCE_AVAILABILITY) you are selecting from. So it's not in the schema for the user you log on as or the user has not been granted SELECT on the table/view if it's another schema.
But you say that if you remove WHERE RES.FISCAL_YEAR :p_fiscalyear, then it works. So it seems like you have select permissions on the table. Do you mean remove the whole where selection or have you tested enter a fixed string, as in WHERE RES.FISCAL_YEAR='2016'?
My other top tip is to run Wireshark and look at what really is sent to the database, usually you connect on port 1521 filter on that.
The answer was to use the fully-qualified database-object name, including the schema. Thanks for your assistance.

Enterprise library caching parameters on stored procs?

I'm trying to standardise some data access code with my colleagues. One of the aforementioned colleagues asserts that the EntLib Data Access Block trys to cache parameters on stored proc calls.
I've had a look in reflector and there is some evidence that it could be caching them. But I don't think it does in the following situation.
public Dictionary<long, string> GetQueue(int maxItems)
{
var sq = new SqlDatabase(_connString.ConnectionString);
var result = new Dictionary<long, string>();
using (var cmd = (SqlCommand)sq.GetStoredProcCommand("dbo.GetQueue"))
{
sq.AddInParameter(cmd, "maxItems", DbType.Int32, maxItems);
var reader = cmd.ExecuteReader(CommandBehavior.CloseConnection);
while (reader.Read())
{
long id = reader.GetInt64(reader.GetOrdinal("id"));
string fileName = reader.GetString(reader.GetOrdinal("meta_data_filename"));
result.Add(id, fileName);
}
}
return result;
}
Can anyone confirm or deny this?
I'm using EntLib 4.1
It definetly used to, I ripped the code out and threw in in my library.
it used sp_help and parsed the output to determine the data types.
These days, I ripped the code out, .Net is much much better about adding parameters.
cmd.Parameters.AddWithValue("#name",somevalue)
in your example of you keep reflectoring ... you will find it being done down this path GetStoredProcCommand()
You will get a Command object back, already populated with parameters
The ent lib code is copyrighted, but the code is almost identical to this
http://code.google.com/p/dbdotnet/source/browse/trunk/ParameterCache.cs
As far as I can tell it doesn't cache the parameters. Using the same instance of a Database object I called DiscoverParameters multiple times while running a trace. Each time I call DiscoverParameters I can see a [sys].[sp_procedure_params_100_managed] so it looks like it's making the round trip every time.
Here's an example of how to do it yourself that's seems like it might be alright:
http://davidhayden.com/blog/dave/archive/2006/11/03/CachingStoredProcedureParameters.aspx

Categories