PostgreSQL Log - parameter details [duplicate] - c#

Is there a way to dump the generated sql to the Debug log or something? I'm using it in a winforms solution so the mini-profiler idea won't work for me.

I got the same issue and implemented some code after doing some search but having no ready-to-use stuff. There is a package on nuget MiniProfiler.Integrations I would like to share.
Update V2: it supports to work with other database servers, for MySQL it requires to have MiniProfiler.Integrations.MySql
Below are steps to work with SQL Server:
1.Instantiate the connection
var factory = new SqlServerDbConnectionFactory(_connectionString);
using (var connection = ProfiledDbConnectionFactory.New(factory, CustomDbProfiler.Current))
{
// your code
}
2.After all works done, write all commands to a file if you want
File.WriteAllText("SqlScripts.txt", CustomDbProfiler.Current.ProfilerContext.BuildCommands());

Dapper does not currently have an instrumentation point here. This is perhaps due, as you note, to the fact that we (as the authors) use mini-profiler to handle this. However, if it helps, the core parts of mini-profiler are actually designed to be architecture neutral, and I know of other people using it with winforms, wpf, wcf, etc - which would give you access to the profiling / tracing connection wrapper.
In theory, it would be perfectly possible to add some blanket capture-point, but I'm concerned about two things:
(primarily) security: since dapper doesn't have a concept of a context, it would be really really easy for malign code to attach quietly to sniff all sql traffic that goes via dapper; I really don't like the sound of that (this isn't an issue with the "decorator" approach, as the caller owns the connection, hence the logging context)
(secondary) performance: but... in truth, it is hard to say that a simple delegate-check (which would presumably be null in most cases) would have much impact
Of course, the other thing you could do is: steal the connection wrapper code from mini-profiler, and replace the profiler-context stuff with just: Debug.WriteLine etc.

You should consider using SQL profiler located in the menu of SQL Management Studio → Extras → SQL Server Profiler (no Dapper extensions needed - may work with other RDBMS when they got a SQL profiler tool too).
Then, start a new session.
You'll get something like this for example (you see all parameters and the complete SQL string):
exec sp_executesql N'SELECT * FROM Updates WHERE CAST(Product_ID as VARCHAR(50)) = #appId AND (Blocked IS NULL OR Blocked = 0)
AND (Beta IS NULL OR Beta = 0 OR #includeBeta = 1) AND (LangCode IS NULL OR LangCode IN (SELECT * FROM STRING_SPLIT(#langCode, '','')))',N'#appId nvarchar(4000),#includeBeta bit,#langCode nvarchar(4000)',#appId=N'fea5b0a7-1da6-4394-b8c8-05e7cb979161',#includeBeta=0,#langCode=N'de'

Try Dapper.Logging.
You can get it from NuGet. The way it works is you pass your code that creates your actual database connection into a factory that creates wrapped connections. Whenever a wrapped connection is opened or closed or you run a query against it, it will be logged. You can configure the logging message templates and other settings like whether SQL parameters are saved. Elapsed time is also saved.
In my opinion, the only downside is that the documentation is sparse, but I think that's just because it's a new project (as of this writing). I had to dig through the repo for a bit to understand it and to get it configured to my liking, but now it's working great.
From the documentation:
The tool consists of simple decorators for the DbConnection and
DbCommand which track the execution time and write messages to the
ILogger<T>. The ILogger<T> can be handled by any logging framework
(e.g. Serilog). The result is similar to the default EF Core logging
behavior.
The lib declares a helper method for registering the
IDbConnectionFactory in the IoC container. The connection factory is
SQL Provider agnostic. That's why you have to specify the real factory
method:
services.AddDbConnectionFactory(prv => new SqlConnection(conStr));
After registration, the IDbConnectionFactory can be injected into
classes that need a SQL connection.
private readonly IDbConnectionFactory _connectionFactory;
public GetProductsHandler(IDbConnectionFactory connectionFactory)
{
_connectionFactory = connectionFactory;
}
The IDbConnectionFactory.CreateConnection will return a decorated
version that logs the activity.
using (DbConnection db = _connectionFactory.CreateConnection())
{
//...
}

This is not exhaustive and is essentially a bit of hack, but if you have your SQL and you want to initialize your parameters, it's useful for basic debugging. Set up this extension method, then call it anywhere as desired.
public static class DapperExtensions
{
public static string ArgsAsSql(this DynamicParameters args)
{
if (args is null) throw new ArgumentNullException(nameof(args));
var sb = new StringBuilder();
foreach (var name in args.ParameterNames)
{
var pValue = args.Get<dynamic>(name);
var type = pValue.GetType();
if (type == typeof(DateTime))
sb.AppendFormat("DECLARE #{0} DATETIME ='{1}'\n", name, pValue.ToString("yyyy-MM-dd HH:mm:ss.fff"));
else if (type == typeof(bool))
sb.AppendFormat("DECLARE #{0} BIT = {1}\n", name, (bool)pValue ? 1 : 0);
else if (type == typeof(int))
sb.AppendFormat("DECLARE #{0} INT = {1}\n", name, pValue);
else if (type == typeof(List<int>))
sb.AppendFormat("-- REPLACE #{0} IN SQL: ({1})\n", name, string.Join(",", (List<int>)pValue));
else
sb.AppendFormat("DECLARE #{0} NVARCHAR(MAX) = '{1}'\n", name, pValue.ToString());
}
return sb.ToString();
}
}
You can then just use this in the immediate or watch windows to grab the SQL.

Just to add an update here since I see this question still get's quite a few hits - these days I use either Glimpse (seems it's dead now) or Stackify Prefix which both have sql command trace capabilities.
It's not exactly what I was looking for when I asked the original question but solve the same problem.

Related

Attempted to read or write protected memory when using Entity Framework

An unhandled exception of type 'System.AccessViolationException' occurred in StatCentric.Tracker.Worker.dll
Additional information: Attempted to read or write protected memory. This is often an indication that other memory is corrupt.
I've read numerous posts on both Stack Overflow and various blogs and can't seem to find a solution for this.
I'm doing something very basic:
public void Execute(ITrackerRequestModel model)
{
PageviewRequest p = (PageviewRequest)model;
using (var db = new StatCentricEntities())
{
db.SetTimeout(60);
db.sp_Log_PageView2(p.SiteId, p.DateTimeUtc, p.vid, p.QueryString, p.p, p.t);
}
}
But this error pops up every time I try to call db.sp_Log_PageView2. This only seems to happen inside my worker role (I'm using Windows Azure).
Also worthy of note is that I'm using the Windows Azure Emulator and I am on Windows 8.1.
I've tried the following:
Doing a winsocket reset
Disabling JIT debugging (native,script, managed)
Disabling JIT debugging on module load
Followed some old posts to hot fixes that seem to be specific to .NET
2.0 and discontinued.
Did a memory diagnostic with no issues to make sure it wasn't my hardware.
I am running Visual Studio as administrator and connecting to a remote SQL Server Database hosted in Azure.
Any ideas on how to resolve or further diagnose this are appreciated.
This is not real fix but while waiting for fix from Microsoft you can use this workaround.
I have same problem. I also tried everything to solve that issue. After few days I gave up and used manual "workaround". It only took few minutes to copy and convert existing sproc calls to new ones.
Just ignore auto generated functions and manually call stored procedures. You can use auto generated classes for returned data. Copy and modify existing function and you will get easily correct parameter names and types.
Just implement partial class to different file:
public partial class StatCentricEntities
{
public virtual List<sp_Log_PageView2_Result> my_sp_Log_PageView2(
Guid? siteId,
DateTime time,
string param3,
string param4 )
{
return Database.SqlQuery<sp_Log_PageView2_Result>(
"sp_Log_PageView2 #siteId #time #param3 #param4",
new SqlParameter("siteId", siteId),
new SqlParameter("time", time),
new SqlParameter("param3", param3),
new SqlParameter("param4", param4)).ToList();
}
}
I was getting this "Attempted to read or write protected memory exception" error while using a SQL Server stored procedure that had an output parameter of type 'Date'. I tried various things without success and, in the interest of time, settled on the following solution.
1) Remove the output parameter of type date
2) Return a string via a select statement
SELECT CONVERT(char(10), #AsOfDate, 20) AS AsOfDate
3) Convert the string returned from the stored procedure to a DateTime value in C#
DateTime asOfDate = DateTime.Now;
using (var context = new DA.MyEntities())
{
var procResult = context.myStoredProcedure(myParameter).FirstOrDefault();
DateTime.TryParse(procResult, out asOfDate);
}
I'm not super happy with this compromise, but it did allow me to move forward.

view SqlXmlCommand.ExecuteCommand() on SQL server

I am having an issue where I am looking at a legacy application that is using SqlXmlCommand objects to get data from the database. There is an .xsd file that has the tables that are being used, and what fields, their relationships etc. The issue that we are having is it works most of the time, but not all. I am wondering if there is a way to check what is actually being run on Sql Server. I don't have the SQL profiler installed so that option is out.
the code looks like:
SqlXmlCommand xcmd = new SqlXmlCommand(DataAccess.OleDbConnectionString);
xcmd.CommandType = SqlXmlCommandType.XPath;
xcmd.SchemaPath = Path.GetFullPath(Path.Combine(AppDomain.CurrentDomain.BaseDirectory, #"myXsd.xsd"));
xcmd.XslPath = Path.GetFullPath(Path.Combine(AppDomain.CurrentDomain.BaseDirectory, String.Format(#"myXsl.xsl", ReportType)));
xcmd.CommandText = "id[#PK=$PK]";
SqlXmlParameter p = xcmd.CreateParameter();
p.Name = "#PK";
p.Value = Id;
using (Stream s = xcmd.ExecuteStream()) { ... }
This blows up at the ExectureStream() with the error:
SQLXML: error loading XML result (XML document must have a top level element.)
We believe that there is some data abnormality that is causing the xml to not generate properly, and that is why we want to see what is exactly run.
Cheers
You can try the below two queries, you might need to tweak it a little, but to give you an idea, the first gives you a list of all requests, and the second will give you the detail of the request by its request id (session_id)
SELECT *
FROM sys.dm_exec_requests
DBCC INPUTBUFFER (12345)
Although I would personally rather try and debug the C# app first and view what's being sent over to the server from the VS debugger before bothering with checking what's being run on SQL Server
Also, DBCC INPUTBUFFER might give you something like EXECUTE dbo.MyStoredProc 'params...', to dig deeper, or otherwise a more straightforward query, you can run this
SELECT r.session_id, r.[status], r.command, t.[text]
FROM sys.dm_exec_requests r
CROSS APPLY sys.dm_exec_sql_text(r.[sql_handle]) t

Why does my SqlCacheDependency HasChanged come back false but almost immediately after changes to true?

I cannot figure out why the HasChanged value of my SqlCacheDependency object is coming back originally from the command execution as false, but somewhere almost immediately after it comes back from the database, the value changes to true.
Sometimes this happens before the item is even inserted into the cache, causing the cache to discard it immediately, sometimes it's after the insert, and I can grab an enumerator which sees the key in the cache but before I even loop to that item in the cache it's been deleted.
SPROC:
ALTER PROCEDURE [dbo].[ntz_dal_ER_X_Note_SelectAllWER_ID]
#ER_ID int
AS
BEGIN
SELECT
ER_X_Note_ID,
ER_ID,
Note_ID
FROM dbo.ER_X_Note e
WHERE
ER_ID = #ER_ID
END
The database is MS SQL Server 2008, broker service is enabled, and SOME output does cache and remain cached. For instance, this one works just fine:
ALTER PROC [dbo].[ntz_dal_GetCacheControllerByEntityName] (
#Name varchar(50)
) AS
BEGIN
SELECT
CacheController_ID,
EntityName,
CacheEnabled,
Expiration
From dbo.CacheController cc
WHERE EntityName = #Name
END
The code which calls the SPROC in question that fails:
DataSet toReturn;
Hashtable paramHash = new Hashtable();
paramHash.Add("ER_ID", _eR_ID.IsNull ? null : _eR_ID.Value.ToString());
string cacheName = BuildCacheString("ntz_dal_ER_X_Note_SelectAllWER_ID", paramHash);
toReturn = (DataSet)GetFromCache(cacheName);
if (toReturn == null)
{
// Set up parameters (1 input and 0 output)
SqlParameter[] arParms = {
new SqlParameter("#ER_ID", _eR_ID),
};
SqlCacheDependency scd;
// Execute query.
toReturn = _dbTransaction != null
? _dbConnection.ExecuteDataset(_dbTransaction, "dbo.[ntz_dal_ER_X_Note_SelectAllWER_ID]", out scd, arParms)
: _dbConnection.ExecuteDataset("dbo.[ntz_dal_ER_X_Note_SelectAllWER_ID]", out scd, arParms);
AddToCache(cacheName, toReturn, scd);
}
return toReturn;
Code that works
const string sprocName = "ntz_dal_GetCacheControllerByEntityName";
string cacheControlPrefix = "CacheController_" + CachePrefix;
CacheControl controller = (CacheControl)_cache[cacheControlPrefix];
if (controller == null)
{
try
{
SqlParameter[] arParms = {
new SqlParameter("#Name", CachePrefix),
};
SqlCacheDependency sqlCacheDependency;
// Execute query.
DataSet result = _dbTransaction != null
? _dbConnection.ExecuteDataset(_dbTransaction, sprocName, out sqlCacheDependency, arParms)
: _dbConnection.ExecuteDataset(sprocName, out sqlCacheDependency, arParms);
controller = result.Tables[0].Rows.Count == 0
? new CacheControl(false)
: new CacheControl(result.Tables[0].Rows[0]);
_cache.Insert(cacheControlPrefix, controller, sqlCacheDependency);
}
catch (Exception ex)
{
// if sproc retreival fails cache the result of false so we don't keep trying
// this is the only case where it can be added with no expiration date
controller = new CacheControl(false);
// direct cache insert, no dependency, no expiration, never try again for this entity
if (HttpContext.Current != null && UseCaching && _cache != null) _cache.Insert(cacheControlPrefix, controller);
}
}
return controller;
The AddToCache method is overloaded and has more tests in it; The direct _cache.Insert in the working method is to bypass those other tests. The working code helps determine if db caching should happen at all.
You can see that when the "non working" data is retrieved initially, all is OK:
But somewhere random beyond that point, in this instance, just stepping into the next method
And yet the data is NOT changing at all; I'm the only one touching this instance of the database.
It was really, really simple, so simple I completely overlooked it.
In this article Creating a Query for Notification, which I DID scour multiple times, it clearly states:
SET Option Settings
When a SELECT statement is executed under a notification request, the
connection that submits the request must have the options for the
connection set as follows:
ANSI_NULLS ON
ANSI_PADDING ON
ANSI_WARNINGS ON
CONCAT_NULL_YIELDS_NULL ON
QUOTED_IDENTIFIER ON
NUMERIC_ROUNDABORT OFF
ARITHABORT ON
Well, I read and re-read and RE-re-read the sproc, and I still didn't see that both ANSI_NULLS and QUOTED_IDENTIFIER were "OFF", not ON.
My dataset is now caching and retaining the data properly without false indicators of change.
I have a hunch that the issue is with your _eR_ID. I think that you should try adding a local variable to the failing procedure that uses an impossible value for _eR_ID, such as -1. I never trust what is going to happen when nulls are involved and I think this could be the source of your problem.
Here is the modified version that I recommend trying:
DataSet toReturn;
Hashtable paramHash = new Hashtable();
int local_er_ID = eR_ID.IsNull ? -1 : _eR_ID.Value;
paramHash.Add("ER_ID", local_eR_ID.ToString());
string cacheName = BuildCacheString("ntz_dal_ER_X_Note_SelectAllWER_ID", paramHash);
toReturn = (DataSet)GetFromCache(cacheName);
if (toReturn == null)
{
// Set up parameters (1 input and 0 output)
SqlParameter[] arParms = {
new SqlParameter("#ER_ID", local_eR_ID),
};
SqlCacheDependency scd;
// Execute query.
toReturn = _dbTransaction != null
? _dbConnection.ExecuteDataset(_dbTransaction, "dbo.[ntz_dal_ER_X_Note_SelectAllWER_ID]", out scd, arParms)
: _dbConnection.ExecuteDataset("dbo.[ntz_dal_ER_X_Note_SelectAllWER_ID]", out scd, arParms);
AddToCache(cacheName, toReturn, scd);
}
return toReturn;
Important
While creating the above code, I think I discovered the source of your problem: when setting the stored proc parameter, you are using _eR_ID but when you set the paramHash you are using _eR_ID.Value.
The code rewrite will solve this problem, but I suspect that this is the root of the problem.
Running into the same issue and finding the same answers online without any help, I was reasearching the xml invalid subscription response from profiler.
I found an example on msdn support site that had a slightly different order of code. When I tried it I realized the problem - Don't open your connection object until after you've created the command object and the cache dependency object. Here is the order you must follow and all will be good:
Be sure to enable notifications (SqlCahceDependencyAdmin) and run SqlDependency.Start first
Create the connection object
Create the command object and assign command text, type, and connection object (any combination of constructors, setting properties, or using CreateCommand).
Create the sql cache dependency object
Open the connection object
Execute the query
Add item to cache using dependency.
If you follow this order, and follow all other requirements on your select statement, don't have any permissions issues, this will work!
I believe the issue has to do with how the .NET framework manages the connection, specifically what settings are set. I tried overriding this in my sql command test but it never worked. This is only a guess - what I do know is changing the order immediately solved the issue.
I was able to piece it together from the following to msdn posts.
This post was one of the more common causes of the invalid subscription, and shows how the .Net client sets the properties that are in contrast to what notification requires.
https://social.msdn.microsoft.com/Forums/en-US/cf3853f3-0ea1-41b9-987e-9922e5766066/changing-default-set-options-forced-by-net?forum=adodotnetdataproviders
Then this post was from a user who, like me, had reduced his code to the simplest format. My original code pattern was similar to his.
https://social.technet.microsoft.com/Forums/windows/en-US/5a29d49b-8c2c-4fe8-b8de-d632a3f60f68/subscriptions-always-invalid-usual-suspects-checked-no-joy?forum=sqlservicebroker
Then I found this post, also a very simple reduction of the problem, only his was a simple issue - needing 2 part name for tables. In his case the suggestion resolved the issue. After looking at his code I noticed the main difference was waiting to open the connection object until AFTER the command object AND the dependency object were created. My only assumption is under the hood (I have not yet started reflector to check so only an assumption) the Connection object is opened differently, or order of events and command happen differently, because of this association.
https://social.msdn.microsoft.com/Forums/sqlserver/en-US/bc9ca094-a989-4403-82c6-7f608ed462ce/sql-server-not-creating-subscription-for-simple-select-query-when-using-sqlcachedependency?forum=sqlservicebroker
I hope this helps someone else in a similar issue.

SQL Server perform backup with C#

I've investigated the possibilities of creating database backups through SMO with C#.
The task is quite easy and code straightforward. I've got only one question: how can I check if the backup was really created?
SqlBackup.SqlBackup method returns no parameters and I don't even know if it throws any exceptions. (the only thing that I know is that it is blocking, because there's also SqlBackupAsync method)
I would appreciate any help.
you can and its very possible to do what you asked for,
but doing the backup it self using SMO its not very hard, but the hard part is managing the backup and the restore.
it would be hard to put all the code here, but its wont fit. so I will try my best to put the lines you need.
SqlBackup.SqlBackup doesn't return any value, its a void function.
but it takes one parameter which is "Server", try out the following code:
Server srvSql;
//Connect to Server using your authentication method and load the databases in srvSql
// THEN
Backup bkpDatabase = new Backup();
bkpDatabase.Action = BackupActionType.Database;
bkpDatabase.Incremental = true; // will take an incemental backup
bkpDatabase.Incremental = false; // will take a Full backup
bkpDatabase.Database = "your DB name";
BackupDeviceItem bDevice = new BackupDeviceItem("Backup.bak", DeviceType.File);
bkpDatabase.Devices.Add(bDevice );
bkpDatabase.PercentCompleteNotification = 1;// this for progress
bkpDatabase.SqlBackup(srvSql);
bkpDatabase.Devices.Clear();
I've investigated the problem using Reflector.NET (I suppose this is legal since RedGate is Ms Gold Certified Partner and Reflector.NET opens .NET libraries out of the box). As I found out the method throws two types of exceptions:
FailedOperationException - in most cases, other exceptions are "translated" (I suppose translating means creating new FailedOperationException and setting InnerException to what was actually thrown)
UnsupportedVersionException - in one case when log truncation is set to TruncateOnly and server major version is more or equal to 10 (which is sql server 2008?)
This solves my problem partially, because I'm not 100% sure that if something goes wrong those exceptions will actually be thrown.

Enterprise library caching parameters on stored procs?

I'm trying to standardise some data access code with my colleagues. One of the aforementioned colleagues asserts that the EntLib Data Access Block trys to cache parameters on stored proc calls.
I've had a look in reflector and there is some evidence that it could be caching them. But I don't think it does in the following situation.
public Dictionary<long, string> GetQueue(int maxItems)
{
var sq = new SqlDatabase(_connString.ConnectionString);
var result = new Dictionary<long, string>();
using (var cmd = (SqlCommand)sq.GetStoredProcCommand("dbo.GetQueue"))
{
sq.AddInParameter(cmd, "maxItems", DbType.Int32, maxItems);
var reader = cmd.ExecuteReader(CommandBehavior.CloseConnection);
while (reader.Read())
{
long id = reader.GetInt64(reader.GetOrdinal("id"));
string fileName = reader.GetString(reader.GetOrdinal("meta_data_filename"));
result.Add(id, fileName);
}
}
return result;
}
Can anyone confirm or deny this?
I'm using EntLib 4.1
It definetly used to, I ripped the code out and threw in in my library.
it used sp_help and parsed the output to determine the data types.
These days, I ripped the code out, .Net is much much better about adding parameters.
cmd.Parameters.AddWithValue("#name",somevalue)
in your example of you keep reflectoring ... you will find it being done down this path GetStoredProcCommand()
You will get a Command object back, already populated with parameters
The ent lib code is copyrighted, but the code is almost identical to this
http://code.google.com/p/dbdotnet/source/browse/trunk/ParameterCache.cs
As far as I can tell it doesn't cache the parameters. Using the same instance of a Database object I called DiscoverParameters multiple times while running a trace. Each time I call DiscoverParameters I can see a [sys].[sp_procedure_params_100_managed] so it looks like it's making the round trip every time.
Here's an example of how to do it yourself that's seems like it might be alright:
http://davidhayden.com/blog/dave/archive/2006/11/03/CachingStoredProcedureParameters.aspx

Categories