I have an MVC5 project that is divided into a main XXXX.Site and a XXXX.Data dll that has EF6 connecting to the MS-SQL 2014 database.
When I'm on the MVC controller and press F10 at the call that runs inside the XXXX.Data dll, all goes well. If I go inside the DLL code and put a breakpoint on the actual EF call ...Visual Studio simply bombs out.
I tried different things like re-adding EF6.1.1.1, I on both DLL and MVC site but nothing works. I tried removing and then adding completely EF. I tried a new project and just put the code to run a simple stored proc, I even tried to combine the Data.dll into the MVC Site code moving all DB access to the MVC Site (mainly deleted the DLL itself) ...but nothing worked !!
This is what I noticed so far:
1) if I put a breakpoint on the code I wrote to call this stored procedure ...Visual Studio simply bombs out.
try
{
using (MyDB db = new MyDB())
{
// IF BREAKPOINT IS ON LINE BELOW, EXECUTION STOPS ABRUPTLY
db.MyStoredProc(value1, value2);
}
}
catch (Exception ex)
{
string s = ex.Message;
return false;
}
No exception is raised, and when it occurs the browsers kind of flashes a couple of times. Then this message appears on the Output window:
A first chance exception of type 'System.AccessViolationException' occurred in XXXX.Data.dll
2) If instead I put the breakpoint inside the actual auto generated EF6 code, the program does runs normally.
public virtual int MyStoredProc(string value1, string value2)
{
var value1Parameter = value1 != null ?
new ObjectParameter("Value1", value1) :
new ObjectParameter("Value1", typeof(string));
var value2Parameter = value2 != null ?
new ObjectParameter("Value2", value2) :
new ObjectParameter("Value2", typeof(string));
// IF BREAKPOINT IS ON LINE BELOW, EXECUTION RUNS NORMALLY
return ((IObjectContextAdapter)this).ObjectContext.ExecuteFunction("MyStoredProc", value1Parameter, value2Parameter);
}
Note that I have both EntityFramework and EntitySQLServer dlls on both bin folders (MVC and DLL).
Questions:
Is it an issue with SQL2014? I don't have this happening when connecting to SQL2012.
Is there a setting that will display the actual exception occurred?
Why is VS bombing out instead of displaying the actual error?
I was getting this "Attempted to read or write protected memory exception" error while using a SQL Server stored procedure that had an output parameter of type 'Date'. I tried various things without success and, in the interest of time, settled on the following solution.
1) Remove the output parameter of type date from the stored procedure.
2) Return a string via a select statement in the stored procedure instead.
SELECT CONVERT(char(10), #AsOfDate, 20) AS AsOfDate
3) Convert the string returned from the stored procedure to a DateTime value in C#.
DateTime asOfDate = DateTime.Now;
using (var context = new DA.MyEntities())
{
var procResult = context.myStoredProcedure(myParameter).FirstOrDefault();
DateTime.TryParse(procResult, out asOfDate);
}
I'm not super happy with this compromise, but it did allow me to move forward.
Related
Using visual studio and C#, I recently added a new bound column to a data grid and modified the stored procedure to pull the extra field. When I debug it - it shows up fine and displays the data as I expect. When I publish the website and copy the files to the web server, the column is no longer there. It's a pretty straight forward setup. I know the file is being copied etc. What am I missing?
In addition to the comments added above, make sure your added field exists on Production Database.
If it throws an exception that your code swallows up, you would never know.
Example:
private bool SomeMethod(string cmdText) {
bool result = false;
try {
result = Query(cmdText);
} Catch (Exception) {
// Error occurred
}
}
If you had the code above and had an error, you would never know.
An unhandled exception of type 'System.AccessViolationException' occurred in StatCentric.Tracker.Worker.dll
Additional information: Attempted to read or write protected memory. This is often an indication that other memory is corrupt.
I've read numerous posts on both Stack Overflow and various blogs and can't seem to find a solution for this.
I'm doing something very basic:
public void Execute(ITrackerRequestModel model)
{
PageviewRequest p = (PageviewRequest)model;
using (var db = new StatCentricEntities())
{
db.SetTimeout(60);
db.sp_Log_PageView2(p.SiteId, p.DateTimeUtc, p.vid, p.QueryString, p.p, p.t);
}
}
But this error pops up every time I try to call db.sp_Log_PageView2. This only seems to happen inside my worker role (I'm using Windows Azure).
Also worthy of note is that I'm using the Windows Azure Emulator and I am on Windows 8.1.
I've tried the following:
Doing a winsocket reset
Disabling JIT debugging (native,script, managed)
Disabling JIT debugging on module load
Followed some old posts to hot fixes that seem to be specific to .NET
2.0 and discontinued.
Did a memory diagnostic with no issues to make sure it wasn't my hardware.
I am running Visual Studio as administrator and connecting to a remote SQL Server Database hosted in Azure.
Any ideas on how to resolve or further diagnose this are appreciated.
This is not real fix but while waiting for fix from Microsoft you can use this workaround.
I have same problem. I also tried everything to solve that issue. After few days I gave up and used manual "workaround". It only took few minutes to copy and convert existing sproc calls to new ones.
Just ignore auto generated functions and manually call stored procedures. You can use auto generated classes for returned data. Copy and modify existing function and you will get easily correct parameter names and types.
Just implement partial class to different file:
public partial class StatCentricEntities
{
public virtual List<sp_Log_PageView2_Result> my_sp_Log_PageView2(
Guid? siteId,
DateTime time,
string param3,
string param4 )
{
return Database.SqlQuery<sp_Log_PageView2_Result>(
"sp_Log_PageView2 #siteId #time #param3 #param4",
new SqlParameter("siteId", siteId),
new SqlParameter("time", time),
new SqlParameter("param3", param3),
new SqlParameter("param4", param4)).ToList();
}
}
I was getting this "Attempted to read or write protected memory exception" error while using a SQL Server stored procedure that had an output parameter of type 'Date'. I tried various things without success and, in the interest of time, settled on the following solution.
1) Remove the output parameter of type date
2) Return a string via a select statement
SELECT CONVERT(char(10), #AsOfDate, 20) AS AsOfDate
3) Convert the string returned from the stored procedure to a DateTime value in C#
DateTime asOfDate = DateTime.Now;
using (var context = new DA.MyEntities())
{
var procResult = context.myStoredProcedure(myParameter).FirstOrDefault();
DateTime.TryParse(procResult, out asOfDate);
}
I'm not super happy with this compromise, but it did allow me to move forward.
I have inherited an application that runs small reports locally using Microsoft Web ReportViewer. Our application allows you to "Preview/Print" a report by clicking on a specific button that routes the user to a URL that allows them to download the report as a PDF. We have recently received the requirement to save these PDFs to the document table in our database. I have been able to get this to work successfully on localhost; however, when I publish the application to our IIS server, I get the following error:
System.Data.SqlClient.SqlException: Login failed for user 'Domain\Servername$'.
I've reviewed all of the sites that I could find involving this error (including this one) - most point to adding the server account to the SQL database; however, this shouldn't be an issue, since the button to preview/print the document is still functional and works as expected when the application is published and all of the data is held in a local object, which was previously pulled from the database (the model parameter below). The button and the auto-generation feature use the same two methods to create the PDF document(see below).
Here's some code:
public static byte[] CreatePDFDocument(DocumentTemplateType template, Request model)
{
Warning[] warnings;
string[] streamIds;
string mimeType = string.Empty;
string encoding = string.Empty;
string extension = string.Empty;
ReportViewer viewer = new ReportViewer();
viewer.ProcessingMode = ProcessingMode.Local;
viewer.LocalReport.ReportEmbeddedResource = "Xxx.Xxx.Bll.ReportViewerRDLCs." + template.RdlcFilename;
switch ((DocumentType)template.DocumentTypeId)
{
case eDocumentType.Report1:
viewer.LocalReport.SetParameters(GetForm1Parameters(model));
break;
/**
* Several other reports are in this switch. All reports have the
* same issue - all but one are removed for brevity.
*/
}
byte[] bytes = viewer.LocalReport.Render("PDF", null, out mimeType, out encoding, out extension, out streamIds, out warnings);
return bytes;
//return new byte[5] {5,6,7,8,9}; - used for troubleshooting.
}
public static List<ReportParameter> GetReport1Parameters(Request model)
{
List<ReportParameter> rptParams = new List<ReportParameter>();
//Start comment
rptParams.Add(new ReportParameter("EmployeeFullName", string.Format("{0:NN}", model.Employee)));
rptParams.Add(new ReportParameter("EmployeePhoneNumber", string.Format("{0:(###) ###-####}", Convert.ToInt64(model.Employee.PhoneNumber))));
rptParams.Add(new ReportParameter("HrchyShortDesc", model.Employee.HrchyShortDesc));
rptParams.Add(new ReportParameter("RequestDate", model.RequestDate.ToShortDateString()));
rptParams.Add(new ReportParameter("RequestRequested", model.RequestRequestType));
rptParams.Add(new ReportParameter("ReasonForRequest", model.RequestRequestReason));
rptParams.Add(new ReportParameter("LogNumber", model.CaseId));
if (!string.IsNullOrWhiteSpace(model.TimeSensitiveReason)) rptParams.Add(new ReportParameter("TimeSensitiveReason", model.TimeSensitiveReason));
var lastAction = model.LastActionOfType(WorkflowStateActionType.EmployeeConfirmation);
if (lastAction != null)
{
rptParams.Add(new ReportParameter("TodaysDate", lastAction.ActionDate.ToShortDateString()));
rptParams.Add(new ReportParameter("EmpConfirmed", "true"));
}
else rptParams.Add(new ReportParameter("TodaysDate", DateTime.Now.ToShortDateString()));
//end comment
return rptParams;
}
Through a lot of commenting in and out and pushes to our server, I've deduced the following:
From what I can tell, the error occurs on calling GetReport1Parameters. In the code above, I included a start and end comment - I've commented out everything in between, leaving only the list initialization and return statement (of an empty list) and still received the error.
I've commented out the call to GetReport1Parameters and returned a nonsensical byte array and didn't receive an Exception.
All functionality works fine on localhost and when I step through the functions, all of the variables seem to appear normal.
Things I've tried to do to remedy the situation:
1. Removed connection strings from the app.config, so that the application has to go to the web.config to get the correct strings (even though they were the same).
2. Commented in and out different sections of code to determine the problem area.
3. Tried calling the GetReport1Parameters method and returning null, leading to a null reference exception.
4. Tried calling the GetReport1Parameters with an empty parameter list, leading to the error mentioned above.
5. Tried running the report with no parameters (not even a blank list), got a ReportProcessingException for missing params.
Some additional information:
We use a service account for the application using impersonate identity in the web.config. That line is commented out on localhost, but is running on IIS.
All of other database interaction works correctly.
All of our database interaction is done using LINQ to SQL - model is an object based off of a database table, with some additional information that is calculated dynamically.
My desired outcome is that both the autogenerated documents and the preview/print documents both work. I have a feeling that this may be something simple that I'm overlooking, but I've already spent several hours today trying to fix this.
I can't think of any other pertinent information, but if you have questions I'll be more than happy to answer them.
Edit: Additional attempts to find solution:
Tried setting LINQ Deferred Loading equal to false. This caused more problems than it solved.
Implemented IReportServerCredentials and assigned the ReportViewer's ServerReport.ReportServerCredentials with the correct database credentials.
Assigned all pertinent report parameters to a Dictionary, and then called .ToString() on every object to ensure that it is pulled from the database. Then assigned those strings from the dictionary to the report parameters, so that ReportViewer should be receiving the data from the string pool, as opposed to pulling it from the database.
Even though you are using an ObjectDataSource to pass data to your report, Report Viewer will still invoke the Select method, which in turn could cause database access to occur. So even though it may seem that the login is unnecessary, you would need to dig into the data access methods you supplied with your ObjectDataSource to know for sure.
The error you are getting is being caused by a bug in Report Viewer 2010 that is describe in the following Microsoft Connect article:
ReportViewer.LocalReport.Render and ReportViewer.LocalReport.SetParameters changes ImpersonationLevel to None
Although the article mentions this problem should be fixed in Service Pack 1, it does not appear to be the case. I have not verified if this problem is fixed in Report Viewer 2012.
I worked around the problem by changing my data access layer to compare the current identity against the one in my HttpContext and restore it if necessary using the following code snippet:
System.Security.Principal.IIdentity id = System.Web.HttpContext.Current.User.Identity
if (id.Name != System.Security.Principal.WindowsIdentity.GetCurrent().Name)
{
context = (id as System.Security.Principal.WindowsIdentity).Impersonate()
}
I do this right before I connect to the database and undo it as soon as the connection is open.
I am not exactly thrilled with this workaround, mainly because now my data access layer is referencing the UI layer (System.Web).
I have an MVC3 View that's posting back a set of input values via Ajax to my controller. My controller is then creating a new FieldTripRoute object on my context and attempting to insert it into the database.
I just can't figure out what's going on. I've triple checked my Designer schema and my DB schema and they match perfectly. So it can't be the normal issue of a column not existing or being nullable in one area but not another. However, I keep receiving a "Row Not Found or Changed" exception every time I attempt to submit changes.
The stack trace on the exception looks like this:
at System.Data.Linq.ChangeProcessor.SubmitChanges(ConflictMode failureMode)
at System.Data.Linq.DataContext.SubmitChanges(ConflictMode failureMode)
at System.Data.Linq.DataContext.SubmitChanges()
at ManageMAT.Controllers.FieldTripController.RouteAdd(Int32 id, FormCollection collection)
This is the code that's being called to add the new Route object from the Controller:
[HttpPost]
public ActionResult RouteAdd(int id, FormCollection collection)
{
FieldTrip trip = context.FieldTrips.Single(ft => ft.ID == id);
if (trip == null) return Json(new { success = false, message = "Field trip not found." }); ;
try
{
FieldTripRoute tripRoute = new FieldTripRoute();
tripRoute.FieldTripID = trip.ID;
tripRoute.Date = DateTime.Parse(collection["Date"]);
tripRoute.ArrivalTime = DateTime.Parse(collection["ArrivalTime"] + " " + DateTime.Now.ToShortDateString());
tripRoute.DepartureTime = DateTime.Parse(collection["DepartureTime"] + " " + DateTime.Now.ToShortDateString());
tripRoute.Destination = collection["Destination"];
tripRoute.PickupLocation = collection["PickupLocation"];
tripRoute.RouteID = Convert.ToInt32(collection["RouteID"]);
context.FieldTripRoutes.InsertOnSubmit(tripRoute);
context.SubmitChanges();
return Json(new { success = true, message = "Success!" });
}
catch (Exception ex)
{
return Json(new { success = false, message = ex.Message });
}
}
And here is my Designer and DB Table Columns:
I've also attempted to view the SQL this is outputting in both the logging available on the context object and in SQL Profiler, but it seems to be failing before it's even hitting the Database server.
Edit: Forgot to add one other thing, when I'm initially creating the new FieldTripRoute object at the beginning of the Add action I noticed that it's not retrieving the correct ID from the database identity series. Perhaps this is related?
I've also tried setting the Update Check on every field in the designer to Never just to see if it was some kind of bizarre concurrency collision going on, but I am still receiving the same error.
I'm really at a loss for what could be causing this issue. Any ideas are appreciated.
This message is thrown every time the row is not inserted for whatever reason. For DML statements Linq to SQL checks the number of modified rows. SQL Server returns this count. It is checked to be one.
The big question is why is the count zero and yet no error message is being sent by SQL Server. Start SQL Server profiler and post the SQL that L2S generates. Run the SQL manually and see what happens. Does a row get inserted? Does its identity value get returned?
Edit: More debugging ahead: Shut down SQL server just before you to the SubmitChanges to make sure that the database is not being hit. Lets make sure to cut this branch off the search tree.
Next, step into the Linq to SQL source code to see whats up. If you have R# this is easy: Press ctrl-shift-t, search ChangeProcessor, click and navigate to the "sources from symbol files". Find the function SubmitChanges and put a breakpoint in there. If you don't have R#, you need to dig out some tutorial on the web for this (it's going to take about 5min).
Step through the source to find why the exception is thrown.
I cannot figure out why the HasChanged value of my SqlCacheDependency object is coming back originally from the command execution as false, but somewhere almost immediately after it comes back from the database, the value changes to true.
Sometimes this happens before the item is even inserted into the cache, causing the cache to discard it immediately, sometimes it's after the insert, and I can grab an enumerator which sees the key in the cache but before I even loop to that item in the cache it's been deleted.
SPROC:
ALTER PROCEDURE [dbo].[ntz_dal_ER_X_Note_SelectAllWER_ID]
#ER_ID int
AS
BEGIN
SELECT
ER_X_Note_ID,
ER_ID,
Note_ID
FROM dbo.ER_X_Note e
WHERE
ER_ID = #ER_ID
END
The database is MS SQL Server 2008, broker service is enabled, and SOME output does cache and remain cached. For instance, this one works just fine:
ALTER PROC [dbo].[ntz_dal_GetCacheControllerByEntityName] (
#Name varchar(50)
) AS
BEGIN
SELECT
CacheController_ID,
EntityName,
CacheEnabled,
Expiration
From dbo.CacheController cc
WHERE EntityName = #Name
END
The code which calls the SPROC in question that fails:
DataSet toReturn;
Hashtable paramHash = new Hashtable();
paramHash.Add("ER_ID", _eR_ID.IsNull ? null : _eR_ID.Value.ToString());
string cacheName = BuildCacheString("ntz_dal_ER_X_Note_SelectAllWER_ID", paramHash);
toReturn = (DataSet)GetFromCache(cacheName);
if (toReturn == null)
{
// Set up parameters (1 input and 0 output)
SqlParameter[] arParms = {
new SqlParameter("#ER_ID", _eR_ID),
};
SqlCacheDependency scd;
// Execute query.
toReturn = _dbTransaction != null
? _dbConnection.ExecuteDataset(_dbTransaction, "dbo.[ntz_dal_ER_X_Note_SelectAllWER_ID]", out scd, arParms)
: _dbConnection.ExecuteDataset("dbo.[ntz_dal_ER_X_Note_SelectAllWER_ID]", out scd, arParms);
AddToCache(cacheName, toReturn, scd);
}
return toReturn;
Code that works
const string sprocName = "ntz_dal_GetCacheControllerByEntityName";
string cacheControlPrefix = "CacheController_" + CachePrefix;
CacheControl controller = (CacheControl)_cache[cacheControlPrefix];
if (controller == null)
{
try
{
SqlParameter[] arParms = {
new SqlParameter("#Name", CachePrefix),
};
SqlCacheDependency sqlCacheDependency;
// Execute query.
DataSet result = _dbTransaction != null
? _dbConnection.ExecuteDataset(_dbTransaction, sprocName, out sqlCacheDependency, arParms)
: _dbConnection.ExecuteDataset(sprocName, out sqlCacheDependency, arParms);
controller = result.Tables[0].Rows.Count == 0
? new CacheControl(false)
: new CacheControl(result.Tables[0].Rows[0]);
_cache.Insert(cacheControlPrefix, controller, sqlCacheDependency);
}
catch (Exception ex)
{
// if sproc retreival fails cache the result of false so we don't keep trying
// this is the only case where it can be added with no expiration date
controller = new CacheControl(false);
// direct cache insert, no dependency, no expiration, never try again for this entity
if (HttpContext.Current != null && UseCaching && _cache != null) _cache.Insert(cacheControlPrefix, controller);
}
}
return controller;
The AddToCache method is overloaded and has more tests in it; The direct _cache.Insert in the working method is to bypass those other tests. The working code helps determine if db caching should happen at all.
You can see that when the "non working" data is retrieved initially, all is OK:
But somewhere random beyond that point, in this instance, just stepping into the next method
And yet the data is NOT changing at all; I'm the only one touching this instance of the database.
It was really, really simple, so simple I completely overlooked it.
In this article Creating a Query for Notification, which I DID scour multiple times, it clearly states:
SET Option Settings
When a SELECT statement is executed under a notification request, the
connection that submits the request must have the options for the
connection set as follows:
ANSI_NULLS ON
ANSI_PADDING ON
ANSI_WARNINGS ON
CONCAT_NULL_YIELDS_NULL ON
QUOTED_IDENTIFIER ON
NUMERIC_ROUNDABORT OFF
ARITHABORT ON
Well, I read and re-read and RE-re-read the sproc, and I still didn't see that both ANSI_NULLS and QUOTED_IDENTIFIER were "OFF", not ON.
My dataset is now caching and retaining the data properly without false indicators of change.
I have a hunch that the issue is with your _eR_ID. I think that you should try adding a local variable to the failing procedure that uses an impossible value for _eR_ID, such as -1. I never trust what is going to happen when nulls are involved and I think this could be the source of your problem.
Here is the modified version that I recommend trying:
DataSet toReturn;
Hashtable paramHash = new Hashtable();
int local_er_ID = eR_ID.IsNull ? -1 : _eR_ID.Value;
paramHash.Add("ER_ID", local_eR_ID.ToString());
string cacheName = BuildCacheString("ntz_dal_ER_X_Note_SelectAllWER_ID", paramHash);
toReturn = (DataSet)GetFromCache(cacheName);
if (toReturn == null)
{
// Set up parameters (1 input and 0 output)
SqlParameter[] arParms = {
new SqlParameter("#ER_ID", local_eR_ID),
};
SqlCacheDependency scd;
// Execute query.
toReturn = _dbTransaction != null
? _dbConnection.ExecuteDataset(_dbTransaction, "dbo.[ntz_dal_ER_X_Note_SelectAllWER_ID]", out scd, arParms)
: _dbConnection.ExecuteDataset("dbo.[ntz_dal_ER_X_Note_SelectAllWER_ID]", out scd, arParms);
AddToCache(cacheName, toReturn, scd);
}
return toReturn;
Important
While creating the above code, I think I discovered the source of your problem: when setting the stored proc parameter, you are using _eR_ID but when you set the paramHash you are using _eR_ID.Value.
The code rewrite will solve this problem, but I suspect that this is the root of the problem.
Running into the same issue and finding the same answers online without any help, I was reasearching the xml invalid subscription response from profiler.
I found an example on msdn support site that had a slightly different order of code. When I tried it I realized the problem - Don't open your connection object until after you've created the command object and the cache dependency object. Here is the order you must follow and all will be good:
Be sure to enable notifications (SqlCahceDependencyAdmin) and run SqlDependency.Start first
Create the connection object
Create the command object and assign command text, type, and connection object (any combination of constructors, setting properties, or using CreateCommand).
Create the sql cache dependency object
Open the connection object
Execute the query
Add item to cache using dependency.
If you follow this order, and follow all other requirements on your select statement, don't have any permissions issues, this will work!
I believe the issue has to do with how the .NET framework manages the connection, specifically what settings are set. I tried overriding this in my sql command test but it never worked. This is only a guess - what I do know is changing the order immediately solved the issue.
I was able to piece it together from the following to msdn posts.
This post was one of the more common causes of the invalid subscription, and shows how the .Net client sets the properties that are in contrast to what notification requires.
https://social.msdn.microsoft.com/Forums/en-US/cf3853f3-0ea1-41b9-987e-9922e5766066/changing-default-set-options-forced-by-net?forum=adodotnetdataproviders
Then this post was from a user who, like me, had reduced his code to the simplest format. My original code pattern was similar to his.
https://social.technet.microsoft.com/Forums/windows/en-US/5a29d49b-8c2c-4fe8-b8de-d632a3f60f68/subscriptions-always-invalid-usual-suspects-checked-no-joy?forum=sqlservicebroker
Then I found this post, also a very simple reduction of the problem, only his was a simple issue - needing 2 part name for tables. In his case the suggestion resolved the issue. After looking at his code I noticed the main difference was waiting to open the connection object until AFTER the command object AND the dependency object were created. My only assumption is under the hood (I have not yet started reflector to check so only an assumption) the Connection object is opened differently, or order of events and command happen differently, because of this association.
https://social.msdn.microsoft.com/Forums/sqlserver/en-US/bc9ca094-a989-4403-82c6-7f608ed462ce/sql-server-not-creating-subscription-for-simple-select-query-when-using-sqlcachedependency?forum=sqlservicebroker
I hope this helps someone else in a similar issue.