asp.net web api internal server error 500 - c#

On my local server it returns json but when I publish it to remote server. I get 500 internal server error.
I can't run the hosted site in debug mode, on local host it seems fine. The remote server is IIS8.5 X-AspNet-Version: 4.0.30319 and my local IIS is iis 8
The html response is
500 - Internal server error.
There is a problem with the resource you are looking for, and it cannot be displayed.
Api Controller
public class loginapiController : ApiController
{
[System.Web.Http.HttpGet]
public UserProjects UserProjects(int id)
{
return new UserProjects(id);
}
}
public class UserProjects
{
public IEnumerable<Project> Projects = new List<Project>();
public List<Project> ExistingUserProjects = new List<Project>();
public int UserId { get; set; }
public UserProjects(int userId)
{
UserId = userId;
var context = new AuditConnection();
var existingProjectIds = context.UserProjects.Where(up => up.UserId == userId).Select(p => p.ProjectId);
foreach (var id in existingProjectIds)
{
var project = context.Projects.Where(p => p.ProjectId == id).First();
ExistingUserProjects.Add(project);
}
var proj = context.Projects.ToList();
Projects = proj.Except(ExistingUserProjects);
}
}
You can look at the exception using this link http://sentry.landsea.com.au/api/loginapi/UserProjects/4

EDIT: given the error
"There is already an open DataReader associated with this Command which must be closed first."
I believe this answer will explain your issue. Your best bet is to let your Reader finish the first command before starting another (you could do this with ToList()), or collapse the query down to a single call (I believe that you could do this by refactoring your queries into one which uses SelectMany()).
Keep in mind that turning on MARS may be covering up a bigger problem, and you should understand it fully before you go that route, especially for a web application with the lifetime scope of your DbContext should be setup as the life of a single http request. Doing that is trivial if you're properly manually disposing of your context or if you're using a dependency injection framework.

For me the URL shows a clear error message: "Message":"There is already an open DataReader associated with this Command which must be closed first." You can only have one DataReader open at once, so the code has to be fixed.

Related

Connection string not correct on EF 6 Azure Function

I have a very simple Azure function which updates rows on a database.
This works fine locally via Postman.
This is the very simple call I have in my Azure function
string connectionString = ConfigurationManager.ConnectionStrings["CatsDBEntities"].ConnectionString;
using (var context = new CatsDBEntities(connectionString))
{
// using (var db = new CatsDBEntities())
{
Cat cat = new Cat
{
Name = "BengA",
Id = id
};
context.Cats.Add(cat);
context.SaveChanges();
}
response = req.CreateResponse(HttpStatusCode.OK);
}
}
catch (System.Exception)
And here is my context
public CatsDBEntities(string connectionString) : base(connectionString) { }
protected override void OnModelCreating(DbModelBuilder modelBuilder)
{
throw new UnintentionalCodeFirstException();
}
public virtual DbSet<Cat> Cats { get; set; }
Here is my connection string in local settings (I know this doesn't matter when deploying to Azure but I am using this as an example of my connection string):
metadata=res://*/CatModel.csdl|res://*/CatModel.ssdl|res://*/CatModel.msl;provider=System.Data.SqlClient;provider connection string='data source=x.database.windows.net;initial catalog=CatsDB;user id=x;password=!;MultipleActiveResultSets=True;App=EntityFramework'",
and within Azure, I am entering my connection string like this:
This works great locally through Postman get a 200 but when deployed to Azure I get a 500 and this is my latest error:
This is error means nothing anyway because I have tried several times now and I keep changing the string and the error sometimes changes and sometimes not.
I have added the EF6 database first project as a separate project to my Azure function and I have put a reference between the two projects.
As you mentioned in the post that EF6 DataBase is added first then,
This is caused by how EF model first connection strings are generated.
The EF connection string builder requires a plain connection string in the constructor.
You can Refer this SO thread link for more information.

Getting 404 Not Found only when re-creating database in tests for 2nd API call

I'm trying to create API integration tests for legacy code I've inherited.
Currently I have a piece of testing code that:
recreate database (using Fluent Migrations)
starts a web app (Owin.Hosting)
makes api call to get auth token
makes api call to authorized endpoint
It works perfectly if I skip first step and do only 2) 3) 4).
It's a bit weird that I'm also able to do steps 1) 2) 3) (so the auth API call works with database recreation included).
I thought my web api isn't working properly, but I'm able to do basic path when I don't recreate database. Then I thought maybe it's not working at all when I recreate database, but I'm able to authorize an user. I have no clues what could I try now.
[Collection("Database Create collection")]
public class RoleControllerTests : IDisposable
{
private readonly IDisposable _server;
private readonly string _url = new Configuration().ServerUrl;
public RoleControllerTests()
{
_server = WebApp.Start<Startup>(_url);
}
public void Dispose()
{
_server.Dispose();
}
[Fact]
public async Task basic_roles_should_exist_in_the_database()
{
// Arrange
var roleApi = RestClient.For<IRoleController>(_url);
IAuthorize auth = new Authorize();
roleApi.AuthenticationHeader = await auth.GetAuthenticationHeaderAsync();
// Act
var rolesData = await roleApi.List();
// Assert
rolesData.ShouldContain(x => x.Name == "User");
rolesData.ShouldContain(x => x.Name == "Displayer");
}
}
So I've changed testing framework to NUnit and it's working.
I have no idea why, does XUnit have some issues with changing things in runtime?

MVC: EF6 Connection Pooling and SQL Server CONTEXT_INFO

In an ASP.NET MVC application, I'm trying to use SQL Server's CONTEXT_INFO to pass the currently logged in user so my audit triggers record not only the web server login, but also the login of the site.
I'm having trouble being certain that the current user will always be fed into the database server context though.
On the backend I have everything set up, a sproc to set the context, a function to pull it and DML triggers to record, no problem.
The app end is a bit more involved. I subscribe to the Database.Connection.StateChange event so I can catch each newly opened connection and set this context accordingly.
Additionally, to be able to retrieve the current login ID of the MVC site in the data layer (which has no access to the web project), I supply a delegate to the EF constructor that will return the user ID. This also means that any other peripheral projects I have set up require this dependency as well, and it keeps most of the implementation detail out of my hair during the web dev:
public class CoreContext : DbContext
{
Func<int> _uidObtainer;
public CoreContext(Func<int> uidObtainer) : base(nameof(CoreContext)) { construct(uidObtainer); }
public CoreContext(Func<int> uidObtainer, string connection) : base(connection) { construct(uidObtainer); }
void construct(Func<int> uidObtainer) {
// disallow updates of the db from our models
Database.SetInitializer<CoreContext>(null);
// catch the connection change so we can update for our userID
_uidObtainer = uidObtainer;
Database.Connection.StateChange += connectionStateChanged;
}
private void connectionStateChanged(object sender, System.Data.StateChangeEventArgs e) {
// set our context info for logging
if (e.OriginalState == System.Data.ConnectionState.Open ||
e.CurrentState != System.Data.ConnectionState.Open) {
return;
}
int uid = _uidObtainer();
var conn = ((System.Data.Entity.Core.EntityClient.EntityConnection)sender).StoreConnection;
var cmd = conn.CreateCommand();
cmd.CommandText = "audit.SetContext";
cmd.CommandType = System.Data.CommandType.StoredProcedure;
cmd.Parameters.Add(new System.Data.SqlClient.SqlParameter("#DomainUserID", uid));
cmd.ExecuteNonQuery();
}
// etc etc...
In my MVC project, I'll have code that looks like this:
context = new Data.CoreContext(() => AppService.UserID());
(making use of a readily accessible method to pass as delegate, which in turn reads from HttpContext.Current.User)
This is all shaping up nicely, except one unknown:
I know that it's possible for a EF Context instance to span multiple logged in users as this lives as part of the IIS app pool and not per HttpContext
What I don't know is enough about connection pooling and how connections are opened/re-opened to be safe in knowing that for each time my StateChange handler runs, I'll actually be retrieving the new UserID from the delegate.
Said differently: is it possible for a single connection to be open and used over the span of two separate HttpContext instances? I believe yes, seeing as how there's nothing to enforce otherwise (at least not that I'm aware of).
What can I do to ensure that each connection is getting the current HttpContext?
(possibly pertinent notes: There's no UoW/Repository pattern outside of EF itself, and data contexts are generally instantiated once per controller)
I see: the one context per controller is generally incorrect. Instead I should be using one context per request, which (besides other advantages), ensures my scenario operates correctly as well.
I found this answer, which explains the reasoning behind it: One DbContext per web request... why?
And I found this answer, which explains quite succinctly how to implement via BeginRequest and EndRequest: One DbContext per request in ASP.NET MVC (without IOC container)
(code from second answer pasted below to prevent linkrot)
protected virtual void Application_BeginRequest()
{
HttpContext.Current.Items["_EntityContext"] = new EntityContext();
}
protected virtual void Application_EndRequest()
{
var entityContext = HttpContext.Current.Items["_EntityContext"] as EntityContext;
if (entityContext != null)
entityContext.Dispose();
}
And in your EntityContext class...
public class EntityContext
{
public static EntityContext Current
{
get { return HttpContext.Current.Items["_EntityContext"] as EntityContext; }
}
}

Entity Framework 6.1.3, 2 Web Applications 1 SQL Database, Browser Cache Issues

The Problem:
No matter how sure I am that my transactions are committed and the application can read the absolute latest from the database, sometimes when I refresh the application the changes aren't displaying the latest data and I suspect that the data is being cached in the browser! This suspicion comes from loading the web application in another browser after the changes are made and seeing the changes. I need to make it so that every time the page is refreshed, this data is not cached in the browser.
Setup:
One Web Application simply reads from the database, while AJAX calls are made client side to a REST API that adds, deletes, and updates the data.
I have been doing a lot of research and the most valuable resource I have found on this was here: http://mehdi.me/ambient-dbcontext-in-ef6/
My code that reads from the database uses this pattern:
public IEnumerable<Link> GetLinks(){
using (var context = new MyContext()){
foreach(var link in context.ChangeTracker.Entries())
{
link.Reload();
}
return context.Links.Where(x => x.UserId == this.UserId).ToList();
}
}
An example of one of my operations that reads follows this pattern:
public int AddLink(string text, string url)
{
using (var context = new MyContext())
{
Link linkresult;
using (var contextTransaction = context.Database.BeginTransaction())
{
var link = new Link()
{
Text = text,
Url = url
UserId = this.UserId
};
linkresult = context.Links.Add(link);
context.SaveChanges();
contextTransaction.Commit();
}
return linkresult.Id;
}
}
Now as shown above, the context.SaveChanges() with the contextTransaction.Commit() I'm making sure that the data gets written to the database and is not cached at any level. I have confirmed this by using the Server Explorer and watching the content get updated real time.
I also think I have confirmed that my read will pull up the latest information from the database by loading the web application in another browser after the changes have been made, but I acknowledge that this may also be a caching issue that I am not aware of.
My last step is getting around the caching that happens in the browser. I know chrome allows you to clear your app hosted data, but I don't know how to make certain data is not cached so that every time a request happens this code executes.
More Details on the REST API:
The Controller for the above example looks something nearly identical to this:
public ActionResult AddLink(MyLink model)
{
IntegrationManager manager = new IntegrationManager(System.Web.HttpContext.Current.User);
model.Id = manager.AddLink(model.Text, model.Url);
return Json(model);
}
The IntegrationManager is just a basic class that does not implement IDisposable because the context is created and disposed of during each transaction. As can be seen, the AddLink is a member of the IntegrationManager class.
More Details on the Web Application:
The model for the view creates an IntegrationManager in it's constructor as a temporary variable to make the getLinks call as follows:
public Home(IPrincipal user, Cache cache)
{
this.HttpCache = cache;
IntegrationManager _IntegrationManager = new IntegrationManager(user);
this.Links = this.GetLinks(_IntegrationManager);
}
AJAX Call:
.on("click", "#btn-add-link", function (event) {
var text = $("#add-link-text"),
url = $("#add-link-url");
if (text.val().length > 0 && url.val().length > 0) {
var hasHttp = /^http.*$/.test(url.val());
if (!hasHttp) {
url.val("http://" + url.val());
}
$.ajax({
url: addLinkUrl,
type: "POST",
data: { Text: text.val(), Url: url.val() }
}).success(function (data) {
var newLink = $('<li class="ui-state-default deletable" id="link-' + data.Id + '">' + data.Text + '</li>');
$("#user-links").append(newLink);
text.val("");
url.val("");
});
Okay, so I have found out how to make sure no caching happens. The following is an attribute for the controller of the web application called NoCache. To use it, your controller will need the attribute like this:
using whatever.namespace.nocache.lives.in
[NoCache]
Here is the details of the attribute:
public class NoCacheAttribute : ActionFilterAttribute
{
public override void OnResultExecuting(ResultExecutingContext filterContext)
{
filterContext.HttpContext.Response.Cache.SetExpires(DateTime.UtcNow.AddDays(-1));
filterContext.HttpContext.Response.Cache.SetValidUntilExpires(false);
filterContext.HttpContext.Response.Cache.SetRevalidation(HttpCacheRevalidation.AllCaches);
filterContext.HttpContext.Response.Cache.SetCacheability(HttpCacheability.NoCache);
filterContext.HttpContext.Response.Cache.SetNoStore();
base.OnResultExecuting(filterContext);
}
}
I'm still looking into the details of whether or not I need everything that is included because it does increase the time the page takes to load significantly.

Access to disposed closure warning using Azure Transient Fault Handling Retry Policy

We have a worker role which processes records and sends Azure service bus messages as needed based on the results of the query, this is basically a queue processing service. As part of the best practices of using SQL Azure, we have wrapped all of our query statements with a retry policy (this detects transient errors and will retry based on the defined policy). Note that we actually send the message from within the using statement so there is no 'leak' of the db variable.
Inside of our using statement, ReSharper is throwing up the 'Access to Disposed Closure' warning, most likely because we are passing our DataContext as a func parameter of the retry policy.
My question is, am I OK in my assumption that ReSharper is not detecting this pattern correctly or are there alternative methods in how we write these functions in order to prevent the warning above?
The Code
The db variable in the retryPolicy.ExecuteAction is what is getting flagged
using (var db = new MyEntities())
{
var thingsToUpdate = retryPolicy.ExecuteAction(() => db.QueueTable.Where(x => x.UpdateType == "UpdateType" && x.DueNext < DateTime.UtcNow).Take(30).ToList());
if (!thingsToUpdate.Any())
{
return;
}
while (thingsToUpdate.Any())
{
var message = new ServiceMessage{
Type = "UpdateType",
Requests = thingsToUpdate.Select(x => new ServiceMessageRequest
{
LastRan = x.LastRan,
ParentItemId = x.ThingId,
OwnerId = x.Thing.ForiegnKeyid
}).ToList()
};
SendMessage("UpdateType", message);
foreach (var thing in thingsToUpdate )
{
thing.LastRan = DateTime.UtcNow;
thing.DueNext = DateTime.UtcNow.AddMinutes(10);
}
retryPolicy.ExecuteAction(() => db.SaveChanges());
thingsToUpdate = db.QueueTable.Where(x => x.UpdateType == "UpdateType" && x.DueNext < DateTime.UtcNow).Take(30).ToList());
}
}
Additional Information
I also posted this to the ReSharper forums for a broader audience and this particular issue was addressed in a little more detail over there. For posterity, you can find the question here.
I guess your ExecuteAction executes your lamdba immediately. Then you should annotate a lambda parameter from your ExecuteAction method with ReSharper's attribute [InstantHandle].
For example:
public void ExecuteAction([InstantHandle] Action action)
{
...
}
You can either import JetBrains.Annotations.dll to get this attribute or just copy all of attributes inside your project. See more info on JetBrains site here and here.

Categories