Reduce number of query in ASP.NET controller for polling data - c#

I'm trying to make a request every 10 seconds to a webApi controller in ASP.NET. The controller has to check in the database (SQL Server) if there are new data available (Using enitity framework). Here is the controller side
public class RemoteCommandController : ApiController
{
public Command Get(String id)
{
Command command = null;
ProxyCommand proxycommand = null;
Device deviceSearch;
using (var systemDB = new DB())
{
deviceSearch = systemDB.Devices.Include("CommandList").Where(d => d.Name.Equals(id)).SingleOrDefault();
command = deviceSearch.CommandList.LastOrDefault();
}
if (command.IsExecuted == false)
{
proxycommand = ConvertingObjects.FromCommandToProxy(command);
}
return proxycommand;
}
}
I want to avoid to query the DB everytime I call this controller. Is there a way to improve this controller in order to reduce the number of querys?
I was thinking of using Session, but I don't think is a good idea...

Just use output caching. Add this attribute to your controller:
[OutputCache(Duration=10, VaryByParam="none")]
public class RemoteCommandController : ApiController
{
//etc.....
Increase or decrease the duration as needed. If the command list never changes, you can set the duration absurdly high and the controller will only get called once.

Related

How to add Health Checks to Swagger

after looking through many articles and not finding a clear answer, I would like to start one more time a topic about adding Health Checks to the swagger in ASP .Net Core.
Firstly, I would like to ask you if it is good idea to do that and how to do it in the easiest way.
Thanks in advance for all answers.
First question, Why do we need Health Check?
When we create Health Checks, we can create very granular, specific checks for certain services, which helps us greatly when diagnosing issues with our application infrastructure, as we can easily see which service/dependency is performing poorly. Our application may still be up and running, but in a degraded state that we can’t easily see by simply using the application, so having Health Checks in place give us a better understanding of what a healthy state of our application looks like.
Instead of relying on our users reporting an issue with the application, we can monitor our application health constantly and be proactive in understanding where our application isn’t functioning correctly and make adjustments as needed.
Here is simple demo about database Health check
First, Write a controller and Inject HealthCheckService in it.
[Route("[controller]")]
[ApiController]
[AllowAnonymous]
public class HealthController : ControllerBase
{
private readonly HealthCheckService healthCheckService;
public HealthController(HealthCheckService healthCheckService)
{
this.healthCheckService = healthCheckService;
}
[HttpGet]
public async Task<ActionResult> Get()
{
HealthReport report = await this.healthCheckService.CheckHealthAsync();
var result = new
{
status = report.Status.ToString(),
errors = report.Entries.Select(e => new { name = e.Key, status = e.Value.Status.ToString(), description = e.Value.Description.ToString() })
};
return report.Status == HealthStatus.Healthy ? this.Ok(result) : this.StatusCode((int)HttpStatusCode.ServiceUnavailable, result);
}
}
Then, In Program.cs(.Net 6), Configure the health check to test whether the query function of the database is normal
//.....
string connectionString = builder.Configuration.GetConnectionString("default");
builder.Services.AddHealthChecks().AddCheck("sql", () =>
{
string sqlHealthCheckDescription = "Tests that we can connect and select from the database.";
string sqlHealthCheckUnHealthDescription = "There is something wrong in database.";
using (SqlConnection connection = new SqlConnection(connectionString))
{
try
{
connection.Open();
//You can specify the table to test or test other function in database
SqlCommand command = new SqlCommand("SELECT TOP(1) id from dbo.students", connection);
command.ExecuteNonQuery();
}
catch (Exception ex)
{
//Log.Error(ex, "Exception in sql health check");
return HealthCheckResult.Unhealthy(sqlHealthCheckUnHealthDescription );
}
}
return HealthCheckResult.Healthy(sqlHealthCheckDescription);
});
//......
Result:
Swagger will expose this health check endpoint
When the query function works fine in database,It will return 200
When there is something wrong in database, It will return 503

Proper API calls and server-side data storage - Blazor

I'm trying to learn Blazor while bringing some of my ideas to live as I think that the best way to learn is by practice. I want to build a card viewer for CCG (collectible card game) and I've stumbled upon a roadblock which I hope someone can help me with :)
As I don't want to store the card database myself, I'm using a 3rd party source to download the required information - and I have a problem with it's storage afterwards.
I made a server-side controller that fetches the mentioned DB (after some sorting with my own class to filter some unnecesarry data). What I wanted to achieve is a list of all the cards and onClick to fetch the info about the specific one by ID.
My controller looks like this:
[Route("[controller]")]
[ApiController]
public class CardController : ControllerBase
{
private List<Card> Cards = new();
[HttpGet("{id}")]
public ActionResult Get(int id)
{
var card = Cards.Find(o => o.Id == id);
if (card != null)
{
return Ok(card);
}
return NotFound("Card not found");
}
[HttpGet("List")]
public ActionResult List()
{
return Ok(Cards);
}
[HttpGet("GetAllCards")]
public async Task<List<Card>> GetAllCards()
{
if (Cards.Count() == 0)
{
Cards = await GetAllCardsPro();
}
return Cards;
}
public async Task<List<Card>> GetAllCardsPro()
{
var client = new HttpClient();
HttpResponseMessage response = await client.GetAsync("https://db.ygoprodeck.com/api/v7/cardinfo.php");
response.EnsureSuccessStatusCode();
string responseBody = await response.Content.ReadAsStringAsync();
YgoProMap.Rootobject obj = JsonConvert.DeserializeObject<YgoProMap.Rootobject>(responseBody);
if (obj != null)
{
foreach (var card in obj.data)
{
Cards.Add(new Card
{
Id = card.id,
Name = card.name,
Type = card.type,
Desc = card.desc,
Atk = card.atk,
Def = card.def,
Level = card.level,
Race = card.race,
Attribute = card.attribute,
Archetype = card.archetype,
Linkval = card.linkval,
Scale = card.scale,
Images = new string[] { card.card_images.First().image_url, card.card_images.First().image_url_small }
});
}
}
return Cards;
}
}
The problem occurs when I try to call Get(ID) to get info of that specific card and it always return 404. If I manualy populate the Cards list with a new Card(some data) I'm able to get that one specific card's information. So I would assume that after the call to that 3rd party API Cards it just passes the value to client once remaining null on the server.
So my 1st question here is how should I store that data that is fetched from the 3rd party API to make it accessable afterwards? Should it be somewhere in the server startup with a daily refresh? Is it even possible to store it like a JSON file for example?
My 2nd question is about best practices performance wise as at the moment I'm passing the whole List of Cards to the client (~10 thousand objects). Would it be better to pass a separate array that only stores card.Name and card.Id for it to be bound to a UI list/table and afterwards onClick fetch that specific card's full data? Or 10 thousand objects isn't a huge deal and I should just pass the whole List and do all the specific card operation client side?
Sorry for making it so long but I've tried to be as specific as I can :D

Is parallel.foreach is best way to improve speed of executing this code

I have code that i need to rewrite to improve speed of executing original code :
Data class :
public class Data
{
public string Id {get;set;}
... Other properties
}
Services : ( There are more than 2 i jus give u 2 for example )
public class SomeService1
{
public Result Process(Data data)
{
//Load data from different services hire
}
}
public class SomeService2
{
public Result Process(Data data)
{
//Load data from different services hire
}
}
Actual method
public void Calculate (List<Data> datas)
{
Result result;
SomeService1 someService1 = new SomeService1();
SomeService2 someService2 = new SomeService2();
// In this place list of data have about 2000 elements
foreach(var data in datas)
{
switch(data.Id)
{
case 1:
result = someService1.Process(data)
break;
case 2:
result = someService2.Process(data)
break;
default:
result = null;
}
ProcesAndSaveDataToDatabase(result);
}
}
The method Calculate i taking List as parameter for every element in this list it grabing data from outside service ( service is determine by id in Data class ). Then it process this data and saving to database. For 2000 elements whole operation is taking about 8 min. 70 % of the time is gathering data from outside service. I must change that time. I have only one idea to do this but to be honest i can't test it with data because only data are on Production environment ( and testing on production is bad idea ). I have one idea. Can you look at it and advice to me if i going in right direction ?
Data class :
public class Data
{
public string Id {get;set;}
... Other properties
}
Services : ( There are more than 2 i jus give u 2 for example )
public class SomeService1 : IService
{
public Result Process(Data data)
{
//Load data from different services hire
}
}
public class SomeService2 : IService
{
public Result Process(Data data)
{
//Load data from different services hire
}
}
IService :
public interface IService
{
Result Process(Data data);
}
Actual method :
Public void Calculate (List<Data> datas)
{
var split= from data in datas group data by data.Id into newDatas select newDatas
// Different list split by Id
Parallel.Foreach(split, new ParallelOptions{MaxDegreeOfParallelism = 4}, datas =>
{
Result result;
IService service = GetService(datas.FirsOfDefault().Id);
if(service == null) return;
foreach(var data in datas)
{
result = service.Process(data)
ProcesAndSaveDataToDatabase(result);
}
});
}
private IService GetService(string id)
{
IService service = null;
if(id == null ) return service;
switch(id)
{
case 1:
service = new SomeService1();
break;
case 2:
service = new SomeService2();
break;
}
return service;
}
In this idea i try to split the different services data to different threads. So in in list we will have 20 items with Id = 1 and 10 items with Id = 2 it should create 2 separated thread and process it discretely this should allow me to cut off the execute time. Is this is good way ? Is any other possibilities to improve this code ?
Thanks
Parallel ForEach helps improve CPU bound tasks but you mention above you are calling services in parallel which is IO bound. Whenever you do IO bound work (like calling a external service) you are better off using async and await instead of parallel foreach.
Parallel ForEach will spin up multiple threads and block those threads until the work is done (approx 8 min with all threads blocked).
Async and Await will weave worker threads between service calls and effectively use IO completion ports to call back into your application. This avoids blocking of multiple threads and allows you to more efficiently use your computer's resources.
More info on how to make parallel asynchronous calls here:
https://msdn.microsoft.com/en-us/library/mt674880.aspx
While you'll reap the benefits of using Parallelism(Parallel.ForEach) in your application, that is not the only way of improving the speed of executing the code.
Also, since you are using LINQ in your application and you might be using it extensively as well, you may well want to use PLINQ(Parallel LINQ) wherever possible.
I'd also suggest that you try profiling your code, to identify the hotspots and bottlenecks in your application, which might give you a better idea of understanding where and how you can improve the performance.
Also, as mentioned by Patrick you should try using async and await wherever possible.
Check out this article from MSDN that'll give you more insights https://msdn.microsoft.com/en-us/library/ff963552.aspx

Entity Framework 6.1.3, 2 Web Applications 1 SQL Database, Browser Cache Issues

The Problem:
No matter how sure I am that my transactions are committed and the application can read the absolute latest from the database, sometimes when I refresh the application the changes aren't displaying the latest data and I suspect that the data is being cached in the browser! This suspicion comes from loading the web application in another browser after the changes are made and seeing the changes. I need to make it so that every time the page is refreshed, this data is not cached in the browser.
Setup:
One Web Application simply reads from the database, while AJAX calls are made client side to a REST API that adds, deletes, and updates the data.
I have been doing a lot of research and the most valuable resource I have found on this was here: http://mehdi.me/ambient-dbcontext-in-ef6/
My code that reads from the database uses this pattern:
public IEnumerable<Link> GetLinks(){
using (var context = new MyContext()){
foreach(var link in context.ChangeTracker.Entries())
{
link.Reload();
}
return context.Links.Where(x => x.UserId == this.UserId).ToList();
}
}
An example of one of my operations that reads follows this pattern:
public int AddLink(string text, string url)
{
using (var context = new MyContext())
{
Link linkresult;
using (var contextTransaction = context.Database.BeginTransaction())
{
var link = new Link()
{
Text = text,
Url = url
UserId = this.UserId
};
linkresult = context.Links.Add(link);
context.SaveChanges();
contextTransaction.Commit();
}
return linkresult.Id;
}
}
Now as shown above, the context.SaveChanges() with the contextTransaction.Commit() I'm making sure that the data gets written to the database and is not cached at any level. I have confirmed this by using the Server Explorer and watching the content get updated real time.
I also think I have confirmed that my read will pull up the latest information from the database by loading the web application in another browser after the changes have been made, but I acknowledge that this may also be a caching issue that I am not aware of.
My last step is getting around the caching that happens in the browser. I know chrome allows you to clear your app hosted data, but I don't know how to make certain data is not cached so that every time a request happens this code executes.
More Details on the REST API:
The Controller for the above example looks something nearly identical to this:
public ActionResult AddLink(MyLink model)
{
IntegrationManager manager = new IntegrationManager(System.Web.HttpContext.Current.User);
model.Id = manager.AddLink(model.Text, model.Url);
return Json(model);
}
The IntegrationManager is just a basic class that does not implement IDisposable because the context is created and disposed of during each transaction. As can be seen, the AddLink is a member of the IntegrationManager class.
More Details on the Web Application:
The model for the view creates an IntegrationManager in it's constructor as a temporary variable to make the getLinks call as follows:
public Home(IPrincipal user, Cache cache)
{
this.HttpCache = cache;
IntegrationManager _IntegrationManager = new IntegrationManager(user);
this.Links = this.GetLinks(_IntegrationManager);
}
AJAX Call:
.on("click", "#btn-add-link", function (event) {
var text = $("#add-link-text"),
url = $("#add-link-url");
if (text.val().length > 0 && url.val().length > 0) {
var hasHttp = /^http.*$/.test(url.val());
if (!hasHttp) {
url.val("http://" + url.val());
}
$.ajax({
url: addLinkUrl,
type: "POST",
data: { Text: text.val(), Url: url.val() }
}).success(function (data) {
var newLink = $('<li class="ui-state-default deletable" id="link-' + data.Id + '">' + data.Text + '</li>');
$("#user-links").append(newLink);
text.val("");
url.val("");
});
Okay, so I have found out how to make sure no caching happens. The following is an attribute for the controller of the web application called NoCache. To use it, your controller will need the attribute like this:
using whatever.namespace.nocache.lives.in
[NoCache]
Here is the details of the attribute:
public class NoCacheAttribute : ActionFilterAttribute
{
public override void OnResultExecuting(ResultExecutingContext filterContext)
{
filterContext.HttpContext.Response.Cache.SetExpires(DateTime.UtcNow.AddDays(-1));
filterContext.HttpContext.Response.Cache.SetValidUntilExpires(false);
filterContext.HttpContext.Response.Cache.SetRevalidation(HttpCacheRevalidation.AllCaches);
filterContext.HttpContext.Response.Cache.SetCacheability(HttpCacheability.NoCache);
filterContext.HttpContext.Response.Cache.SetNoStore();
base.OnResultExecuting(filterContext);
}
}
I'm still looking into the details of whether or not I need everything that is included because it does increase the time the page takes to load significantly.

Handling data access in multi tenant site

I would appreciate some pointers regarding data access/control in a MVC based multi tenant site:
Is there a better/more secure/elegant way to make sure that in a multi tenant site the user can handle only its own data.
There are number of tenants using same app: firstTenant.myapp.com, secondTenant.myapp.com...
//
// GET: /Customer/
// show this tenant's customer info only
public ViewResult Index()
{
//get TenantID from on server cache
int TenantID = Convert.ToInt16( new AppSettings()["TenantID"]);
return View(context.Customers.ToList().Where(c => c.TenantID == TenantID));
}
If a user logs in for the first time and there is no server side cache for this tenant/user- AppSettings checks in db and stores TenantID in the cache.
Each table in database contains the field TenantID and is used to limit access to data only to appropriate Tenant.
So, to come to the point, instead of checking in each action in each controller if data belong to current tenant, can I do something more 'productive'?
Example:
When firstTenant admin tries editing some info for user 4, url has:
http://firstTenant.myapp.com/User/Edit/4
Let's say that user with ID 2 belongs to secondTenant. Admin from firstTenant puts
http://firstTenant.myapp.com/User/Edit/2 in url, and tries getting info which is not owned by his company.
In order to prevent this in the controller I check if the info being edited is actually owned by current tenant.
//
// GET: /User/Edit/
public ActionResult Edit(int id)
{
//set tennant ID
int TenanatID = Convert.ToInt32(new AppSettings()["TenantID"]);
//check if asked info is actually owned by this tennant
User user = context.Userss.Where(u => u.TenantID == TenantID).SingleOrDefault(u => u.UserID == id);
//in case this tenant doesn't have this user ID, ie.e returned User == null
//something is wrong, so handle bad request
//
return View(user);
}
Basically this sort of setneeds to be placed in every controller where there is an access to any data. Is there (and how) a better way to handle this? (Filters, attributes...)
I choose to use action filters to do this. It may not be the most elegant solution, but it is the cleanest of the solutions we've tried so far.
I keep the tenant (in our case, it's a team) in the URL like this: https://myapp.com/{team}/tasks/details/1234
I use custom bindings to map {team} into an actual Team object so my action methods look like this:
[AjaxAuthorize, TeamMember, TeamTask("id")]
public ActionResult Details(Team team, Task id)
The TeamMember attribute verifies that the currently logged in user actually belongs to the team. It also verifies that the team actually exists:
public class TeamMemberAttribute : ActionFilterAttribute
{
public override void OnActionExecuting(ActionExecutingContext filterContext)
{
base.OnActionExecuting(filterContext);
var httpContext = filterContext.RequestContext.HttpContext;
Team team = filterContext.ActionParameters["team"] as Team;
long userId = long.Parse(httpContext.User.Identity.Name);
if (team == null || team.Members.Where(m => m.Id == userId).Count() == 0)
{
httpContext.Response.StatusCode = 403;
ViewResult insufficientPermssions = new ViewResult();
insufficientPermssions.ViewName = "InsufficientPermissions";
filterContext.Result = insufficientPermssions;
}
}
}
Similarly, the TeamTask attribute ensures that the task in question actually belongs to the team.
Since my app is using subdomains (sub1.app.com, sub2.app.com.....) I basically choose to:
a) use something like the following code to cache info about tenants and
b) to call an action filter on each controller as suggested by Ragesh & Doc:
(Following code is from the blog on : http://www.developer.com/design/article.php/10925_3801931_2/Introduction-to-Multi-Tenant-Architecture.htm )
// <summary>
// This class is used to manage the Cached AppSettings
// from the Database
// </summary>
public class AppSettings
{
// <summary>
// This indexer is used to retrieve AppSettings from Memory
// </summary>
public string this[string Name]
{
get
{
//See if we have an AppSettings Cache Item
if (HttpContext.Current.Cache["AppSettings"] == null)
{
int? TenantID = 0;
//Look up the URL and get the Tenant Info
using (ApplContext dc =
new ApplContext())
{
Site result =
dc.Sites
.Where(a => a.Host ==
HttpContext.Current.Request.Url.
Host.ToLower())
.FirstOrDefault();
if (result != null)
{
TenantID = result.SiteID;
}
}
AppSettings.LoadAppSettings(TenantID);
}
Hashtable ht =
(Hashtable)HttpContext.Current.Cache["AppSettings"];
if (ht.ContainsKey(Name))
{
return ht[Name].ToString();
}
else
{
return string.Empty;
}
}
}
// <summary>
// This Method is used to load the app settings from the
// database into memory
// </summary>
public static void LoadAppSettings(int? TenantID)
{
Hashtable ht = new Hashtable();
//Now Load the AppSettings
using (ShoelaceContext dc =
new ShoelaceContext())
{
//settings are turned off
// no specific settings per user needed currently
//var results = dc.AppSettings.Where(a =>
// a.in_Tenant_Id == TenantID);
//foreach (var appSetting in results)
//{
// ht.Add(appSetting.vc_Name, appSetting.vc_Value);
//}
ht.Add("TenantID", TenantID);
}
//Add it into Cache (Have the Cache Expire after 1 Hour)
HttpContext.Current.Cache.Add("AppSettings",
ht, null,
System.Web.Caching.Cache.NoAbsoluteExpiration,
new TimeSpan(1, 0, 0),
System.Web.Caching.CacheItemPriority.NotRemovable, null);
}
}
If you want to execute common code like this on every Action in the Controller, you can do this:
protected override void OnActionExecuting(ActionExecutingContext filterContext)
{
base.OnActionExecuting(filterContext);
// do your magic here, you can check the session and/or call the database
}
We have developed a multi tenant application using ASP.NET MVC as well and including the tenant ID in every query is a completely acceptable and really necessary thing to do. I'm not sure where you are hosting your application but if you can use SQL Azure they have a new product called Federations that allows you to easily manage multi tenant data. One nice feature is that when you open the connection you can specify the tenant ID and all queries executed thereafter will only effect that tenants data. It is essentially just including their tenant ID in every request for you so you don't have to do it manually. (Note that federating data is not a new concept, Microsoft just released their own implementation of it recently)

Categories