I am developing my application in ASP.NET 4.5. I setup a webservice. The webservice get accessed by Jquery using a GET Ajax call. The function has a 'id' parameter and for each id it returns different content.
I want to cache that data both on server and client. On client I already know how to use 'CacheDuration' but the problem is with the client side. I use HttpFox to get the headers of each response.
I've setup the following code at the beginning of the function:
HttpCachePolicy cache = HttpContext.Current.Response.Cache;
cache.SetCacheability(HttpCacheability.Private);
cache.SetExpires(DateTime.Now.AddMinutes((double)10));
cache.SetMaxAge(new TimeSpan(0, 0, 10));
FieldInfo maxAgeField = cache.GetType().GetField(
"_maxAge", BindingFlags.Instance | BindingFlags.NonPublic);
maxAgeField.SetValue(cache, new TimeSpan(0, 10, 0));
What I want to do is to prevent the client from sending a request again for the same ID, and do that for let's say.. 10 minutes. So at the first call the request will be made and we get response 200. At the second time the response header should be 304.
Right now the code above doesn't do that and I want to know how to achieve that. Again, I am talking on client-side caching that return 304, so it want go to the server again for the same 'id' until the cache is expired.
I know that it can be done using the code, so please don't submit IIS type of solutions.
Need your help to solve this.
You will need to implement the last modified header
Response.Cache.SetLastModified(lastWriteTime.Value.ToUniversalTime());
This will allow your client to check for last modified values and not refetch.
You could in theory fake the last modified time to the nearest 10 mins by using
public static DateTime Round10(this DateTime value)
{
var ticksIn10Mins = TimeSpan.FromMinutes(10).Ticks;
DateTime dtReturn = (value.Ticks % ticksIn10Mins == 0) ? value : new DateTime((value.Ticks / ticksIn15Mins + 1) * ticksIn10Mins);
if(dtReturn > DateTime.Now())
{
return dtReturn.AddMinutes(-10);
} else {
return dtReturn;
}
}
Response.Cache.SetLastModified(Round10(DateTime.Now);
This code is untested though
Related
I have developed application, which is sending sms by twilio.
I need to track sms statuses - deliveries, fails, etc. for this reason I use twilio provided web hooks.
Of course I save sent sms into database - for some purposes and then also updating sms status in the db.
I achieved this by getting smsId - what twilio returns when sending sms - this id is saved into db - to track deliveries as mentioned above.
Actual question:
web hook is posting very fast to my endpoint status - actually for that time I event don't expect it to be saved in db.
I tried some of the optimizations - but nothing helped - still web hook is faster - so the resource is not in db - and of course I get NullReferenceException.
dealing with that - I came to the solution - to trying reading resource multiple times with some delays and at the end it gets resource and everything is fine.
actually - I don't like querying database multiple times for very small amount of time.
What can be other solution?
Code sample:
var response = _notificationSender.SendSms(new SendSmsRequest
{
Text = text,
ToNumber = contact.Mobile ?? contact.Phone
});
_db.Database.ExecuteSqlCommand(#"INSERT INTO [Table] ([field], .. ) Values(Field1,...),");
SendSms:
public SendSmsResponse SendSms(SendSmsRequest request)
{
ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12;
TwilioClient.Init(_twilioConfiguration.AccountSid, _twilioConfiguration.Token);
var message = MessageResource.Create(
body: request.Text,
from: new Twilio.Types.PhoneNumber(FromNumber),
statusCallback: new Uri(_twilioConfiguration.StatusCallbackUrl),
to: new Twilio.Types.PhoneNumber(request.ToNumber)
);
return new SendSmsResponse
{
Ok = true,
From = _twilioConfiguration.FromNumbers.First().Value,
Sid = message.Sid,
ErrorCode = (TwillioErrorCode?)message.ErrorCode,
SmsStatus = response.SmsStatus
};
}
I am using Microsoft Graph SDK to create a batch request that contains individual requests to request 20 different users. When I call GetNextLinkAsync() the result is always null. I have tried requesting 1000 different users using batch requests each containing 20 individual requests. This works fine, response is always returned in a single batch response.
I couldn't understand why the response is returned in single batch response content rather than giving me the link to fetch the next response?
Even though next link is null always, how can I follow it using Graph SDK? It is a string. It's not like next page request.
foreach (var batchRequest in batchRequests)
{
try
{
var responses = await PostBatchRequest(batchRequest.Request);
foreach (var id in batchRequest.RequestIds)
{
try
{
var user = await responses.GetResponseByIdAsync<User>(id);
users.Add(user.UserPrincipalName, user.Id);
} catch (ServiceException e)
{
logger.LogInformation(e.StatusCode);
}
}
} catch (ServiceException e)
{
logger.LogInformation(e.StatusCode);
}
}
I couldn't find proper documentation that tells you how to follow next link using Graph SDK or why it is always null for this type of requests, or are there any special type of requests for which next link is returned?
So a lot of this is framework stuff that's inside my wrapper classes, but the crux of the solution to your issue should be inside here:
var results = await batch.ParseAsync<ContactFolderContactsCollectionResponse, IContactFolderContactsCollectionPage>(response => {
var page = response.Value;
if (response?.AdditionalData != null && response.AdditionalData.ContainsKey("#odata.nextLink"))
{
page.InitializeNextPageRequest(Application.GraphConnection.Client, (string)response.AdditionalData["#odata.nextLink"]);
}
return page;
});
In this snippet I'm parsing out a ContactFolderContactsCollectionResponse from a batch with steps generated from a get request that would normally return a IContactFolderContactsCollectionPage. The ContactFolderContactsCollectionResponse is fetched by the wrapper internals using
GetResponseByIdAsync<ContactFolderContactsCollectionResponse>(id)
So it's pretty analogous to what you're doing except that there is probably some kind of UserResponse type that you should probably be using instead of User.
In my case the ContactFolderContactsCollectionResponse contains the IContactFolderContactsCollectionPage I actionally want in the Vvalue property hence:
var page = response.Value;
Now, IContactFolderContactsCollectionPage normally has a NextPageRequest property, but when you parse it directly from the ContactFolderContactsCollectionResponse, this is not filled out. Luckily, we can find the raw #odata.nextPage link in the ContactFolderContactsCollectionResponse's AdditionalData dictionary, and we can set it using the IContactFolderContactsCollectionPage.InitializeNextPageRequest methhod.
Hence the :
if (response?.AdditionalData != null && response.AdditionalData.ContainsKey("#odata.nextLink"))
{
page.InitializeNextPageRequest(Application.GraphConnection.Client, (string)response.AdditionalData["#odata.nextLink"]);
}
Hopefully that give you enough thread to pull on. Sorry if the rest of the syntax is confusing, as I said, a lot of it is operating in a wrapper framework I'm building and I
don't have time to build and test a clean solution.
It's also possible that the whole thing is different anyway on account of you having a batch with a thousand steps as opposed to a batch whose steps return thousands of objects like as in my case.
Either way, happy hunting.
I'm polling data to my angular app from a c# web api. Every time all data is polled, even though much of the data haven't changed. I would like to poll only the objects which have actually been updated in any sort of way.
This is my code in my Controller.cs
//Get all details of the available vehicles
[HttpGet]
[Route("api/details")]
public object GetFleetStatusDetails()
{
var fmsData = this.fmsdb.Value.GetFleetStatusDetails();
var data = fmsData.Entries;
List<VehicleDetails> result = new List<VehicleDetails>();
foreach (var item in data)
{
if (item != null)
{
var details = ConvertVehicleDetail(item);
result.Add(details);
}
}
return result;
}
As you can see im converting the data into VehicleDetails which I later add to my VehicleDetails list. The data im getting is in JSON-format. Is there a way of comparing my last poll with the current poll without going to much deeper down into the database? If so how would I do that?
I suggest saving the last state either through the Browser's local storage or using ngrx.
Without knowing what this.fmsdb.Value.GetFleetStatusDetails(); does or the schema of the database you're calling the best guess answer would be to create two endpoints in the API.
The first would use this.fmsdb.Value.GetFleetStatusDetails(); to get the full set of results. Once the data has been downloaded from the endpoint store the current date/time in the angular view and pass that to the second endpoint...
The second endpoint would get the records that have been changed since the date/time you got the first set of data. This assumes that you are storing the updated date time.
[HttpGet]
[Route("api/updateddetails")]
public object GetUpdatedFleetStatusDetails([FromUrl]DateTime date)
{
var fmsData = this.fmsdb.Value.GetUpdatedFleetStatusDetails(date);
return fmsData.Entries
.Where(x => x != null)
.Select(ConvertVehicleDetail);
}
Could give more details if knew what was in this.fmsdb.Value.GetFleetStatusDetails()
PS Haven't tested this code.
I have a c# script task in an ssis package designed to geocode data through my company's proprietary system. It currently works like this:
1) Pull query of addresses and put in data table
2) Loop through that table and Foreach row, build request, send request, wait for response, then insert back into the database.
The issue is that each call takes forever to return, because before going out and getting a new address on the api side, it checks a current database(string match) to ensure the address does not already exist. If not exists, then go out and get me new data from a service like google.
Because I'm doing one at a time, it makes it easy to keep the ID field with the record when I go back to insert it into the database.
Now comes the issue at hand... I was told to configure this as multi-thread or asynchronous. Here is the page I was reading on here about this topic:
ASP.NET Multithreading Web Requests
var urls = new List<string>();
var results = new ConcurrentBag<OccupationSearch>();
Parallel.ForEach(urls, url =>
{
WebRequest request = WebRequest.Create(requestUrl);
string response = new StreamReader(request.GetResponse().GetResponseStream()).ReadToEnd();
var result = JsonSerializer().Deserialize<OccupationSearch>(new JsonTextReader(new StringReader(response)));
results.Add(result);
});
Perhaps I'm thinking about this wrong, but if I send 2 requests(A & B) and lets say B actually returns first, how can I ensure that when I go back to update my database I'm updating the correct record? Can I send the ID with the API call and return it?
My thoughts are to create an array of requests, burn through them without waiting for a response and return those value in another array, that I will then loop through on my insert statement.
Is this a good way of going about this? I've never used Parrallel.ForEach, and all the info I find on it is too technical for me to visualize and apply to my situation.
Perhaps I'm thinking about this wrong, but if I send 2 requests(A & B) and lets say B actually returns first, how can I ensure that when I go back to update my database I'm updating the correct record? Can I send the ID with the API call and return it?
None of your code contains anything that looks like an "ID," but I assume everything you need is in the URL. If that is the case, one simple answer is to use a Dictionary instead of a Bag.
List<string> urls = GetListOfUrlsFromSomewhere();
var results = new ConcurrentDictionary<string, OccupationSearch>();
Parallel.ForEach(urls.Distinct(), url =>
{
WebRequest request = WebRequest.Create(url);
string response = new StreamReader(request.GetResponse().GetResponseStream()).ReadToEnd();
var result = JsonSerializer().Deserialize<OccupationSearch>(new JsonTextReader(new StringReader(response)));
results.TryAdd(url, result);
});
After this code is done, the results dictionary will contain entries that correlate each response back to the original URL.
Note: you might want to use HttpClient instead of WebClient, and you should take care to dispose of your disposable objects, e.g. StreamReader and StringReader.
I'm developing a web application with C# MVC and using Session to persist data between multiple requests.
Sometimes the session timed out so I looked for way to keep it alive and found some solutions here in stackoverflow. Being reluctant to simply copy-paste code into my project I attempted to rewrite the code to fit my needs and understand it better.
At first I attempted to keep the session alive using the following code:
JS + jQuery - client side:
function keepAliveFunc(){
setTimeout("keepAlive()", 300000);
};
function keepAlive() {
$.get("/Account/KeepAlive", null, function () { keepAliveFunc(); });
};
$(keepAliveFunc());
C# - server side:
[HttpGet]
public bool KeepAlive()
{
return true;
}
This however did not seem to keep my session alive, it expired normally.
After a while of fiddling around I changed the code to:
JS + jQuery - client side:
function keepAliveFunc(){
setTimeout("keepAlive()", 10000);
};
function keepAlive() {
$.post("/Account/KeepAlive", null, function () { keepAliveFunc(); });
};
$(keepAliveFunc());
C# - server side:
[HttpPost]
public JsonResult KeepAlive()
{
return new JsonResult { Data = "Success" };
}
The latter worked well which has me conclude, with some uncertainty, that the Session is kept alive because of the POST request instead of the GET. Which raises the question: Why do I need to use POST when trying to keep my Session alive? What's the difference? Am I making some other mistake which I do not comprehend?
I've looked for answers but I cannot seem to find any on this matter, merely solutions without much explanation. Reading up on Session on MSDN also didn't help me much. This makes me conclude that there are some "words" related to Session and this perticular problem that I haven't encountered yet which makes me unable to google effectively.
With either GET or POST, the browser does send the SessionId cookie with the request. So for keep-alive purposes it doesn't matter which one you use. Most likely you are seeing the difference in behavior because of the different interval you and "pinging" the server.
With the GET request you did it at an interval of 300000 ms, while with the POST request you did it at an interval of 10000 ms.
Most likely, your server's session lifespan is somewhere between the two values.
You could, however, configure the session lifespan to fit your needs (as in increasing it), but keep in mind that expiring sessions is a security feature so try to find a small value that is big enough to let your application work ok, but still allow the session to expire in a safe interval of time.