Passing Information from a Windows Service to a Web Page - c#

I have developed a Windows Service that runs every 10 seconds and collects real time data from a couple of web services. That data is then collated and sent in an xml file to a web site outside my system.
I now have a requirement to display this real-time data in an internal web page, which can be viewed by user in our company.
One option would be to write this data to a database table and access it from the web page. However as the data is real time statistics, there is no requirement to store the data in the long term.
Is there a good way to pass the data to the web page so that it can be displayed to the users that does not require multiple calls to the database and put a strain on the network?
Any thoughts or ideas would be welcome.

Related

Web Site Security - 1 main site, 1 secondary site hosted within the main site

Scenario:
ASP.NET 5 / Razor Pages / C#
We have one main site, with very good security. In the background, all passwords are encrypted. Also, a log is made of all logon usernames, from which IP addresses, at whatever time accessed.
We have a second site that is hosted within the main site visually on the front end via iframes mostly, but not on the server. They won't live together in the same web app.
Problem:
I need to ensure that the secondary site access is secure, whilst relying on the fact that the user already logged on successfully via the main website. I don't want the user to need to logon twice to two systems, rather I want the single logon to fluidly allow access to the secondary site.
I have a method I am using now. It works, but I really want to delve in and see if I can improve this given I'm not heavy on experience in terms of website security. I'm sure there is a better way.
Options?
From a security point of view, using iframes, the two site are independent.
So you need to guarantee that the security process is issued on both sides.
You have several possibilities, but the best, I think, is to revalidate the user in the "iframed" website.
You can use a token, generated from the main website and stored in a backend DB, and pass it to the iframe URL.
The endpoint of the iframe has to read the token, call a backend API to validate it and allow the access.
The main problem you have is to refresh the token after a reasonable time, in order to ensure the validity during the use of the "iframed" website.

Architectural / Design question related to C# project, Azure and Office SharePoint 365

Need feedback and input. Here's a scenario. End goal is to update a custom list in Office 365 SharePoint site using data from external site. Note that SharePoint is not on-premise.
End-users fills out a form online at a 3rd party web site
-- Assume that your org does not want to drop this 3rd party web site for any reason
Form data is posted to an internal database
-- Assume DB is inaccessible from 3rd party site except via web form grid view and managed exports via CSV
Web Hook endpoint is configured to also send form data to receiver on MS Azure
Azure endpoint "receiver" gets and stores form data in Azure storage account queue
That's the first phase. How would you "feel" about the second phase being this scenario?
A web service coded in C# installed on a local server that periodically wakes up
-- Yes, I know that you can also store a C# developed service on Azure cloud. But let's assume that you are a frugal penny savings minded accountant in a previous life.
-- Yes, I'm also aware that a local service approach short-circuits the entire online path that the form data would otherwise take. But let's assume that at least one other person in the team agrees that locally, we have much more control with what we can do with the data, assuming that it's final destination as a new item a SharePoint list is just the tip of the iceberg.
Connects to Azure online storage
Downloads all queue items for processing
Saves queue items as new items in Office 365 SharePoint List
Dequeue items in Azure queue
Note: In this scenario, we're not taking advantage of Azure Queue Trigger which would require the second phase of the project code to reside in Azure also. Either as a Azure function or function app. Whichever the correct nomenclature is. The strategy for this approach is to save time from having to go through all the steps for coding authentication (OAuth or otherwise) to access the Office 365 SharePoint List. Lots of steps here and various choices for which SharePoint access methodology to use, i.e. REST, CSOM, etc.
Note2: A local service app in customer domain using local domain credentials (credential also added to Office 365 SharePoint site) will have trusted access to SharePoint list so there's no need to configure AD nor configure any certificates just for adding a new item to a SharePoint list.
-- Assume you already have some C# plumbing for Queue Trigger and that you can successfully read items in the queue. The only missing piece at this point is sending the data to Office 365 SharePoint online.
So, Yay or Nay? To help with understanding, what are the flaws in the second phase thinking/strategy and how would it negatively impact you if you were a developer having to maintain this solution in the event I am abducted by outer space aliens or fall into a very deep sinkhole that winds up somewhere on the other side of the planet, after the solution goes into production? Would you lay flowers on my cemetery headstone or spray paint graffiti on it?
What would you prefer, if any, to see as the second phase of the solution (that's all after storing form data into Azure Storage Queue?) (Keep in mind the monthly costs of Azure).

Masking 1000 ajax post request with better performance

We have an application that is installed on some 600 odd servers. This application exposes a web api which gets me version information of the application.
My requirement is to display the version of application on each of the server. I have achieved this in asp.net application by:
Writing a web method in aspx.cs page with server name as parameter. This method will build web api URL, invoke web api, get response, build a object and return as json string.
I have written an jquery ajax post request for each server name to the above method. On success, built a html table row and append it to table. so that as and when we get response it is shown to user.
This works absolutely fine for say 30-40 servers. But when it increases, it takes lot of time to process all requests (30 - 40 mins). And with multiple users using this asp.net app, we start getting error.
Is there a any other method to achieve this faster and for multiple users without errors?
How often is this information changed? If this data is rarely updated, I suggest the following.
Make every of servers to save this information into a shared database table when the server starts. If the application version can be changed while the server is operating, update a corresponding record in a database.
When your service gets information about the application version, get it once from the database table instead of requesting it directly from every server.

Dropping a session variable (or the value of one) into a client's local application

I am trying to build a system where, when the user logs in to an online ASP.NET site, the User ID (that is placed in a session variable) is then stored on to a local application (c# application) for later use.
So far, the only way I can think of doing this is, when the user logs in, the User ID is stored in a text file on the client's machine. But, this would not be an ideal solution, as this means the local client application must then check the file to make sure the contents have not changed (almost every second as it is important that the client Application always has the correct User ID).
Any suggestions?
When you say "the User ID is stored in a text file on the client's machine" I deduce you mean cookies, because you simply can´t store files on client machines via web applications, unless there is some sort of ActiveX control involved.
You can perfectly store a cookie with the User Id on the client and access it with your console app, but this is not very reliable, as the user can have cookies disabled or he can clean the cookies folder and also because different browsers use different folders for storing cookies.
So my choice would rather be a storing the current logged users in a database and make the console app poll that info through a WCF service.
If you don´t want to use a Database, store an XML file on the server that could act as your database and use for example LINQ to XML to retrieve the data via that WCF service.
Other option we can equate instead of polling the info you could use WCF Duplex Services and make the WebService push that info to the client apps once a user logs in.

C# Forms authentication with form data

I have two applications, say app. A and app. B. App A. sends form data (using the post method) to app B. B however, is a web application and uses forms authentication. The post data is send to a webpage (viewdocument.aspx) which is secured by forms authentication. But when the data is send to viewdocument, the login page is displayed because the user isn't authenticated.
The point is, I want the post data to be read by viewdocument. How can I do this?
You can allow all users to access your viewdocument page (by setting authorization in your web.config), get the values of the post in your page load and then, manually do:
if (!User.Identity.IsAuthenticated)
FormsAuthentication.RedirectToLoginPage();
//Else continue with page display
This way, you will protected the display of your page but be able to send data to the page with any user.
I hope it will help
If your web app is only for accept data use web-services.
I think you want to consider separating out the two process - accepting data from another web site, and displaying data to a user. This way the you get nice separation of logic which can improve maintainability. And I'm not sure how you are going to go POSTing data from one website to another as POST should go back to the original webpage. I would do as #Kane suggested in his comment and use a service to accept the incoming data. This could be built to accept the current data, but would also be easily extensible if you ever need to receive data from other sites. Your page for displaying the data would then be a lot more simple and clearer for developers to work on.

Categories