I'm building an ECM project, my client wants web and desktop clients. So I decided to build ASP.NET MVC WebApp and WPF desktop client. Both are going to comunicate with business through an Web API project (actually I'm not sure about using Web API or WCF with MTOM as a Service Layer).
Since I expect the project to somehow scale, I'm worried about memory consumption, then I searched and couldn't figure out whether the server is going to use twice the memory when storing/retrieving files.
Suppose some user uploads a 500MB file. MVC receives the file (therefore occupying 500MB of memory for that file, at least I think it's how it works). Assuming the MVC will delegate the request to Web API (which is hosted in the same server), which is going to take care of storing the file, my question is:
Will the server consumes twice the memory during this process?
If yes, are there any techniques to avoid this? (no need to write full examples, I can search for it once I know what to search for).
Feel free to tell me if this is a bad design and suggest different approaches for this scenario, I'm trying to figure out if it's worth separating API in different project.
Related
I'm a C# ASP.NET junior dev and have worked with Code First C# Databases, RESTful API's, MVC & Vue (a frontend framework sort of like React) to create websites.
Now at work and during my education, I've never handled deployment.
At this time I have a personal project. I have succesfully hosted my relational MySQL Database on phpMyAdmin and can update it from my local desktop.
My hosting site let me know they do not host C# or anything of the sort.
I found some posts suggesting Azure, AWS, others, but for every post I find I find equal people protesting those.
What is a good site to host my first REST API? I'm looking for something that can go beyond Minimum Viable Product and I'd like to host my website under the hosting service I'm currently using (so not paired hosting with the API).
What would the costprice look like for an API that's deployed and being used by clients?
I realize this cost depends on the amount of traffic, but assume a basic API used for, let's say, posting orders in an online shop (though website/app/w.e, it all would communicate through the API).
Any tips are welcome as I feel I'm swimming in the dark researching this.
Thank you
Any hosting service that grants you real access to a machine will be able to run your API as some specialized for the .net/core ecossystem.
I supposed you know about php based on phpMyAdmin service and the ecossystem of hosts that support php, although cheaper they do not exactly give you access to the machine and probably will not support .net/core as inumerous others tech stacks.
As a Junior Developer I believe you should have a little bit of practice in any deployment ecossystem so I encorage you to try most of the big clouds (Azure, GCP, AWS), but also some smaller hosts to gain experience and understand a bit more about the differences in deployment and ecosystem.
Azure will be really easy, you can create an account and post your API without any costs using an free WebApp, VS will even have publishing tools that will handle 90% of the job, GCP will be a little trickier and will require you to know a bit about containers and clusters, if you go for a non specilized host like digitalocean you will need to understand more about the Operational system and associated servers/controllers to deploy and publish
about the cost part is a lot more difficult. It will depend on the host that you are using and the load (process, memory, size, and throughput of data). In my experience I had some very small-scale APIs that required more processing or memory to accomplish some tasks like PDF generation than a medium-scale API that only had json data transaction
I am writing a program that will be used to pull data from two separate SaaS softwares via their REST APIs - concatenate them, and save the output as raw JSON. When this is complete - the program will be scheduled to run every night.
The overall concept isn't really complicated, it follows a very simple process:
Pull data from Software A via REST
Pull data from Software B via REST
Concatenate them
Save output as raw JSON
I started off by trying to develop this as a .NET Core Console Application. Mostly straightforward - just using appropriate libraries to send/receive HTTP requests to both software APIs, pulling data, and using IO library to save to JSON file.
However - I've been told that this actually needs to be a web service, not a console application. I have basic experience and knowledge regarding web services, but I need some help understanding how to lay out the entire solution from a high-level.
I think I'm getting confused at understanding how exactly web services work and how my original solution would fit into a web service? And how would my final solution be "deployed" and scheduled to run?
How would I structure the 4 step process above if I were to convert my console application to a web service?
I have a Windows Service written in C#. I have recently added CassiniDev to it to allow remote web administration and monitoring of the service. The integration went really well except for my inability to interact with data layer of my Windows Service from hosted ASP.NET pages.
I have tried putting everything of interest into a common assembly but the debugger shows there are two loaded assemblies with the same name but from different paths. Cassini runs ASP.NET off some temp folder so the assembly I am using is really "a different instance" in the address space of the same process.
I am not sure what is going on here. Probably some "application domain" separation stuff that I do not understand at this time.
So with Windows Service and the web server running in the same process, how can I make them interact? Say I have some status in the Service part that I want to report in the ASP.NET part. Any ideas how I could make this happen? Shared memory or TCP comes to mind but it sounds like an overkill for purely intra-process communication.
If security isn't an immediate concern, i.e. the data isn't highly sensitive and in a controlled environment, then you could have success using Named Pipes. A managed API for processing piping has been implemented as part of the framework, so you don't need to think in native calls.
I'm developing an automatic software update service, in which clients get updated version of the software over internet, we are considering downloads from a remote server over HTTP because it is immune to firewall restrictions.
The update server, must be able to authenticate the request, check license of the software and provide the client with the correct update files (as if there might be several versions of the software which need updating)
An ASP.net web application might be able to do the job, however I'm trying to avoid a web application because it needs to be installed in IIS. I am considering a basicHttp WCF library hosted in a windows service with Streamed transferMode, however I've read articles that say "its not a good practice to transfer files with WCF! I wonder why WCF is not a technology for file transfers? What are the restrictions and alternatives?
Do you suggest a WCF windows service for this job?
This doesn't exactly answer your question regarding WCF and transferring files, however I built an auto-updater application not too long ago that had a small client front-end to a WinForms app. It would get a list of local files, generate an MD5 hash of each one and send it up to a web service for comparison against a local list of files on the web server. The web service method returned the list of files that changed.
The client would then loop through that list and call another web service that would return a byte array for each file and dump it out to the local hard drive.
When this was done, it would do a Process.Start on the exe of the updated app.
It's been running in production with a couple hundred active users for over a year with no issues.
I am building my own web server and want to serve from it a Silverlight application. Is there anything else I have to do besides setting the mime type and pushing the application through the wire?
It probably makes no difference, but the web-server is in C# (the micro-edition).
No, silverlight is all run on the client, so unless you want to do some webservices or whatever, you needn't do anything other than set the mime-type.
It is really just like a separate file that you serve to the client, just like any image, script or css file.
If you are developing a single Silverlight application that you want to deliver then you need only serve the XAP.
However if you are not the application developer or you want to deliver multiple apps effeciently then your web server needs also to be able to deliver other files that may come along with these apps. For example the libraries may be be delivered as zip files and they may download external images and XML files. Still this is all likely to be simple static content you will not normally need to implement other services.
Note if you are hosting an app to be referenced by a HTML file served by some other server then you need to get your site to respond with appropriate XML when SL requests the clientaccesspolicy.xml file.