Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 6 years ago.
Improve this question
We have a desktop application. we need to install in client pc and connect the database to remote server. which method is better to connect database (for speed and performance).
1. Normal query method (mention the server name in connection string).
2. Create a web service and get the data in xml or json format.
Both solutions will bring positive and negative points.
Direct query to server -> imply that your client software knows the Database schema. If you change the Database schema, you need to test its integration in the client app.
Web service -> a limited API allows your Database to be only known by its data web service. The client app only knows about the small web service API. When the Database evolves, you have a very low chance to negatively impact the client code.
From an architectural point of view, it is encouraged to limit the size of contracts between 2 pieces of technology.
From a development cost point of view, creating and maintaining such a service has a cost and introduces maybe the need of a new set of technical skill set in your team.
Depends on your requirement, budget and time constraint.
If there is any possibility that this desktop software would be later extended to Mobile App and other platforms, then go for Creating web services preferably with JSON.
Keeping data access layer in Client Desktop Application saves a little development time, but makes testing, re usability and maintenance harder.
Also, the trend is to use SOA, thus I'd always prefer creating Web Services. Its secure, reusable and very friendly for future modification to project.
Related
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 4 years ago.
Improve this question
I'm trying to find the best way creating a mobile application, and connected with a cloud database(or server).
So my first question is: Is it good practice to communicate directly into a database from a mobile app?
If yes, which is the best database for this work? Azure? Oracle? Firebase...?
If no, which is the best service to communicate first? And which protocol?
What is the most recommended way? Does it matter what os(windows?mac?) my server will use?
The best approach if to build a REST API to communicate with your server. But if you don't want to do that, you should look for Firebase. Firebase is one of the best platform to build mobile and web application really fast. It provide you with Authentication, Real-Time Database, Storage, hosting crash reports and many more. Using firebase, you can set security rules on who can write to you database thus it eliminate the risk of unauthorized access to the database.
Also keep in mind that, if you use firebase, you should structure your db in such a way that the number of read and write requests are as minimal as possible, as firebase could be a bit expensive if not used properly.
Its better to build a api in between. Doing queries in your app is kinda risky.
You could use a nodejs framework for this e.g. Sails.js or Express.js
For a cloud database, Firebase is a very easy to implement and work with Database. There are plenty of information resources about the setup and work process.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
I have an existing cloud based solution with a web api that brokers data to and from the backend SQL database to client applications. All very standard. My web api is built using .NET Core. This is working well and I utilise the existing web api with various web clients that have been built. The architecture looks like this:
The current solution needs to be extended to support native mobile client applications using the web api (nothing out of the ordinary here - normally they'd call the web api the same as any other client) BUT I have to meet the requirement that these new client applications can be used in an offline scenario. This means I cannot expect a data connection to exist on the device in order to call my web api every time I need. I need to look at synchronising data so it can be offline and sent back to the server when needed.
Thinking about it, the data will be synchronised in one of two ways:
One-way sync - data from the server to client but no changes will be
made to this data e.g. system lookup tables.
Two-way sync - existing
data will be synchronised to the client, modified and sent back to
the server OR new data is created on the client and sent up e.g. new
order.
New architecture will be as follows:
So - getting to my question - does anyone know of a good design pattern to follow with regards synchronising data (one-way and two-way) OR maybe a NuGet Package which has synchronisation code built in? I'm trying to avoid reinventing the wheel with regards sync, if possible.
NOTE: Just for information purposes, the native mobile apps will be built using Xamarin in Visual Studio 2015.
I've not used it yet (just starting to look into as I have a similar situation to yours) but it looks like Xamarin has facility for this already (if you're using Azure anyway...):
Offline Data Sync in Azure Mobile Apps
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
We currently have a single database with users, customers, products and orders logically separated by schemas. We then have several MVC.net applications accessing the database via their own BLLs. Each of these applications have their own functionality and share some aspects with some/all of the other applications.
Currently, some code is duplicated in these BLLs and it's a bit of a mess to maintain. It does however, allow us to develop features quickly and deploy each application independently (assuming on major database work here).
We have started to develop a single access layer, properly separated out that sits above the database and is used by all of our MVC.net applications. Logically this makes sense as we can now share code between our applications. For example, application A can retrieve a customer record in the same way as application B. The issue comes when we want to deploy an application, we wouldn't be able to deploy one application, we'd need to deploy them all.
What other architectural approaches could we consider that would allow us to share code between our applications and deploy those applications independently?
A common solution is to factor out services (based on an arbitrary communication layer REST, WCF, Message Bus, your choice with versioning) and deploy these services to your infrastructure as standalone services.
Now you can evolve, scale and deploy your services independently of the consumers. Instead of deploying all applications you now only have to deploy the changed services (side-by-side with the old ones) and the new application.
This adds quite a lot of complexity around service versioning, configuration management, integration testing, a little communication overhead etc. So you have to balance the pros and cons. There are quite a bunch of articles on the net how to build such an architecture.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I'm coding a program for a small business. The application's going to be used for store-keeping and the ordering of parts for a mechanical shop. I need some type of backend to store orders, article numbers & prices, customers and so on (obviously). Right now I'm using a local MySQL server and running queries directly from the code. This is not ideal because of the risk of a system meltdown or similar. I've thought about running a local MySQL server - with a scheduled backup on a remote host, but I'm hoping there's a better solution. For previous applications I've written a PHP wrapper, and used a web hotel to host the MySQL server - Which isn't ideal either for security reasons. I suppose I should mention the application's written with windows forms in the VS .net environment(in C#). My question's this: How do I set up a MySQL server (or other type of database system) on a remote host - that I can run queries on and then return the result back to the application? Preferably I wouldn't want to handle the MySQL server myself but outsource it. I don't mind renting a server from some host - if it will spare me the hassle of setting up a local server machine to run separately. Are there any solutions that you can rent for this purpose? I'm sure there must be tons of information about this on the interwebs but I can't find anything. I would be very thankful if anyone could give me some pointers!
One way you could do this is rent a cheap VPS and host mysql in there. I have been using DigitalOcean, and it is pretty good. For your needs a $5 per month VPS would be enough.
Or you could use Azure. http://www.windowsazure.com/en-us/services/data-management/
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 4 years ago.
Improve this question
I would like input on the design I currently have planned.
Basically, I have some number of external instrumentation, each of which should always be running, collecting specific data. My thought was to create a service for each, always running and polling the instruments, performing logging, etc. There could be one instrument, or there could be 40.
However, I need one application to consume all this data, run some math on it, and do the charting, display, emailing, etc. The kicker is that even if this application is not running, the services should constantly be consuming data. Also, these services should almost always be supposed to run on the same machines as the client application itself, but the ability to network them (like .NET Remoting used to do) could be an interesting feature.
My question is... is this the best design? If it is, how do I go about doing the communication between services and application? I've looked into WCF, but it seems to be geared towards request-response web services, not something that is continually streaming data to anything that might listen to it. Alternatively, should I have these services contact some other Web Service using WCF, that then compiles the data for use in a thin client viewer that polls the web service often?
Any links and resources would be greatly appreciated. .NET namespaces for me to research are also appreciated. If I wasn't clear about something let me know.
Just a thought....but have you considered perhaps adding a backend database? All services could collate data and persist it then your application that needs to process the information can just query the database rather than setting up loads of IPC between the services.
WCF can handle streaming. It can also use MSMQ as a transport, which will ensure that no messages are lost, even if your instruments begin producing large quantities of data.