App_Code DB Connection - c#

Having code on each page to open a connection to retrieve or insert data to/from the local database can be very time consuming if you decide to change something (obviously having to change all of it so the code remains the same layout)
My question is, is it okay to have the connection to the database in the appCode. Is there a risk of to many connections the same source or from the same location?

Related

BindingSource.EndEdit() vs TableAdapterManager.UpdateAll()

In the .NET framework, in order to save data into a databse item, one has to use:
Me.Validate();
Me.CustomersBindingSource.EndEdit();
Me.TableAdapterManager.UpdateAll(Me.CustomerDataSet);
Can someone explain me why? What's going on behind the scenes? If the .EndEdit() "applies changes to the underlying data source" why isn't it enough to apply those changes?
It is enough to "apply those changes"... to the data source. The data source is a DataTable, which is an object in your application. The UpdateAll call is what saves the changes from that DataTable - in fact, all DataTables in your DataSet - to the database.
ADO.NET is based on a disconnected model. That means that your application is not directly connected to your database. Using ADO in VB6, changes you made to a Recordset were made directly to the database. Not so in ADO.NET. When you call Fill, a connection to the database is opened, data is copied from the database into a DataTable and then the connection is closed. Any changes you make locally affect only that local copy. When you call Update or UpdateAll, the connection is opened again and the local changes saved to the database.

Sync Framework - overwrite local database changes

I'm using Sync Framework to handle syncing between local and remote databases.
I've managed to get both upload and download working, but I would like to have any local changes made to a specific table be overwritten with the original remote values; a forced overwrite in a sense.
Is there any way that this can be accomplished?
Any changes made to the remote database's table are successfully syncing down to the local db's table, but in the event that a change is made locally, it must be overwritten.
SyncFx syncs incrementally (syncs changes after last sync). In your case, the remote values will not be re-sent to your local if they didn't change.
you can do a dummy update on the remote rows to force them to be re-sent, but rather than doing it that way, why dont you just prevent edits on the local copy?

Can the editing done in datagridview item change the value directly in the sql server of the table

I am binding a table in datagridview and I want that the user can edit the items in my form.Will the editing done here be updated in the sql server if i am using sql connection string.
That is doable as long as you're handling a proper PropertyChanged event for this type.
Aside, I would not recommend doing so, since this would necessitate to keep the connection alive until the whole editing is made, that is, one trip around the database for each property that has changed along the edit.
Should you want not to keep the connection, that will cost even more, since you'll have all the connection instantiation and opening overhead, yet your database engine might manage a connection pool for itself.
ADO.NET prefers an offline approach, I don't remember the term exactly. That is, connect to the database to load the needed data, and close and dispose it afterwards, so that another user may use the connection. Meanwhile, on your side, the user brings the changes he needs, then when he's done, he persists them into the database, and then only one trip around the database is required for a bunch of changes, which looks to me to be more productive and more performant.

What are best practices for database setup from a winform client

I have written a winform application that connects to a database on our corporate network. I created the database as I was writing the application. Now it’s time to document the schema and to provide a method of recreation in the event that it is lost for what ever reason.
I have been considering that the client application should ask the user if it should recreate it or provide new connection parameters. Note: the current Connection parameters are kept in an obfuscated text file that is included in the application setup.
What are best practices for recreation of the database or the storage of the schema. Should the schema be just kept in a text file in the application directory, or should it be embedded in the application as a string resource.
Also, does anybody know of a open source application that I could use in documenting the database.
Thanks for any assistance or direction you can provide
Probably the schema should be embedded within the application as a resource. I'm thinking this because you said the connection parameters are obfuscated, which suggests that you don't want users to have any real knowledge of the database. Providing the schema as a plain text file would allow them to make very reasonable guesses as to what the connection parameters are. Another easy way to recreate the database is to simply keep an empty copy of it embedded as an application resource. Instead of actually recreating the database, you can simply stream out this copy.
As for documenting the schema (your second question, which in the future please ask as a separate question entirely), I'm not really sure what you mean. Are you just wanting to document the tables and rows within your database?

Data-Driven Websites for Very Small Businesses

I have a client who has a product-based website with hundreds of static product pages that are generated by Microsoft Access reports and pushed up to the ISP via FTP (it is an old design). We are thinking about getting a little more sophisticated and creating a data-driven website, probably using ASP.NET MVC.
Here's my question. Since this is a very small business (a handful of employees), I'd like to avoid enterprise patterns like web services if I can. How does one push updated product information to the website, batch-style? In a SQL Server environment, you can't just push up a new copy of the database, can you?
Clarification: The client already has a system at his facility where he keeps all of his product information and specifications. I would like to refresh the database at the ISP with this information.
You don't mention what exactly the data source is, but the implication is that it's not already in SQL Server. If that's the case, have a look at SSIS.
If the source data is in SQL Server, then I think you'd want to be looking at either transactional replication or log shipping to sync the two databases.
If you are modernizing, and it is a handful of employees, why would you push the product info out batch style?
I don't know exactly what you mean by "data driven", but why not allow the ASP.NET app to query the SQL Server product catalog database directly? Why generate static pages at all?
UPDATE: ok, I see, the real question is, how to update the SQL database running at the ISP.
You create an admin panel so the client can edit the data directly on the server. It is perfectly reasonable to have the client keep all their records on the server as long as the server is backed up nightly. Many cloud and virtual services offer easy ways to do replicated backups.
The additional benefit of this model is that more than one user can be adding or updating records at a time, making the workforce a lot more scalable. Likewise, the users can log in from anywhere they have a web browser to add new records, fix mistakes made in old records, etc.
EDIT: This approach assumes you can convince the client to abandon their current data entry system in favor of a centralized web-based management panel. Even if this isn't the case, the SQL database can be hosted on the server and the client's application could be made to talk to that so you're only ever using one database. From the sounds of it, it's a set of Access forms and macros which you should have source access to.
Assuming that there is no way to sync the data directly between your legacy system DB (is it in Access, or is Access just running the reports) and the SQL Server DB on the website (I'm not aware of any):
The problem with "pushing" the data directly into the SQL server will be that "old" (already in the DB) records won't be updated, but instead removed and then recreated. This is a big problem with foreign keys. Plus, I really don't like the idea of giving the client any access to the db at all.
So considering that, I find that the best is to write a relatively simple page that takes an uploaded file and updates the database. The file will likely be CSV, possibly XML. After a few iterations of writing these pages over the years, here's what I've come up with:
Show file upload box.
On next page load, save file to temp location
Loop through each line (element in XML) and validate all the data. Foreign keys, especially, but also business validations. You can also validate that the header row exists, etc. Don't update the database.
3a. If invalid data exists, save an error message to an array
At the end of the looping, show the view.
4a. If there were errors, show the list of error messages and tell them to re-upload the file.
4b. If there were no errors, create a link that has the file location from #2 and a confirmation flag
After the file location and confirm flag have been submitted run the loop in #3 again, but there's an if (confirmed) {} statement that actually makes the updates to the db.
EDIT: I saw your other post. One of the assumptions I made is that the databases won't be the same. ie, the legacy app will have a table or two. Maybe just products. But the new app will have orders, products, categories, etc, etc. This will complicate "just uploading the file".
Why do you need to push anything?
You just need to create a product management portion of the webpage and a secondly a public facing portion of the webpage. Both portions would touch the same SqlServer database.
.Net has the ability to monitor a database and check for updates. then you can run a query to [push] the data elsewhere.
or use sql to push the data with a trigger on the table(s) in question.
Is this what you were looking for?
You can try Dynamic Data Web Application.
You should have a service that regularly updates the data in the target DB. It will probably run on your source data machine (where the Access-DB is)
The service can use SSIS or ADO.NET to write the data. You can do this over the web, because you have access via TCP/IP to the server I assume.
Please check when the updates are done and how long it takes. If you can do the updates during the night you are fine. If not you should check, if you can still access the web during the import. That is sometimes not the case.
Use wget to push the new data file to the mvc app and once the data is received by the action, the mvc app invokes the processing/importing of the data (maybe in a worker process if you dont want long requests).

Categories