I am planning to make a C# desktop app with SQL Server 2019 Express.
Because I am not so familiar with SQL Sever, it'd be appreciated if any advice given on the feasibility of my thoughts, thanks in advance!
Here are my planned backup strategies:
This app is for personal use, I make it and I use it.
Any modification to the database will be wrapped as a transaction.
Full backup of the target database will be conducted once a day, and differential backup EVERY TIME after successfully committed a transaction.
All backups are using BACKUP statement through C# code.
Questions:
Because I have backup for every modification, so I can just use "Simple" restore model, and there is no need to use Transaction Log backup, is that right?
For the same reason, I don't need to backup every time I close the app, right?
Why I am asking this is because differential backup take about 560KB even right after a full backup and without any change.
Do I need to backup system databases other than the one used by the app?
(I might change hosting pc in the future, can the backups be restored to other SQL Server instances without system databases backups?)
Thanks!
Related
I am working on project in which I have used visual c# as front end and SQL Server 2008 R2 Express for backend.
Now I know that SQL Server express database has a size limit is of 10 GB, so I have written the code for database backup when pick limit is reached and I empty the database once the backup is successful.
I want to know what will be best approach to restore the backup file so that my current applications backend (which I have emptied earlier) should not be disturb.
Is it okay if I restore the same in my current database, in that case I have question to ask does it affect my applications working, as my application is kind of real time and after every 15 min. interval stored some values in database.
Whether I need to write some other utility for seeing the old data..?
Every day around 50 MB of data is inserted into database, so it will take around 8 months to reach the pick size( as per my rough calculations). and as far as the nature of application is concern user will not going to use archive data frequently. please consider this and suggest me the approach.
thanks in advance..!!
Hope i got your question right, but consider the following suggestion for working:
one database ("Current DB") that stores the real-time data.
when it comes to a size, it is dumped (or copied mdf+ldf) to archive.
and stored with time stamps (FROM-TO).
When data is needed, the relevant mdf is attached as a new "offline" database.
(you can use a connection string to attach MDF file to an SQL Server.)
and use that connection instead of the live one.
The application can run smoothly on the On-line database.
while reading, loading etc...
is done from the temporary attached and detached database files.
Take a look at : Connection String to Connect to .MDF
for how to Attach a MDF to SQL Server instance.
If you enter the data in a whole new database server your old queries won't work on the new one. As SQL Express limit is not per database, but per database server.
You could create a new SQL Express Server, link your servers and create a query with a linked server ( How to create a linked server # msdn )
You will need to adjust your queries.
If you query your data now like this:
SELECT em.Name, em.Telefone FROM employees AS em
You need to refer the database too.
SELECT [server1\db1].dbo.em.Name, [server1\db1].dbo.em.Telefone FROM [server1\db1].dbo.employees AS em
for your current database, and
SELECT [server2\backup].dbo.em.Name, [server2\backup].dbo.em.Telefone FROM [server2\backup].dbo.em.Name
It is possible like this but I would not advise it. If you exceeded 10GB data already then you might have large tables. Each table at a linked server is copied completely to you server and could cause serious network traffic and takes quite some time to be executed.
I would think of getting the SQL Standard edition.
I'm getting ready to develop a MVC 3 website with C#, Entity Framework and SQL Server.
This website is built for critical jobs and data lost is something absolutely not allowed ! In my knowledge I had no experience of evolving database, but I know this project should be able to evolve while using incremental development methodology. May I know is there any guideline to follow and how do I evolve it without any single error? In term of database initial design or anything. Just, 0 Data Lost is highest priority requirement.
I need answer for this 2 question and hope some experience could guide me in this issue
How to update database include table, column without affect other data in the same table
How to update remote database (for example C# window apps and database is not with me)
For the 1. question the database is located at my web server but question 2 the database is staying with user end.
The answer is: it is your duty to design upgrade process in the way your requirements are met. There is no auto magic which will do this for you.
The process usually involves creation of upgrade SQL script which will modify database structure and if needed it can also move data to temporary tables while structure of main tables are changed so that data are not lost. You can also maintain database version in some special table and check it before you run the update so that you ensure that update script is run on expected old version.
There are tools like RedGate SQL Compare and Visual Studio Database tools (only Premium and Ultimate version) which are able to take old database, a new database and create difference script for you so that old database schema can be upgraded to a newer one. This works for most scenarios but you must always very carefully test result in your testing environment. It is best to test in on backup of your production database if possible.
How to avoid data loss if anything goes wrong? There is only one very simple way BACKUP THE DATABASE before you do any changes and restore the old database if anything goes wrong. Backup can be even scripted with SQL. Without successful backup never touch your production database.
How to upgrade client side database? You will use the same process but you will wrap it all in some installation package (.msi) for example created with WiX.
I have written certain code to take the Backup of a Database, the application will take the Database Backup automatically at the time Specified. Now, I want some help to take Successive Backup of the Same Database which was taken previously and dont want to take Complete Database Backup again and again. Can any one Help me.
How to make incremental backups with MS Sql Server in c# : http://www.mssqltips.com/tip.asp?tip=1849
I have a C# app (in VisualStudio 2010) that used SqlServer 2005 accessed through TableAdapters in C#.
I haven't discovered a good way to manage DB changes. For example, in my next release I have a bunch of db schema changes. I made all of my DB changes in Sql Server Management Studio. But now I have to manually make these changes on the production servers in turn after I deploy the new application code (slow and buggy).
Furthermore, if I decide to roll back my release to a previous version, I have to manually go through and undo all my db changes before I can deploy the old code (and now I am under time constraints because the app is down). Again, this is also very error prone.
Oh, and lets hope that one of my errors doesn't cause massive destruction to the production DB, otherwise I now have to pull the most recent backup out of storage and try again (very time consuming).
I have heard of things like Migrations from Rails (and ORMs like SubSonic). I think that the new ORM style (define your schema in c# code) helps alleviate a lot of this, but unfortunately, as I am using TableAdapters, I don't see how I could implement something like migrations.
How do people deal with this?
Release management for DBs usually involves migrations of static data and running of scripts to update/create programmability elements (sprocs, UDFs, triggers, etc) and modify existing schema definitions. Looks to me like you're missing the scripts. If you're making changes manually to your development DB and not creating scripts that mirror those changes, you will need to repeat the same manual steps against your test/production environments, which as you say is error prone and dangerous.
SQL Server Management Studio makes it easy to save scripts that reflect changes to any database objects. In the toolbar there should be an icon called "Generate change script", which gives you the option to save the SQL file to disk. You can then use this to perform the same change against another server. You can also manually script any or all stored procs, UDFs, triggers and so on, and run those against a server as well (just right-click on them).
As to rollback, that's normally achieved by restoring a backup of the database made just before the deployment process begins.
This whole process tends to be different for each company, but that's generally how it's done.
ORMs that auto-generate schemas have always seemed evil to me, not to mention pretty much impossible to use against a production box, but I guess there's also an option.
The easiest way to deal with this problem is to buy software that can detect db schema by comparing two databases changes and generate a change script that can update your target database. I am using Visual Studio Ultimate 2010 for that, but there's also cheaper software that can do the same. This works for me 99% of the time (the only instance where this did not work properly for me is when I renamed a table column).
If you don't have such a piece of software, it is crucial to generate your SQL change scripts by hand. Whenever you do a change to the database schema, keep track of the SQL you used for that changed and add it to one big file of db schema changes for the next version of your software. It's a bit tedious at the beginning, but you'll get used to it pretty quickly.
Then when you are ready to deploy the software, proceed as follows:
Take the website offline
Make a backup of your current production database.
Make a backup of your current production website.
Upload your new code to the server
Run the DB changes script you previously created (either by hand or with the software mentioned above)
Take the website back online and see if it works. If it doesn't and you can't easily fix the problem, revert to the previous website and db version until you have fixed the bug.
All of these steps can be easily automated using batch files and the SQL server agent or SQLCMD.
Generally you should deploy to a staging server first, then test your website very thoroughly and only then move on to the production server. This way you avoid longer downtimes on your production server and minimize the risk of losing any vital data.
Here at Red Gate Software we're currently tackling this exact issue. Please take a look at our SSMS add-in, SQL Source Control, in combination with SQL Compare Pro. We're also working on a 'Migrations' feature, due out later this year, allowing custom migrations scripts to be defined for specific version transitions. As we're still in the early stages of the project, there's still time to give us feedback and help us design a great solution. We'd love to speak to your further about your requirements!
SQL Server 2008, Visual Studio 08 and C#
The task is to create the same database on multiple servers. If it were only two or three tables I would have done it manually but there are more than 50 tables in the database..
So I thought why not create a backup and restore the backup file wherever needed
error
the file is in use! ( although the backup is not in use, the error is saying that the actual db is in use! YES IT IS, i cannot close the server each time i want to restore the backup on other servers!)
So what should I do, please give your ideas
And also note
Whatever you say should be achievable using SMO objects also
thank you
[REVISED - I need to learn to read better]
I'm not certain about SMO Objects, but given that SMO can work like other SQL Server functionality, what we do for our project is to use a Database Project that deploys to our servers automatically. This probably requires Database Edition (VS 2008) or higher...premium in 2010. If you have that, it's definitely a nice option to create a DB project. Then, you just set it up to do a schema compare (and you might be able to do a data compare as well if you need that...?) during deployment. Auto-deployment is harder to setup initially, but once it's setup, you're good to go with single click deployments...we use TFS to deploy right now, but I hear good things about TeamCity:
http://www.jetbrains.com/teamcity/
Kevin
Use
usp_killDBConnections #DBName=''DbName''
before you run restore operation. The command will kill all DB connection before Restore, which is important during restoration.
Prior to running the restore, you can run the stored procedure 'sp_who2' and it will list the current connections to the database. If you are going to overwrite the DB anyways, for each of those connections you can issue a 'kill ' to forceably close the connections.
I also recall there being some "close all existing connections" option when using the restore GUI in Sql management studio.