Console application uses sql request BACKUP DATABASE to create .bak file
https://technet.microsoft.com/en-us/library/ms191304(v=sql.105).aspx
It works good, but transaction logs contain information about this backup (and restore). How can I delete this information from transaction logs programmatically? Or can I backup/restore sql database without adding this information to transaction logs?
No you can't do any modification insiTransaction log
If you use SIMPLE recovery model, use CHECKPOINT to truncate the log.
If you are in FULL or BULK LOGGED, backup your log (you may need to repeat this operation if log is not truncated for some reason)
But what do you think it contains? There is no real data (reguarding backup) in there, only information about differential bitmap changes
Related
I'm creating a desktop application that backs up SQL server database periodically and sends the backup as attachment over email. After every iteration, I modify the file name to look like Backup, Backup_1, Backup_2 etc., to identify the files at the receiving end.
In the scenario where creating backup is successful (say Backup_1), but sending email has failed, I would like to delete the backup (Backup_1), so that I can reuse the same name in next iteration. In this way there would not be any gaps in the receiving end.
Unfortunately, when I delete the backup file, I'm getting the exception: "The process cannot access the file 'D:\Backups\Backup_1.bak' because it is being used by another process." I'm not aware of any other process that can access this file.
Please advise on how to tackle this issue.
I tried running as administrator, closing SQL Server Management Studio, but nothing helped.
How do I find the process that's blocking me?
I am using this command to backing up my database in my wpf application in this application i am using ado.net entity data model database first approach so after executing command i am getting this Error Please guide why it happens ??
"Access is Denied" means that the user the database service is running under does not have access to that backup file, which is probably only writable by your user since it's under your user directory.
Update the security settings of your file/folder so that the database process's user (probably SYSTEM, LOCAL SERVICE, NETWORK SERVICE or similar) can write to that file. Or host the backup file somewhere else where SQL Server has access.
I am developing an asp.net web application and want to log my exceptions into SQL DB. For this I am using Log4Net AdoNetAppender to log info into SQL DB, now the problem occurs when the DB goes offline. Log4Net doesn't persist the log messages. So all messages get lost while the DB was offline, is there any way to retain the messages until the DB comes online and then log all the messages to DB once it become available. Although by using reconnectonerror value="True" it starts logging again when DB is available but all intermediate messages while DB was offline are not logged.
Or there exist any other approach to log exceptions in Db with offline support.
There is nothing with helps you out of the box with this. You can always log to a file (keep the last week or so) and the database. If for some reason there is a gab in the logging you can fallback on the file.
If you want the behavior you describe you can always implement your own appender. Let is inherrit from the adonetappender and add your failover code. However keep in mind that you do not want to create an appender which eat all your memory...
I am trying to create a document manager for my winforms application. It is not web-based.
I would like to be able to allow users to "attach" documents to various entities (personnel, companies, work orders, tasks, batch parts etc) in my application.
After lots of research I have made the decision to use the file system to store the files instead of a blob in SQL. I will set up a folder to store all the files, but I will store the document information (filepath, uploaded by, changed by, revision etc) in parent-child relationship with the entity in an sql database.
I only want users to be able to work with the documents through the application to prevent the files and database records getting out of sync. I some how need to protect the document folder from normal users but at the same time allow the application to work with it. My original thoughts were to set the application up with the only username and password with access to the folder and use impersonation to login to the folder and work with the files. From feedback in a recent thread I started I now believe this was not a good idea, and working with impersonation has been a headache.
I also thought about using a webservice but some of our clients just run the application on there laptops with no windows server. Most are using windows server or citrix/windows server.
What would be the best way to set this up so that only the application handles the documents?
I know you said you read about blobs but are you aware of the FILESTREAM options in SQL Server 2008 and onwards? Basically rather than saving blobs into your database which isn't always a good idea you can instead save the blobs to the NTFS file system using transactional NTFS. This to me sounds like exactly what you are trying to achieve.
All the file access security would be handled through SQL server (as it would be the only thing needing access to the folder) and you don't need to write your own logic for adding and removing files from the file system. To remove a file from the file system you just delete the related record in the sql server table and it handles removing it from the file system.
See:
http://technet.microsoft.com/en-us/library/bb933993.aspx
Option 1 (Easy): Security through Obscurity
Give everyone read (and write as appropriate) access to your document directories. Save your document 'path' as the full URI (\\servername\dir1\dir2\dir3\file.ext) so that your users can access the files, but they're not immediately available if someone goes wandering through their mapped drives.
Option 2 (Harder): Serve the File from SQL Server
You can use either a CLR function or SQLDMO to read the file from disk, present it as a varbinary field and reconstruct it at the client side. Upside is that your users will see a copy, not the real thing; makes viewing safer, editing and saving harder.
Enjoy! ;-)
I'd go with these options, in no particular order.
Create a folder on the server that's not accessible to users. Have a web service running on the server (either using IIS, or standalone WCF app) that has a method to upload & download files. Your web service should manage the directory where the files are being stored. The SQL database should have all the necessary metadata to find the documents. In this manner, only your app can get access to these files. Thus the users could only see the docs via the app.
I can see that you chose to store the documents on the file system. I wrote a similar system (e.g. attachments to customers/orders/sales people/etc...) except that I am storing it in SQL Server. It actually works pretty well. I initially worried that so much data is going to slowdown the database, but that turned out to be not the case. It's working great. The only advice I can give if you take this route is to create a separate database for all your attachments. Why? Because if you want to get a copy of the RDBMS for your local testing, you do not want to be copying a 300GB database that's made up of 1GB of actual data and 299GB of attachments.
You mentioned that some of your users will be carrying laptops. In that case, they might not be connected to the LAN. If that is the case, I'd consider storing the files (and maybe metadata itself) in the cloud (EC2, Azure, Rackspace, etc...).
I have a dedicated server which hosts a Windows Service which does a lot of very heavy load stuff and populates a number of SQL Server database tables.
However, of all the database tables it populates and works with, I want only one to be synchronised with a remote SQL Azure DB table. This is because this table holds what I called Resolved data, which is the end result of the Windows Service's work.
I would like to keep a SQL Azure database table in sync with this database table.
As far as I understand, my options are:
Move everything onto Azure (but that involves a massive development overhead and risk)
Have another Windows Service on the dedicated server which essentially looks at changed records since the last update and then manually update the SQL Azure table
Microsoft has the SQL Azure Data Sync used with the Sync framework:
http://www.microsoft.com/windowsazure/developers/sqlazure/datasync/
This can be used from T-SQL and the SQL Agent:
Link
Conversely, you could have a daily backup performed on your main database tables that only backs up changes. Then you could transfer the backup file to the SQL Azure database host and Restore from the file. At that point, you would have them in sync through daily scheduled tasks that shouldn't take very much effort. The BACKUP and RESTORE T-SQL commands should accomplish what you're looking for, and SQL Azure says it supports T-SQL.
Try SQL Examiner Suite
http://www.sqlaccessories.com/SQL_Examiner_Suite/
This tool can synchronize your local database with SQL Azure database (both schema and data)