we have an application that has a local SQL Server CE database file. When we open the database, but don't do any changes to it, the database file is changed anyway:
using (var connection = new SqlCeConnection("Data Source='data.sdf';File Mode='Shared Read';Encrypt=FALSE;LCID=1033"))
{
connection.Open();
using (var context = new DataContext(connection))
{
}
}
This changes some bytes at the very beginning of the sdf-file.
Is there any way to prevent this?
Yes, you can enable read only mode in the connection string. Also you may need to specify a temp path in this case:
string connectionString = ...;Mode = Read Only;Temp Path= ...;
More info.
Related
I have few thousands attachments saved in a SQL Server database with column datatype varbinary(max). I want to retrieve all these files and save them to a local drive.
What is the best way to achieve that? I am not looking for code specifically but trying to understand all options so I can do this in C#.
Suggestions are appreciated. Thank you
You can do any of the followings:
Write a SQL script which will read the data from the table and save those to the disk. Here is how that can be done
You can write a C# script which will connect to the Database, Read and store the data as a file to a disk
Use the Id of the table as part of the file name to make it unique if you are storing all the files to a single folder.
Here's example code.
Note, the namespace System.Data.SqlClient is not referenced by a .NET Core project by default as done by .NET Framework; you have to manually add the System.Data.SqlClient NuGet package to the project.
using System.Data.SqlClient;
var connectionString = "Data Source=localhost;Initial Catalog=MyDatbase;Integrated Security=True;";
var outputFolder = #"C:\temp\";
using var conn = new SqlConnection(connectionString);
conn.Open();
var query = "select DocumentId, Contents from DocumentFile where ID >= 1234";
using var cmd = new SqlCommand(query);
cmd.Connection = conn;
var reader = cmd.ExecuteReader();
if (!reader.HasRows) throw new Exception("No rows!");
while (reader.Read())
{
var fileName = $"{reader["DocumentId"]}.pdf";
var data = (byte[])reader["Contents"];
if (data == null) throw new Exception("Contents is null");
using var writer = new BinaryWriter(File.OpenWrite(Path.Combine(outputFolder, fileName)));
writer.Write(data);
}
I have a database in SQL Server 2008 R2, that uses the Simple recovery model.
The database contains a filegroup, where the bulk of the data resides (>20GB of images). These images are not critical for the application.
I want to backup the database from C# using Sql Server SMO. But I only want to backup the database structure (the PRIMARY filegroup; everything except the non-essential images). I want to do this in order to keep the backup size small.
In my C# code, I am setting the backup action to BackupActionType.Files, and I am only including the PRIMARY filegroup inside the DatabaseFileGroups collection, so it should only backup the database structure, and not the images.
But when I run the backup, I get this exception:
System.Data.SqlClient.SqlError: The primary filegroup cannot be backed up as a file backup because the database is using the SIMPLE recovery model. Consider taking a partial backup by specifying READ_WRITE_FILEGROUPS.
My question is, how can I specify READ_WRITE_FILEGROUPS from inside C# code, using Sql Server SMO? The exception shows me how to do so in T-SQL, but I want to do the same thing in C#.
Here is the code I am using:
class Program
{
static string DbName = PATH_TO_DATABASE;
static string connString = CONNECTION_STRING;
static void Main(string[] args)
{
ServerConnection serverConn = new ServerConnection();
serverConn.ConnectionString = connString;
Server server = new Server(serverConn);
Backup backup = new Backup() { Database = DbName };
backup.Action = BackupActionType.Files;
backup.DatabaseFileGroups.Add("PRIMARY");
backup.Devices.AddDevice("D:\\backup.bak", DeviceType.File);
backup.Initialize = true;
backup.ContinueAfterError = false;
backup.Incremental = false;
backup.Complete += (snd, e) => { Console.WriteLine("Complete"); };
backup.PercentComplete += (snd, e) => { Console.WriteLine("Percent " + e.Percent); };
backup.SqlBackup(server);
serverConn.Disconnect();
}
}
solution is very simple.
Just in SQLSERVER rigth-click on database and in Properties Window in Option tab change Recovery Mode To Bulk-logged
secound Solution by T-SQL:
USE [master]
GO
ALTER DATABASE [databasename] SET RECOVERY BULK_LOGGED WITH NO_WAIT
GO
I will try to do a backup database from C# application. I found more tutorials how do that. In the new project copy some solution and run. All the time I got one connection error like:
"backup failed for server ".
In this line:
source.SqlBackup(server);
Do you know how I resolve this? I think that problem concerns connection to server (it's broken?).
Below you can see a Backup method:
public static void BackupDatabase(string backUpFile)
{
ServerConnection con = new ServerConnection(#".\SQLEXPRESS");
Server server = new Server(con);
Backup source = new Backup();
source.Action = BackupActionType.Database;
source.Database = "DB";
BackupDeviceItem destination = new BackupDeviceItem(backUpFile, DeviceType.File);
source.Devices.Add(destination);
source.SqlBackup(server);
con.Disconnect();
MessageBox.Show("Kopia wykonana!");
}
Couple of things for you to try.
Make sure your database name is correct
source.Database = "DB"; // Check the database name is actually 'DB'.
I had some issues in the past using ServerConnection with a connection string, even though the syntax allows you to do so. What i did was to create an SqlConnection from the connection string and then give that to ServerConnection.
string connectionString = "Your connection string goes here";
SqlConnection sqlCon = new SqlConnection(connectionString);
ServerConnection connection = new ServerConnection(sqlCon);
I would also try initializing the backup object.
source.Initialize = true;
Added full control for PC Users to the backup folder on C:\ drive helped! Thanks all for help! But just I have one question: how I can modify above C# code that program should be yourself create backup folder on C:\ and do a copy database? Currently I must do it manually.
Background:
I am using sql statements to create a Temp database on a server which will store data until it is needed further by my client program.
Problem:
My sql statement to create the database works properly and creates the database with all the required specifications when run through Sql Management studio, on the other hand when my program executes the statement it only creates a database with the 'Default' settings except for the name.
Questions:
Why is this?
How can I make it create a database with my specifications
Sql statement:
CREATE DATABASE Temp ON PRIMARY(
NAME = Temp
, FILENAME = 'C:\Temp.mdf'
, SIZE = 2MB
, FILEGROWTH = 10%) LOG ON (
NAME = Temp_Log
, FILENAME = 'C:\Temp.ldf'
, SIZE = 1MB, MAXSIZE = 70MB
, FILEGROWTH = 10%)
Code:
public void AcuConvert()
{
using (DestD)
{
SqlCommand command = new SqlCommand();
DestD.Open();
command.Connection = DestD;
foreach (var item in Entity.SqlDestinationQueries.ToList())
{
command.CommandText = item.Query;
command.ExecuteNonQuery(); //This is where the command is run
}
foreach (var item in Entity.SystemQueries.ToList())
{
command.CommandText = item.Query.Replace("#Sys", SysD.Database);
command.ExecuteNonQuery();
}
foreach (var item in Entity.InsertQueries.ToList())
{
command.CommandText = item.Query.Replace("#Source", SourceD.Database); ;
command.ExecuteNonQuery();
}
}
}
Have you tried using SQL Server Management Objects instead of a raw SQL statement?
For example:
using Microsoft.SqlServer.Management.Smo;
using Microsoft.SqlServer.Management.Common;
...
// Connect to the default instance
Server server = new Server();
// Establish new database details
Database database = new Database(server, "MyTempDB");
// Add primary filegroup details
database.FileGroups.Add(new FileGroup(database, "PRIMARY"));
// Set Primary datafile properties
DataFile primaryFile = new DataFile(database.FileGroups["PRIMARY"],
"MyTempDB_Data", "C:\\MyTempDB.mdf");
primaryFile.Size = 2048; // Sizes are in KB
primaryFile.GrowthType = FileGrowthType.Percent;
primaryFile.Growth = 10;
// Add to the Primary filegroup
database.FileGroups["PRIMARY"].Files.Add(primaryFile);
// Define the log file
LogFile logfile = new LogFile(database, "MyTempDB_Log", "C:\\MyTempDB_Log.ldf");
logfile.Size = 1024;
logfile.GrowthType = FileGrowthType.Percent;
logfile.Growth = 10;
logfile.MaxSize = 70 * 1024;
// Add to the database
database.LogFiles.Add(logfile);
// Create
database.Create();
database.Refresh();
You can connect to the server with specific credentials, too, of course:
http://msdn.microsoft.com/en-us/library/microsoft.sqlserver.management.smo.server.aspx
If, however, you're stuck with using text scripts to create your database, I'd ensure that your Initial Catalog is set correctly (i.e. to 'master') in your connection string and your user has the necessary permissions - CREATE DATABASE / CREATE ANY DATABASE / ALTER DATABASE. If this still doesn't give you any joy, try stripping out the rest of your C# code and run the create SQL independently of the other statements - it could be that there's a side-effect from a preceding query. Use Profiler to see exactly what's running as you add them back in.
Edit:
I tested your script against a local SQL Server instance (2012 SqlLocalDB) via a small C# program using SqlClient and it ran just fine after changing the file paths to ones I had write access to (root of C is protected by default). The only other amendment was that the Primary size had to start at 3MB or more. Any smaller and the default tables could not be created. This may be another avenue of investigation for you to explore.
Your alternative option could be to use the Process class to run the sqlcmd.exe
Eg.
var process = Process.Start(WORKING_PATH, argument);
%WORKING_PATH% being "C:\tools\sql\sqlcmd.exe"
%argument% being "C:\scripts\Create.sql"
I use this strategy to dump test data into test environments when bootstrapping acceptance test fixtures.
Cheers
i have the code below trying to do a bulk copy from oracle to SQL server 2005 and it keeps timing out. how can i extend the oracle connection timeout? it seems i can not from what i read on the web.
OracleConnection source = new OracleConnection(GetOracleConnectionString());
source.Open();
SqlConnection dest = new SqlConnection(GetSQLConnectionString() );
dest.Open();
OracleCommand sourceCommand = new OracleCommand(#"select * from table");
using (OracleDataReader dr = sourceCommand.ExecuteReader())
{
using (SqlBulkCopy s = new SqlBulkCopy(dest))
{
s.DestinationTableName = "Defects";
s.NotifyAfter = 100;
s.SqlRowsCopied += new SqlRowsCopiedEventHandler(s_SqlRowsCopied);
s.WriteToServer(dr);
s.Close();
}
}
source.Close();
dest.Close();
here is my oracle connection string:
return "User Id=USER;Password=pass;Data Source=(DESCRIPTION=" +
"(ADDRESS=(PROTOCOL=TCP)(HOST=14.12.7.2)(PORT=1139))" +
"(CONNECT_DATA=(SID=QCTRP1)));";
You can set s.BulkCopyTimeout option
In your connection string, there is a 'Connection Lifetime' and 'Connection Timeout' parameter. You can set it accordingly. See here for the full reference.
BTW, I know you didn't ask this, but have you considered an ETL tool for migrating your DB records (e.g. Informatica, FME, etc.)? While your approach is valid, it isn't going to be very performant since you are hydrating all of the records from one DB to the client and then serializing them to another DB. For small bulk sets, this isn't a big issue, but if you were processing hundreds of thousands of rows, you might want to consider an official ETL tool.