Could we create a WCF Service to merge sqlserverce databases in background thread. I have GUI which selects a set of sdf databases from various directories in a network and event handler to cmd for merge the databases in those directories between the range of dates. The structure of the sdf databases are same in all the directories.
Okay...I was trying to be funny with my comment but I will provide a better answer.
I don't know of any "magic" command to merge databases together with C# code. You could write your own logic to merge databases. The code would look something like this.
Get a list of tables from the database.
Iterate through the list of tables and grab the metadata
Add logic to skip any readonly columns such as identity columns and calculated fields
Get the data from the source table
Write the data to the destination table
I would really need much more information to help any more then this. For example,
Will you be merging all the databases into one of the original databases or will you be creating a completely new database which contains all the data from all databases?
Do you need to create the destination database first or does it already exists?
If it already exists will you need to clean up the data that already exists in the database?
Once you write the logic to do the merge, how you call the code is your choice. .Net provides the capability to call the logic from a Web Service and it provides the means to run the code in a background thread. You can implement this functionality in a variety of ways.
This is exactly what SQL Server Integration Services was designed to do. SSIS has source and destination adapters for connecting to SQL CE.
Related
I have a conceptual question.
I have two databases which have the same structure. One database has already contained a lot of data. These data should be transferred to the other database via Select and Insert.
How can I do this data migration with the highest performance?
My first approach was to sort all the tables in a list where the tables which contain foreign keys will be stored behind the referenced tables. But with this solution it will be impossible to start parallel processing.
The second idea was to create a custom type which contains the tablename and the tablenames of the referenced tables and a bool flag which stores whether the data in the table have been copied. This type is stored for each table in a list. Then I start a new thread that checks before copying whether the referenced tables have already been created for each table. If not, I execute Thread.Sleep() after which I will check it again.
Is there a well performing approach to this problem?
Any suggestions will be helpful.
EDIT:
The old database is a SQL Base database.
The new database is a ms sql server database.
You may use either:
1) SQL Server Replication
Or
2) SQL Server Merge statement
You may use SQL Server Linked Servers to connect different database platforms (e.g. sql server, mysql, db2, ...)
Best advice : Stick with SQLBase.
Second best advice , use the SQLBase UNLOAD command via SQLTalk. This will write all DDL statements required to recreate the database else where - including Triggers , Stored Procs, Indexes etc. plus all the Data to load , if you use the right options, to an external file . This file can optionally then be edited programmatically if need be to be in a sql server format ( not much difference ) . There are many options to the UNLOAD command which can't be written here in detail , but here's a link to the syntax.
Note that SQLBase v12 has been released in recent months and performance has increased tenfold. With the right tuning and indexes etc . it will outstrip Sql Server in terms of efficiency and performance. On a 100Gb database our response times have gone from 50 seconds to 3 seconds with no additional work .
I have two databases. One of them belongs to a CRM software and is the source.
The other one will be the destination used by a tool I'm developing.
The destination will contain a table ADDRESSES with a subset of the columns of a table of the same name in the source database.
What is the best (most efficient) way to copy the data between those databases (btw: they're on different SQL Server instances if that's important).
I could write a loop which does INSERT into the destination for each row obtained from the source but I don't think that this is efficient.
My thoughts and information:
The data won't be altered on its way from source to destination
It will be altered on its way back
I don't have the complete structure of the source but I know which fields I need and that they're warranted to be in the source (hence, access to the rows obtained from source isn't possible using the index of columns)
I can't use LINQ.
Anything leading me in the right direction here is appreciated.
Edit:
I really need a C# way to copy the data. I also need to know how to merge the copied rows back to the source. Is it really necessary (or even best practise) to do this row after row?
Why write code to do this?
The single fastest and easiest way is just to use SQL Server's bcp.exe utility (bcp: Bulk Copy Program).
Export the data from the source server.
Zip it or tar it if it needs it.
FTP it over to where it needs to go, if you need to move it to another box.
Import it into the destination server.
You can accomplish the same thing via SQL Server Management Studio in a number of different ways. Once you've defined the task, it can be saved and it can be scheduled.
You can use SQL Server's Powershell objects to do this as well.
If you're set on doing it in C#:
write your select query to get the data you want from the source server.
execute that and populate a temp file with the output.
execute SQL Server's bulk insert statement against the destination server to insert the data.
Note: For any of these techniques, you'll need to deal with identity columns if the target table has them. You'll also need to deal with key collisions. It is sometimes easier to bulk load the data into a perma-temp table first, and then apply the prerequisite transforms and manipulations to get it to where it needs to go.
According to your comment on Jwrit's answer, you want two way syncs.
If so, you might want to look into Microsoft Sync Framework.
We use it to sync 200+ tables on Premise SQL to SQL Azure and SQL Azure to SQL Azure.
You can use purely C#. However, it might offer a lot more than you want, or it might be over kill for a small project.
I'm just saying so that you can have different option for your project.
If these databases exist on two servers you can setup a link between the servers by executing sp_addlinkedserver there are instructions for setting this up here. This may come in handy if you plan on regularly "Sharing" data.
http://msdn.microsoft.com/en-us/library/ff772782.aspx
Once the servers are linked a simple select statement can copy the rows from one table to another
INSERT INTO db1.tblA( Field1,Field2,Field2 )
SELECT Field1,Field2,Field2 FROM db2.tblB
If the Databases are on the same instance you only need to execute similar SQL to the above
If this is one time - the best bet is normally SSIS (SQL server integration services), unless there are complex data transformations - you can quickly and easily do column mappings and have it done (reliably) in 15 mins flat......
I am looking for an idea or some direction. I have to transfer data from one database to another both are structurally and schema wise same.
Its a complete database with maybe 70 tables and having relationship between tables at different levels. Even though i ' m going to mess up the identity when i move across database but as of now i am ok with it.
Idea which i thought was to load required data from all table into an XML and then create connection to second database and push data from this XML its kind of repeated and not best way at all. So looking for direction.
Can i use entity framework for this somehow??
I cannot use SSIS for this it has to be C# Sorry.
You can create a linked server as stated in the comments to your question. You seemed to indicate that you know how to do this, but in case not, you can do this from SQL Server Management Studio by drilling down to "Server Objects > Linked Servers" beneath your source database on the source database server, then right-click, "New Linked Server", etc.
Then you would use a statement like this, for example, from your C# code:
insert into DestServer.DBName.dbo.TableName
select * from SourceServer.DBName.dbo.TableName
Assuming you are connected to 'SourceServer' and that 'SourceServer' maintains a linked server object pointing to 'DestServer'. Note: you don't actually need to use the fully-qualified name for the table on 'SourceServer', but I've put it there for clarity. I.E. you could also do this:
insert into DestServer.DBName.dbo.TableName
select * from TableName
Don't forget to setup the permissions properly in your linked server object so that your query can write to the table in the destination server. You can do this any number of ways, and often (because I work in a small environment where it's maintained by just me and a couple other folks) I just use the "sa" login:
Yes, can use linked servers in .NET.
You just use the 4 part name.
What you can do in TSQL in SSMS you can do in a .NET SQLcommand.
My experience is that you get better performance connecting to the server you are writing to.
I am using the Microsoft Sync Framework "collaboration" providers. Both ends of the sync will use SQL Express to begin with. When provisioned the database contains a "_tracking" table for each "real" table in the database. My database is fairly large, and I don't want to transfer the entire thing via MSF on the first sync. Is there a way to use some other method to "jumpstart" the sync when both sides are known to contain the same data? In my testing when both databases contain identical content, it looks like it downloads the entire scope, churns through the entire batch of "changes", and then uploads the entire scope back to the server which then churns through the entire dataset again. Is there any way to update the _tracking tables (hopefully only on one side) to let the system know that the database contents are the same?
More information (edit):
From examining the contents of the tracking tables after doing an initial sync, it looks like the scope_update_peer_timestamp and local_create_peer_timestamp fields in every _tracking table need to be updated on both sides. In addition, the update_scope_local_id, scope_update_peer_key, and last_change_datetime need to be set on one of the two sides.
The last_change_datetime field is a datetime and is fairly self-explanatory.
The two _timestamp fields seem to use ##DBTS and are thus bigints that contain the equivalent of an editable timestamp column.
That still leaves a bunch of unknowns:
Does MSF track which peer the content of the timestamp columns come from?
Which peer (local or remote) drives the contents of the _timestamp fields?
What logic drives the contents of the update_scope_local_id and scope_update_peer_key fields?
More information on the environment (edit):
I have SQL Express/Std on both sides. The server side will eventually contain information for several clients (using multi-tenancy), so backups will not be useful since the server will contain information for multiple clients.
how are you initializing your databases? are you provisioning databases that both contain the same set of data already?
the best way to initialize other replicas is to use the GenerateSnapshot method on the SqlCeSyncProvider that creates an SDF file to initialize other replicas or to do a back up of the database (non-SDF, SQL Server/Express database), restore it and run PerformPostRestoreFixup before doing a sync.
I am trying to replace a DTS access exporter package with a exe we can call from our stored procedures (using xp_cmdshell).
We are in the middle of a transition between SQL 2000 and SQL 2005, and for the moment if we can not use DTS OR SSIS that would be the best options.
I believe I have the following options:
Using a SQL data reader to read SQL records, and using ADO.net to insert the read records into Access.
I have implemented this and it is WAY too slow. This is not a option
Setting up Linked tables in access, then getting access to pull the data out of sql.
If anyone has any experience in doing this I would be grateful for some code examples or pointing out some resources?
If there are any other options for transferring large amounts of data from SQL into a Access database that would be awesome, but performance is a big issue as we can be dealing with up to 1mil records per table.
Have you tried this?
Why not creating a linked table in Access, and pulling data from Sql Server instead of pushing from Sql to Access ?
I've done plenty of cases where I start with an Access database, attach to SQL Server, create a Create Table or Insert Querydef, and write some code to execute the querydef, possibly with arguments. But there are a lot of assumptions I would need to make about your problem and your familiarity with Access to go into more detail. How far can you get with that description?
I have ended up using Access interop, thanks to le dorfier for pointing me in the direction of the import function which seems to be the simplest way..
I now have something along these lines:
Access.ApplicationClass app = new Access.ApplicationClass();
Access.DoCmd doCmd = null;
app.NewCurrentDatabase(_args.Single("o"));
doCmd = app.DoCmd;
//Create a view on the server temporarily with the query I want to export
doCmd.TransferDatabase(Access.AcDataTransferType.acImport,
"ODBC Database",
string.Format("ODBC;DRIVER=SQL Server;Trusted_Connection=Yes;SERVER={0};Database={1}", _args.Single("s"), _args.Single("d")),
Microsoft.Office.Interop.Access.AcObjectType.acTable,
viewName,
exportDetails[0], false, false);
//Drop view on server
//Releasing com objects and exiting properly.
Have you looked at bcp? It's a command line utility that's supposed work well for importing and exporting large amounts of data. I've never tried to make it play nice with Access, but it's a great lightweight alternative to DTS and/or SSIS.
Like others have said, the easiest way I know to get data into an Access mdb is to set things up in Access to begin with. Roughly speaking:
Create linked tables to the SQL data you want to export. (in Access: File --> get ecternal data --> link tables) This just gives you a connection to sql server.
Create a local table that represents teh schema of the data you want to export. (on the tables tab, click the "new" button and follow your nose).
Create an Update query that selects data from the linked tables (SQL Server) and appends rows to the local table (access mdb).
On the macros tab, create a new macro that executes the query you just created above (I can't recall the exact "action" to use, but it's something like OpenQuery or RunQuery); name the macro "autoexec", which will cause it to automatically run when the mdb is opened.
Use a script (or whatever) to copy and open the mdb when appropriate; the autoexec macro will kick things off and the query will copy data from SQL server to the mdb.