Is there a way I can write C# code to write the structure of an SQL table and the contents of its data to text files? So far I figured out how to use BCP to copy the data, but I'm not sure how to copy the table creation script using C#.
Am I going down the right path or is there a better way to do this? I basically want to automate the table backup scripts that can be created in SQL server. The ideal output would be two sql files. One that creates the table and one that populates the data. BCP data isn't in SQL format so you can't just execute it in SQL server (which is what I want, this just will make it more user friendly for the people I work for).
You can do this with SQL Database Management Objects (SQL-DMO).
An example of scripting objects is here: http://msdn.microsoft.com/en-us/library/ms162153
Related
I need to export the Snowflake data (millions of records in a single table) using C# script and insert into a SQL Server table.
Bulk export operation times out. How to do this in batches in C# script?
I have not done it before but what J.Salas is referring to (linking servers) looks like the best solution and is described here or here.
If you can't do the above for some reason then it would be better to use a COPY to dump CSV files into a storage bucket and then load it into your SQL Server database with one of its bulk load utilities.
The main thing: don't build it yourself in C#, you'd just be re-inventing something that already exists. You can find something that is faster and better than what you could build yourself.
I have two databases. One of them belongs to a CRM software and is the source.
The other one will be the destination used by a tool I'm developing.
The destination will contain a table ADDRESSES with a subset of the columns of a table of the same name in the source database.
What is the best (most efficient) way to copy the data between those databases (btw: they're on different SQL Server instances if that's important).
I could write a loop which does INSERT into the destination for each row obtained from the source but I don't think that this is efficient.
My thoughts and information:
The data won't be altered on its way from source to destination
It will be altered on its way back
I don't have the complete structure of the source but I know which fields I need and that they're warranted to be in the source (hence, access to the rows obtained from source isn't possible using the index of columns)
I can't use LINQ.
Anything leading me in the right direction here is appreciated.
Edit:
I really need a C# way to copy the data. I also need to know how to merge the copied rows back to the source. Is it really necessary (or even best practise) to do this row after row?
Why write code to do this?
The single fastest and easiest way is just to use SQL Server's bcp.exe utility (bcp: Bulk Copy Program).
Export the data from the source server.
Zip it or tar it if it needs it.
FTP it over to where it needs to go, if you need to move it to another box.
Import it into the destination server.
You can accomplish the same thing via SQL Server Management Studio in a number of different ways. Once you've defined the task, it can be saved and it can be scheduled.
You can use SQL Server's Powershell objects to do this as well.
If you're set on doing it in C#:
write your select query to get the data you want from the source server.
execute that and populate a temp file with the output.
execute SQL Server's bulk insert statement against the destination server to insert the data.
Note: For any of these techniques, you'll need to deal with identity columns if the target table has them. You'll also need to deal with key collisions. It is sometimes easier to bulk load the data into a perma-temp table first, and then apply the prerequisite transforms and manipulations to get it to where it needs to go.
According to your comment on Jwrit's answer, you want two way syncs.
If so, you might want to look into Microsoft Sync Framework.
We use it to sync 200+ tables on Premise SQL to SQL Azure and SQL Azure to SQL Azure.
You can use purely C#. However, it might offer a lot more than you want, or it might be over kill for a small project.
I'm just saying so that you can have different option for your project.
If these databases exist on two servers you can setup a link between the servers by executing sp_addlinkedserver there are instructions for setting this up here. This may come in handy if you plan on regularly "Sharing" data.
http://msdn.microsoft.com/en-us/library/ff772782.aspx
Once the servers are linked a simple select statement can copy the rows from one table to another
INSERT INTO db1.tblA( Field1,Field2,Field2 )
SELECT Field1,Field2,Field2 FROM db2.tblB
If the Databases are on the same instance you only need to execute similar SQL to the above
If this is one time - the best bet is normally SSIS (SQL server integration services), unless there are complex data transformations - you can quickly and easily do column mappings and have it done (reliably) in 15 mins flat......
I've got a bunch of SQL dump files that I'd like to import into a dataset with C#. The machine this code will run on does not have SQL installed. Is there any way to do this short of parsing the text manually?
Thanks!
EDIT: The Dump file is just a long list of SQL statements specifying the schema and values of the database.
Is this a SQL dump of the backup sort that you create via a DUMP or BACKUP statement? If so the answer is pretty much no. It's essentially a snapshot of the physical structure of the database. You need to restore/load the dump/backup to an operable database before you can do anything with it.
I am pretty sure there is no way to turn a bunch of arbitrary SQL statements into an ADO.net DataSet without running the SQL commands through a database engine that understands the SQL in the dump file. You can't even do it between databases, e.g. a MySQL dump file will not run on an MS SQL server
You don't have to go manual, once you know the format. Creating a DataSet can be done 2 ways:
Set up code to loop the file and create a dataset directly
Set up code to loop the file and create an XML document in dataset format
Either will work. The best depends on your familiarity with the above (HINT: If you have no familiarity with either, choose the dataset).
It is all a question of format. What type of database is the DUMP file from? MySQL? Postgress? other?
Is it possible to create database files that are independent from any SQL Server engine on the fly? The thing is that we want to create one database file per user in our application, and we want that database to be stored in the user's directory but we don't want to connect to it through SQL Server. Prety much like an sdf file, but instead of App specific, User specific.
So what we want to do is that every time a user is created in the app, generate their folder, and generate the database schema, which will be populated later.
Sounds like you're looking for SQL Server Compact. You can create any number of these databases at runtime and place them anywhere you want.
Sounds like you really want a combo of "Access" and the Jet engine.
It meets the database in a single file requirement and is reasonably performant for single user access.
Another possibility would sqlite, which is IMHO a better database but support for C# seems immature.
You can also have empty database file stored and each time a new user is created make a copy of that empty database file into the user folder. This way the schema will be there from the empty database file (it is like a template or a model database from the SQL Server).
SQL Server Express/Compact with smo to programatically create your db.
I am trying to replace a DTS access exporter package with a exe we can call from our stored procedures (using xp_cmdshell).
We are in the middle of a transition between SQL 2000 and SQL 2005, and for the moment if we can not use DTS OR SSIS that would be the best options.
I believe I have the following options:
Using a SQL data reader to read SQL records, and using ADO.net to insert the read records into Access.
I have implemented this and it is WAY too slow. This is not a option
Setting up Linked tables in access, then getting access to pull the data out of sql.
If anyone has any experience in doing this I would be grateful for some code examples or pointing out some resources?
If there are any other options for transferring large amounts of data from SQL into a Access database that would be awesome, but performance is a big issue as we can be dealing with up to 1mil records per table.
Have you tried this?
Why not creating a linked table in Access, and pulling data from Sql Server instead of pushing from Sql to Access ?
I've done plenty of cases where I start with an Access database, attach to SQL Server, create a Create Table or Insert Querydef, and write some code to execute the querydef, possibly with arguments. But there are a lot of assumptions I would need to make about your problem and your familiarity with Access to go into more detail. How far can you get with that description?
I have ended up using Access interop, thanks to le dorfier for pointing me in the direction of the import function which seems to be the simplest way..
I now have something along these lines:
Access.ApplicationClass app = new Access.ApplicationClass();
Access.DoCmd doCmd = null;
app.NewCurrentDatabase(_args.Single("o"));
doCmd = app.DoCmd;
//Create a view on the server temporarily with the query I want to export
doCmd.TransferDatabase(Access.AcDataTransferType.acImport,
"ODBC Database",
string.Format("ODBC;DRIVER=SQL Server;Trusted_Connection=Yes;SERVER={0};Database={1}", _args.Single("s"), _args.Single("d")),
Microsoft.Office.Interop.Access.AcObjectType.acTable,
viewName,
exportDetails[0], false, false);
//Drop view on server
//Releasing com objects and exiting properly.
Have you looked at bcp? It's a command line utility that's supposed work well for importing and exporting large amounts of data. I've never tried to make it play nice with Access, but it's a great lightweight alternative to DTS and/or SSIS.
Like others have said, the easiest way I know to get data into an Access mdb is to set things up in Access to begin with. Roughly speaking:
Create linked tables to the SQL data you want to export. (in Access: File --> get ecternal data --> link tables) This just gives you a connection to sql server.
Create a local table that represents teh schema of the data you want to export. (on the tables tab, click the "new" button and follow your nose).
Create an Update query that selects data from the linked tables (SQL Server) and appends rows to the local table (access mdb).
On the macros tab, create a new macro that executes the query you just created above (I can't recall the exact "action" to use, but it's something like OpenQuery or RunQuery); name the macro "autoexec", which will cause it to automatically run when the mdb is opened.
Use a script (or whatever) to copy and open the mdb when appropriate; the autoexec macro will kick things off and the query will copy data from SQL server to the mdb.