I have a question, clearly since I'm here...
I have a source database (SQL Server 2008) and a destination database (also SQL Server 2008). There are modifications which need to be done which means that a lot of the data has to pass through C# (converting coordinates, triangulation etc). But how would you do it?
I'm looking for a few things here. Right now I'm using a SqlDataReader and pulling the data into a DataTable and using the data like that in order to manipulate it before I push it into the destination database, but what would be a better (faster & more memory efficient) way of doing it?
Also, for data which does not need to be manipulated I'm still pulling it through in the same way, I assume there is a way of avoiding that which would be quicker?
Technical info:
DB: 2 x SQL Server 2008 (source/dest) - On the same server
Language: C# / .NET 3.5
OS: Windows
I would insert the source data into a temp table(s) on to the destination database.
Then I would use SQL Merge to update the destination.
MERGE (Transact-SQL)
I would completely minimize the role that c# plays.
You could embed the .NET code onto the sql server using the SQL CLR in order to remove the round-trip time between the database and the machine.
You should use Integration Services for ETL tasks.
Related
As stated in the title, how can I programmatically create a SQL Server CE 4.0 from a remote SQL Server?
I want my application to allow users to delete the .sdf and create a new one based on the new remote database schema when there is a database schema update. And then download relevant data for offline use.
I already read up about the SqlCeEngine part, but I am not good at SQL Server CE queries - seem to give many syntax errors when trying out in Management Studio.
I also tried Microsoft Sync Framework Snapshot synchronization but it feels too bulky and the Local Cache Database modifies my database schema and generates a lot of junk I do not need. Maybe a lower level solution like querying information.schema or something may work better?
Checkout DMO. Using managed code, you can enumerate objects like tables, columns on the sql server side.
http://msdn.microsoft.com/en-us/library/aa174487(v=sql.80).aspx
Here's a tutorial to get you started:
http://www.codeproject.com/KB/database/SMO_Tutorial_1.aspx
Concerning the data, one option is the bcp utility
http://msdn.microsoft.com/en-us/library/aa337544.aspx
Those are good starting points if you want to extract and create a new database. For mirroring/sync, probably not a good path. If it's read only data on the client and you jst want to update the local data, then you can just extract again and throw away the old 'data cache'
You can use my scripting API and command line tools to do this: http://exportsqlce.codeplex.com - see for example this blog post: http://erikej.blogspot.com/2010/02/how-to-use-exportsqlce-to-migrate-from.html
This may be a more up to date way using SQL only:
SELECT * FROM INFORMATION_SCHEMA.COLUMNS
I am writing a software which stores all the information of a users interaction in a global session object/class. I would like to store this values collected in a persistent storage. However i cannot use heavy databases such as sql server or mysql in the target pc as i need to keep the installer minimum in size.
I also need to retrieve values from the storage by passing simple Linq queries,etc.
My question is what is the next best thing to databases which can be manipulated by C# code?
Probably either SQLite or SQL Server Compact Edition - these are both fairly full-featured database systems than run entirely in-process and are frequently used these sorts of thing (for example Firefox uses SQLite to store bookmarks).
The next rung down the ladder of complexity would probably be either XML (using LINQ to XML), or just serialisable objects (using LINQ to objects) - you would of course incur performance penalties over a "proper" compact database like SQLite if you started storing a lot of data, however you would probably need store more than you think before it became noticable, and for small data sets the simplicity would even make this faster than SQLite (for example you could restrict your application to storing the last 100 or so actions).
SQL Server CE and SQLite are popular for the scenario you are describing. XML is as well
You could connect to Access MDB files. You don't need an SQL server for this, and it uses the same syntax.
Just need to use OleDb.
Example: DataEasy: Connect to MS Access (.mdb) Files Easily using C#
Complete newbie question here: I'm just playing with C# for the first time and want to make a Windows Forms application which stores some info in a database structure, but obviously don't want to require something like MySQL to be installed on each client's computer. How do I go about this?
You can use SQLite. It doesn't require any installation or server on the client's computers. Here is an blog describing how to use it with .NET. It is easy to use, just add a reference to the System.Data.SQLite.dll.
Here is an open source data provider for .NET: System.Data.SQLite
From homepage: "SQLite is a software library that implements a self-contained, serverless, zero-configuration, transactional SQL database engine. SQLite is the most widely deployed SQL database engine in the world. The source code for SQLite is in the public domain."
You use a database that doesn't require an install. There are a few out there - there's Microsoft SQL Server Compact, which frankly is badly named, as it won't support the more useful SQL functions like stored procedures, views and so on. There's also VistaDB which does support stored procs, but requires purchase if you want Visual Studio plugins.
The answer is Embedded Databases. You've got quite a large list of Embedded databases that you can use:
Commercial:
VistaDB - This database is written completely in managed C#.
Open Source:
Firebird - .NET Driver
SQLite - .NET Driver
You could write your data to XML files, or you could take a look at the Sql Server Compact Edition.
You could also work with objects and serialize/deserialize these to disk as binaries.
Of course the type of storage you choose depends a lot on the kind of data you're storing (and the volume of it).
Use SQL Server CE
An easy way to do it from .NET 3.5 onwards is to store your data in XML files and use Linq to XML. This allows you to use SQL-like commands on your data which are actually compiled into your application, so you get full IDE IntelliSense support and error checking.
Perhaps you could serialise a dataset and save it as XML. I'm a little confused why if you're playing around you would need to install MySQL on all client's computers. You could look at using SQL Express which is free perhaps?
Serialise Dataset:
http://blogs.msdn.com/yosit/archive/2003/07/10/9921.aspx
http://msdn.microsoft.com/en-us/magazine/cc163911.aspx
The Easiest way will be SQL Server Compact, Because it integrates directly into the Visual Studio IDE (I'm just hazarding the guess here that you use VS). Add the "Local Database", Create your tables and be sure to make your Table Adapter with Select, Update, Insert and Delete methods. If during Database Creation you called your Dataset "DS" you will be able to instantiate a Table Adapter Object from
DSTableAdapters
Namespace, and Use GetData() or Fill() methods to retrieve your Data, and Insert(), Update() and Delete() to Manage it.
VelocityDB works in a server less mode but can also be combined with a server when there is a need for it. It outperforms all the other choices mentioned here by roughly a magnitude, see comparison here. It allows you to use almost any .NET data structures persistently. The entire database engine and the optional server is implemented using C# code.
I need to transfer around 2m records from 4 tables in SQL Server 2008 Express to MySQL.
In C# I can insert these records very quickly within a transaction with the Table-Valued parameter. (Around 50 seconds).
How can I do something similar for MySQL in C#?
Read the explanations in the MySQL Reference Manual. The best you can do is use LOAD DATA INFILE while disabling indices before and recreating (and thus batch-calculating them) afterwards. There is more interesting advice if that doesn't work out for you.
Are you locked into using C# for the migration? If you're not and only want to transfer the data you can use the MySQL Migration Toolkit to do the transfer for you.
I need to copy several tables from one DB to another in SQL Server 2000, using C# (VS 2005). The call needs to be parameterized - I need to be able to pass in the name of the database to which I am going to be copying these tables.
I could use DTS with parameters, but I can't find any sample code that does this from C#.
Alternatively, I could just use
drop table TableName
select * into TableName from SourceDB..TableName
and then reconstruct the indexes etc - but that is really kludgy.
Any other ideas?
Thanks!
For SQL Server 7.0 and 2000, we have SQLDMO for this. For SQL Server 2005 there is SMO. This allows you do to pretty much everything related to administering the database, scripting objects, enumerating databases, and much more. This is better, IMO, than trying a "roll your own" approach.
SQL 2000:
Developing SQL-DMO Applications
Transfer Object
SQL 2005:
Here is the SMO main page:
Microsoft SQL Server Management Objects (SMO)
Here is the Transfer functionality:
Transferring Data
How to: Transfer Schema and Data from One Database to Another in Visual Basic .NET
If the destination table is being dropped every time then why not do SELECT INTO? Doesn't seem like a kludge at all.
If it works just fine and ticks all the requirements boxes why create a days worth of work growing code to do exactly the same thing?
Let SQL do all the heavy lifting for you.
You could put the scripts (copy db) found here
http://www.codeproject.com/KB/database/CreateDatabaseScript.aspx
Into an application. Just replace the destination. To actually move the entite database, FOLLOW
http://support.microsoft.com/kb/314546
But remember, the database has to be taken offline first.
Thanks