I have been asked to generate a Backup Script. Which comprises of a lot of insert statements.
However im not sure how to go about this with SQL to Linq, I have looked at the DataContext Mapping which is able to list all the tables. However I need to be able to then go through each row and grab the data, which will be wrote into the Backup Script.
Any advice would be appreciated.
Thank you
Is it really a requirement to use Linq2SQL. Otherwise I guess the following stored procedure will do?
This procedure generates INSERT statements using existing data from the given tables and views. Later, you can use these INSERT statements to generate the data. It's very useful when you have to ship or package a database application. This procedure also comes in handy when you have to send sample data to your vendor or technical support provider for troubleshooting purposes.
http://vyaskn.tripod.com/code.htm#inserts
Related
Before I posted this question, I did some Googling first on how a database was created through C# and mostly it points to either SMO or SQL query files and it was the time of SQL Server 2005 and 2008.
So at this day in age, is there an easier way to create a database with empty tables, tables with data in them by default, stored procedures and views?
I need a suggestion.
I think the answer is probably Entity Framework. You can do 'code first' and use database migrations, allowing you to write your C# code and use that to generate a lot of the database for you.
Ultimately though, 'easier' is subjective. I personally find EF great for the 'normal' stuff, but at the end of the day, if you need a stored procedure to do some custom logic; you need to write the custom logic, in some fashion.
Maybe have a look and see if you think it fits your needs.
https://www.asp.net/mvc/overview/getting-started/getting-started-with-ef-using-mvc/creating-an-entity-framework-data-model-for-an-asp-net-mvc-application
Looked at the database projects in studio 2013. You create a database as a series of scripts using a familiar GUI. However, changes are published - this process creates a unique change script targeting the connection you define. For new databases the whole thing gets created, but publish against a partial or out dated version and the script created in a change script to bring it up to date.
You can even writ unit tests against your database using specialist tools, although I do find them lacking a bit.
More on msdn - here
Depends. right out of gates. Sp and views. Best shot is directly from database through a workbench. I can then capture definitions and store in a file to be replayed through c#
As for tables there are many orms that can generate tables via c#. Look at entity frameworks. Code first examples
I have generated tables using EF Works fine. I then went into database and created views and sps.
The trick is to migrate new views and sps into your EF model U can google entity Frameworks code first ... Adding views and SPs.
Worst case is u create database all through database workbench. Create a script that an be played to recreate eveything. By running. Then use EF DATABASE first approach
In either case u end up with a good set of autogenerated code to manage CRUD and object management and an abstracted data model
I have two databases. One of them belongs to a CRM software and is the source.
The other one will be the destination used by a tool I'm developing.
The destination will contain a table ADDRESSES with a subset of the columns of a table of the same name in the source database.
What is the best (most efficient) way to copy the data between those databases (btw: they're on different SQL Server instances if that's important).
I could write a loop which does INSERT into the destination for each row obtained from the source but I don't think that this is efficient.
My thoughts and information:
The data won't be altered on its way from source to destination
It will be altered on its way back
I don't have the complete structure of the source but I know which fields I need and that they're warranted to be in the source (hence, access to the rows obtained from source isn't possible using the index of columns)
I can't use LINQ.
Anything leading me in the right direction here is appreciated.
Edit:
I really need a C# way to copy the data. I also need to know how to merge the copied rows back to the source. Is it really necessary (or even best practise) to do this row after row?
Why write code to do this?
The single fastest and easiest way is just to use SQL Server's bcp.exe utility (bcp: Bulk Copy Program).
Export the data from the source server.
Zip it or tar it if it needs it.
FTP it over to where it needs to go, if you need to move it to another box.
Import it into the destination server.
You can accomplish the same thing via SQL Server Management Studio in a number of different ways. Once you've defined the task, it can be saved and it can be scheduled.
You can use SQL Server's Powershell objects to do this as well.
If you're set on doing it in C#:
write your select query to get the data you want from the source server.
execute that and populate a temp file with the output.
execute SQL Server's bulk insert statement against the destination server to insert the data.
Note: For any of these techniques, you'll need to deal with identity columns if the target table has them. You'll also need to deal with key collisions. It is sometimes easier to bulk load the data into a perma-temp table first, and then apply the prerequisite transforms and manipulations to get it to where it needs to go.
According to your comment on Jwrit's answer, you want two way syncs.
If so, you might want to look into Microsoft Sync Framework.
We use it to sync 200+ tables on Premise SQL to SQL Azure and SQL Azure to SQL Azure.
You can use purely C#. However, it might offer a lot more than you want, or it might be over kill for a small project.
I'm just saying so that you can have different option for your project.
If these databases exist on two servers you can setup a link between the servers by executing sp_addlinkedserver there are instructions for setting this up here. This may come in handy if you plan on regularly "Sharing" data.
http://msdn.microsoft.com/en-us/library/ff772782.aspx
Once the servers are linked a simple select statement can copy the rows from one table to another
INSERT INTO db1.tblA( Field1,Field2,Field2 )
SELECT Field1,Field2,Field2 FROM db2.tblB
If the Databases are on the same instance you only need to execute similar SQL to the above
If this is one time - the best bet is normally SSIS (SQL server integration services), unless there are complex data transformations - you can quickly and easily do column mappings and have it done (reliably) in 15 mins flat......
We have a winform app, all the SQL is inline and no stored procedure is used unfortunately.
What is the best way to keep a trail of what action (INSERT, UPDATE or DELETE) is performed against a SQL table, monitor and capture a record of the activity going on in a single table.
I know SQL server profiler is available but ideally I don't want to keep the profiler running all the time, what I need is something running behind the scenes and capturing all the activity for one table.
I thought of using triggers but there are some disadvantages of using it so I was wondering if there are other options?
Thanks
If you're using SQL Server 2008 (and up), Change Data Capture sounds like a good place to start:
http://msdn.microsoft.com/en-us/library/bb522489(v=sql.105).aspx
If you are using ORACLE, you can audit a table using this statement (if fireid
is the name of the user doing the updates):
AUDIT SELECT TABLE, UPDATE TABLE, INSERT TABLE, DELETE TABLE BY fireid BY ACCESS;
The results will be stored in the SYS.AUD$ table
(If you're using another database just search the documentation for auditing DML statements.)
SQL SERVER: There is a feature called "SQL AUDIT" Is this what you're looking for?
http://www.bradmcgehee.com/2010/03/an-introduction-to-sql-server-2008-audit/
There are several ways. If you just want to do it in Code (depending on your program)
You can just add the commands to a list and write them to a file.
If you are using EntityFrameWork then I would log the data changes. And use of stored procs is really best when using EF.
A sample of your code or project would help. TransactionLogs already capture that information.
And OBVIOUSLY your using Sql Server THOSE ORACLE PEOPLE ARE JUST SO JEALOUS.
Also look at using DataSourceViews, it's a way to tell what is going on with the server.
I think SQL Server CHANGE TRACKING TeEchinique is best suitable for your requirements.
What is the fastest method for copying data between databases in c#?
I want to create c# database procedure than accepts query string as parameter and inserts data to another database (sql query and destination table will have the same columns).
I use SQLBulkCopy right now but causes some problems.
I know you've asked about C# in your post, but you might want to consider a couple of other methods, using the stored procedure you are about to create. All of the below would be in the stored procedure itself.
Have you considered a T-SQL approach using OPENROWSET() That way, you never have to pull the data back over to C# and send it back over to SQL server. You will find some recommendations from Microsoft here
and here is the specific link using OPENROWSET()
If you want to use C# still, you can also use SSIS, which has an ability to copy data in bulk mode between servers, and you can call it from C# using the appropriate package you'd have to create (visual workflow). Please see this thread for details.
Is it possible using nHibernate, or Entity Framework, or whatever, to generate the SQL necessary to INSERT or UPDATE an object I've mapped to a table? I'm looking at generating SQL scripts from these POCO classes instead of running directly against a database.
The idea here is I'm taking some 2000+ line SQL scripts and I've made a Python-based DSL that does a lot of the work for us. I've got a C# application now that loads IronPython and 'compiles' the Python script, generating all of the necessary objects. I've got a prototype script here that's roughly 100 lines. Now, I need to actually generate the SQL script.
I could use something like nVelocity, but if this project is successful, I want to keep the long-term door open for running against a database. I've got roughly 30 tables with a few dozen columns per table to script out.
I found this but it appears to be generating the database table schema, rather than writing inserts and updates: https://forum.hibernate.org/viewtopic.php?f=25&t=1000334
The documentation on SchemaExport doesn't give a lot of information.
Suggestions / ideas?
I think it is not possible.It can generate the schema, but the sql for the crud is generated on the fly.
You can use tools that generates T-sql for doing the CRUD against a table.