I'm running the same commands in ADO.NET C# and Sql Server Management studio. The SQL that runs via C# performs significantly worse - memory usage is worse (using up all available memory) and thus causing the database executing time to increase. The management studio isn't perfect (it too causes sql server to use up memory) but it's not as bad as via ADO.NET.
I am running: Windows 7, Sql Server 2008 R2, 10.50.1600. C# .NET 3.5. Sql Server management Studio 2008 R2. All programs and databases are on my local dev machine.
The SQL I am running is 40 create view's and 40 create unique indexes on 2 database's. I need to do this on the fly as we are running a database compare between 2 databases (for reasons that aren't relevant we need to compare views and not tables). And since performance is an issue we cannot leave the views and indexes around all the time.
The SQL looks like this:
create view [dbo].[view_datacompare_2011106] with schemabinding as (
SELECT t.[ID], t.[Column1], t.[Column2], t.[Column3], FROM dbo.Table t WHERE t.[ID] in ('1','2','3','4') )
go
create unique clustered index [index_datacompare_2011106] on [dbo].[view_datacompare_2011106] (ID)
go
...
The only difference is that the C# code does not call Go. Each create cmd is wrapped up in a using statement and called via ExecuteNonQuery() e.g.
using (SqlCommand cmd = new SqlCommand(sql, this.connectionActualDb))
{
cmd.CommandTimeout = Int32.Parse(SqlResources.TimeoutSeconds);
cmd.ExecuteNonQuery();
}
P.S. SET ARITHABORT must be ON when you are creating or changing indexes on computed columns or indexed views.
Use Waits and Queues methodology to investigate the performance bottleneck. You'll find the root cause and then we can advice accordingly. Most likely your C# application runs into concurrency due to locks, very likely held by the application itself. Typically one blames plan changes due to parameter sniffing, as in Slow in the Application, Fast in SSMS, but with DDL statements this is unlikely.
Why don't you put all the commands into a single string separated by GO and send the one string to the database?
It's called SQL Batching.
Related
I was wondering which one is the best way to replicate some data of a database to another.
I have a database in one computer and this one receives some transactions. I need to send this data to another server (in the same local network) but with a modified value (I need to add 11 years to a Timestamp value).
So I was looking for some options for my case, I can develop a windows service to do this but I don't know if the sql server replication can do this for me or if there is another option like some kind of magical trigger that can do that.
I'm using SQL Server 2005 on Windows Server 2003 R2.
This link should help you:
Selecting the Appropriate Type of Replication
Quoted summary from link:
Microsoft SQL Server offers three types of replication. Each type of
replication is suited to different application requirements. Depending
on the needs of your application, you can use one or more types of
replication in a topology:
Snapshot replication
Transactional replication
Merge replication
I personally would replicate the database (transactional) and then use log shipping to update the replicated database (on your second server) with the latest data changes (from the primary server) then use a stored procedure running as a sql agent job to update the fields you need.
I personally am not a fan of triggers as you can end up having triggers activating other triggers and something that takes milliseconds to run can take seconds and if you have large volumes of data that can be painful (I manage a system that has exactly this issue - soon to be replaced thankfully)
hope this helps and if you have some follow up questions I'll be happy to help.
I am working on project in which I have used visual c# as front end and SQL Server 2008 R2 Express for backend.
Now I know that SQL Server express database has a size limit is of 10 GB, so I have written the code for database backup when pick limit is reached and I empty the database once the backup is successful.
I want to know what will be best approach to restore the backup file so that my current applications backend (which I have emptied earlier) should not be disturb.
Is it okay if I restore the same in my current database, in that case I have question to ask does it affect my applications working, as my application is kind of real time and after every 15 min. interval stored some values in database.
Whether I need to write some other utility for seeing the old data..?
Every day around 50 MB of data is inserted into database, so it will take around 8 months to reach the pick size( as per my rough calculations). and as far as the nature of application is concern user will not going to use archive data frequently. please consider this and suggest me the approach.
thanks in advance..!!
Hope i got your question right, but consider the following suggestion for working:
one database ("Current DB") that stores the real-time data.
when it comes to a size, it is dumped (or copied mdf+ldf) to archive.
and stored with time stamps (FROM-TO).
When data is needed, the relevant mdf is attached as a new "offline" database.
(you can use a connection string to attach MDF file to an SQL Server.)
and use that connection instead of the live one.
The application can run smoothly on the On-line database.
while reading, loading etc...
is done from the temporary attached and detached database files.
Take a look at : Connection String to Connect to .MDF
for how to Attach a MDF to SQL Server instance.
If you enter the data in a whole new database server your old queries won't work on the new one. As SQL Express limit is not per database, but per database server.
You could create a new SQL Express Server, link your servers and create a query with a linked server ( How to create a linked server # msdn )
You will need to adjust your queries.
If you query your data now like this:
SELECT em.Name, em.Telefone FROM employees AS em
You need to refer the database too.
SELECT [server1\db1].dbo.em.Name, [server1\db1].dbo.em.Telefone FROM [server1\db1].dbo.employees AS em
for your current database, and
SELECT [server2\backup].dbo.em.Name, [server2\backup].dbo.em.Telefone FROM [server2\backup].dbo.em.Name
It is possible like this but I would not advise it. If you exceeded 10GB data already then you might have large tables. Each table at a linked server is copied completely to you server and could cause serious network traffic and takes quite some time to be executed.
I would think of getting the SQL Standard edition.
As what i experience using Sqlite for my Small Applications i always use sqliteadmin to use its database cleanup function to removes unnecessary data on my database.
Now i want to create a Method in my Application which do the same way as sqliteadmin CleanUp.
How to do this?
Thanks in Regards
using (SQLiteCommand command = m_connection.CreateCommand())
{
command.CommandText = "vacuum;";
command.ExecuteNonQuery();
}
here is the exact answer on how to execute vacuum.
It seems you're looking for the VACUUM statement.
Interesting post script. The Vacuum statement in SQLite copies the entire database to a temp file for rebuilding. If you plan on doing this "On Demand" via user or some process, it can take a considerable amount of disk space and time to complete once your database gets above 100MB, especially if you are looking at several GB. In that case, you are better off using the AUTO_VACUUM=true pragma statement when you create the database, and just deleting records instead of running the VACUUM. So far, this is the only advantage I can find that SQL Server Compact has over SQLite. On demand SHRINK of the Sql Server Compact database is extremely fast compared to SQLite's vacuum.
I have been tasked with taking an existing WinForms application and modifying it to work in an "occasionally-connected" mode. This was to be achieved with SQL Server CE 3.5 on a user's laptop and sync the server and client either via SQL Server Merge Replication or utilizing Microsoft's Sync Framework.
Currently, the application connects to our SQL Server and retrieves, inserts, updates data using stored procedures. I have read that SQL Server CE does not support stored procedures.
Does this mean that all my stored procedures will need to be converted to straight SQL statements, either in my code or as a query inside a tableadapter?
If this is true, what are my alternatives?
Since SQL Server CE is considered to be an "application data store", it is assumed that any complex logic that you might normally implement in a SQL Server Stored Procedure will be implemented in the application itself. Many traditional database concepts are not supported in SQL CE, such as constraints, covering indexes, stored procs, UDFs... you name it, SQLCE doesn't have it!
Because SQL CE is single-user, this assumption more-or-less makes sense; you don't really need to worry about concurrency or atomicity issues when you have total control over everything that's happening at the DB level. It helps to not really think of SQL CE as a full-fledged database; it's more of an alternative to something like SQLite or MS Access.
Your only options are:
Rewrite your application to behave differently (i.e. use simple queries or direct table access) when operating in "disconnected" mode;
Disallow the application from performing the more complex operations unless it is "connected";
Switch to SQL Express instead, which has a much larger footprint but does support Stored Procedures and most of the other SQL Server goodness.
Yes, they are not supported, and the best way is to build them into parameterized queries in code. You can build your own kind of framework for accessing them like stored procedures by enum and then keep them in one clean place in code.
Although if you plan on scaling sql compact at all (outer joins of multiple tables with thousands of rows) you will want to use SqlCeResult sets and the Seek method. It is extremely fast and you can even open indexes directly and seek on them.
http://msdn.microsoft.com/en-us/library/system.data.sqlserverce.sqlceresultset(VS.80).aspx
Another option is to use Linq to Datasets. It can "store" stored procedure like methods for you. It is not stored on the database, but it gives you that illusion (though all of these methods need to be attached to a table and still need to me fairly simple).
Another alternative is VistaDB. It does support T-SQL Procs and all the same datatypes as SQL Server (more than SQL CE actually).
You may want to look at this SO post on advantages of VistaDB for more information.
To my Mind you can make separate class in your database which could be assessed from Data Access Layer.
You can simply manage parameters collection in DAL and then pass them as object to Stored Procedure class Method which will behave like SQL Store Procedure and generate Query after Concatenation with desired variables.
After Generating Script it will return Query which can then pass to SQL CE to extract results.
I need to copy several tables from one DB to another in SQL Server 2000, using C# (VS 2005). The call needs to be parameterized - I need to be able to pass in the name of the database to which I am going to be copying these tables.
I could use DTS with parameters, but I can't find any sample code that does this from C#.
Alternatively, I could just use
drop table TableName
select * into TableName from SourceDB..TableName
and then reconstruct the indexes etc - but that is really kludgy.
Any other ideas?
Thanks!
For SQL Server 7.0 and 2000, we have SQLDMO for this. For SQL Server 2005 there is SMO. This allows you do to pretty much everything related to administering the database, scripting objects, enumerating databases, and much more. This is better, IMO, than trying a "roll your own" approach.
SQL 2000:
Developing SQL-DMO Applications
Transfer Object
SQL 2005:
Here is the SMO main page:
Microsoft SQL Server Management Objects (SMO)
Here is the Transfer functionality:
Transferring Data
How to: Transfer Schema and Data from One Database to Another in Visual Basic .NET
If the destination table is being dropped every time then why not do SELECT INTO? Doesn't seem like a kludge at all.
If it works just fine and ticks all the requirements boxes why create a days worth of work growing code to do exactly the same thing?
Let SQL do all the heavy lifting for you.
You could put the scripts (copy db) found here
http://www.codeproject.com/KB/database/CreateDatabaseScript.aspx
Into an application. Just replace the destination. To actually move the entite database, FOLLOW
http://support.microsoft.com/kb/314546
But remember, the database has to be taken offline first.
Thanks