I have been tasked with taking an existing WinForms application and modifying it to work in an "occasionally-connected" mode. This was to be achieved with SQL Server CE 3.5 on a user's laptop and sync the server and client either via SQL Server Merge Replication or utilizing Microsoft's Sync Framework.
Currently, the application connects to our SQL Server and retrieves, inserts, updates data using stored procedures. I have read that SQL Server CE does not support stored procedures.
Does this mean that all my stored procedures will need to be converted to straight SQL statements, either in my code or as a query inside a tableadapter?
If this is true, what are my alternatives?
Since SQL Server CE is considered to be an "application data store", it is assumed that any complex logic that you might normally implement in a SQL Server Stored Procedure will be implemented in the application itself. Many traditional database concepts are not supported in SQL CE, such as constraints, covering indexes, stored procs, UDFs... you name it, SQLCE doesn't have it!
Because SQL CE is single-user, this assumption more-or-less makes sense; you don't really need to worry about concurrency or atomicity issues when you have total control over everything that's happening at the DB level. It helps to not really think of SQL CE as a full-fledged database; it's more of an alternative to something like SQLite or MS Access.
Your only options are:
Rewrite your application to behave differently (i.e. use simple queries or direct table access) when operating in "disconnected" mode;
Disallow the application from performing the more complex operations unless it is "connected";
Switch to SQL Express instead, which has a much larger footprint but does support Stored Procedures and most of the other SQL Server goodness.
Yes, they are not supported, and the best way is to build them into parameterized queries in code. You can build your own kind of framework for accessing them like stored procedures by enum and then keep them in one clean place in code.
Although if you plan on scaling sql compact at all (outer joins of multiple tables with thousands of rows) you will want to use SqlCeResult sets and the Seek method. It is extremely fast and you can even open indexes directly and seek on them.
http://msdn.microsoft.com/en-us/library/system.data.sqlserverce.sqlceresultset(VS.80).aspx
Another option is to use Linq to Datasets. It can "store" stored procedure like methods for you. It is not stored on the database, but it gives you that illusion (though all of these methods need to be attached to a table and still need to me fairly simple).
Another alternative is VistaDB. It does support T-SQL Procs and all the same datatypes as SQL Server (more than SQL CE actually).
You may want to look at this SO post on advantages of VistaDB for more information.
To my Mind you can make separate class in your database which could be assessed from Data Access Layer.
You can simply manage parameters collection in DAL and then pass them as object to Stored Procedure class Method which will behave like SQL Store Procedure and generate Query after Concatenation with desired variables.
After Generating Script it will return Query which can then pass to SQL CE to extract results.
Related
I usually work with MySql, but also with SQL Server, Oracle and Access, the database structure is almost the same. My database stores configuration and recorded data of a SCADA application ("Supervisory Control And Data Acquisition").
Most of the tables are usually the same but sometime my teammates adds fields, tables or changes some fields type.
I'm writing an application that need to load some config parameters from db, then load data, process it and store the new values on db. It also need to add new records.
I have a class that, independently from db type, given the correct connection params, gets a IDbConnection object. With some methods I can specified a SQL query and it give me and IDataReader or a also Dataset.
Now, how should i query data from the db, analyze, recalculate, and finally store them again?
I'm a bit scared of building a detailed object mapping because of the possibility of changed fields. A simple dataset/datatable/datarow should be ok but i'd like to use linq to query in a simpler way the extracted data from the database.
Finally, my db has about 60 tables but in this application I work only with a dozen of them. I have only a few time to build that application, so I need a fast way, also if it's not "very beautiful".
Thanks.
you should try an ORM that configures itself automatically according to schema
i have found this one. I didn't use similar things in c# but it works nicely in other (dynamic) languages.
http://www.codeproject.com/Articles/117666/Kerosene-ORM
Using an ORM would most probably be the fastest. You could use NHibernate which has multiple DB support. NHibernate does have a learning curve, so something like a micro ORM could be easier to use perhaps. Petapoco is a great micro ORM and supports SQL Server, SQL Server CE, MySQL, PostgreSQL and Oracle.
These ORMs would create a mapping file for each DB you use which needs to be updated or recreated when changes are made in the DB.
I am migrating an existing .NET 2.0, SQL Server codebase to a .NET 4.0, SQL Server 2008 environment.
The design pattern is that all app calls to the database go through a stored procedure. So there's a get[object name] stored procedure that needs to be created or altered for most select statements.
So the disadvantages of this architecture to me are already evident: inconvenience.
What are the advantages of this highly enforced stored procedure design? (now in .NET 4.0 as opposed to using an ORM).
Actually - contrary to popular belief - performance isn't one of the advantages of stored procedures - not anymore. Properly written "inline" SQL queries with parameters are just as fast, get "compiled" once by SQL Server (before first use) and remain in the procedure cache of SQL Server just as long as any stored procedure.
But stored procedures do have advantages - two main ones I'd like to mention:
the shield the user from the underlying tables. Which also means: it's another layer in your security system. Your database users do not need access to the tables - and thus they won't be able to cause any grief on those tables, either - by accessing them via Excel or Access or some other tool. This alone can be a huge benefit.
the second point is having a layer of stored procedure can give your DBA a place to optimize. You as a developer only call stored procedures - and the DBA can tweak them, fine tune them, make them run faster. As long as the parameter list and the return result set remain the same - you as a frontend developer won't even notice (at least not in a negative way!)
I take the approach of stored procs for INSERT/ UPDATE / DELETE for objects and do SELECTs in application code. Advantages:
Clear separation of business logic and data
Data security is better because it is controlled at the database layer.
Doing SELECTs in business logic is a compromise that anyone can read table data if they get the database login credentials, but they cant modify it (assuming you setup object level permissions correctly (tables read-only)), but i don't have to write a stored proc for every variant of where criteria.
its easier to customize data operations when you write your own data adapters vs ORMs
ORMs are fine, but there's typically alot of overhead in ORMs and i like the approach of my applications creating the least amount of work possible for the machines they run on. Plus I know exactly what is happening and there's less 'magic' happening behind the scenes
Disadvantages:
You can generate alot of code if you don't use ORMs, which means more to maintain.
It's fair to say that writing your own data adapters is re-inventing the wheel. more control always comes with a cost
Stored procedures greatest benefit is - execution time. If you have "heavy" SQL queries you should use SP.
I am writing a software which stores all the information of a users interaction in a global session object/class. I would like to store this values collected in a persistent storage. However i cannot use heavy databases such as sql server or mysql in the target pc as i need to keep the installer minimum in size.
I also need to retrieve values from the storage by passing simple Linq queries,etc.
My question is what is the next best thing to databases which can be manipulated by C# code?
Probably either SQLite or SQL Server Compact Edition - these are both fairly full-featured database systems than run entirely in-process and are frequently used these sorts of thing (for example Firefox uses SQLite to store bookmarks).
The next rung down the ladder of complexity would probably be either XML (using LINQ to XML), or just serialisable objects (using LINQ to objects) - you would of course incur performance penalties over a "proper" compact database like SQLite if you started storing a lot of data, however you would probably need store more than you think before it became noticable, and for small data sets the simplicity would even make this faster than SQLite (for example you could restrict your application to storing the last 100 or so actions).
SQL Server CE and SQLite are popular for the scenario you are describing. XML is as well
You could connect to Access MDB files. You don't need an SQL server for this, and it uses the same syntax.
Just need to use OleDb.
Example: DataEasy: Connect to MS Access (.mdb) Files Easily using C#
This question already has answers here:
Closed 11 years ago.
Possible Duplicate:
What are the pros and cons to keeping SQL in Stored Procs versus Code
I was listening to Hanselminutes podcast "Rise of The Micro ORM," where one of the guests (Sam Saffron and Rob Conery) outlined the classic reasons that DBA's insist on stored procedures:
They are pre-compiled, which gives them an execution speed advantage
They hide the underlying database scheme, which allows a separation of interface and implementation that prevents brittleness.
A guest then said these aren't good arguments, and suggested the real reason the DBA's will insist on stored procs is because they simply want to protect themselves from the ignorance of the middle-tier developers.
I found that statement to be a bit on the extreme side. Certainly I can agree that argument #2 is flawed, but I thought it was well known that sending arbitrary (uncompiled) SQL to the database was a performance hit. Is there something I'm missing that would explain why argument #1 is not really true?
My own answer, as just a guess, is that there is a performance hit - but it rarely matters. It is perhaps analogous to a developer who attempts to optimize every loop he writes, even though only 1% of the loops written ever benefit from the tuning. Am I capturing the thought correctly?
"but I thought it was well known that sending arbitrary (uncompiled) SQL to the database was a performance hit."
The distinction you're making between stored procs and other sql statements regarding precompilation hasn't existed since SQL 6.5.
Stored Procedures and Execution Plans
In SQL Server version 6.5 and earlier,
stored procedures were a way to
partially precompile an execution
plan. At the time the stored procedure
was created, a partially compiled
execution plan was stored in a system
table. Executing a stored procedure
was more efficient than executing an
SQL statement because SQL Server did
not have to compile an execution plan
completely, it only had to finish
optimizing the stored plan for the
procedure. Also, the fully compiled
execution plan for the stored
procedure was retained in the SQL
Server procedure cache, meaning that
subsequent executions of the stored
procedure could use the precompiled
execution plan.
SQL Server 2000 and SQL Server version
7.0 incorporate a number of changes to statement processing that extend many
of the performance benefits of stored
procedures to all SQL statements. SQL
Server 2000 and SQL Server 7.0 do not
save a partially compiled plan for
stored procedures when they are
created. A stored procedure is
compiled at execution time, like any
other Transact-SQL statement. SQL
Server 2000 and SQL Server 7.0 retain
execution plans for all SQL statements
in the procedure cache, not just
stored procedure execution plans. The
database engine uses an efficient
algorithm for comparing new
Transact-SQL statements with the
Transact-SQL statements of existing
execution plans. If the database
engine determines that a new
Transact-SQL statement matches the
Transact-SQL statement of an existing
execution plan, it reuses the plan.
This reduces the relative performance
benefit of precompiling stored
procedures by extending execution plan
reuse to all SQL statements.
http://msdn.microsoft.com/en-us/library/aa174792%28v=sql.80%29.aspx
In my experience, most DBAs could no more write a stored proc then they could fly the space shuttle. Everywhere I've worked stored procs have been written by the application developers, who also designed and implemented the databases.
Having said that, stored procs are not innately faster than using, say views, and may indeed be slower if written by inexperienced developers using stuff like cursors.
as for performance: Either use Stored procedures or Precompiled statements.
as for abstraction: Either use a DAL/ORM or Stored procedures.
Sure, stored procedures can do things that you can't do from the outside and with this performance. So as usual, it depends..
I need to copy several tables from one DB to another in SQL Server 2000, using C# (VS 2005). The call needs to be parameterized - I need to be able to pass in the name of the database to which I am going to be copying these tables.
I could use DTS with parameters, but I can't find any sample code that does this from C#.
Alternatively, I could just use
drop table TableName
select * into TableName from SourceDB..TableName
and then reconstruct the indexes etc - but that is really kludgy.
Any other ideas?
Thanks!
For SQL Server 7.0 and 2000, we have SQLDMO for this. For SQL Server 2005 there is SMO. This allows you do to pretty much everything related to administering the database, scripting objects, enumerating databases, and much more. This is better, IMO, than trying a "roll your own" approach.
SQL 2000:
Developing SQL-DMO Applications
Transfer Object
SQL 2005:
Here is the SMO main page:
Microsoft SQL Server Management Objects (SMO)
Here is the Transfer functionality:
Transferring Data
How to: Transfer Schema and Data from One Database to Another in Visual Basic .NET
If the destination table is being dropped every time then why not do SELECT INTO? Doesn't seem like a kludge at all.
If it works just fine and ticks all the requirements boxes why create a days worth of work growing code to do exactly the same thing?
Let SQL do all the heavy lifting for you.
You could put the scripts (copy db) found here
http://www.codeproject.com/KB/database/CreateDatabaseScript.aspx
Into an application. Just replace the destination. To actually move the entite database, FOLLOW
http://support.microsoft.com/kb/314546
But remember, the database has to be taken offline first.
Thanks