Memory/Performance of Using SQL Server CE - c#

So I am working in an embedded environment and there is a need to transfer all of the data from a SQL Server Compact Edition database (.sdf file) to a SQLite database. So this would be a one time operation and my concern is that if I connect to the CE database to retrieve the data, will this cause additional memory and CPU resources to be used and stick around after I am done reading the data? My hope is that all the resources used by SQL Server CE would be garbage collected and freed up after I am done reading from the CE database or else I may be forced to write my own .sdf parser/reader.
Does anyone know ow the resources for SQL Server CE will be handled after I am done connecting to the database and if I can safely assume that they will be freed up after I read the data?

I believe you would have no concerns here. The ce and express versions of sql server have imposed utilization limits. I gb mem and 1 physical cpu and if you you are doing a one time batch process the mem will be GC after the process is stopped.

Related

MS Access or SQL Server CE for portable client application (performance basis)

I am using a MS Access 2007 database (.accdb) with password encryption in my client application, built in Visual Studio 2010 with the .NET framework 3.5. My database size is 1.0 GB and will grow to 1.7 GB max. There are no forms/reports (macros etc.) in the database.
There are several calculations performed in tables which leads to decrease in performance of the application. It's a client portable software and I want to increase the performance (loading data {select query}) of database. I cannot use SQL Server Express (LocalDB etc) since it does not support database encryption like .accdb and .sdf support.
I have gone though several SO posts like SQL Server CE or MS Access for a portable database, which is targeted to Storage of DB. And Cases where MS Access is a better choice than SQL Server, which is between SQL Server and MS Access.
I am reading data from table having millions of rows (database size is about 1GB , just because of this table). Its taking too much time in reading data row by row (almost entire fields of row), i have used indexing.
Should I go for SQL Server CE to increase performance (selecting row by row data) ? If yes, then SQL Server CE 3.5 or SQL Server CE 4.0 (database is not used for web purposes)?
Is there any RDBMS option available except these two, which supports database encryption like .accdb and .sdf?

.NET Embedded/Offline DB that works with EF

I've been researching different solutions for an offline database.
Basically I have a desktop application and I would like it to communicate with a database, without requiring internet connection. So I am looking for a way to ship my app along with a database that the app can work with.
All my DB intercation will happen through Entity Framework, so I need a solution that is compatible with it. So far, I have always used SQL Server Express for my DB, but as far as I know that requires that SQL Server is installed on the user's machine, which is obviously not what I need.
My DB will not need to handle huge amounts of data. (Worst-case would be something around 100,000 - 1,000,000 rows of data in the DB).
From what I am reading I found that SQLite, and SQL Server CE and a feature of SQL Server called LocalDB might do the job for me. (SQL Server CE is no longer supported my Microsoft, so I am guessing its not a good idea)
I was wondering if I am on the right track here? Is this the way to go, or is there a way to embed my SQL Server Express into my app.
SQL Server Compact and SQLite can both run as embedded and work with EF (SQL Ce much better than SQLite)
LocalDB might also be an option, requires admin access to install (not during runtime)

Detecting C# and SQL Server memory conflicts

I have a memory-intensive C# app that resides on the same server as SQL Server.
I have tweaks in my application to limit in-memory caching and I am aware of how to set (and have set) maximum memory limits on my SQL Server.
PROBLEM: When the C# app wants to use more memory than is available due to SQL Server's caching, my app slows greatly, presumably to venturing into Virtual Memory space. I can prevent this most of the time by eyeballing how much RAM is available and setting the appropriate values in my custom C# code (whether to cache to RAM and/or to SQL Server) and in SQL Server's settings.
HOWEVER: Sometimes the machine's memory usage goes beyond my eyeballed boundaries due to other processes on the machine, typical OS needs fluctuating, etc.
I have noticed that SQL Server will often yield RAM to other processes, such as Chrome, MS Word... it doesn't seem to do so for my process. I have a gut feeling that my C# app isn't actively using all of the cached data in SQL Server...
So, how do I detect when SQL Server won't yield the RAM to my application and/or how do I detect when my application cannot allocate additional bytes of physical RAM?

SQL Server Express Database hosted on Network Share - Is it Possible?

I've started work on a project that requires an SQL Server Database. I will be building a front end application in c# .Net 3.5, that will use LINQ to SQL.
I need to host the database on a network share so that a group of users can all gain access to the database, mainly for read only.
I know that SQL Server Compact is designed to run on the local machine and my company is not willing to front the costs of a full blooded SQL Server.
Is there a way of achieving what I need to do via SQL Server Express?
If so, which are the best guides on how to set this up?
Thanks
If you go with the (free) SQL Server express, it will do what you need - but you don't access it thru a network shared drive - the server would be located by an ip address (or equivalent DNS).
You c# application would be talking to a service - SQL Server - not reading to/from a database file. The service will handle the interaction with the database. Only the SQL Server service will need to know where the file actually is - your client machines won't know and shouldn't care.
If your background is only with file-based databases - i.e. MS Access, you need to change your mindset a bit about how SQL server works.
You can install a SQL Server Express instance and install the SQL Management Studio Express for all users who need access to the database. The Express Edition is a standard SQL server with limitations regarding the number of processors used, the maximum amount of memory used and the maximum database size. If these limitations don't bother you, it should work fine for you.
Using a network share as a database storage to access db files from several clients is a bad idea, as the sql server instance should always be the only one directly accessing the database, both for read and write access. Configuring several instances of SQL Server to access the same database will probably not work - and if it works, it will probably create havoc in your database files.

Using C# WebServices, what is the Recommended way to upload records from a local database to remote one

The Remote Central Office SQL Server 2008 database is only available via WebServices and in need to collect data located on various tables* from each Sub-Office SQL Server 2008 database, on a scheduled basis.
What is the "recommended way" to prepare and upload data on the client office in order to optimize the Web Service call that will upload the data over to the Central Office?
The language used for the WebService is C#.
The Database used on both ends is SQL Server 2008.
I can only presume you are transferring large data-sets across a domain boundary.
If you cannot use SQL technologies to do this for you (SSIS, Mirroring) then (depending on your volumes) you may need to create a WCF streamed service.
http://msdn.microsoft.com/en-us/library/ms789010.aspx
By streaming the data over the wire you will avoid out of memory errors and potential large object heap fragmentation.

Categories