Export SDF database contents as JSON file - c#

I have a C# Windows Forms project which uses a SQL Server Compact (.SDF) database for only retrieving data. My application does not update the database. Thus the database is static.
I recently read somewhere that for such kind of static work it is best to use XML or JSON as they reduce the I/O time which is spent on connecting to the database, retrieving,and closing the database. Is this true?
If So, is there a way by which I can directly convert my database contents to that of a JSON file? It has 7 tables(relations) now and total of 850 rows(tuples) of data. The data is in Kannada language and not English (if this makes any diff).

Yes, with 850 read-only records that you don't join, reading from a file rather than a database is way faster.
What you want to do is save your data in files (one per table will come in handy) and read them once at program start into a list of classes that look like your respective table structure. You can then operate on those lists a lot faster than on a database. You can use JSON as a file format, or XML or CSV if your data is simple, or anything else you can come up with.
How you do this in detail is way to broad as a question for this site. Read a few turorials and ask a more detailed question once you have any problems with it.

Related

C# upload CSV file to Netezza

So my team is looking into connecting to Netezza with C# and we plan on loading data into netezza, pulling data from netezza and writing update queries all in C#.
From my research, I see that it's possible to connect to netezza using C# and I'm wondering if you can do all that is bolded above using C# so that we can decide on whether or not we can do just about anything with Netezza using C#. We'd like to know before we commit to anything. The types of data we would be loading are CSV files.
Are there any good resources on this? I haven't been able to find any.
We also have Aginity client tools so maybe it's possible to incorporate Aginity to this (Not that I would want to but if it's easier I'd like to know about it)?
Retrieving data is straightforward and can be done through the usual channels (loop over a cursor to get results) but loading can take a bit longer.
Netezza is not a fan of multiple INSERT queries; loading a large number of records with individual INSERT queries, as it doesn't support multi-row inserts, will take a long time.
When loading multiple records most people usually write out their data to a ".csv" and use the external table syntax to perform the insert.
When in a application we prefer to load/unload our data via a named pipe so that we don't have to write/read the data to disk prior.

Write binary file database C#

my concern is that, im trying to make a program that will show up let's say images of cars to create a collection. I'm using windows forms and i have made the grapchical interface i want.
The thing is i have a xml file (like database) with 300 cars(elements) and each car has a some children elements one of them being the "carname.png" so i can use it to find the file image to show also. My concern here is if there is a way to convert this xml file into a binary one that my program will be able to read much faster. This is for efficiency and self-educational-practice purposes as I am young in programming. Thanx in advance for your time and thoughts.
Simply import your XML file into a suitable database and query that instead. If it naturally fits into relational form, use Sqlite as first choice. In the Windows environment as a learning experience you might prefer to use SqlServer but it's overkill.
If the data does not fit naturally into the relational form then use a no-sql database like MongoDB and store your XML records (or JSON or whatever) in it. You get fast access and XML flexibility together.

C# Winform database

I am building a Winform application that need a database.
The database needs to save an array of items of a custom class:
Name
Date
Duration
Artist
Genre
If I should build the database using a file that every time, when I increase the array, I will save. Is there wait time to save an array of 300 or so items?
And the second database is to use SQL.
What is the difference between them? And what should I use?
As someone mentioned in a comment, SQLite should work very well for this type of scenario.
If you think your data set will remain fairly small, you might consider XML, or a file, or something else if you think that would be quicker/easier.
In any case, I would strongly recommend that you hide your storage-logic behind an interface, and call only that from the winforms part of your application. This way you will be able to replace your storage-solution later if you should need to.
Update in response to comment: The reason for using SQLite instead of another DB System is that SQLite can be integrated directly into your application. Other DBMS`s will typically be external systems, that you just connect to from within your app.
A quick google search will provide you lots of info, such as this short article about using SQLite within a C# application.
I think you have to think about the futured size of your data.
If you know that i future the data will grow up exponentially, i think you have to use a database System like SQL.
Otherwise if it is only for a few records, you can use a XML File instead.
If you are using a MS SQL Database, you can open a Connection while saving your data, and write it with a sqladapter into the database.
If you are using a XML file instead, you can use the XMLSerializer class for serialization of your own Business object.
File vs database? - it is easy. What is database - it is a file. Only it has an engine that knows how to manipulate that file.
If you use file, you suddenly need to think, "what if?". What if file gets corrupted during write. Or what if computer shuts down in the middle of write? DBMS takes care of this issues by issuing all sorts of mechanisms such as uncommitted data files, etc. Now you will need to provide this mechanism yourself.
This is why you should write to file only non-critical data. For example, some user settings. Because if you lost that file, user can re-size controls again but no data will be at loss. Or log file is another good use of file. Because if you lose a log, you can live without. But if you lose months of worth of data...
In your case, I don't know, how user history is important. 300 items is not a large array. You can use XML by creating an object (class) and mark its properties with XML attributes and then use XML serializer to serialize your history into XML
http://msdn.microsoft.com/en-us/library/system.xml.serialization.xmlserializer.aspx
But if it is going to grow and you not planning to age some of it and delete, look into RDBMS.

system architecture for real-time data

The company I work for is running a C# project that crawling data from around 100 websites, saving it to the DB and running some procedures and calculations on that data.
Each one of those 100 websites is having around 10,000 events, and each event is saved to the DB.
After that, the data that was saved is being generated and aggregated to 1 big xml file, so each one of those 10,000 events that were saved, is now presented as a XML file in the DB.
This design looks like that:
1) crawling 100 websites to collects the data and save it the DB.
2) collect the data that was saved to the DB and generate XML files for each event
3) XML files are saved to the DB
The main issue for this post, is the selection of the saved XML files.
Each XML is about 1MB, and considering the fact that there are around 10,000 events, I am not sure SQL Server 2008 R2 is the right option.
I tried to use Redis, and the save is working very well (and fast!), but the query to get those XMLs works very slow (even locally, so network traffic wont be an issue).
I was wondering what are your thoughts? please take into consideration that it is a real-time system, so caching is not an option here.
Any idea will be welcomed.
Thanks.
Instead of using DB you could try a cloud-base system (Azure blobs or Amazon S3), it seems to be a perfect solution. See this post: azure blob storage effectiveness, same situation, except you have XML files instead of images. You can use a DB for storing the metadata, i.e. source and event type of the XML, the path in the cloud, but not the data itself.
You may also zip the files. I don't know the exact method, but it can surely be handled on client-side. Static data is often sent in zipped format to the client by default.
Your question is missing some details such as how long does your data need to remain in the database and such…
I’d avoid storing XML in database if you already have the raw data. Why not have an application that will query the database and generate XML reports on demand? This will save you a lot of space.
10GBs of data per day is something SQL Server 2008 R2 can handle with the right hardware and good structure optimization. You’ll need to investigate if standard edition will be enough or you’ll have to use enterprise or data center licenses.
In any case answer is yes – SQL Server is capable of handling this amount of data but I’d check other solutions as well to see if it’s possible to reduce the costs in any way.
Your basic arch doesn't seem to be at fault, its the way you've perceived the redis, basically if you design your key=>value right there is no way that the retrieval from redis could be slow.
for ex- lets say I have to store 1 mil objects in redis, and say there is an id against which I am storing my objects, this key is nothing but a guid, the save will be really quick, but when it comes to retrieval, do I know the "key" if i KNOW the key it'll be fast, but if I don't know it or I am trying to retrieve my data not on the basis of key but on the basis of some Value in my objects, then off course it'll be slow.
The point is - when it comes to retrieval you should just work against the "Key" and nothing else, so design your key like a pre-calculated value in itself; so when I need to get some data from redis/memcahce, I could make the KEY, and just do a single hit to get the data.
If you could put more details, we'll be able to help you better.

Programmatically saving a SQL Server database to xml files and restoring it again

I want to save a whole MS SQL 2008 Database into XML files... using asp.net.
Now I am bit lost here.. what would be the best method to achieve this? Datasets?
And I need to restore the database later again.. using these XML files. I am thinking about using datasets for reading the tables and writing to xml and using the SQLBulkCopy class to restore the database again. But I am not sure whether this would be the right approach..
Any clues and tips for me?
If you will need to restore it on the same server type (I mean SQL Server 2008 or higher) and don't care about ability to see actual data inside the XML do the following:
Programmatically backup the DB using "BACKUP DATABASE" T-SQL
Compress the backup
Convert the backup to Base64
Place the backup as the content of the XML file (like: <database name="..." compressionmethod="..." compressionlevel="...">the Base64 content here</database>
On the server where you need to restore it, download the XML, extract the Base64 content, use the attributes to know what compression was used. Decompress and restore using T-SQL "RESTORE" command.
Would that approach work?
For sure, if you need to see the content of the database, you would need to develop the XML scheme, go through each table etc. But, you won't have SPs/Views and other items backed up.
Because you are talking about a CMS, I'm going to assume you are deploying into hosted environments where you might not have command line access.
Now, before I give you the link I want to state that this is a BAD idea. XML is way too verbose to transfer large amounts of data. Further, although it is relatively easy to pull data out, putting it back in will be difficult and a very time consuming development project in itself.
Next alert: as Denis suggested, you are going to miss all of your stored procedures, functions, etc. Your best bet is to use the normal sql server backup / restore process. (Incidentally, I upvoted his answer).
Finally, the last time I dealt with XML and SQL Server we noticed interesting issues that cropped up when data exceeded a 64KB boundary. Basically, at 63.5KB, the queries ran very quickly (200ms). At 64KB, the query times jumped to over a minute and sometimes quite a bit longer. We didn't bother testing anything over 100KB as that was taking 5 minutes on a fast/dedicated server with zero load.
http://msdn.microsoft.com/en-us/library/ms188273.aspx
See this for putting it back in:
How to insert FOR AUTO XML result into table?
For kicks, here is a link talking about pulling the data out as json objects: http://weblogs.asp.net/thiagosantos/archive/2008/11/17/get-json-from-sql-server.aspx
you should also read (not for the faint of heart): http://www.simple-talk.com/sql/t-sql-programming/consuming-json-strings-in-sql-server/
Of course, the commentors all recommend building something using a CLR approach, but that's probably not available to you in a shared database hosting environment.
At the end of the day, if you are truly insistent on this madness, you might be better served by simply iterating through your table list and exporting all the data to standard CSV files. Then, iterating the CSV files to load the data back in ala C# - is there a way to stream a csv file into database?
Bear in mind that ALL of the above methods suffer from
long processing times due to the data overhead; which leads to
a high potential for failure due to the various time outs (page processing, command, connection, etc); and,
if your data model changes between the time it was exported and reimported then you're back to writing custom translation code and ultimately screwed anyway.
So, only do this if you really really have to and are at least somewhat of a masochist at heart. If the purpose is simply to transfer some data from one installation to another, you might consider using one of the tools like SQL Compare and SQL Data Compare from RedGate to handle the transfer.
I don't care how much (or little) you make, the $1500 investment in their developer bundle is much cheaper than the months of time you are going to spend doing this, fixing it, redoing it, fixing it again, etc. (for the record I do NOT work for them. Their products are just top notch.)
Red Gate's SQL Packager lets you package a database into an exe or to a VS project, so you might want to take a look at that. You can specify which tables you want to consider for data.
Is there any specific reason you want to do this using xml?

Categories