C#: methode for storing large and complex data? [closed] - c#

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
I am trying to store a software "Calender"(complex logs that saved and browsed based on date).
I tried using both .ini and xml but when the application tried to read the entire file to find info for 1 specific day out of the 100 days (or so it seemed), it took almost 9 seconds to get 5 variables out of 500 variables. The size of the actual file might eventually be more than 40 variables per day.
Also, I would rather not make a file for each day, that will seems a little bit unprofessional and messy.
I am asking the question to know if there is an alternative to keep things fast and neat. The data includes different types of variables and different amounts of them. I know i am kinda overdoing it with logging thing but the program needs logs to do its work

If the data must be stored it has to be a file or a database (local or remote), I'd go for SQLite, it would end in a single file, but you could query the data with SELECT, JOIN, etc.
EDIT:
You can use SQLite3 from c# if you include this package:
https://www.nuget.org/packages/System.Data.SQLite/
You'll need to learn some SQL, but after that you'll just use something like:
select Message from Logs where Date > '2015-11-01' and Date < '2015-11-25';
which is easier, faster and clearer than messing with XML, and it will not load the whole file.

As mentioned above, SQLite will offer a great possibility. Since you (generally), and probably not a lot of people out here will be able to write a database management system that is as efficient as the ones out there.
https://www.sqlite.org

Whole point of using RDBMS because it's far more efficient that dealing with files.
SQL Lite is light weight and easier to deploy. But remember that,
SQLite only supports a single writer at a time (meaning the execution
of an individual transaction). SQLite locks the entire database when
it needs a lock (either read or write) and only one writer can hold a
write lock at a time. Due to its speed this actually isn't a problem
for low to moderate size applications, but if you have a higher volume
of writes (hundreds per second) then it could become a bottleneck.
Reference this question
If this is an enterprise level application requirement I would go for Azure Table storage based solution which is identical for this sort of scenario.

Related

Using JSON file as database in a ASP.NET App(C#) [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 5 months ago.
Improve this question
I am developing a light weight web app in ASP.NET with C# and I would like to store some data such as site analytics etc.
Instead of using a SQL server database could I use a JSON file instead? So I would open the JSON file and write into it when done. Could there be problems with simultaneous connection?
Is this standard/ common practice?
Any help will be highly appreciated.
It is probably a VERY bad idea. I would only do this if it was some text file to display a wee bit of information.
If you need updating? Then what happens if two users are on the site. The last person to update will over write the text based json file.
I suppose you could write out a "id" and json file for each record you edit, and that way a save of one record value would not over write others. But, putting more then one reocrd in a json file, and if users are to edit such values, then no, it will not work, since a simple save of that text json file will overwrite any other changes made by any other user who also happens to hit the same button.
It is VERY hard to find a web site hosting, even those for super cheap - less then $10 per month that does not include a database server system. And be it MySQL, SQL server, postgres sql and more? They all have free verisons you can use.
And I suppose you could consider the file based sqlLite. That would even be better. While it not considered thread safe, it can work if you only say have a a few users like 2-4 users working at the same time.
Because you have OH SO MANY choices here? The only reason to use a text based json file is if no other options exist - and boatloads of options exist in near all cases.
And if some database was not available? The I would simple include sqlLite in the project and use that.
I'm not super experienced but i would like to explain why i would never do that.
First of all everything (if you are brave enough) can be a database, as long as it is some kind of file that can give persistency to your data. Database are basically optimized one purpose software to store data. Mainly the problem in your solution is that you would need to read the file and load it in memory storing it as an object and then writing data to it like you would do with a static factory object and then serialise it back to JSON after you are don with it. Personally i don't like this idea because i think is very prone to human error (like deleting the file by accident during mantainance) and it can be hard-er to debug if it starts accumulating a sizeable chunk of data. There are very lightweight data persistency solutions that implements SQLite that is a database for small applications, like Pocket Base. Since you are already developing a backend it would require you near to no effort to add a little table to store the analytics.

Save millions of files on windows [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
My system will save ~20-40 million image files
Each file is 150-300KB
My application will run on windows server 2012 R2 and the files will be saved on storage (don't know which one yet)
My application is written in C#
My requirements are:
- The system will constantly delete old files and save new files (around 100K files per day)
- The most recent images will be automatically displayed to users on web and wpf applications
- I need fast access to recent files (last week) for report purposes
What is the best practice for storing / organizing this amount of files?
Broad questions much? If you're asking about how to organize them for efficient access that's a bit harder to answer without knowing the reason you're storing that many files.
Let me explain:
Lets say you're storing a ton of log files. Odds are your users are going to be most interested in the logs from the last week or so. So storing your data on disk in a way that you can easily access the files by day (e.g. yyyy-mm-dd.log) will speed up getting access to a specific day log.
Now instead think of it like a phone book and you're accessing peoples names. Well storing it by the time you inserted that name in the phone book really isn't going to help you get to the result you want quickly. Better come up with a better sorting algorithm.
Essentially look at how your data will be accessed, try to sort it in a logical manner so that you can do a binary search algorithm or better algorithm on it.
I'd highly recommend rewording your question so it is clearer though.

Does frequent INSERT/UPDATE query compromise the database safety? [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
Hi, I am relatively new to using databases in a web application and I am trying to develop a dynamic application for my users.
So, I was wondering how safe is it to have your application frequently (say every 2 seconds) execute an INSERT/UPDATE query from the same user.
I'm aware that INSERT/UPDATE queries are quite slow to execute (I've read its 12 INSERT per second) but that's not my question.My question is concerning the safety of my database.
The frequency of executing INSERTs or UPDATEs to your database is in no way related to the safety of of your database. But it might impact its performance, that's possible.
I think the concern of the OP is more about the robustness of the database's ability to handle load and not become corrupt, rather than issues of SQL injection. (Valuable comments to take note of though). The volumes suggested per user should no concern.
Database integrity for any DB should be checked with regular maintenance. Make sure you are doing the following and you should have no problems with performance and reliability.
Back up your DB. Full backup and transaction log backups.
Re-build and/or Re-Organize indexes
Delete old data backups and check free space often.
Check the maintenance logs for issues on your DB.
Then monitor performance for sufficient Memory, DISK I/O, and CPU.
wihout "with(nolock)" segment after the table name, the table being insert/update is always locked.
if so, how can it performance well

Creating a program for Volunteers, how should I store the data?(xml, sql, etc) [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
Disclaimer: Yes, I am fairly new to storing data in files. But this is something I'm willing to learn, and take the time to learn
I am working on creating a program for someone who keeps track of volunteers and how much time they spend working, and what they are working on.
For each volunteer I will have their Address Information, email, phone, etc. I will also have what show they worked, how long they worked it for, and what position they were working.
Later, the user will be able to access a way to print out monthly reports of this information.
I am wondering if anyone is willing to give me a nudge in the right direction. How should I store all this data? I've heard of XML, SQL, JSON, and other things here and there. I need something that can handle large amounts of data, as there are about 200 volunteers right now, and data will need to be constantly added to this file(s). Are there any suggestions? If you need me to clarify something, please just ask.
Also, I am using Windows Forms Application, C#.
There are different ways to accomplish this, most preferred ones would be either SQL database (MySQL, MSSQL, SQLite, etc.), JSON storage (and GZipping, for good measures. Saving storage, you know), or binary storage (aka. knowing the complete size of the data you're storing, such as how many Int32's, and reading this in bytes from a file).
I'd go with the JSON storage if you're not planning on accessing this data from more than 1 computer.
If portability is what you want, you're gonna want to go with SQL.
If you're just looking for a simple way to store data, then it'd be either JSON or binary saving.
These are probably the best choices you have, on how to save data.
#Edit: Seeing how you've specified large amounts of data, your best shot would be either SQL or Binary. Binary, because you can jump around in files because you know the size of each object, so you wont have to read all of them. SQL because it has this feature built as part of the core. Json simply wont be a very good choice for large amounts of data.
XML, you're gonna want to avoid entirely, unless you want to be able to export data because other applications requires it in XML format. XML simply uses too large amounts of storage space, due to the whole structure of the language.
#Edit2: Would whoever downvoted my answer, explain why please? .__.'
SQL can handle pretty much a lot of data. It is easy to use for a beginner and there is a lot of support for it.

ElasticSearch vs SQL Full Text Search [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 3 years ago.
Improve this question
I want to use full text search in my project... Can anyone explain me, what is the difference between ElasticSearch and SQL Full Text Search
Or
why SQL Full Text Search is better (worse) than elastic?
documentations, presentations, schema...
Define "better"... sql full text search is fairly trivial to get working (indexing and query) - but it has penalties:
very little (virtually no) control over how things are indexed (what the index keys are; what the lexers/stemmers/etc are; etc)
runs on the sql server - which is usually your least scalable infrastructure
Elastic search requires more work; you need to setup and maintain a dedicated cluster of nodes, and then provide code that performs the actual index operations, which may also involve a scheduled job that works from a change-log (processing new / edited data), building the fragments to be indexed; equally, you need to then take more time building the query. But you get a lot of control over the index and query, and scalability (a cluster can be whatever size you need). If it helps any, Stack Overflow grew up on sql full text search, then moved into elastic search when the limitations (both features and performance) proved prohibitive.
The answer depends on what goal you're trying to achieve and what resources you have to reach it. SQL server fulltext search is lower admin but limited in functionalities. Elastic search is at the other end of the spectrum.
SQL server fulltext search:
can prove efficient if you're data is not considerable growing and or schema is not changing over time
requires less effort to maintain and less of a learning curve/need for new competence
Elasticsearch:
need for data ingestion if your master db is having frequent incremental updates (logstash and other alternatives)
scales better horizontally
ability to use advanced features to improve performance for very large data sets (eg. routing)

Categories