As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I have thousands of images in a directory divided in sub folders. I want to take all these images out of the file system and put them in a database. I dont think this kind of data is good for a normal database like sql server. is there a database out there that is good for holding thousands if not millions of small high definition thumbnails? I would like to query this database by id then it provides me the image
There are a few guidelines.
If the files are small (say under 4K) and you have a lot (say over 10K), you could see an improvement with storing the files in the database;
If you want to simplify backup by storing everything in one database, and get replication for free (if you've already got replication set up for your normal database), database would have an advantage;
If your files are large (say over 100K), storing them in the database will very very likely not be a good idea (SQL databases are not build for that). If you still want to store them in the database, look for something else (like CouchDB etc);
One big disadvantage of storing them in the database is that it's going to be more difficult to access the images. For one, getting the images from disc uses file system caching and optimized paths for streaming files directly from disc over the internet. You lose all this and this may give problems under certain circumstances (again, when your files are large);
When your files do not change often (which images don't), a database is more often than not, not a good fit. Databases are good in storing and mutating small amounts of data. Large static pieces of information do not fit this model well.
The filesystem is usually the best way to store big blogs of binary data like images that can be easily identified by an ID, URL or anything unique.
Related
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I'm implementing a SOAP Webservice for sending thousands of emails and storing thousands of XML response records in a local database. (c#.net, visual studio 2012)
I would like to make my service consumer as fast and lightweight as possible.
I need to know some of the considerations. I always have a feeling that my code should run faster than it is.
E.g.
I've read that using datasets increase overhead. So should I use lists of objects instead?
Does using ORM introduce slowness into my code?
Is a console application faster than a winform? Because the user needs no GUI to deal with. There are simply some parameters sent to the app that invoke some methods.
What are the most efficient ways to deal with a SOAP Web Service?
Make it work, then worry about making it fast. If you try to guess where the bottle necks will be, you will probably guess wrong. The best way to optimize something is to measure real code before and after.
Datasets and ORM and win form apps, and console apps can all run plenty fast. Use the technologies that suit you, then tune the speed if you actually need it.
Finally if you do have a performance problem, changing your choice of algorithms to better suit your problem will likely yield much greater performance impact than changing any of the technologies you mentioned.
Considering my personal experience with soap, in this scenario I would say your main concern should be on how you retrieve this information from your database (procedures, views, triggers, indexes and etc).
The difference between console, winform and webapp isn't that relevant.
After the app is done you should make a huge stress test on it to be able to see where lies your performance problem, if it exists.
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I'm looking for an existing datamap and/or MySQL script that can migrate data between TYPO3 and Joomla. Obviously, both CMS use MySQL and that's great, but I'm wondering if there is already a document/script that takes the elements from TYPO3 database and puts them in the Joomla database.
I'm planning to write my own migration program in C# so if anyone has some code snippets for that, it would also be helpful.
Forget that, there are too many differences to make it possible... TYPO3 instances are built typically with many various extension + pages + content elements etc. Although Joomla's initial structure is quite simple (as far as I remember) when you are adding new plugins it changes in many different ways. Conclusion is simple if somebody wrote such 'mapper' between his TYPO3 and his Joomla most probably will not work in any other combination.
How many pages are there in the TYPO3 ? if less than 100 I wouldn't waste a time for any programm, just copy/paste it in common editing mode.
If more, I think that would be best solution to write a TYPO3 extension (PHP), which will get all required data from some page using it's parsers, configs etc and will convert it to JSON (or even pure SQL insert statements) with structure understandable in Joomla. Keep in mind that many elements can be quite different in the database - than on the client's side. The best example are internal links which in DB are saved just as ID's of pages to which they are pointing. You need to resolve it on the TYPO3 side, or you'll need to learn how relations are built in TYPO3's database to do the same task in your C# application.
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I'm currently working on a todolist system that acts like a calendar to store your Tasks and can check them off once you are done with it. We should also be able to undo our changes made.
Currently, my project mate is suggesting that we store the data into files with different dates and when we want to search for a particular task, just search whether the file exists, then edit and directly manipulate the files when needed.
However, I feel that it is better to store the data into 1 large file and load it into memory(possibly a list of tasks) when our program is executed. I can't explain why though.
Does OOP come into the picture when dealing with this?
Sorry if I am a bit confused as I am still learning.
It is a perfect task for a database solution. I suggest that you use the SQL server database that was included with your Visual Studio for this task.
Store each task as rows in a table and select dates and subjects for the calendar view and all the values of one task when editing. VS has some pretty good tools to create such an application in a few minutes (for an experienced user)
Handling files is always a mess when several persons need to edit the data at the same time.
Always the best practice depends on your work you are doing as for todo list you have to make multiple operation on ,
So its going to better if you use client side memory like a sdf file to do this instead of making files because sdf file will work as database and because its an light weight with large data to so easy to handle than file
Firstly this is a persistence problem and should be dome using well known patterns. You could use a database and repository pattern to solve this. Nosql database are also an option ,as thy are easy to setup and lacks the overheads associated with SQL dbs.
But if flat files is your option then holding all data in memory has the flaw of when an exception occurs or the program shut you loose all you data. Persistence is necessary using create read update ans delete CRUD cycles.
This results in persisting in small chunks as you go and you only loose a small amount of data if you crash.
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I'm writing application on C# (WPF) for Windows. I need to store data for this application (in database, i think). But i need to secure this data (from modification). I'm planning to distribute this application, so filesystem protect doesn't fit for this situation. Also i wanna use NoSql database. My question is which NoSQL database support data protection and can be installed with minimum efforts as embedded with application database?
Upd: So, which is the better Single user NoSQL DB, no security, for redistributable WPF application?
When the data is on the users hard drive, there is no way to prevent the user from accessing it.
Storing it in an obscure format, like some uncommon database, wouldn't make it much harder. Even encrypting the database wouldn't be an insurmountable barrier. You need to store the decryption key somewhere inside your application where a determinded hacker will find it when searching for it long enough.
When you really want to protect your data, there is no way around storing it on your own server and let the program access it via the Internet. I would recommend to use a webservice for that.
This will, of course, mean some additional cost for you as you have to pay for the server. But you also get some additional perks for that:
You can give each user an own username and password. That makes it a lot harder to pirate your product (too many IPs from different networks using the same username = likely software pirates)
Updating your database doesn't require any deployment on the users machines
You can write "cloud" all over your marketing material :)
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I am designing a CMS in C# and need to decide where to save configuration settings for the site. Also considering defining my base html templates and then processing them server side to create the pages ahead of time.
So what is generally faster/less overhead for the server reading an XML file or querying that same information from a local database?
It seems unlikely to me that this will be a bottleneck in your code. How often are you planning on reading configuration settings? Configuration is typically fairly small, and read infrequently. There are more important things to consider about where and how to store it:
Bootstrapping: you can probably rely on your app having access to the local file system, and you can hard-code a configuration filename... but if all your configuration is in the database, where do you configure which database to talk to?
Ease of tweaking and deployment: editing a configuration file on the server by hand may be faster than making a change in the database... but if you have multiple servers, do you want to tweak a file on every one of them?
Simplicity of code reading/processing the configuration: what would your configuration look like? Is it naturally hierarchical? If so, XML is likely to be a good fit. If it's more like a set of name/value pairs, then a simple table is a good fit. Of course, you can store XML within a database - you don't have to tie the storage location and the storage format decisions together. Editing an XML document on the database may well be harder than either editing an XML file or changing individual values... but you can always make life easier with tools for this sort of thing.
Just for server settings - really doesn't matter. You're only going to read them once. Even if it takes a couple of seconds, still will be unnoticable.
First measure, then optimize.
How long is a piece of string? I can write a database query that's much slower than reading the same data from an XML file, but I can also write an XML file that's much slower to query than reading a database.
I would say if you're displaying "mostly" static content and you're worried about performance, then it's probably a better idea to implement it in whatever way you think would be the simplest, then use a caching mechanism to make it performant - this way, the first access might be "slow" but subsequent accesses will be much, much faster.
Typically, if you're generating HTML content, write the completed HTML to disk and send that to the browser, instead of populating it from the database/XML files on subsequent requests. If you have your backend process delete the cached files whenever it does an update to the content, then the server can automatically detect when the file doesn't exist and re-generate it again.
This depends on the strategy you are going to employ to access the data.
If you go the database route, are you going to cache your results? There could be a lot of network chatter if you constantly pull out the details from the db.
In terms of simplicity you really can be agnostic to the data source using Linq..
Faster?
Once the stuff is in memory there should be no difference. Configuration information, as another poster pointed out is typically fairly static. Why not create a console app and quantify the differences, by using a profiler.
http://www.red-gate.com/products/ants_performance_profiler/index.htm?utm_source=google&utm_medium=cpc&utm_content=brand_aware&utm_campaign=antsperformanceprofiler&gclid=CLXmzbLXnKMCFQs-lAodSXTGsQ
if your most important point is speed, then use a database. xml is very slow.
but,
if your data is very "complicated" or has many different relations and attributes, consider using xml
I'd just use a database by default here.
It's faster and requires less code.