I have a lot of small pictures to store and these are pictures users can change very often.
The images have an average of 50Kb - 150Kb. Let's say I have 5000 of these images. Will FTP get unmanageable in the end or will a MsSQL database get to much load giving a normal webpage might use 25 of these images.
What technique should I choose? By the way, in my case im using a hosting solution consisting of a webfarm.
You might want to look at hosting these images on a CDN or similar. Take what SO use as a good example, imgur.
I'm thinking the performance and user experience would take you a long time to match. You might want to store some reference of the images as well.
Of course this may not be an option to you then I'd still put your images on a sub domain of your site, that is setup to deliver static content, again look at how SO do it for an detailed example.
Related
My team and I (all students, this is a college project) are creating a software, that's basically 2 frontends (a website and an app) connected to a backend through an API. We have the entity user in this software, and one of its properties its image since we expect to have a lot of users, We don't think storing the images directly on the database it's the best thing.
So in this kind of cases what should we do ? What's the best practice ? (all the components are going to be deployed so local storing it's not really an option).
(We had in mind to get some 3rd party service where we could upload the images and just store the link in the DB, but is this possible? is this fine?).
You can consider using cloud storage like
Amazon S3 (https://aws.amazon.com/s3/)
Google Cloud Bucket (https://cloud.google.com/storage/docs/creating-buckets)
Microsoft Azure Storage (https://learn.microsoft.com/en-us/azure/storage/common/storage-introduction)
They have some free of charge for experiments/learnings but you also need to be aware of charge fees after some certain usages limits. They're using cloud computing, so you don't need to worry about the performance.
Another choice that you can build your own image server and serve images directly to your website. The benefits could be
Having the same connection directly to your website via domains (faster than cloud sometimes)
You can flexibly control your image sizes/types according to your image requests. For example, you can convert your images to WebP or compress them to fit users' local devices
Of course, with this way, you need to have knowledge about building an image server.
there are many different cloud storage services that you can use to host images for your project. most popular of them is imgbb & imgur . I've been using them for a long time and they work great. Both have api's which are very simple to implement. If its your first time using them, you might want to watch a tutorial on youtube . Hopefully this helped you.
I have a situation where I need to display an image to an end user following the below method.
When the user request the image from a URL, the C# code behind should start by looking in an azure blob/cdn to see if the image is there. If the image is there and less than x days old, it should pass the image to the end user in the most efficient way (preferably without spending too many resources (memory & cpu) passing it to the user.
If the image is not there or more than a week old, the image will be generated based on the parameters supplied in the url the user requested, after which it is stored on the blob/cdn and displayed to the end user.
My problem is how, I in the most efficient way on azure, can generate a lot of images simultaneously as well as being able to pass the data from the cdn while still being able to first check if it is too "old" and needs to be regenerated or if it isn't there and needs to be generated before being displayed to the user. Since the second I pass the image through the c# code will loose the cdn's strengths.
There are many ways to do things on azure. You will need to look through the extensive azure documentation and determine what will best fit your needs.
If you want to get something working quickly azure blobs can be served directly to a client (straight to their browser) - see the doco keep in mind anonymous access is not secure.
I am creating application in which I would have database which users can´t edit. It just have data and it shows to user. I found that I dont have to use localstorage I just can add database to project and it is readonly. So I did it and everything is ok. Now I want to add images to my app and I am not sure what is better way. I can every image add to folder, in database store path to image and It´s very easy and It can do now. Or I found that I can store image in database as image (byte[]). What is better? Would be images in database smaller? Would be loading of images faster? If images in database is better solution have can I easy way add images to my existing database? Is there any article to do this in winforms? Thanks
Edit:
I'm glad that my question has so many answers and opinions. I want to explain more my needs. My application should have about 150 pictures and picture should be sized 150px and 100px. I want to app would work without internet connection. It could connect for updates but that´s all. Again thanks for all opinions :)
For our application, we found a hybrid approach worked best. We configured our SQL environment to support FILESTREAM and then imported all of our images. That gives us the flexibility of having the images 'in sql' while still storing the actual images to disk. Its a fast solution that may work well for you, too.
There are a lot of 'ifs', 'buts' and 'maybes' about where you put you put your images and I don't think there is a right or wrong way.
One thing worth mentioning is that if the application is business critical with high availability I would store them in the database, simply because the can be backed up with the rest of the data. Databases can be mirrored etc so having all the images treated as 'data' can be beneficial. Also if things get big and a web farm is employed and load balancers etc it helps when the images live in one place.
For me, I'd go with the database. But it all really depends on the scale of your application.
I personally would store all the images in isolated storage and in the database I would store the path to the image.
I would store all the images as a byte[] in the isolatedstorage
Nothing wrong with the answers already here but it really depends on what you are trying to accomplish and how it is currently set up.
Best would be to use a content delivery network for static images to ease the load of your web server.
You can serve from database or as resource file in file structure to your liking. If the images are static don't forget to add appropriate caching which for static content would be far future.
If you're rendering images I would definitely keep the images in the database to make it possible to access the images from any number of web servers.
If you want you can develop this further by keeping the images most recently used in memory or store on local filesystem so you don't have to get the file from the database every time.
web folder. also think about remote loading them from a cloud server like Amazon S3 to free up access to your own server
THe best option is to store your images on web folder, if you save them in DB, you will waste time retreiving them from DB because it takes more time
I am creating a site whose content is dynamic and has images in it.
What/How much performance hit will my DB (MSSQL) take if I save content/Images in DB?
I am just trying to understand what kind of problems I may run into.
I appreciate any responses.
Thanks!
If you keep content (e.g. images) outside of the database, you can let IIS serve this content directly without calling ASP.NET at all (and as a consequence, no database access is needed).
You can even put static content on a different server is you have huge load (like here on StackOverflow).
So if you need to scale in any way, keep static content outside of both ASP.NET and database.
In the company where I'm employed we are using a custom-made CMS.
It renders content and controls for a page dynamically. The content are stored in a table for all pages, (each page has one main content), and other tables store information regarding UserControls, path to it, and which properties should be set with which value via reflection.
The performance is good, even for lets say 10 dynamically created controls. Our biggest client's page has about 70k hits a day and there is no performance problem. The page renders really fast.
Storing the images in your database can also work.
Just keep in mind that you need to use server-side caching for your images (e.g. get them via a generic handler *.ashx and use chaching there) and hope that your imageurl gets recognized for clientside caching.
If you want to be sure, expose your images directly on a dedicated image application. (e.g. www.foobar.com is your URL, then you can create images.foobar.com and store all your images there)
If would definetly advise to store often used images there, like images for the layout, or userpictures (if you are using a forum, or some kind of web application that uses several pictures all the time). But there is nothing wrong with storing not often used pictures in the database (user related uploads et cetera).
If you store your images in DB, database size will increase and this will result in slower DB queries. Better store on other media and guide your DB to do the stuff for you.
I am implementing an eCommerce application using ASP.Net. I would like to know if custom Google search is sufficient enough or if we plan to go implement our search functionality.. how do we go about doing it?
Ideas and Suggestions and best practices are most welcome.
Regards,
Abdel Olakara
If you don't plan on using Google Search then you really have 2 options:
If you are using SQL Server you can put all of your site text into ntext or varbinary fields so that it is search-able. Then if you have files like PDFs etc you can put the files into a table as varbinary and create a Full Text Catalog to search them. For PDFs you will need to have iFilter installed, it is part of the free Adobe Reader package. There are other iFilters. Check out http://www.ifilter.org/ for more info on them.
If you are using asp.net hosted on a windows server you can use the Windows Indexing Service and put any of the data that you want into text file or any other file format. You might still need iFilters for those formats.
I would suggest option 1 if you don't go with Google. It can be a little more complex but option 2 can have the issue that everything starts to look like it is being saved in a giant heap. You can also do some combination of 1 and 2.
You might like to look at Lucene.NET http://lucene.apache.org/lucene.net/
I have spent many years implementing search engines and using an established 3rd Party tool like Lucene will save you a lot of heartache. There are many, many gotchas and edge cases with searching. These have been dealt with to a large degree in Lucene.
As I read it, Custom Google Search is about searching web pages in your site. Contrast this with searching database contents such as lists of products or reviews of items.
What are your requirements? My guess is that when I go to an eCommerce site (eg. Amazon or something such as Trip Advisor) I want to search content that is stored in databases. So I'm doubtful that this particular Google capability fits my expectations of eCommerce.