Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I am creating a Zillow-like IOS app. I am using Azure SQL (S-0 version) and C# for my backend. I store users data including the avatar of each user and the trips the person has been to (another table), where each user can have multiple trip pictures. I am using varbinary(max) to store those image types. So each user entity would have an average of 3 pictures.
my C# app retrieves all users with all their images and trip info and images using lazy loading in my linq-to-sql query.
The issue is naturally the time it takes to load all those date before user becomes able to consume it in the iphone clients (written in Swift). The end user will need all data to scroll through all those images. He at first load users, to see his images and read about him, and then scrolls up and would see trips the other user has gone to. It won't make sense for the user to see only avatars and then wait for all trips for example, every time he calls.
What strategy should I do to minimize the load time!?
The thoughts I have had so far:
Use Azure Blobs to store images and consume that right from my swift clients to see if it will be any faster. I ditched that option because the extra slow retrieval of files because I have to first access the image file from my C# code, and make binary out of it so I can transmit it to my iphone client app.
Instead of using linq-to-sql, use the good ole ADO.NET to minimize the extra layer linq-to-sql. I didn't see much difference not even a second extra.
optimize my linq query by adding as much as possible in the where clause and avoid using loops. Make them all one select statement. The query spat by the linq-to-sql seems right. It didn't change much because the issue is the extra huge binary write ups being retrieved.
I mentioned it already above, using silent calls where I load only the user-related image and then when user sees summary, behind the scene I start loading all those trips, but the user will quickly need to check the trips. He will be confronted of having to wait again to see those trips. And I would have to initiate new calls to DB, which is expensive. Very undesirable!
What other suggestions should I try?
Related
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 5 months ago.
Improve this question
I am developing a light weight web app in ASP.NET with C# and I would like to store some data such as site analytics etc.
Instead of using a SQL server database could I use a JSON file instead? So I would open the JSON file and write into it when done. Could there be problems with simultaneous connection?
Is this standard/ common practice?
Any help will be highly appreciated.
It is probably a VERY bad idea. I would only do this if it was some text file to display a wee bit of information.
If you need updating? Then what happens if two users are on the site. The last person to update will over write the text based json file.
I suppose you could write out a "id" and json file for each record you edit, and that way a save of one record value would not over write others. But, putting more then one reocrd in a json file, and if users are to edit such values, then no, it will not work, since a simple save of that text json file will overwrite any other changes made by any other user who also happens to hit the same button.
It is VERY hard to find a web site hosting, even those for super cheap - less then $10 per month that does not include a database server system. And be it MySQL, SQL server, postgres sql and more? They all have free verisons you can use.
And I suppose you could consider the file based sqlLite. That would even be better. While it not considered thread safe, it can work if you only say have a a few users like 2-4 users working at the same time.
Because you have OH SO MANY choices here? The only reason to use a text based json file is if no other options exist - and boatloads of options exist in near all cases.
And if some database was not available? The I would simple include sqlLite in the project and use that.
I'm not super experienced but i would like to explain why i would never do that.
First of all everything (if you are brave enough) can be a database, as long as it is some kind of file that can give persistency to your data. Database are basically optimized one purpose software to store data. Mainly the problem in your solution is that you would need to read the file and load it in memory storing it as an object and then writing data to it like you would do with a static factory object and then serialise it back to JSON after you are don with it. Personally i don't like this idea because i think is very prone to human error (like deleting the file by accident during mantainance) and it can be hard-er to debug if it starts accumulating a sizeable chunk of data. There are very lightweight data persistency solutions that implements SQLite that is a database for small applications, like Pocket Base. Since you are already developing a backend it would require you near to no effort to add a little table to store the analytics.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
I am trying to store a software "Calender"(complex logs that saved and browsed based on date).
I tried using both .ini and xml but when the application tried to read the entire file to find info for 1 specific day out of the 100 days (or so it seemed), it took almost 9 seconds to get 5 variables out of 500 variables. The size of the actual file might eventually be more than 40 variables per day.
Also, I would rather not make a file for each day, that will seems a little bit unprofessional and messy.
I am asking the question to know if there is an alternative to keep things fast and neat. The data includes different types of variables and different amounts of them. I know i am kinda overdoing it with logging thing but the program needs logs to do its work
If the data must be stored it has to be a file or a database (local or remote), I'd go for SQLite, it would end in a single file, but you could query the data with SELECT, JOIN, etc.
EDIT:
You can use SQLite3 from c# if you include this package:
https://www.nuget.org/packages/System.Data.SQLite/
You'll need to learn some SQL, but after that you'll just use something like:
select Message from Logs where Date > '2015-11-01' and Date < '2015-11-25';
which is easier, faster and clearer than messing with XML, and it will not load the whole file.
As mentioned above, SQLite will offer a great possibility. Since you (generally), and probably not a lot of people out here will be able to write a database management system that is as efficient as the ones out there.
https://www.sqlite.org
Whole point of using RDBMS because it's far more efficient that dealing with files.
SQL Lite is light weight and easier to deploy. But remember that,
SQLite only supports a single writer at a time (meaning the execution
of an individual transaction). SQLite locks the entire database when
it needs a lock (either read or write) and only one writer can hold a
write lock at a time. Due to its speed this actually isn't a problem
for low to moderate size applications, but if you have a higher volume
of writes (hundreds per second) then it could become a bottleneck.
Reference this question
If this is an enterprise level application requirement I would go for Azure Table storage based solution which is identical for this sort of scenario.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I have a UI which calls WebAPIs (WebAPI 2.0), Web API are basically LINQ queries (to MS SQL database) and some processing logic for the data. I want to do performance evaluation of the complete flow (click from UI to API, back to UI to display data) upon a huge DB with 30K - 60K records in it.
How can it be done? Let me know the methods/tools used for this.
currently I am tracking time-taken in chrome debug window, which shows the total time for each network call.
Wow. This is a subject in its own right but here's an approach:
The bits are independent so you break it down. You measure your LINQ queries without any of the logic or web api stuff getting in the way. If LINQ is against stored procedures then measure those first. Then you measure the cost of the logic, then you measure the cost of sending X rows of data using WebAPI. You should avoid including the cost of actually retrieving the rows from the database so you're checking just the connectivity. I'd also consider writing a browserless test client (i.e. GETS/POSTS or whatever) to eliminate the browser as a variable.
Now you've got a fairly good picture of where the time gets spent. You know if you've got DB issues, query issues, network issues or application server issues.
Assuming it all goes well, now add a bunch of instances to your test harness so you're testing concurrent access, load testing and the like. Often if you get something wrong you can't surface that with a single user so this is important.
Break it down into chunks and have a data set that you can consistently bring back to a known state.
As for tools, it really depends on what you use. VS comes with a bunch of useful things but there are tons of third party ones too. If you have a dedicated test team this might be part of their setup. SQL Server has a huge chunk of monitoring capability. Ask your DBAs. If you've got to find your own way, just keep in mind that you want to be able to do this by pressing a button, not by setting up a complex environment.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
I created a wpf browser application ,and I hosted it in a live server ,it is a Inventory Management application ,which uses database heavily for saving, updating and selecting data.On save and select operation it takes very long time to save and fetch data to and from database.In the meantime (when save or select data from database) the application is irresponsive .How to make the application fast when dealing with database and how to make the application responsive during that operation .I am totally messed up ,its very kind if anyone has the answer .
Is using stored procedures instead of raw sql bring performance or not?
Edit
My application interact with database by using raw sql statements.
Responsive Application
In order to make your application responsive you need to make every interaction with the database in a new thread or task. You could also use the new C# 5.0 features await and async.
Your current application is blocking because the main thread is handling database operations as well as user interface, so what you need to do is to handle these two separate functionalities in separate threads.
Fast Database Access
I don't know how are you dealing with the database right now, but you could definitely use some kind of ORM like Entity Framework or NHibernate.
For better performance, but much less readable and mantainable code, which is really important, you could use raw sql.
try this
to access DB faster in .NEt i am sure it not problem with c# but for querying and retireving DB u need to have better mechanism
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I want to store images of my employees with their profile details in SQL in a SQL Server database, but I have following reservations:
Should I compress images or not and if yes, please can the community provide me sample code or direct me to a guide that can assist me with this.
How should I retrieve images efficiently? I am afraid of asp.net application performance issues. I think with thousands of employee records, the system may halt or slow down.
I would suggest storing the path of the images on your sql table, and actually store the image in a secure folder. This has worked for me perfectly.
Sql Server is well equipped to handle large amounts of binary data, but as always, the devil is in the details and the design:
To avoid any performance issues that can happen with reporting or access from other applications due the Sql Server's paging process (all db engine use pages as a discrete unit of data manipulation) I would place the image data in a separate table that is linked 1-to1 to your employee table. Doing so will keep the binary data from being moved around with employee queries unnecessarily when accessed from other parts of your system. Your image display logic will of course have to join on the two tables to match up employee and image records.
First thing, ten thousand records is nothing ... as long as you have the right indexes in place SQL Server will handle that with ease.
However, consider just using the file system to store things like images. The SQL Server record could just contain a pointer (ie. path) to the file and the asp.net application could just use that to read the file from disk and stream it to the browser.
Most image formats (such as JPEG, or PNG) already employ some compression algorithm, so unless your source is a RAW image or an uncompressed bitmap, additional compression won't help much. What might help is limiting the size of the image, say a maximum of 400 x 400 pixels. I don't know what the intended purpose is so I don't know if this will work well for you. Obviously, if you are storing the data directly from a digital camera, the images will be very large and you might want to scale them down first, to something more reasonable.
If you are on SQL server 2008 or above then you can use the filestream feature that is like the best of both worlds. It stores the image in the filesystem but has it under transactional control and is also included in your backups when they are taken.
If you are not on 2008 or above then I would say to keep the images in the DB, see this old Microsoft white paper for my reasons why.
http://arxiv.org/pdf/cs.DB/0701168