My client wanted me to write a simple desktop app such that :
User will select a date. Then the program will travel all files in the windows pc and then create an excel report for all files which are created before the selected date.
Actually it seems really simple application. However, my client told me he has petabytes of data which made me think the app would run for hours even days. I need a really smart solution for that. For instance despite the possibility of a crash or unexpected error etc, I am planning to divide the report into parts and create a new excel file for each 50000 records. In addition, the application should continue from where it left off. In another words, if the application is closed, the application should not start scanning all over again. What is the most logical way for the program to work efficiently and continue from where it left off? What comes to my mind is to put a tick on the folders so that the already looked folders are not looked at again. What is the best way to implement this logic in C#? Is it efficient to create a separate list for the visited folders and then first look if the folder is alreay in list? Which data structures would you use for this problem and how would you implement this?
Thanks.
Okay so my overall goal is to create a UWP notes app that doesn't require the end-user to manually save each note they write; this would be done automatically for them.
So what I'm looking to do is create a C# class that will detect changes to the document the user is currently writing and constantly update the underlying text file (This will eventually be written to a row within the database, but I hear it is less efficient to constantly update records within a DB than to deal with text files for this matter?).
But yeah, this is pretty much what apps like OneNote do in the background for the user, and so the user have to never worry about saving the file or losing data in situations where the computer loses power or the app terminates unexpectedly.
So if I created a class that detected changes to the document and then update the underlying file, is the WHOLE file rewritten or just the particular parts (bytes?) that were changed within (or appended to) the text?
I'm just looking for the most efficient way to constantly update a file because if a user is a fast typist, the system will have to be able to keep up with every single keystroke input.
Last, would the entire file have to be rewritten if the user makes random changes to the text at random locations (rather than append to the end of the file)? Does any of this even make sense. I tend to write a lot to ask a simple question. I have problems....
I would do a timertick event and have it automatically save every 3 to 5 seconds. I do this alot. I understand what your doing, but automatically saving every key stroke, would be putting a lot of stress on the program.
I would automatically save every few seconds on a if basis,
If a change is detected then it will save. Think about this answer, it hs been saved almost 100 times if done by keystroke.
I would like to check with you all regarding running multi-instances of same program (in different PC) with a running number (non-duplicate) field on it.
Program outline
- Program will get running number from DB (Eg: LoanID from LoanRef table)
- User input info and click on Save.
- Program will save the information to DB (eg: Loan table)
Currently, the program will retrieve the latest running number from DB (SQL Server 2012) and put it in read only text filed on the program.
During my development testing, I only tested with one program on one PC and it worked fine.
When I tested with multiple PC, access (almost) at the same time, the running number become a problem.
Is there anyway to prevent such kind of running number issue (while allowing multiple instances of the same program).
I think of generate the number at the end of 'Save' process but the problem is user wants to see the number before save. So, I need to get it when form load occurs.
And I also can't disable multiple instances with Mutex because more than one person needs to use this program.
Any idea on what functions or features that I should use or check?
I have a program that overwrites a certain set of files required for my website, however the traffic on my website has increased so much that i now get the error File in use. Which results in it being unable to update the file..
This program runs every 5 minutes to update the specified files.
The reason I let this program handle the writing of the file and not the program, is cause I also need to update it to a different webserver (through ftp). this way I also ensure that the file gets updated every 5 minutes, instead of when a user would take a look at page.
My question therefore is; Can i tell IIS7.5 to cache the file after (say 5 seconds to 1 minute) it has been updated? This should ensure that the next time the program runs to update the file it won't encounter any problems.
Simplest solution would be change that program that refreshes the file to store that new information in database, not in filesystem.
But if you can't use database I would take different approach, store file contents to System.Web.Caching.Cache with time when was last modified, and then check if file is changed if not use cached version, and if was changed store new contents and time in same cache variable.
Of course you will have to check if you can read the file, and only then refresh cache contents, and if you can not read the file you can simply get last version from the cache.
Initial reading of file would have to be in application_start to ensure that cache has been initialized, and there you will have to wait until you can read file to store it in the cache for the first time.
Best way to check that you can read from file is to catch exception, because lock can happen after your check, see this post : How to check for file lock?
I am developing a website using VS 2008 (C#). My current mission is to develop a module that should perform the following tasks:
Every 15 minutes a process need to communicate with the database to find out whether a new user is added to the "User" table in the database through registration
If it finds an new entry, it should add that entry to an xml file (say NewUsers18Jan2009.xml).
In order to achieve this, which of the following one is most appropriate?
Threads
Windows Service
Other
Are there any samples available to demonstrate this?
Separate this task from your website. Everything website does goes through webserver. Put the logic into class library (so you can use it in the future if you will need to ad on-demand checking), and use this class in console application. Use Windows “Scheduled task” feature and set this console app to run every 15 minutes. This is far better solution than running scheduled task via IIS.
It doesn't sound like there's any UI part to your task. If that's the case, use either a windows service, or a scheduled application. I would go with a service, because it's easier to control remotely.I fail to see a connection to a web site here...
Why in the world would an admin need to get pinged every 15 minutes? Poor admin!
I would just put a time stamp on each entry in your users table and create a quick report to allow the admin to query the data whenever they need it.
I think your approach is wrong. You should do one of two things:
Add a user to the XML file as the last step in creating the user.
Generate the XML file on demand when it is requested. This will give you real-time information with very little overhead.
A windows service would give you a good solution - or if you're using SQL Server you could fire this kind of processing from a SQL Agent job.
If you want the code to be 'part' of your web application you could always fire the logic fro a heartbeat page which runs your task(s) whenever the url is called - you can then poll the url from a service or agent job.
The simplest approach is to create a System.Threading.Timer in your app on Application_Start and put it in some static field so it does not get collected. That way you can have your web app poll the database without an external process needed. Of course if your app goes down so does the timer.
For the polling logic just keep a last userId (if you have an increment policy) and check for new added users by filtering WHERE id > lastId.
Create a new table (or text file) that stores the last time you did a "new user export" or additionally just look for the modified/creation date of the last export file. When your script hits your DB do a SQL command to get all users created after the last export time. Spit out new XML for each user.
Set this script to run as a windows scheduled task/cron job/maybe even a database trigger.
Since you seem to be adding all users for a given day to a single XML file (as per your post), why do you need to do this every 15 minutes? Wouldn't it be sufficient to do this once after midnight?
When you do that, preferably in a Windows Service (if it has to run every 15 minutes) or in a command line app that's scheduled to run once at e.g. 0:15 hours, you'll just need to check the "sign up" date for the users and if there's any that signed up the past day, you add them to your list and export that list to the XML file at the end of processing the table.
Marc
While I also voted for creating a Windows Service to perform this function for you, the simplest way I could think to do it would be to put a trigger on your "Users" table that would create the xml file for you when a user is inserted.
I second Chuck's answer - you could pull this data from the database using an XML query and send it directly, no need to much around creating files on the system.