log4net rolling file is rolling upto minute and not rolling second - c#

I have a log file where in I want to roll up a file based on size and datetime. if the size is more than 2mb, then a new file is created with same filename_datetime.seq where seq is (1,2). When the rolling occurs in next min, a new file with minute is created as shown below instead of updating the existing file which has enough space. rolling style used is composite
filename_10_18_2023_125501.log
filename_10_18_2023_125501.1.log
filename_10_18_2023_125501.2.log
filename_10_18_2023_125625.log
filename_10_18_2023_125625.1.log
filename_10_18_2023_125625.2.log
as seen above 125501 file rolled twice because of max size reached. Then again at 125625 it rolled twice etc. I have not found anything online where the rolling happens with second for e.g.
filename_10_18_2023_125501.log
filename_10_18_2023_125521.log
filename_10_18_2023_125545.log
filename_10_18_2023_125621.log
filename_10_18_2023_125625.log
filename_10_18_2023_125658.log
Does anyone know if this can be implemented using other mechanism or this is a known thing with log4net
This is a similar question as below
log4net RollingFileAppender is not rolling by date every second

Related

When updating a text file via code, is the entire file re-saved, or just the parts?

Okay so my overall goal is to create a UWP notes app that doesn't require the end-user to manually save each note they write; this would be done automatically for them.
So what I'm looking to do is create a C# class that will detect changes to the document the user is currently writing and constantly update the underlying text file (This will eventually be written to a row within the database, but I hear it is less efficient to constantly update records within a DB than to deal with text files for this matter?).
But yeah, this is pretty much what apps like OneNote do in the background for the user, and so the user have to never worry about saving the file or losing data in situations where the computer loses power or the app terminates unexpectedly.
So if I created a class that detected changes to the document and then update the underlying file, is the WHOLE file rewritten or just the particular parts (bytes?) that were changed within (or appended to) the text?
I'm just looking for the most efficient way to constantly update a file because if a user is a fast typist, the system will have to be able to keep up with every single keystroke input.
Last, would the entire file have to be rewritten if the user makes random changes to the text at random locations (rather than append to the end of the file)? Does any of this even make sense. I tend to write a lot to ask a simple question. I have problems....
I would do a timertick event and have it automatically save every 3 to 5 seconds. I do this alot. I understand what your doing, but automatically saving every key stroke, would be putting a lot of stress on the program.
I would automatically save every few seconds on a if basis,
If a change is detected then it will save. Think about this answer, it hs been saved almost 100 times if done by keystroke.

Keeping the last X minutes of a recording video

I'm developing an app which requires me to record a lot of video data but I don't need to store them all.
What I need to do is to keep just the last X minutes of the recorded stream.
This means that I need a way to remove the oldest sample everytime that I need to store a new one.
So I started working with this article: http://msdn.microsoft.com/library/windowsphone/develop/hh394041%28v=vs.105%29.aspx
The first idea that I had was to just call the StopVideoRecording() and then the StartVideoRecording() using a timer each X minutes.
Now, at first this made sense but it won't work.
The problem is that doing this way will delete the previous data each X minutes.
This means that, if we record 12 minutes and we need to keep the last 5, following this idea we'll delete the first 5 and then the second five, leaving just the last 2 minutes and this is not what I was looking for.
I moved then my attention to the VideoSink class because of the OnSample method.
This seems pretty simple, we intercept every sample and we store it in a fixed size byte array (the size depends on the needed length and the sample's size).
When the buffer is full we just shift everything on the left before adding the new sample.
The problem is that a test video of just 1 minute generated something like 2GB of samples and this makes this way really hard to manage.
I know that those samples are uncompressed, but wouldn't be hard, for a smartphone, to get a sample, compress it, shift a big array, insert the sample and write the array to a file and do it on EVERY sample received?
Yeah, I'm talking about writing the array to a file because we need to persist this video somehow. It may happen that the battery stops working, and having it just in RAM will let us loose everything that we recorded!
The last idea that came to mind was to use a combo of VideoSink and FileSink.
While the FileSink does the compression magic (I even decompiled this class to understand what it does but there's no code inside!), we use the VideoSink's OnSample method to manually remove the unneeded data from the mp4 file used by the FileSink.
This one sounds quite hard because I don't know if I can write to the file with both FileSink and VideoSink without concurrency issues, and I've not found a good c# library to help me working with the mp4 files without having to deal with its structure.
The only library that I found is this one http://basemedia.codeplex.com/ but it totally lacks documentation (each link in the documentation page gives a 404 error).
I'm starting to think that this is something that can't be done, but I'd like to see if there's someone here which can point me to the right direction.
EDIT:
Just to be clear, I used the "recording" word and not the "recorded" one beacuse I'm talking about trimming the video while it's still recording!
This is not about editing it once it has been saved, but something more like removing stuff from the stream while I'm writing it to disk.
I cannot provide a code to you but just an idea. Because you have requirements:
I need to do is to keep just the last X minutes of the recorded stream.
Target platform is windows phone 8
I want to add some modifications to your first idea:
Write each minute of video stream in the separate file
Also you need to leave +1 video file more then a number of minutes. For instance if you need 5 minutes you should always keep 6 files because last file may be not full.
By use DirectShow you will able to join this files into one. Be ready to use C++ (As alternative to this you can use some service or make own back-end for this)

XML file data is lost when sudden shutdown is occurred

I have an application that stores data in XML file every 500 ms using XElement object's .Save("path") method.
The problem is : when a sudden shutdown is occurred the content of the file is deleted so on the next run of the application the file can not be used.
How to prevent that / make sure the data will not be lost?
P.S: I'm using .NET 2010 C# under windows 7
I've made an experiment: instead of writing to the same data.xml file I've created (by copying from the original file) a new file each time and when the power was off while copying from data.xml file it would corrupt all previously created files?!?!?
Let's assume your file is data.xml. Instead of writing to data.xml all the time, write to a temporary file data.xml.tmp, and when finished, rename it to data.xml. But renaming will not work if you already have a data.xml file, so you will need to delete it first and then rename the temporary file.
That way, data.xml will contain the last safe data. If you have a sudden shutdown, the incomplete file will be the temporary data.xml.tmp. If your program tries to read the file later on and there is no data.xml file, that means the shutdown happened between the delete and rename operations, so you will have to read the temporary file instead. We know it is safe because otherwise there would be a data.xml file.
You can use a 2-phase commit:
Write the new XML to a file with a different name
Delete the old file
Rename to new file to the old name
This way, there will always be at least one complete file.
If you restart, and the standard name doesn't exist, check for the different name.
This one could be a life savior but with little more efforts. There should be a separate process which does
Take backup to its stash automatically whenever the file gets updated.
It internally maintains two versions in a linked list.
If the file gets updated, then the latest shall be updated to HEAD using linkedList.AddFirst() and the least version pointed by TAIL could be removed by linkedList.RemoveLast().
And of course, it should scan and load the stash about the latest version available in the stash during startup.
In the hard shutdown scenario, when the system starts up next time, this process should check whether the file is valid / corrupted. If corrupted, then restore the latest from HEAD and subscribe for FileChanged notification using a simple FileSystemWatcher.
This approach is well tested.
Problems seen
What if the Hard shutdown happens while updating the HEAD?
-- Well, there is another version we have it in the stash next to HEAD
What if the Hard shutdown happens while updating the HEAD when the stash is empty? -- We know that the file was valid while updating HEAD. The process shall try copying again at next startup since it is not corrupted.
What if the stash is empty and the file has been corrupted? -- This is the death pit and no solution is available for this. But this scenario occurs only when you deploy this recovery process after the file corruption happened.

c# overwrite file, error file in use by IIS 7.5

I have a program that overwrites a certain set of files required for my website, however the traffic on my website has increased so much that i now get the error File in use. Which results in it being unable to update the file..
This program runs every 5 minutes to update the specified files.
The reason I let this program handle the writing of the file and not the program, is cause I also need to update it to a different webserver (through ftp). this way I also ensure that the file gets updated every 5 minutes, instead of when a user would take a look at page.
My question therefore is; Can i tell IIS7.5 to cache the file after (say 5 seconds to 1 minute) it has been updated? This should ensure that the next time the program runs to update the file it won't encounter any problems.
Simplest solution would be change that program that refreshes the file to store that new information in database, not in filesystem.
But if you can't use database I would take different approach, store file contents to System.Web.Caching.Cache with time when was last modified, and then check if file is changed if not use cached version, and if was changed store new contents and time in same cache variable.
Of course you will have to check if you can read the file, and only then refresh cache contents, and if you can not read the file you can simply get last version from the cache.
Initial reading of file would have to be in application_start to ensure that cache has been initialized, and there you will have to wait until you can read file to store it in the cache for the first time.
Best way to check that you can read from file is to catch exception, because lock can happen after your check, see this post : How to check for file lock?

How can I catch the event when a log file rolls over in Log4Net so that I can do some further action?

Basically, this is what I am trying to do. I am using Log4Net and rolling over the log files each day and appending the file name with the date.
I'd like to catch the event that begins the rolling over action so that I can perform another task right after the rollover finishes.
Specifically, I am trying to encrypt the file immediately after it rolls over.
Any help would be greatly appreciated. Thanks in advance!
If you're doing nothing but leaving the encrypted file on the server then I see a flaw in your setup. You will always have the current day's data sitting in plain text on your server. Since it is all just streams I bet it would be easy to inherit from the RollingFileAppender and replace its output stream with something wrapped in an encrypted stream.
You can always imploy a file watcher on a seperate thread for your log directory. When it sees a new file, encrypt the last one. I can't find any rollover event for log4net, but this should be pretty quick for your needs.
http://msdn.microsoft.com/en-us/library/system.io.filesystemwatcher.aspx

Categories