I would like to get the username of an accessed file (add, delete, rename,...).
actually I use filesystemwatcher to monitor the file access and I have activated object access on an directory to get userinformation via eventlogs. This solution is not perfect, because there are a lot of file events and the eventlog messages are not so detailed. there is just one eventd id for write data. this it is used for add file, rename , move,... every write data. Additionally I had to crosscheck that the eventlog message matches the filesystemwatcher event. I would prefer handle this better. so i spend al lot of time googleing, reading, ... I know there is another post on stackoverflow
Get username of opened file
but i think there should be a possible solution because Windows Events can get the username.
with reading on a few pages i disovered that there should be a possible solution using netapi32.dll. the example code on
http://vbcity.com/forums/t/133307.aspx?PageIndex=2
doesn't work for me. i was unable to get the fileid so i changed the code to
private ulong GetFileIdFromPath(string filePath)
{
WinAPI.BY_HANDLE_FILE_INFORMATION objectFileInfo = new WinAPI.BY_HANDLE_FILE_INFORMATION();
Thread.Sleep(200);
FileInfo fi = new FileInfo(filePath);
FileStream fs = fi.Open(FileMode.Open, FileAccess.Read, FileShare.Read);
WinAPI.GetFileInformationByHandle(fs.Handle, out objectFileInfo);
fs.Close();
ulong fileIndex = ((ulong)objectFileInfo.FileIndexHigh << 32) + (ulong)objectFileInfo.FileIndexLow;
return fileIndex;
}
with this code I'm able to get the fileid but with the fileid and the example code I'm unable to get the username.
From My last program ( 2 week ago) - I was asked to audit change in files ( also the user name)
the Solution was by filesystemwatcher and after an event -> goto the Event Log of windows and bu Xpath search - To find which user made the action.
public static EventUnit DisplayEventAndLogInformation(string fileToSearch, DateTime actionTime)
{
StringBuilder sb = new StringBuilder();
const string queryString = #"<QueryList>
<Query Id=""0"" Path=""Security"">
<Select Path=""Security"">*</Select>
</Query>
</QueryList>";
EventLogQuery eventsQuery = new EventLogQuery("Security", PathType.LogName, queryString);
eventsQuery.ReverseDirection = true;
EventLogReader logReader = new EventLogReader(eventsQuery);
EventUnit e = new EventUnit();
bool isStop = false;
for (EventRecord eventInstance = logReader.ReadEvent(); null != eventInstance; eventInstance = logReader.ReadEvent())
{
foreach (var VARIABLE in eventInstance.Properties)
if (VARIABLE.Value.ToString().ToLower().Contains(fileToSearch.ToLower()) && actionTime.ToString("d/M/yyyy HH:mm:ss") == eventInstance.TimeCreated.Value.ToString("d/M/yyyy HH:mm:ss"))
{
foreach (var VARIABLE2 in eventInstance.Properties) sb.AppendLine(VARIABLE2.Value.ToString());
e.Message = sb.ToString();
e.User = (eventInstance.Properties.Count > 1) ? eventInstance.Properties[1].Value.ToString() : "n/a";
e.File = fileToSearch;
isStop = true;
break;
}
if (isStop) break;
try
{
// Console.WriteLine("Description: {0}", eventInstance.FormatDescription());
}
catch (Exception e2)
{
}
}
return e;
}
Related
I am working on a c# console application. I am saving some data into a text file. Every time the programs are run it saves the data into that file without overwriting into it. Now I want to save the data into a new file every time I send a request/runs the new program.
var result = XmlDecode(soapResult);
XDocument doc = XDocument.Parse(result);
XmlReader read = doc.CreateReader();
DataSet ds = new DataSet();
ds.ReadXml(read);
read.Close();
if (ds.Tables.Count > 0 && ds.Tables["Reply"] != null && ds.Tables["Reply"].Rows.Count > 0)
{
string refNo = string.Empty;
string uniqueKey = string.Empty;
string meterNo = string.Empty;
List<string> ls = new List<string>();
if (ds.Tables["Reply"].Rows[0][0].ToString().ToUpper() == "OK")
{
if (ds.Tables["Names"] != null && ds.Tables["Names"].Rows.Count > 0)
{
uniqueKey = ds.Tables["Names"].Rows[0]["name"].ToString();
}
if (ds.Tables["NameType"] != null && ds.Tables["NameType"].Rows.Count > 0)
{
refNo = ds.Tables["NameType"].Rows[0]["name"].ToString();
}
if (ds.Tables["Meter"] != null && ds.Tables["Meter"].Rows.Count > 0)
{
if (ds.Tables["Meter"].Columns.Contains("mRID"))
{
meterNo = ds.Tables["Meter"].Rows[0]["mRID"].ToString();
processedRec++;
}
}
}
log = uniqueKey + " | " + refNo + " | " + meterNo + " | " + Environment.NewLine;
ls.Add(log);
}
File.AppendAllText(filePath, log);
How can I create a new file every time?
Any help would be highly appreciated.
Make a unique filePath. Something like this:
var filePath =$"{folderPath}\txtFile_{Guid.NewGuid()}";
This will make the file to be always unique. The guid can be replaced with something more meaningful like Unix Timestamp as well.
Create a custom file name every time and make use of File.WriteAllText (which will Creates a new file, write the contents to the file, and then closes the file. If the target file already exists, it is overwritten.) instead for File.AppendAllText
In your case the filePath should be dynamic which can be construct like this:
string basePath = ""; // this should be path to your directory in which you wanted to create the output files
string extension = ".xml";
string fileName = String.Format("{0}{1}{2}","MyFile",DateTime.Now.ToString("ddMMyy_hhmmss"),extension );
string filePath = Path.Combine(basePath,fileName);
In the above snippet the DateTime.Now.ToString("ddMMyy_hhmmss") will be the current time(at the time of execution of the code) which will be differ in each execution so the file name will differ in each run. And at some later points you can search/group files based on these common pattern.
One more thing:
In your code you have used a variable List<string> ls which is populated with all logs, and you are writing the the contents of log to the file, which contains only the last record. So the statement for writing the content should be:
File.WriteAllText(filePath, String.Join("\n",log));
or even simply
File.WriteAllLines(filePath, log);
simply make your filePath unique, for example by using Ticks
var filePath = $"app-log-{DateTime.Now.Ticks:X}.log";
I want to upload the excel file of record 2500. This process takes time more than 5 minutes approximate. I want to optimize it to less than a minute maximum.
Find the best way of optimization of that code. I am using Dbgeography for location storing. File is uploaded and save the data properly. Everything is working find but I need optimization.
public ActionResult Upload(HttpPostedFileBase FileUpload)
{
if (FileUpload.ContentLength > 0)
{
string fileName = Path.GetFileName(FileUpload.FileName);
string ext = fileName.Substring(fileName.LastIndexOf('.'));
Random r = new Random();
int ran = r.Next(1, 13);
string path = Path.Combine(Server.MapPath("~/App_Data/uploads"), DateTime.Now.Ticks + "_" + ran + ext);
try
{
FileUpload.SaveAs(path);
ProcessCSV(path);
ViewData["Feedback"] = "Upload Complete";
}
catch (Exception ex)
{
ViewData["Feedback"] = ex.Message;
}
}
return View("Upload", ViewData["Feedback"]);
}
Here is other method from where the file is uploaded and save it to the database..
private void ProcessCSV(string filename)
{
Regex r = new Regex(",(?=(?:[^\"]*\"[^\"]*\")*(?![^\"]*\"))");
StreamReader sr = new StreamReader(filename);
string line = null;
string[] strArray;
int iteration = 0;
while ((line = sr.ReadLine()) != null)
{
if (iteration == 0)
{
iteration++;
continue; //Because Its Heading
}
strArray = r.Split(line);
StoreLocation store = new StoreLocation();
store.Name = strArray[0];
store.StoreName = strArray[1];
store.Street = strArray[2];
store.Street2 = strArray[3];
store.St_City = strArray[4];
store.St_St = strArray[5];
store.St_Zip = strArray[6];
store.St_Phone = strArray[7];
store.Office_Phone = strArray[8];
store.Fax = strArray[9];
store.Ownership = strArray[10];
store.website = strArray[11];
store.Retailer = check(strArray[12]);
store.WarehouseDistributor = check(strArray[13]);
store.OnlineRetailer = check(strArray[14]);
string temp_Add = store.Street + "," + store.St_City;
try
{
var point = new GoogleLocationService().GetLatLongFromAddress(temp_Add);
string points = string.Format("POINT({0} {1})", point.Latitude, point.Longitude);
store.Location = DbGeography.FromText(points);//lat_long
}
catch (Exception e)
{
continue;
}
db.StoreLocations.Add(store);
db.SaveChanges();
}
The obvious place to start would be your external call to the Geo Location service, which it looks like you are doing for each iteration of the loop. I don't know that service, but is there any way you can build up a list of addresses in memory, and then hit it once with multiple addresses, then go back and amend all the records you need to update?
The call to db.SaveChanges() which updates the database, this is happening for every line.
while (...)
{
...
db.StoreLocations.Add(store);
db.SaveChanges(); // Many database updates
}
Instead, just call it once at the end:
while (...)
{
...
db.StoreLocations.Add(store);
}
db.SaveChanges(); // One combined database update
This should speed up your code nicely.
I've written a set of web services using Web Api 2. They eventually end up calling a CMD program that fires up a OpenEdge Progress client, passes a formatted XML string and then inserts records into a OpenEdge Progress database by calling a .p procedure (WebSpeed is not an option).
The .P file has a set of business logic to run against a Progress application. It subsequently generates an XML file on completion containing an < Error > node. If this node is empty - then it worked. If the file doesn't exist or the node contains text... it failed. I then read this XML file and pass the contents of the < Error > node back to the client in Web Api.
At the moment, there is a static delay of 10 seconds from calling the CMD/Progress applet, to attempting to read the XML file, to give the server time to run the .P file and create said XML file. This isn't great, though, and occasionally an error is returned to the client because it can't find the file, yet, the file was created 1 second after the response was returned because of abnormally high server loads. Alternatively, people are forced to wait 10 seconds when the response could have been handled in 2 seconds.
I need to come up with a way to "check until file exists" until a timeout period has elapsed. I've done some research and can't find anything suitable for a Web Api environment. Does anyone have any suggestions?
Code below - forgive me. I've very much been learning as I've been going along and am very new to this!
Controller
// the request date/time
DateTime requestDate = DateTime.Now;
// list of validation errors
List<string> ohValidation = new List<string>();
...
WebExtensions.callInsertProgram(xml, "JOBLOG");
ohValidation = XmlExtensions.ReadProgressXmlFileWithArray(job.logjob.placeref, requestDate, "joblogging");
CallInsertProgram
public static void callInsertProgram(string xml, string program)
{
try
{
using (Process p = new Process())
{
p.StartInfo.FileName = #"C:\Rubixx\runProgress.exe";
p.StartInfo.WorkingDirectory = #"C:\Rubixx";
// stop windows from appearing on the server
p.StartInfo.UseShellExecute = false;
p.StartInfo.CreateNoWindow = true;
// set the arguments for running. The program name and xml are passed in as arguments
// wrapped in escaping "\" to stop spaces from being treated as a separator
p.StartInfo.Arguments = "\"" + program + "," + xml + "\"";
p.Start();
}
}
catch (Exception e)
{
throw new OpenHousingException(e.Message.ToString());
}
}
ReadProgressXMLWithArray
public static List<string> ReadProgressXmlFileWithArray(string reference, DateTime requestDateTime, string folder)
{
// new empty list
List<string> output
= new List<string>();
// wait X seconds before doing anything
// to ensure the XML file has time to be created
Delay_Start(fileDelay);
//
string filename = fullFileName(jobno, folder, requestDateTime);
string filepath = getFullFilepath(filename, folder);
if (checkXmlFileExists(filepath))
{
// if so check for the existence of an error message
output = getXmlErrorArray(filepath);
}
else
{
// if no file is found - the call to Progress hasn't executed. So tell the end user.
throw new OpenHousingException("No OpenHousing file could be found");
}
return output;
}
Delay_Start
private static void Delay_Start(int Seconds)
{
DateTime StartTime;
DateTime EndTime;
StartTime = DateTime.Now;
EndTime = StartTime.AddSeconds(Seconds);
do
{ StartTime = DateTime.Now; } while (StartTime < EndTime);
}
FullFileName (needed because I can't be sure of XML filename until created. File Format is UniqueReference_DateTimeFileCreated.xml (xxxxxxxx_20160401-1100.xml) So, I have to wildcard search a folder with a unique reference).
public static string fullFileName(string jobNo, string folder, DateTime createdDate)
{
string fileName = string.Empty;
string folderPath = fileLocation + folder;
DirectoryInfo dir = new DirectoryInfo(folderPath);
FileInfo[] files = dir.GetFiles(jobNo + "*", SearchOption.TopDirectoryOnly).Where(f => f.CreationTimeUtc > createdDate || f.LastWriteTimeUtc > createdDate).ToArray() ;
foreach (var item in files)
{
fileName = item.Name;
}
if (string.IsNullOrEmpty(fileName))
throw new OpenHousingException("No OpenHousing file could be found");
return fileName;
}
GetFullFilePath (Can probably be consolidated into fullFileName)
private static string getFullFilepath(string filename, string folder)
{
return fileLocation + folder + #"\" + filename;
}
CheckXMLFileExists
private static bool checkXmlFileExists(string filepath)
{
bool fileExists = false;
if (File.Exists(filepath))
{
fileExists = true;
}
return fileExists;
}
GetXMLErrorArray
private static List<string> getXmlErrorArray(string filepath)
{
List<string> output
= new List<string>();
// read the text from XML file
using (TextReader txtReader = new StreamReader(filepath))
{
XmlSerializer xs
= new XmlSerializer(typeof(JobError));
// de-serialise the xml text
// to a strongly typed object
JobError result = (JobError)xs.Deserialize(txtReader);
// if the xml file contains an error - return it to the client
if (!string.IsNullOrEmpty(result.ErrorText))
output.Add(result.ErrorText);
//check for SoR errors that are created under a different node
if (result.LineError != null)
{
List<LineError> lineErrs = result.LineError.ToList();
foreach (LineError le in lineErrs)
{
output.Add(le.SorCode + ":" + le.Error);
}
}
}
return output;
}
OK - so I think I was overcomplicating the problem.
Instead of waiting for the file to exist, I added a line into my CallInsertProgram method as below (as suggested by RB...
public static void callInsertProgram(string xml, string program)
{
try
{
using (Process p = new Process())
{
p.StartInfo.FileName = #"C:\Rubixx\runProgress.exe";
p.StartInfo.WorkingDirectory = #"C:\Rubixx";
// stop windows from appearing on the server
p.StartInfo.UseShellExecute = false;
p.StartInfo.CreateNoWindow = true;
// set the arguments for running. The program name and xml are passed in as arguments
// wrapped in escaping "\" to stop spaces from being treated as a separator
p.StartInfo.Arguments = "\"" + program + "," + xml + "\"";
p.Start();
// ADDED
p.WaitForExit(60000);
}
}
catch (Exception e)
{
throw new OpenHousingException(e.Message.ToString());
}
}
This ensures that the Progress cmd applet is completed before moving onto the next line - at which point the XML will have been created (or not if it's failed). Initial testing is working well. Can anyone foresee any problems with this approach?
My program goes through all files in a folder, reads them, and without altering their information moves them to another location under a different name. However I cannot use the File.Move method because I get the following IOException:
The process cannot access the file because it is being used by another
process.
This is how I am reading the file and adding all its lines to a List<string>:
List<string> lines = null;
using (var fs = new FileStream(filePath, FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
using (var sr = new StreamReader(fs, Encoding.Default))
{
lines = new List<string>();
while (!sr.EndOfStream)
lines.Add(sr.ReadLine());
}
And this is the function with which I move the file:
public static bool ArchiveFile(string filePath, string archiveFolderLocation)
{
if (!Directory.Exists(archiveFolderLocation))
Directory.CreateDirectory(archiveFolderLocation);
try
{
string timestamp = string.Format("{0:yyyy-MM-dd HHmmss}", DateTime.Now);
string newFileName = Path.GetFileNameWithoutExtension(filePath) + " " + timestamp;
string destination = string.Format("{0}\\{1}{2}", archiveFolderLocation, newFileName, Path.GetExtension(filePath));
File.Move(filePath, destination);
return true;
}
catch (Exception ex)
{
return false;
}
}
I thought using the using statement is supposed to garbage-collect and release the file after being used. How can I release the file so I can move it and why my file stays locked?
Solved:
Got it. Somewhere between these two calls I was opening a TextReaderobject without disposing it.
I thought using the using statement is supposed to garbage-collect and
release the file after being used. How can I release the file so I can
move it and why my file stays locked?
Not really. Using statement is nothing but :
try { var resource = new SomeResource(); }
finally { resource.Dispose(); // which is not GC.Collect(); }
It works fine so it looks like your file is opened from some other place in your code...
P.S.
By the way you can just do:
List<string> lines = File.ReadAllLines().ToList();
You could use:
string dpath = "D:\\Destination\\";
string spath = "D:\\Source";
string[] flist = Directory.GetFiles(spath);
foreach (string item in flist)
{
File.Move(item, dpath + new FileInfo(item).Name);
}
Replace D:\\Source & D:\\Destination\\ with the required source and destination paths, respectively.
I'm having issues with a bit of code that I am writing in C#.
I am sending a document using the MailMessage and SMTP components. I copy the files that I wish to send to a temp directory such as c:\temp, loop through the documents and attach them to the email.
The email sends fine, however when I try to delete the files from the temp directory, I get the following error:
The process can not access the file because it is being used by another process
I can't understand why this is happening. Below is the code that processes the documents
public void sendDocument(String email, string barcode, int requestid)
{
string tempDir = #"c:\temp";
//first we get the document information from the database.
Database db = new Database(dbServer, dbName, dbUser, dbPwd);
List<Document> documents = db.getDocumentByID(barcode);
int count = 0;
foreach (Document doc in documents)
{
string tempPath = tempDir + "\\" + doc.getBarcode() + ".pdf";
string sourcePath = doc.getMachineName() + "\\" + doc.getFilePath() + "\\" + doc.getFileName();
//we now copy the file from the source location to the new target location
try
{
//this copies the file to the folder
File.Copy(sourcePath, tempPath, false);
}
catch (IOException ioe)
{
count++;
//the file has failed to copy so we add a number to the file to make it unique and try
//to copy it again.
tempPath = tempDir + "\\" + doc.getBarcode() + "-" + count + ".pdf";
File.Copy(sourcePath, tempPath, false);
}
//we now need to update the filename in the to match the new location
doc.setFileName(doc.getBarcode() + ".pdf");
}
//we now email the document to the user.
this.sendEmail(documents, email, null);
updateSentDocuments(documents, email);
//now we update the request table/
db.updateRequestTable(requestid);
//now we clean up the documents from the temp folder.
foreach (Document doc in documents)
{
string path = #"c:\temp\" + doc.getFileName();
File.Delete(path);
}
}
I would of thought that the this.sendEmail() method would of sent the email before returning to the sendDocument method, as I think it is the smtp object that is causing the deletes to fail.
This is the sendEmail method:
public void sendEmail(List<Document> documents, String email, string division)
{
String SMTPServer = null;
String SMTPUser = null;
String SMTPPwd = null;
String sender = "";
String emailMessage = "";
//first we get all the app setting used to send the email to the users
Database db = new Database(dbServer, dbName, dbUser, dbPwd);
SMTPServer = db.getAppSetting("smtp_server");
SMTPUser = db.getAppSetting("smtp_user");
SMTPPwd = db.getAppSetting("smtp_password");
sender = db.getAppSetting("sender");
emailMessage = db.getAppSetting("bulkmail_message");
DateTime date = DateTime.Now;
MailMessage emailMsg = new MailMessage();
emailMsg.To.Add(email);
if (division == null)
{
emailMsg.Subject = "Document(s) Request - " + date.ToString("dd-MM-yyyy");
}
else
{
emailMsg.Subject = division + " Document Request - " + date.ToString("dd-MM-yyyy");
}
emailMsg.From = new MailAddress(sender);
emailMsg.Body = emailMessage;
bool hasAttachements = false;
foreach (Document doc in documents)
{
String filepath = #"c:\temp\" + doc.getFileName();
Attachment data = new Attachment(filepath);
emailMsg.Attachments.Add(data);
hasAttachements = true;
}
SmtpClient smtp = new SmtpClient(SMTPServer);
//we try and send the email and throw an exception if it all goes tits.
try
{
if (hasAttachements)
{
smtp.Send(emailMsg);
}
}
catch (Exception ex)
{
throw new Exception("EmailFailure");
}
}
How to I get around this problem with a process hogging the file I wish to delete.
I can delete the file(s) once the application finishes.
You're email message isn't being Disposed, try disposing it after you've sent it:
try
{
if (hasAttachements)
{
smtp.Send(emailMsg);
}
}
catch ...
finally
{
emailMsg.Dispose();
}
The first step is to figure out what process is holding onto the file in question. I would grab the SysInternals toolkit and use the handle.exe command to determine what process is holding onto the file.
Without knowing what process has the file open, there is no way to fix this problem.
Sysinternals Link: http://technet.microsoft.com/en-us/sysinternals/default.aspx
Here's a trick I just discoverred, which will hopefully be of use to other folk?
Before adding attachments to a message, create the attachment in a using wrapper. This ensures it gets disposed of correctly, allowing the file to be successfully deleted. I'm not sure if the send also needs to be in this loop; (when I tested, emails didn't get through at first, then half an hour later I got flooded, so decided to leave testing for a time when the network was a little calmer).
using (Attachment attachment = new Attachment(filename))
{
message.Attachments.Add(attachment);
client.SendAsync(message, string.Empty);
}
File.Delete(filename);
Works OK for me anyway :)
Good luck,
JB