I would like to be able to run commands on the command prompt on a windows pc from an Android phone.
I am currently doing it by a wpf app on windows which listens for files with .dev in their file name.
This method is very annoying to use on a phone as I have to type out the command in a .bat file and then copy the file to D:\Shared using an app like File Manager to transfer the file via windows share
using Microsoft.Win32;
using System.Diagnostics;
namespace Dev
{
public partial class Dev : ApplicationContext
{
class Var
{
public static string temp = Path.GetTempPath();
}
public Dev()
{
RegistryKey key = Registry.CurrentUser.OpenSubKey("SOFTWARE\\Microsoft\\Windows\\CurrentVersion\\RunOnce", true);
key.SetValue("Remote Dev", Application.ExecutablePath);
// Create a new FileSystemWatcher and set its properties.
FileSystemWatcher watcher = new()
{
Path = "D:\\Shared",
/* Watch for changes in LastAccess and LastWrite times, and
the renaming of files or directories. */
NotifyFilter = NotifyFilters.LastAccess | NotifyFilters.LastWrite
| NotifyFilters.FileName | NotifyFilters.DirectoryName,
// Only watch text files.
Filter = "*.dev.*"
};
// Add event handlers.
watcher.Created += new FileSystemEventHandler(Create);
// Begin watching.
watcher.EnableRaisingEvents = true;
//This method is called when a file is created.
static void Create(object source, FileSystemEventArgs e)
{
string tempPath = Var.temp + Path.GetFileName(e.FullPath);
if (File.Exists(tempPath)) File.Delete(tempPath);
File.Move(e.FullPath, tempPath);
Thread.Sleep(1000);
Process.Start("explorer", tempPath);
}
}
}
}
I would like to create an app that can automate the process of creating the .bat file and coping it to the D:\Shared without the need of any third party app.
! Please help me in the Maui part as I am new to it.
Thanks
Related
I am new to C# and have to develop a Windows Form application in C#. This application should track the following things.
Monitor the CD / DVD drives both external and internal.
Monitor the files which are created, modified and deleted on the CD/DVD drives.
I am able to get system notification for CD/DVD drive insertion by RegisterNotification and by tracking WM_DEVICECHANGE messages in the WndProc method.
The above implementation lets me know when a new device has been attached to the PC.
The problem I am facing is how track the file changes which happen on the CD/DVD (Writes / Modification). One option is to poll for the files in the CD / DVD as a background job. But, this will be as the last option.
I have found IMAPIthrough which we can write to CD/DVDs but I just need to monitor the file changes for audit purposes.
Kindly point me to right direction on how to receive file changes on the CD/DVD notification in my program ?
I have tried FileSystemWatcher but it doesn't seem to work with CD/DVD drives.
Updated on 07-Feb-2018:
The another approach I could find was via WMIqueries which are attached to WMI Events. I have found a question Best way to detect dvd insertion in drive c# which could also hold the answer. I wanted to know if the detection of DVD file system modification is feasible in WMI and if any experts can share the query for the same. I hope if Arshad would be able to help in this area.
Approach 1 : Using FileSystemWatcher
public void ListOpticalDiscDrivesAndWatchRootDirectory()
{
var drives = DriveInfo.GetDrives();
foreach (var drive in drives)
{
if (drive.IsReady && drive.DriveType == DriveType.CDRom)
{
var rootDirectory = drive.RootDirectory.ToString();
Console.WriteLine(rootDirectory);
Watch(rootDirectory);
}
}
}
private void Watch(string path)
{
var watcher = new FileSystemWatcher
{
Path = path,
NotifyFilter = NotifyFilters.Attributes |
NotifyFilters.CreationTime |
NotifyFilters.DirectoryName |
NotifyFilters.FileName |
NotifyFilters.LastAccess |
NotifyFilters.LastWrite |
NotifyFilters.Security |
NotifyFilters.Size,
Filter = "*.*",
EnableRaisingEvents = true
};
watcher.Changed += new FileSystemEventHandler(OnChanged);
}
private void OnChanged(object source, FileSystemEventArgs e)
{
Console.WriteLine("Something changed!");
}
Approach 2 : Using WMI
There's a code project sample (VBScript) describing how to use WMI for file system monitoring. I used the query from that sample in the C# snippet below :
using System;
using System.Management;
public class OpticalDriveWatcher
{
private ManagementEventWatcher _wmiWatcher = new ManagementEventWatcher();
public ManagementEventWatcher WmiWatcher
{
get { return _wmiWatcher; }
}
private void OnWmiEventReceived(object sender, EventArrivedEventArgs e)
{
Console.WriteLine("WMI event!");
}
public void WatchWithWMI(string path)
{
string queryString = "Select * From __InstanceOperationEvent "
+ "WITHIN 2 "
+ "WHERE TargetInstance ISA 'CIM_DataFile' "
+ $"And TargetInstance.Drive='{path}'";
WqlEventQuery wmiQuery = new WqlEventQuery(queryString);
WmiWatcher.Query = wmiQuery;
WmiWatcher.Start();
}
}
The catch is that CIM_DataFile returns only instances of files on local fixed disks. You can call this as follows
var detector = new OpticalDriveDetector();
var drive = "I:"; //You could get the optical drive you want to watch with DriveInfo as described in approach 1
detector.WatchWithWMI(drive);
detector.WmiWatcher.EventArrived += detector.OnWmiEventReceived;
Both approaches worked fine for me when I tested with a DVD-RAM.
I have an application that requires two files to process data. A zip file containing the actual data then a control file that says what to do with said data.
These files are downloaded via sftp to a staging directory. Once the zip file is complete, I need to check and see if the control file is there as well. They share a naming prefix only(Eg. 100001_ABCDEF_123456.zip is paired with 100001_ABCDEF_control_file.ctl.
I am trying to find a way to wait for the zip file to finishing downloading then move the files on the fly, while maintaining the directory structure as that is important for the next step in processing.
Currently I am waiting till the sftp worker finishes then calling robocopy to move everything. I would like a more polished approach.
I have tried several things and I get the same results. Files download but never move. For some reason I just cannot get the compare to work correctly.
I have tried using a FileSystemWatcher to look for the rename from filepart to zip but it seems to miss several downloads and for some reason the function dies when I get to my foreach to search the directory for the control file.
Below is the FileSystemWatcher event, I am calling this for created and changed.
Also below is the setup for the filesystemwatcher.
watcher.Path = #"C:\Sync\";
watcher.IncludeSubdirectories = true;
watcher.EnableRaisingEvents = true;
watcher.Filter = "*.zip";
watcher.NotifyFilter = NotifyFilters.Attributes |
NotifyFilters.CreationTime |
NotifyFilters.FileName |
NotifyFilters.LastAccess |
NotifyFilters.LastWrite |
NotifyFilters.Size |
NotifyFilters.Security |
NotifyFilters.CreationTime |
NotifyFilters.DirectoryName;
watcher.Created += Watcher_Changed;
watcher.Changed += Watcher_Changed;
private void Watcher_Changed(object sender, FileSystemEventArgs e)
{
var dir = new DirectoryInfo(e.FullPath.Substring(0, e.FullPath.Length - e.Name.Length));
var files = dir.GetFiles();
FileInfo zipFile = new FileInfo(e.FullPath);
foreach (FileInfo file in files)
{
MessageBox.Show(file.Extension);
if (file.Extension == "ctl" && file.Name.StartsWith(e.Name.Substring(0, (e.Name.Length - 14))))
{
file.CopyTo(#"C:\inp\");
zipFile.CopyTo(#"C:\inp\");
}
}
}
Watcher_Changed is going to get called for all sorts of things, and not every time it's called will you want to react to it.
The first thing you should do in the event handler is try to exclusively open zipFile. If you cannot do it, ignore this event and wait for another event. If this is an FTP server, every time a new chunk of data is written to disk, you'll get a changed event. You could also put something on a "retry" queue or use some other mechanism to check to see if the file available at a later time. I have a similar need in our system, and we try every 5 seconds after we notice a first change. Only once we can exclusively open the file for writing, do we allow it to move on to the next step.
I would tighten up your assumptions about what the filename looks like. You're limiting the search to *.zip, but don't depend on only your .zip files existing in that target directory. Validate that the parsing you're doing of the filename isn't hitting unexpected values. You may also want to check that dir.Exists() before calling dir.GetFiles(). That could be throwing exceptions.
As to missing events, see this good answer on buffer overflows: FileSystemWatcher InternalBufferOverflow
The FileSystemWatcher class is notoriously tricky to use correctly, because you will get multiple events for a single file that is being written to, moved or copied, as #WillStoltenberg also mentioned in his answer.
I have found that it is much easier just to setup a task that runs periodically (e.g. every 30 seconds). For your problem, you could easily do something like the below. Note that a similar implementation using a Timer, instead of the Task.Delay, may be preferable.
public class MyPeriodicWatcher
{
private readonly string _watchPath;
private readonly string _searchMask;
private readonly Func<string, string> _commonPrefixFetcher;
private readonly Action<FileInfo, FileInfo> _pairProcessor;
private readonly TimeSpan _checkInterval;
private readonly CancellationToken _cancelToken;
public MyPeriodicWatcher(
string watchPath,
string searchMask,
Func<string, string> commonPrefixFetcher,
Action<FileInfo, FileInfo> pairProcessor,
TimeSpan checkInterval,
CancellationToken cancelToken)
{
_watchPath = watchPath;
_searchMask = string.IsNullOrWhiteSpace(searchMask) ? "*.zip" : searchMask;
_pairProcessor = pairProcessor;
_commonPrefixFetcher = commonPrefixFetcher;
_cancelToken = cancelToken;
_checkInterval = checkInterval;
}
public Task Watch()
{
while (!_cancelToken.IsCancellationRequested)
{
try
{
foreach (var file in Directory.EnumerateFiles(_watchPath, _searchMask))
{
var pairPrefix = _commonPrefixFetcher(file);
if (!string.IsNullOrWhiteSpace(pairPrefix))
{
var match = Directory.EnumerateFiles(_watchPath, pairPrefix + "*.ctl").FirstOrDefault();
if (!string.IsNullOrEmpty(match) && !_cancelToken.IsCancellationRequested)
_pairProcessor(
new FileInfo(Path.Combine(_watchPath, file)),
new FileInfo(Path.Combine(_watchPath, match)));
}
if (_cancelToken.IsCancellationRequested)
break;
}
if (_cancelToken.IsCancellationRequested)
break;
Task.Delay(_checkInterval, _cancelToken).Wait().ConfigureAwait(false);
}
catch (OperationCanceledException)
{
break;
}
}
}
}
You will need to provide it with
the path to monitor
the search mask for the first file (i.e. *.zip)
a function delegate that gets the common file name prefix from the zip file name
an interval
the delegate that will perform the moving and receives the FileInfo for the pair to be processed / moved.
and a cancellation token to cleanly cancel monitoring.
In your pairProcessor delegate, catch IO exceptions, and check for a sharing violation (which likely means writing the file has not yet completed).
By default a Script written by the machine saves the files to Local/Server Path folder, but due to network issue, both the folders are not in sync. I have witten a C# Window Service program using FileSystemWatcher, DiffEngine, System.Timers and PingService as below coding to handle this.
To monitor a local folder OnChange Event, Ping server IP whether is success/fail before Compare/Copy to Server path, When Ping Fail it will goes to logtemp folder, system timer handle this and Ping again before re-dump the logtemp files.
I do not know how to use threading for this. Where should be my system timer code when the ping fails?
protected override void OnStart(string[] args)
{ //all watcher config here//
watcher.Path = "path";
watcher.NotifyFilter = NotifyFilters.LastWrite;
watcher.Filter = "filename_company-Pg1_Product*";
watcher.Changed += new FileSystemEventHandler(LogFileSystemChanges);
watcher.EnableRaisingEvents = true;}
private void LogFileSystemChanges(object sender, FileSystemEventArgs e)
{
FileInfo sourcepath = new FileInfo(e.FullPath);
FileInfo destPath = new FileInfo(Path.Combine(dFile, e.Name));
FileInfo _tempPath = new FileInfo(Path.Combine(tempPath, e.Name));
if (PingService())
//PingService Bool Type....Ping Specific IP Before Compare/Copy
{
if (!destPath.Exists)
{
LogEvent(destPath + " DOES NOT EXIST!! ");
CopyFunction.CopyFile(sourcepath, destPath, true, true);
}
else
{
if (BinaryDiff(sFile, Path.Combine(dFile, e.Name)))
//DiffEngine If Source & Diff are Different is TRUE.
{
CopyFunction.CopyFile(sourcepath, destPath, true, true);
}
}
string msg = string.Format("Filename {0} are {1} now at {2} ", _
e.Name, e.ChangeType, DateTime.Now.ToString());
LogEvent(msg);
}
else
{
CopyFunction.CopyFile(sourcepath, _tempPath, true, true);
}
}
Use NServiceBus (with what they call a Saga)
Its an open source project that lets you foolproof your code against cases where the network is down.
Or are you just asking how to make a thread?
If so see MSDN on examples for creating threads
I want to monitor a log file of our PBX for changes. I made a small program that does just that with a FileSystemWatcher.
Now it's getting strange: The FileSystemWatcher never fires the Changed-Event when I simply start the program. Despite the fact that the log file really has changed. But when I open the directory in the Windows Explorer where the log file is located, the program works as expected. But only as long as the Explorer Window stays open... what the..?
Operating System: Windows Server 2008 R2
EDIT: Sorry, here is the code:
class Program
{
static void Main(string[] args)
{
new LogFileWatcher(#"C:\PBX\Dial.log");
System.Console.Read();
}
}
public class LogFileWatcher
{
public string LogFilePath { get; private set; }
private DateTime _lastLogFileWriteTime;
public LogFileWatcher(string path)
{
LogFilePath = path;
var directoryName = Path.GetDirectoryName(LogFilePath);
var fileName = Path.GetFileName(LogFilePath);
var fsw = new FileSystemWatcher { Path = directoryName, Filter = fileName };
fsw.Changed += fsw_Changed;
fsw.EnableRaisingEvents = true;
}
private void fsw_Changed(object sender, FileSystemEventArgs e)
{
// Get and fix the last write time of the log file
var fixLastWriteTime = File.GetLastWriteTime(LogFilePath);
// Don't do anything when file didn't change since last time
if (fixLastWriteTime == _lastLogFileWriteTime) return;
Console.WriteLine("File changed on: {0} - ID:{1}", DateTime.Now.ToLongTimeString(), Guid.NewGuid());
// Save last write time of the log file
_lastLogFileWriteTime = fixLastWriteTime;
}
}
EDIT2: Maybe this is important: The log file is in use by the PBX Windows-Service! I can open it with Notepad though.
For optimization reasons, the FileStream.Flush()-Method doesn't flush metadata anymore (Vista and later Microsoft operating systems). Therefore the FileSystemWatcher gets no file notification and won't fire the Changed-Method.
http://connect.microsoft.com/VisualStudio/feedback/details/94772/filestream-flush-does-not-flush-the-file-in-the-correct-way-to-work-with-filesystemwatcher-or-native-readdirectorychangesw
I am using the FileSystemWatcher to notify when the new files gets created in the network directory. We process the text files(about 5KB size) and delete them immediately when the new file gets created in the directory. If the FileSystemWatcher windows service stops for some reason we have to look for the unprocessed files after it gets back up and running. How can I handle if the new file comes while processing the old files from the directory? Any examples please?
Thank you,
Here is the code example I have with simple form.
public partial class Form1 : Form
{
private System.IO.FileSystemWatcher watcher;
string tempDirectory = #"C:\test\";
public Form1()
{
InitializeComponent();
CreateWatcher();
GetUnprocessedFiles();
}
private void CreateWatcher()
{
//Create a new FileSystemWatcher.
watcher = new FileSystemWatcher();
watcher.Filter = "*.txt";
watcher.NotifyFilter = NotifyFilters.FileName;
//Subscribe to the Created event.
watcher.Created += new FileSystemEventHandler(watcher_FileCreated);
watcher.Path = #"C:\test\";
watcher.EnableRaisingEvents = true;
}
void watcher_FileCreated(object sender, FileSystemEventArgs e)
{
//Parse text file.
FileInfo objFileInfo = new FileInfo(e.FullPath);
if (!objFileInfo.Exists) return;
ParseMessage(e.FullPath);
}
void ParseMessage(string filePath)
{
// Parse text file here
}
void GetUnprocessedFiles()
{
// Put all txt files into array.
string[] array1 = Directory.GetFiles(#"C:\test\");
foreach (string name in array1)
{
string path = string.Format("{0}{1}", tempDirectory, name)
ParseMessage(path);
}
}
}
When the process starts do the following:
first get the contents of the folder
process every file (end delete them as you already do now)
repeat until no files are in the folder (check again here, since a new file could have been placed in the folder).
start the watcher
For any of our services that use the FileSystemWatcher we always process all the files that exist in the directory first, prior to starting the watcher. After the watcher has been started we then start a timer (with a fairly long interval) to handle any files that appear in the directory without triggering the watcher (it does happen from time to time). That usually covers all the possibilities.