Deadlock while writing to Process.StandardInput - c#

I'm developping an application and I have a problem with a deadlock.
My code looks like that :
Process p = new Process(); // That using an other application
Then I'm sending an .xml-file to this process:
XmlSerializer xs = new XmlSerializer(data.GetType());
using (var ms = new MemoryStream())
{
var sw = new StreamWriter(ms);
XmlWriter xmlwriter = XmlWriter.Create(sw, xmlWriterSettings);
xmlwriter.WriteProcessingInstruction("PipeConfiguratorStyleSheet", processing);
xs.Serialize(xmlwriter, data);
xmlwriter.Flush();
ms.Position = 0;
var sr = new StreamReader(ms);
while (!sr.EndOfStream)
{
String line = sr.ReadLine();
p.StandardInput.WriteLine(line);
Console.WriteLine(line);
p.BeginOutputReadLine();
p.CancelOutputRead();
}
}
So actually I can send a part of my .xml-file to my process but at some point I'll get a deadlock.
I guess I don't knoy how to use BeginOutputReadLine() correctly.

First off, why don't you use the Process.StandardInput-property directly as your target, like
var process = new Process
{
// all your init stuff
};
var xmlSerializer = new XmlSerializer(data.GetType());
var xmlwriter = XmlWriter.Create(process.StandardInput, xmlWriterSettings);
xmlSerializer.Serialize(xmlwriter, data);
Otherwise, the msdn-entry gives a clear howto for using Process.BeginOutputReadLine(), which you can remodel to
var autoResetEvent = new AutoResetEvent(false); // this mutex acts as our bouncer for the reading-part
var process = new Process
{
// all your init stuff
};
process.StartInfo.UseShellExecute = false;
process.StartInfo.RedirectStandardOutput = true;
process.OutputDataReceived += (sender, args) => {
// TODO you could read the content here with args.Data
autoResetEvent.Set();
};
process.Start();
using (var memoryStream = new MemoryStream())
{
using (var streamWriter = new StreamWriter(memoryStream))
{
var xmlSerializer = new XmlSerializer(data.GetType());
var xmlwriter = XmlWriter.Create(streamWriter, xmlWriterSettings);
xmlSerializer.Serialize(xmlwriter, data);
}
memoryStream.Position = 0;
using (var streamReader = new StreamReader(memoryStream))
{
while (!streamReader.EndOfStream)
{
var line = streamReader.ReadLine();
process.StandardInput.WriteLine(line);
Console.WriteLine(line);
process.BeginOutputReadLine();
autoResetEvent.WaitOne();
}
}
}
// TODO closing the process.StandardInput, exiting process, ...
Anyway - I know this should be a comment - is there a specific reason why you are waiting for your process to write something?
The StandardOutput stream can be read synchronously or asynchronously.
Methods such as Read, ReadLine, and ReadToEnd perform synchronous read
operations on the output stream of the process. These synchronous read
operations do not complete until the associated Process writes to its
StandardOutput stream, or closes the stream. In contrast,
BeginOutputReadLine starts asynchronous read operations on the
StandardOutput stream. This method enables a designated event handler
for the stream output and immediately returns to the caller, which can
perform other work while the stream output is directed to the event
handler.
Which means, that if your process does not write anything (and you are waiting), you are spinning for response endlessly ...
EDIT
You should additionally add a handler to Process.ErrorDataReceived like
process.StartInfo.RedirectStandardError = true;
process.ErrorDataReceived += (sender, args) => {
// TODO do something with the response of args.Data
autoResetEvent.Set();
};
and
while (!streamReader.EndOfStream)
{
var line = streamReader.ReadLine();
process.StandardInput.WriteLine(line);
Console.WriteLine(line);
process.BeginOutputReadLine();
process.BeginErrorReadLine();
autoResetEvent.WaitOne();
}
to handle error-cases as well (whatever that may mean).

Related

JToken.WriteToAsync does not write to JsonWriter

I'm trying to create a middleware that changes the request in a certain way. I am able to read it and change the content but I cannot figure out how to correctly setup the stream writers to create a new body. When I call normalized.WriteToAsync(jsonWriter) the MemoryStream remains empty and consequently I receive the A non-empty request body is required. exception. What am I missing here? This is what I have so far:
public async Task Invoke(HttpContext context)
{
if (context.Request.ContentType == "application/json" && context.Request.ContentLength > 0)
{
using var scope = _logger.BeginScope("NormalizeJson");
try
{
using var requestReader = new HttpRequestStreamReader(context.Request.Body, Encoding.UTF8);
using var jsonReader = new JsonTextReader(requestReader);
var json = await JToken.LoadAsync(jsonReader);
var normalized = _normalize.Visit(json); // <-- Modify json and return JToken
// Create new Body
var memoryStream = new MemoryStream();
var requestWriter = new StreamWriter(memoryStream);
var jsonWriter = new JsonTextWriter(requestWriter);
await normalized.WriteToAsync(jsonWriter); // <-- At this point the MemoryStream has still 0 length.
var content = new StreamContent(memoryStream.Rewind()); // <-- Use helper extension to Seek.Begin = 0
context.Request.Body = await content.ReadAsStreamAsync();
}
catch (Exception e)
{
_logger.Scope().Exceptions.Push(e);
}
}
await _next(context);
}
Demo for LINQPad etc.:
async Task Main()
{
var token = JToken.FromObject(new User { Name = "Bob" });
var memoryStream = new MemoryStream();
var requestWriter = new StreamWriter(memoryStream);
var jsonWriter = new JsonTextWriter(requestWriter);
await token.WriteToAsync(jsonWriter);
memoryStream.Length.Dump(); // <-- MemoryStream.Length = 0
}
public class User
{
public string Name { get; set; }
}
You need to properly flush and close your JsonTextWriter and StreamWriter in order to fully populate the memoryStream, like so:
var memoryStream = new MemoryStream();
// StreamWriter implements IAsyncDisposable
// Leave the underlying stream open
await using (var requestWriter = new StreamWriter(memoryStream, leaveOpen: true))
{
var jsonWriter = new JsonTextWriter(requestWriter); // But JsonTextWriter does not implement IAsyncDisposable, only IDisposable!
try
{
await token.WriteToAsync(jsonWriter);
}
finally
{
await jsonWriter.CloseAsync();
}
}
Demo fiddle #1 here.
Or, since you're writing to a MemoryStream, there's really no nead to use async at all, and instead you can do:
var memoryStream = new MemoryStream();
using (var requestWriter = new StreamWriter(memoryStream, leaveOpen: true)) // Leave the underlying stream open
using (var jsonWriter = new JsonTextWriter(requestWriter))
{
token.WriteTo(jsonWriter);
}
Demo fiddle #2 here.
Notes:
Note the use of await using for the StreamWriter. This syntax guarantees that the StreamWriter will be flushed and closed asynchronously, and can be used on any object that implements IAsyncDisposable. (This only really matters if you were writing to a file stream or other non-memory stream.)
It seems that neither JsonTextWriter nor the base class JsonWriter implement IAsyncDisposable, so I had to asynchronously close the JSON writer manually rather than via a using statement. The outer await using should ensure that the underlying StreamWriter is not left open in the event of an exception.
JSON RFC 8259 specifies that Implementations MUST NOT add a byte order mark (U+FEFF) to the beginning of a networked-transmitted JSON text. Thus, when constructing a StreamWriter, it is recommended to pass an encoding such as new UTF8Encoding(false) that does not prepend a BOM. Alternatively, if you just want UTF-8, the StreamWriter constructors will create a StreamWriter with UTF-8 encoding without a Byte-Order Mark (BOM) if you do not specify one yourself and leave a default value for that parameter as is shown in the code above.

C# PSExec to execute multiple path from StreamReader parsing

I'm trying to learn how C# could read and parsing multiple line from text file using streamReader and afterward process each of line with PSExec
Inside cocomand.txt have multiple line example
c:/command1.cmd
c:/command2.bat
c:/command3.cmd
private static void calleachline()
{
string pathx = #"c:\cocomand.txt";
using (StreamReader reader = new StreamReader(new FileStream(pathx, FileMode.Open, FileAccess.Read, FileShare.ReadWrite), Encoding.ASCII))
{
while ((!reader.EndOfStream))
{
System.Diagnostics.Process cmd = new System.Diagnostics.Process();
cmd.StartInfo.FileName = #"psexec.exe";
cmd.StartInfo.Arguments = #"\\localhost";
cmd.StartInfo.UseShellExecute = false;
cmd.StartInfo.RedirectStandardOutput = true;
cmd.Start();
if (!cmd.WaitForExit(cmd2))
{
ExecutePSKill(cmd);
}
else
{
//
}
}
}
Trying to understand from few thread but with my lack knowledge seems this still doesn't work

Writing to Filestream and copying to MemoryStream

I want to overwrite or create an xml file on disk, and return the xml from the function. I figured I could do this by copying from FileStream to MemoryStream. But I end up appending a new xml document to the same file, instead of creating a new file each time.
What am I doing wrong? If I remove the copying, everything works fine.
public static string CreateAndSave(IEnumerable<OrderPage> orderPages, string filePath)
{
if (orderPages == null || !orderPages.Any())
{
return string.Empty;
}
var xmlBuilder = new StringBuilder();
var writerSettings = new XmlWriterSettings
{
Indent = true,
Encoding = Encoding.GetEncoding("ISO-8859-1"),
CheckCharacters = false,
ConformanceLevel = ConformanceLevel.Document
};
using (var fs = new FileStream(filePath, FileMode.OpenOrCreate, FileAccess.ReadWrite))
{
try
{
XmlWriter xmlWriter = XmlWriter.Create(fs, writerSettings);
xmlWriter.WriteStartElement("PRINT_JOB");
WriteXmlAttribute(xmlWriter, "TYPE", "Order Confirmations");
foreach (var page in orderPages)
{
xmlWriter.WriteStartElement("PAGE");
WriteXmlAttribute(xmlWriter, "FORM_TYPE", page.OrderType);
var outBound = page.Orders.SingleOrDefault(x => x.FlightInfo.Direction == FlightDirection.Outbound);
var homeBound = page.Orders.SingleOrDefault(x => x.FlightInfo.Direction == FlightDirection.Homebound);
WriteXmlOrder(xmlWriter, outBound, page.ContailDetails, page.UserId, page.PrintType, FlightDirection.Outbound);
WriteXmlOrder(xmlWriter, homeBound, page.ContailDetails, page.UserId, page.PrintType, FlightDirection.Homebound);
xmlWriter.WriteEndElement();
}
xmlWriter.WriteFullEndElement();
MemoryStream destination = new MemoryStream();
fs.CopyTo(destination);
Log.Progress("Xml string length: {0}", destination.Length);
xmlBuilder.Append(Encoding.UTF8.GetString(destination.ToArray()));
destination.Flush();
destination.Close();
xmlWriter.Flush();
xmlWriter.Close();
}
catch (Exception ex)
{
Log.Warning(ex, "Unhandled exception occured during create of xml. {0}", ex.Message);
throw;
}
fs.Flush();
fs.Close();
}
return xmlBuilder.ToString();
}
Cheers
Jens
FileMode.OpenOrCreate is causing the file contents to be overwritten without shortening, leaving any 'trailing' data from previous runs. If FileMode.Create is used the file will be truncated first. However, to read back the contents you just wrote you will need to use Seek to reset the file pointer.
Also, flush the XmlWriter before copying from the underlying stream.
See also the question Simultaneous Read Write a file in C Sharp (3817477).
The following test program seems to do what you want (less your own logging and Order details).
using System;
using System.IO;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Xml;
using System.Threading.Tasks;
namespace ReadWriteTest
{
class Program
{
static void Main(string[] args)
{
string filePath = Path.Combine(
Environment.GetFolderPath(Environment.SpecialFolder.Personal),
"Test.xml");
string result = CreateAndSave(new string[] { "Hello", "World", "!" }, filePath);
Console.WriteLine("============== FIRST PASS ==============");
Console.WriteLine(result);
result = CreateAndSave(new string[] { "Hello", "World", "AGAIN", "!" }, filePath);
Console.WriteLine("============== SECOND PASS ==============");
Console.WriteLine(result);
Console.ReadLine();
}
public static string CreateAndSave(IEnumerable<string> orderPages, string filePath)
{
if (orderPages == null || !orderPages.Any())
{
return string.Empty;
}
var xmlBuilder = new StringBuilder();
var writerSettings = new XmlWriterSettings
{
Indent = true,
Encoding = Encoding.GetEncoding("ISO-8859-1"),
CheckCharacters = false,
ConformanceLevel = ConformanceLevel.Document
};
using (var fs = new FileStream(filePath, FileMode.Create, FileAccess.ReadWrite))
{
try
{
XmlWriter xmlWriter = XmlWriter.Create(fs, writerSettings);
xmlWriter.WriteStartElement("PRINT_JOB");
foreach (var page in orderPages)
{
xmlWriter.WriteElementString("PAGE", page);
}
xmlWriter.WriteFullEndElement();
xmlWriter.Flush(); // Flush from xmlWriter to fs
xmlWriter.Close();
fs.Seek(0, SeekOrigin.Begin); // Go back to read from the begining
MemoryStream destination = new MemoryStream();
fs.CopyTo(destination);
xmlBuilder.Append(Encoding.UTF8.GetString(destination.ToArray()));
destination.Flush();
destination.Close();
}
catch (Exception ex)
{
throw;
}
fs.Flush();
fs.Close();
}
return xmlBuilder.ToString();
}
}
}
For the optimizers out there, the StringBuilder was unnecessary because the string is formed whole and the MemoryStream can be avoided by just wrapping fs in a StreamReader. This would make the code as follows.
public static string CreateAndSave(IEnumerable<string> orderPages, string filePath)
{
if (orderPages == null || !orderPages.Any())
{
return string.Empty;
}
string result;
var writerSettings = new XmlWriterSettings
{
Indent = true,
Encoding = Encoding.GetEncoding("ISO-8859-1"),
CheckCharacters = false,
ConformanceLevel = ConformanceLevel.Document
};
using (var fs = new FileStream(filePath, FileMode.Create, FileAccess.ReadWrite))
{
try
{
XmlWriter xmlWriter = XmlWriter.Create(fs, writerSettings);
xmlWriter.WriteStartElement("PRINT_JOB");
foreach (var page in orderPages)
{
xmlWriter.WriteElementString("PAGE", page);
}
xmlWriter.WriteFullEndElement();
xmlWriter.Close(); // Flush from xmlWriter to fs
fs.Seek(0, SeekOrigin.Begin); // Go back to read from the begining
var reader = new StreamReader(fs, writerSettings.Encoding);
result = reader.ReadToEnd();
// reader.Close(); // This would just flush/close fs early(which would be OK)
}
catch (Exception ex)
{
throw;
}
}
return result;
}
I know I'm late, but there seems to be a simpler solution. You want your function to generate xml, write it to a file and return the generated xml. Apparently allocating a string cannot be avoided (because you want it to be returned), same for writing to a file. But reading from a file (as in your and SensorSmith's solutions) can easily be avoided by simply "swapping" the operations - generate xml string and write it to a file. Like this:
var output = new StringBuilder();
var writerSettings = new XmlWriterSettings { /* your settings ... */ };
using (var xmlWriter = XmlWriter.Create(output, writerSettings))
{
// Your xml generation code using the writer
// ...
// You don't need to flush the writer, it will be done automatically
}
// Here the output variable contains the xml, let's take it...
var xml = output.ToString();
// write it to a file...
File.WriteAllText(filePath, xml);
// and we are done :-)
return xml;
IMPORTANT UPDATE: It turns out that the XmlWriter.Create(StringBuider, XmlWriterSettings) overload ignores the Encoding from the settings and always uses "utf-16", so don't use this method if you need other encoding.

C# concurrent filestream read/write eof

I have a thread producing lines in a log file:
var t1 = Task.Factory.StartNew(() =>
{
using (var fileStream = File.Open(file, FileMode.Create, FileAccess.Write, FileShare.Read))
using (var streamWriter = new StreamWriter(fileStream))
{
for (var i = 0; i < 10; i++)
{
streamWriter.WriteLine(i);
streamWriter.Flush();
Thread.Sleep(1000);
}
}
File.Delete(file);
});
And I have a thread reading lines from the same log file:
// Reads lines from the log file.
var t2 = Task.Factory.StartNew(() =>
{
Thread.Sleep(500); // Horrible wait to ensure file existence in this test case.
using (var fileStream = new FileStream(file, FileMode.Open, FileAccess.Read, FileShare.Delete | FileShare.Read | FileShare.Write))
using (var streamReader = new StreamReader(fileStream))
{
string line;
while ((line = streamReader.ReadLine()) != null) Console.WriteLine(line);
// FIXME: The stream reader stops, instead of doing a continous read.
Console.WriteLine("End of file");
}
});
The reader is supposed to read all written lines, therefore it should wait for more data instead of stopping at the first time it encounters EOF. I do not mind if the reader is never 'finished', so long as the file continues to exist, the reader is allowed to continue reading. How can I achieve this? Full code for reproduction purposes:
using System;
using System.IO;
using System.Threading;
using System.Threading.Tasks;
namespace PlayGround
{
internal static class Program
{
private static void Main()
{
const string file = "test.log";
// Writes lines to the log file.
var t1 = Task.Factory.StartNew(() =>
{
using (var fileStream = File.Open(file, FileMode.Create, FileAccess.Write, FileShare.Read))
using (var streamWriter = new StreamWriter(fileStream))
{
for (var i = 0; i < 10; i++)
{
streamWriter.WriteLine(i);
streamWriter.Flush();
Thread.Sleep(1000);
}
}
File.Delete(file);
});
// Reads lines from the log file.
var t2 = Task.Factory.StartNew(() =>
{
Thread.Sleep(500); // Horrible wait to ensure file existence in this test case.
using (
var fileStream = new FileStream(file, FileMode.Open, FileAccess.Read,
FileShare.Delete | FileShare.Read | FileShare.Write))
using (var streamReader = new StreamReader(fileStream))
{
string line;
while ((line = streamReader.ReadLine()) != null) Console.WriteLine(line);
// FIXME: The stream reader stops, instead of doing a continous read.
Console.WriteLine("End of file");
}
});
Task.WaitAll(t1, t2);
}
}
}
EDIT: As a practical example, this is useful for a scenario where a third party process is producing log entries which need to be read and processed. You could see this as a log file reader if that makes the application and use clearer.
You could perform a wait when the line == null, by checking the streamReader.EndOfFile property. Using Thread.Sleep(1000) is not an ideal solution, a bit hacky, and I guess there are other better alternative solutions out there. :-)
using (var fileStream = new FileStream(file, FileMode.Open, FileAccess.Read, FileShare.Delete | FileShare.Read | FileShare.Write))
using (var streamReader = new StreamReader(fileStream))
{
string line;
bool running = true; // we may want to terminate this loop in some condition.
while (running)
{
line = streamReader.ReadLine();
if (line != null)
{
Console.WriteLine(line);
}
else // as per edit, the whole else block can be omitted.
{
while (streamReader.EndOfStream)
{
Thread.Sleep(1000); // wait around for n time. This could end up in an infinte loop if the file is not written to anymore.
}
}
}
// FIXME: The stream reader stops, instead of doing a continous read.
Console.WriteLine("End of file");
}
EDIT: You can do without the else block:
else
{
while (streamReader.EndOfStream)
{
Thread.Sleep(1000)
}
}
Like this:
while (running)
{
line = streamReader.ReadLine();
if (line != null)
{
Console.WriteLine(line);
}
}
You need synchronization mechanism, in this case I use AutoResetEvent.
The required changes based on your code is.
const string file = "test.log";
// Adds new line.
AutoResetEvent signal = new AutoResetEvent(false);
streamWriter.Flush();
// Adds new line.
signal.Set();
File.Delete(file);
// Adds new line
signal.Set();
Thread.Sleep(500);
// Replaces with.
signal.WaitOne();
while ((line = streamReader.ReadLine()) != null) Console.WriteLine(line);
// Replaces with.
while ((line = streamReader.ReadLine()) != null)
{
signal.WaitOne();
Console.WriteLine(line);
}
Full code
const string file = "test.log";
AutoResetEvent signal = new AutoResetEvent(false);
// Writes lines to the log file.
var t1 = Task.Factory.StartNew(() =>
{
using (var fileStream = File.Open(file, FileMode.Create, FileAccess.Write, FileShare.Read))
{
using (var streamWriter = new StreamWriter(fileStream))
{
for (var i = 0; i < 10; i++)
{
streamWriter.WriteLine(i);
streamWriter.Flush();
signal.Set();
Thread.Sleep(10);
}
}
}
File.Delete(file);
signal.Set();
});
// Reads lines from the log file.
var t2 = Task.Factory.StartNew(() =>
{
signal.WaitOne();
using (var fileStream = new FileStream(file, FileMode.Open, FileAccess.Read, FileShare.Delete | FileShare.Read | FileShare.Write))
{
using (var streamReader = new StreamReader(fileStream))
{
string line;
while ((line = streamReader.ReadLine()) != null)
{
signal.WaitOne();
Console.WriteLine(line);
}
// FIXME: The stream reader stops, instead of doing a continous read.
Console.WriteLine("End of file");
}
}
});
Task.WaitAll(t1, t2);

2 Threads, 1 File

Since this can't be solved so easily how can I implement 1 thread that writes strings to a file / buffer line by line using Console.WriteLine() and another thread that reads those strings from the same file / buffer also line by line ? I guess I need to:
redirect Console to file / buffer
read file / buffer thread save, when a line is written it must be read by the other thread
make that asynchronous (no ReadToEnd(), it must be live)
try Memory-Mapped files it will allow you to read write to one shared file from multiple threads. As to redirecting console try:
Console.SetIn
Console.SetOut
I would like to do that with a buffer though.
Solution with file:
class Program
{
private static bool terminated = false;
private static void listen()
{
StreamReader file = new StreamReader(new FileStream("C:/test.txt", FileMode.Open, FileAccess.Read, FileShare.ReadWrite));
while (!terminated || !file.EndOfStream)
if (!file.EndOfStream)
{
string text = file.ReadLine();
MessageBox.Show(text); // display it
}
}
static void Main(string[] args)
{
StreamWriter sw = new StreamWriter(new FileStream("C:/test.txt", FileMode.Create, FileAccess.Write, FileShare.Read));
sw.AutoFlush = true;
Console.SetOut(sw);
new Thread(new ThreadStart(listen)).Start();
for (int i = 0; i < 10; i++)
{
Thread.Sleep(250);
Console.Out.WriteLine("hello world - " + i);
}
terminated = true;
}
}
Works line by line and doesn't miss one.

Categories