I know this question has probably been answered multiple times across this site, but even after looking at those solutions I don't have an answer to why my program isn't writing to the text file I have assigned. Here is the code:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using System.IO;
namespace Main
{
class Program
{
public static void Main (string[] args)
{
using (StreamWriter writer = new StreamWriter("test.txt"))
{
writer.WriteLine("Hello, World!");
}
}
}
}
My program throws no exceptions and exits with 0, so I am not understanding how this is still not functioning properly. If someone could please provide an answer along with an explanation as to why this doesn't work, I would really appreciate it.
EDIT: Okay, I fixed the code after playing around a bit. Turns out that upon reading the file the text that I wrote is there. Thus, a clarification of this problem would be:
The text writes to the file, but it is not visible to me when I open up the file from my project in Visual Studio. I am not sure as to why, and this is leading to confusion
Your sample is correct. The file is saved into build path. (where is "yourApp.exe").
You can try with a abosule path to define where file will be save, for example StreamWriter writer = new StreamWriter(#"c:\test\test.txt")
So far I haven't found anything that would allow my program to access text files that are in the same folder as it.
for example:
if my file is in C:/testingfolder i would need to use C:/testingfolder/filenames.txt to access the other files, the problem with this is sometimes it wont be in c:/testingfolder but instead it might be in E:/importantfiles or F:/backup and it needs to run from all of those.
If anyone could explain or give code that showed how to make a longer path into a "same folder" path that would answer my question.
With the Environment.CurrentDirectory you can get the path of your process that is executing, then you should use System.IO.Path.Combine() method to concatenate this path with the name of your file, and you will get the absolute location of your file.
You need to use System.IO and System.Text
using System;
using System.IO;
using System.Text;
then
static void Main(string[] args)
{
string line = "";
// look for the file "myfile.txt" in application root directory
using (StreamReader sr = new StreamReader("myfile.txt"))
{
while ((line = sr.ReadLine()) != null)
{
Console.WriteLine(line);
}
}
Console.ReadKey();
}
After copying a file to a temporary directory, I am unable to delete the copy because of an UnauthorizedAccessException exception. The idea here is to get a copy of the file, zip it and then delete the copy, but after removing all the code between File.Copy and File.Delete I am still getting the exception. Exiting from the program frees the lock and allows me to delete the copy without issue.
Is there a way to copy without causing this persistent lock (and preserve file metadata like LastModified)? Or a way to release the lock? Should there even be a lock on the copied file after File.Copy finishes?
I am using Visual C# 2010 SP1 targeting .NET Framework 4.0.
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Net;
using System.IO;
using System.Xml;
namespace Test
{
class Program
{
static void Main(string[] args)
{
String FileName = "C:\\test.txt";
// Generate temporary directory name
String directory = Path.Combine(Path.GetTempPath(), Path.GetRandomFileName());
// Temporary file path
String tempfile = Path.Combine(directory, Path.GetFileName(FileName));
// Create directory in file system
Directory.CreateDirectory(directory);
// Copy input file to the temporary directory
File.Copy(FileName, tempfile);
// Delete file in temporary directory
File.Delete(tempfile);
}
}
}
Check your "C:\\test.txt" file for read only or not.
based on your comments this may be the reason you can copy but you can't delete
try with below
File.SetAttributes(tempfile, FileAttributes.Normal);
File.Delete(tempfile);
I see you've already found an answer, but I'll add this for reference anyway; A possible alternative approach for you, might be to create the copy in a memory stream instead of copying the file to your hard-drive.
Using the DotNetZip library, you could do something like this:
using (var ms = new MemoryStream())
{
using (var zip = new ZipFile())
{
zip.AddEntry(fileName, data);
zip.Save(ms);
}
}
I'm trying to embed an XML file into a C# console application via Right clicking on file -> Build Action -> Embedded Resource.
How do I then access this embedded resource?
XDocument XMLDoc = XDocument.Load(???);
Edit: Hi all, despite all the bashing this question received, here's an update.
I managed to get it working by using
XDocument.Load(new System.IO.StreamReader(System.Reflection.Assembly.GetExecutingAssembly().GetManifestResourceStream("Namespace.FolderName.FileName.Extension")))
It didn't work previously because the folder name containing the resource file within the project was not included (none of the examples I found seemed to have that).
Thanks everyone who tried to help.
Something along these lines
using System.IO;
using System.Reflection;
using System.Xml;
namespace ConsoleApplication1
{
class Program
{
static void Main(string[] args)
{
var stream = Assembly.GetExecutingAssembly().GetManifestResourceStream("ConsoleApplication1.XMLFile1.xml");
StreamReader reader = new StreamReader(stream);
XmlDocument doc = new XmlDocument();
doc.LoadXml(reader.ReadToEnd());
}
}
}
Here is a link to the Microsoft doc that describes how to do it. http://support.microsoft.com/kb/319292
I am writing a simple web service using .NET, one method is used to send a chunk of a file from the client to the server, the server opens a temp file and appends this chunk. The files are quite large 80Mb, the net work IO seems fine, but the append write to the local file is slowing down progressively as the file gets larger.
The follow is the code that slows down, running on the server, where aFile is a string, and aData is a byte[]
using (StreamWriter lStream = new StreamWriter(aFile, true))
{
BinaryWriter lWriter = new BinaryWriter(lStream.BaseStream);
lWriter.Write(aData);
}
Debugging this process I can see that exiting the using statement is slower and slower.
If I run this code in a simple standalone test application the writes are the same speed every time about 3 ms, note the buffer (aData) is always the same side, about 0.5 Mb.
I have tried all sorts of experiments with different writers, system copies to append scratch files, all slow down when running under the web service.
Why is this happening? I suspect the web service is trying to cache access to local file system objects, how can I turn this off for specific files?
More information -
If I hard code the path the speed is fine, like so
using (StreamWriter lStream = new StreamWriter("c:\\test.dat", true))
{
BinaryWriter lWriter = new BinaryWriter(lStream.BaseStream);
lWriter.Write(aData);
}
But then it slow copying this scratch file to the final file destination later on -
File.Copy("c:\\test.dat", aFile);
If I use any varibale in the path it gets slow agin so for example -
using (StreamWriter lStream = new StreamWriter("c:\\test" + someVariable, true))
{
BinaryWriter lWriter = new BinaryWriter(lStream.BaseStream);
lWriter.Write(aData);
}
It has been commented that I should not use StreamWriter, note I tried many ways to open the file using FileStream, none of which made any change when the code is running under the web service, I tried WriteThrough etc.
Its the strangest thing I even tried this -
Write the data to file a.dat
Spawn system "cmd" "copy /b b.dat + a.dat b.dat"
Delete a.dat
This slows down the same way????
Makes me think the web server is running in some protected file IO environment catching all file operations in this process and child process, I can understand this if I was generating a file that might be later served to a client, but I am not, what I am doing is storing large binary blobs on disk, with a index/pointer to them stored in a database, if I comment out the write to the file the whole process fly's no performance issues at all.
I started reading about web server caching strategies, makes me think is there a web.config setting to mark a folder as uncached? Or am I completely barking up the wrong tree.
A long shot: is it possible that you need close some resources when you have finished?
If the file is binary, then why are you using a StreamWriter, which is derived from TextWriter? Just use a FileStream.
Also, BinaryWriter implements IDisposable, You need to put it into a using block.
Update....I replicated the basic code, no database, simple and it seems to work fine, so I suspect there is another reason, I will rest on it over the weekend....
Here is the replicated server code -
using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;
using System.Web.Services;
using System.IO;
namespace TestWS
{
/// <summary>
/// Summary description for Service1
/// </summary>
[WebService(Namespace = "http://tempuri.org/")]
[WebServiceBinding(ConformsTo = WsiProfiles.BasicProfile1_1)]
[System.ComponentModel.ToolboxItem(false)]
// To allow this Web Service to be called from script, using ASP.NET AJAX, uncomment the following line.
// [System.Web.Script.Services.ScriptService]
public class Service1 : System.Web.Services.WebService
{
private string GetFileName ()
{
if (File.Exists("index.dat"))
{
using (StreamReader lReader = new StreamReader("index.dat"))
{
return lReader.ReadLine();
}
}
else
{
using (StreamWriter lWriter = new StreamWriter("index.dat"))
{
string lFileName = Path.GetRandomFileName();
lWriter.Write(lFileName);
return lFileName;
}
}
}
[WebMethod]
public string WriteChunk(byte[] aData)
{
Directory.SetCurrentDirectory(Server.MapPath("Data"));
DateTime lStart = DateTime.Now;
using (FileStream lStream = new FileStream(GetFileName(), FileMode.Append))
{
BinaryWriter lWriter = new BinaryWriter(lStream);
lWriter.Write(aData);
}
DateTime lEnd = DateTime.Now;
return lEnd.Subtract(lStart).TotalMilliseconds.ToString();
}
}
}
And the replicated client code -
static void Main(string[] args)
{
Service1 s = new Service1();
byte[] b = new byte[1024 * 512];
for ( int i = 0 ; i < 160 ; i ++ )
{
Console.WriteLine(s.WriteChunk(b));
}
}
Based on your code, it appears you're using the default handling inside of StreamWriter for files, which means synchronous and exclusive locks on the file.
Based on your comments, it seems the issue you really want to solve is the return time from the web service -- not necessarily the write time for the file. While the write time is the current gating factor as you've discovered, you might be able to get around your issue by going to an asynchronous-write mode.
Alternatively, I prefer completely de-coupled asynchronous operations. In that scenario, the inbound byte[] of data would be saved to its own file (or some other structure), then appended to the master file by a secondary process. More complex for operation, but also less prone to failure.
I don't have enough points to vote up an answer, but jro has the right idea. We do something similar in our service; each chunk is saved to a single temp file, then as soon as all chunks are received they're reassembled into a single file.
I'm not certain on the underlying processes for appending data to a file using StreamWriter, but I would assume it would have to at least read to the end of the current file before attempting to write whatever is in the buffer to it. So as the file gets larger it would have to read more and more of the existing file before writing the next chunk.
Well I found the root cause, "Microsoft Forefront Security", group policy has this running real time scanning, I could see the process goto 30% CPU usage when I close the file, killing this process and everything works the same speed, outside and inside the web service!
Next task find a way to add an exclusion to MFS!