I'm overwriting a file using C# in Windows Phone 7. When I do this a seemingly random character is added to the start of each line.
Why is this happening?
Code:
public static bool overwriteFile(string filename, string[] inputArray)
{
try
{
IsolatedStorageFile store = IsolatedStorageFile.GetUserStoreForApplication();
FileStream stream = store.OpenFile(filename, FileMode.Create);
BinaryWriter writer = new BinaryWriter(stream);
foreach (string input in inputArray)
{
writer.Write(input + "\n");
}
writer.Close();
return true;
}
catch (IOException ex)
{
return false;
}
}
Lodaing Code:
public static Idea[] getFile(string filename)
{
try
{
IsolatedStorageFile store = IsolatedStorageFile.GetUserStoreForApplication();
string fileContents = null;
if (store.FileExists(filename)) // Check if file exists
{
IsolatedStorageFileStream save = new IsolatedStorageFileStream(filename, FileMode.Open, store);
StreamReader streamReader = new StreamReader(save);
fileContents = streamReader.ReadToEnd();
save.Close();
}
string[] lines = null;
if (fileContents != null)
{
lines = fileContents.Split('\n');
}
Idea[] ideaList = null;
if (lines != null)
{
ideaList = new Idea[lines.Length];
for (int i = 0; i < lines.Length; i++)
{
ideaList[i] = new Idea(lines[i].TrimEnd('\r'));
}
}
return ideaList;
}
catch (IOException ex)
{
return null;
}
}
The random character is a length prefix; see http://msdn.microsoft.com/en-us/library/yzxa6408.aspx.
You should be using some type of TextWriter to write strings to the file; NOT a BinaryWriter.
A StreamWriter might be the best and then you could use the WriteLine method.
Instead of using '\n', try using Environment.NewLine
You are using a BinaryWriter to write, and a TextReader to read. Change your write code to use a StreamWriter (which is a TextWriter) instead of a BinaryWriter. This will also get you the WriteLine method that Naveed recommends.
try changing this
writer.Write(input + "\n");
to
writer.WriteLine(input);
Related
I am trying to create a list using the FileStream/StreamReader method. Everything works fine except the price calculation is reset every time a new line is added.
I believe the issue is with the save method. I am sure it is not caused from functions in my classes, since the price is showing properly. There seems to be an issue when saving the string.
This is my read method:
public static List<Customer> ReadCustomers()
{
// create an empty customer list
List<Customer> customerList = new List<Customer>();
// new Filestream
FileStream fs = null;
// new StreamReader
StreamReader sr = null;
Customer c; // for reading
string line;
string[] fields;
try
{
fs = new FileStream(path, FileMode.OpenOrCreate, FileAccess.Read);
sr = new StreamReader(fs);
while (!sr.EndOfStream)// while there is data
{
line = sr.ReadLine();
fields = line.Split(','); // split sections by commas
c = new Customer(); // initializes customer object
c.AccountNo = Convert.ToInt32(fields[0].Trim());
c.CustomerName = Convert.ToString(fields[1].Trim());
c.CustomerType = Convert.ToChar(fields[2].Trim());
c.CustomerCharge = Convert.ToDecimal(fields[3].Trim());
customerList.Add(c);
}
}
catch (Exception ex)
{
throw ex;
}
finally // always execute
{
if (fs != null) fs.Close(); // close file
}
return customerList;
}
This is where I try to save the string...
public static void SaveCustomers(List<Customer> list)
{
FileStream fs = null;
StreamWriter sw = null;
string line;
try
{
fs = new FileStream(path, FileMode.Create, FileAccess.Write);
sw = new StreamWriter(fs);
foreach (Customer c in list) // for each customer in the list
{
line = c.AccountNo.ToString() + ", " + c.CustomerName.ToString() + ", " +
c.CustomerType.ToString() + ", " + c.CustomerCharge.ToString(); // make a line with data
sw.WriteLine(line); // and write it to the file
}
}
catch(Exception ex)
{
throw ex;
}
finally
{
if (sw != null) sw.Close(); // stream writer close
if (fs != null) fs.Close();
}
}
Calculation:
public override decimal CalculateCharge()
{
decimal peak;
decimal offpeak;
if (Kwh1 <= INDUST_BASE_HOURS)
{
peak = KWH_PEAK_BASE_PRICE;
}
else
{
peak = ((Kwh1 - INDUST_BASE_HOURS) * KWH_INDUST_PEAK) + KWH_PEAK_BASE_PRICE;
}
if (Kwh2 <= INDUST_BASE_HOURS)
{
offpeak = KWH_OFF_PEAK_BASE_PRICE;
}
else
{
offpeak = ((Kwh2 - INDUST_BASE_HOURS) * KWH_INDUST_OFFPEAK) + KWH_OFF_PEAK_BASE_PRICE;
}
return peak + offpeak;
}
In SaveCustomers(), are you sure you want to open the file:
fs = new FileStream(path, FileMode.Create, FileAccess.Write);
You may want:
fs = new FileStream(path, FileMode.Append, FileAccess.Write);
FileMode.Create will destroy the file if it exists.
FileMode.Append will append to an existing file it exists.
Maybe for the purposes of clarity around testing, you output to another file rather than the one you read in.
Try using this by using the append parameter:
new StreamWriter("c:\\file.txt", true);
http://msdn.microsoft.com/en-us/library/36b035cb.aspx
Or you can see related answers here, which had similar problems
C# add text to text file without rewriting it?
I am trying to read all text of a TXT file that is inside a ZIP file. The unzipped file has 1GB.
The following code does not throw errors but never ends, is there any way to speed up the process?
var fileText = string.Empty;
using (var file = File.OpenRead(System.Configuration.ConfigurationManager.AppSettings["zipPath"]))
using (var zip = new ZipArchive(file, ZipArchiveMode.Read))
{
using (var stream = zip.Entries.First().Open())
{
using (var streamReader = new StreamReader(stream))
{
try
{
while (streamReader.Peek() >= 0)
{
var line = streamReader.ReadLine();
fileText = fileText + line;
}
}
catch (Exception ex)
{
Console.WriteLine(ex);
}
}
}
}
Try:
var fileText = string.Empty;
using (var file = File.OpenRead(System.Configuration.ConfigurationManager.AppSettings["zipPath"]))
using (var zip = new ZipArchive(file, ZipArchiveMode.Read))
{
using (var stream = zip.Entries.First().Open())
{
using (var streamReader = new StreamReader(stream))
{
try
{
while (!streamReader.EndOfStream)
{
var line = streamReader.ReadLine();
fileText = fileText + line;
}
}
catch (Exception ex)
{
Console.WriteLine(ex);
}
}
}
}
However, consider using ReadAllLines for easiest solution.
I want to overwrite or create an xml file on disk, and return the xml from the function. I figured I could do this by copying from FileStream to MemoryStream. But I end up appending a new xml document to the same file, instead of creating a new file each time.
What am I doing wrong? If I remove the copying, everything works fine.
public static string CreateAndSave(IEnumerable<OrderPage> orderPages, string filePath)
{
if (orderPages == null || !orderPages.Any())
{
return string.Empty;
}
var xmlBuilder = new StringBuilder();
var writerSettings = new XmlWriterSettings
{
Indent = true,
Encoding = Encoding.GetEncoding("ISO-8859-1"),
CheckCharacters = false,
ConformanceLevel = ConformanceLevel.Document
};
using (var fs = new FileStream(filePath, FileMode.OpenOrCreate, FileAccess.ReadWrite))
{
try
{
XmlWriter xmlWriter = XmlWriter.Create(fs, writerSettings);
xmlWriter.WriteStartElement("PRINT_JOB");
WriteXmlAttribute(xmlWriter, "TYPE", "Order Confirmations");
foreach (var page in orderPages)
{
xmlWriter.WriteStartElement("PAGE");
WriteXmlAttribute(xmlWriter, "FORM_TYPE", page.OrderType);
var outBound = page.Orders.SingleOrDefault(x => x.FlightInfo.Direction == FlightDirection.Outbound);
var homeBound = page.Orders.SingleOrDefault(x => x.FlightInfo.Direction == FlightDirection.Homebound);
WriteXmlOrder(xmlWriter, outBound, page.ContailDetails, page.UserId, page.PrintType, FlightDirection.Outbound);
WriteXmlOrder(xmlWriter, homeBound, page.ContailDetails, page.UserId, page.PrintType, FlightDirection.Homebound);
xmlWriter.WriteEndElement();
}
xmlWriter.WriteFullEndElement();
MemoryStream destination = new MemoryStream();
fs.CopyTo(destination);
Log.Progress("Xml string length: {0}", destination.Length);
xmlBuilder.Append(Encoding.UTF8.GetString(destination.ToArray()));
destination.Flush();
destination.Close();
xmlWriter.Flush();
xmlWriter.Close();
}
catch (Exception ex)
{
Log.Warning(ex, "Unhandled exception occured during create of xml. {0}", ex.Message);
throw;
}
fs.Flush();
fs.Close();
}
return xmlBuilder.ToString();
}
Cheers
Jens
FileMode.OpenOrCreate is causing the file contents to be overwritten without shortening, leaving any 'trailing' data from previous runs. If FileMode.Create is used the file will be truncated first. However, to read back the contents you just wrote you will need to use Seek to reset the file pointer.
Also, flush the XmlWriter before copying from the underlying stream.
See also the question Simultaneous Read Write a file in C Sharp (3817477).
The following test program seems to do what you want (less your own logging and Order details).
using System;
using System.IO;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Xml;
using System.Threading.Tasks;
namespace ReadWriteTest
{
class Program
{
static void Main(string[] args)
{
string filePath = Path.Combine(
Environment.GetFolderPath(Environment.SpecialFolder.Personal),
"Test.xml");
string result = CreateAndSave(new string[] { "Hello", "World", "!" }, filePath);
Console.WriteLine("============== FIRST PASS ==============");
Console.WriteLine(result);
result = CreateAndSave(new string[] { "Hello", "World", "AGAIN", "!" }, filePath);
Console.WriteLine("============== SECOND PASS ==============");
Console.WriteLine(result);
Console.ReadLine();
}
public static string CreateAndSave(IEnumerable<string> orderPages, string filePath)
{
if (orderPages == null || !orderPages.Any())
{
return string.Empty;
}
var xmlBuilder = new StringBuilder();
var writerSettings = new XmlWriterSettings
{
Indent = true,
Encoding = Encoding.GetEncoding("ISO-8859-1"),
CheckCharacters = false,
ConformanceLevel = ConformanceLevel.Document
};
using (var fs = new FileStream(filePath, FileMode.Create, FileAccess.ReadWrite))
{
try
{
XmlWriter xmlWriter = XmlWriter.Create(fs, writerSettings);
xmlWriter.WriteStartElement("PRINT_JOB");
foreach (var page in orderPages)
{
xmlWriter.WriteElementString("PAGE", page);
}
xmlWriter.WriteFullEndElement();
xmlWriter.Flush(); // Flush from xmlWriter to fs
xmlWriter.Close();
fs.Seek(0, SeekOrigin.Begin); // Go back to read from the begining
MemoryStream destination = new MemoryStream();
fs.CopyTo(destination);
xmlBuilder.Append(Encoding.UTF8.GetString(destination.ToArray()));
destination.Flush();
destination.Close();
}
catch (Exception ex)
{
throw;
}
fs.Flush();
fs.Close();
}
return xmlBuilder.ToString();
}
}
}
For the optimizers out there, the StringBuilder was unnecessary because the string is formed whole and the MemoryStream can be avoided by just wrapping fs in a StreamReader. This would make the code as follows.
public static string CreateAndSave(IEnumerable<string> orderPages, string filePath)
{
if (orderPages == null || !orderPages.Any())
{
return string.Empty;
}
string result;
var writerSettings = new XmlWriterSettings
{
Indent = true,
Encoding = Encoding.GetEncoding("ISO-8859-1"),
CheckCharacters = false,
ConformanceLevel = ConformanceLevel.Document
};
using (var fs = new FileStream(filePath, FileMode.Create, FileAccess.ReadWrite))
{
try
{
XmlWriter xmlWriter = XmlWriter.Create(fs, writerSettings);
xmlWriter.WriteStartElement("PRINT_JOB");
foreach (var page in orderPages)
{
xmlWriter.WriteElementString("PAGE", page);
}
xmlWriter.WriteFullEndElement();
xmlWriter.Close(); // Flush from xmlWriter to fs
fs.Seek(0, SeekOrigin.Begin); // Go back to read from the begining
var reader = new StreamReader(fs, writerSettings.Encoding);
result = reader.ReadToEnd();
// reader.Close(); // This would just flush/close fs early(which would be OK)
}
catch (Exception ex)
{
throw;
}
}
return result;
}
I know I'm late, but there seems to be a simpler solution. You want your function to generate xml, write it to a file and return the generated xml. Apparently allocating a string cannot be avoided (because you want it to be returned), same for writing to a file. But reading from a file (as in your and SensorSmith's solutions) can easily be avoided by simply "swapping" the operations - generate xml string and write it to a file. Like this:
var output = new StringBuilder();
var writerSettings = new XmlWriterSettings { /* your settings ... */ };
using (var xmlWriter = XmlWriter.Create(output, writerSettings))
{
// Your xml generation code using the writer
// ...
// You don't need to flush the writer, it will be done automatically
}
// Here the output variable contains the xml, let's take it...
var xml = output.ToString();
// write it to a file...
File.WriteAllText(filePath, xml);
// and we are done :-)
return xml;
IMPORTANT UPDATE: It turns out that the XmlWriter.Create(StringBuider, XmlWriterSettings) overload ignores the Encoding from the settings and always uses "utf-16", so don't use this method if you need other encoding.
How to Zip a folder using ICSharplib.
Is there any way I can add a encrypt password while zipping it ?
There is no option that I can use any other dll. Have to use only ICSharplib.
Currently I am using this code block
private static void CompressFiles(string folderPath) {
string zipOutput = #"C:\temp\myoutput.zip";
try {
using (ZipOutputStream zs = new ZipOutputStream(File.Create(zipOutput))) {
zs.SetLevel(9); // 0-9 (9 being best compression)
foreach (string file in Directory.GetFiles(folderPath)) {
ZipEntry entry = new ZipEntry(Path.GetFileName(file));
entry.DateTime = DateTime.Now;
using (FileStream fs = File.OpenRead(file)) {
byte[] buffer = new byte[fs.Length];
fs.Read(buffer, 0, buffer.Length);
entry.Size = buffer.Length; // This is very important
zs.PutNextEntry(entry);
zs.Write(buffer, 0, buffer.Length);
}
}
zs.Finish();
zs.Close();
}
}
catch { throw; }
}
It can zip all the files in the folder.
But What I want is to zip the whole folder.
Like the folders in side that folder also be included in the zip file .
Thanks in advance
Use the FastZip object.
ICSharpCode.SharpZipLib.Zip.FastZip z = new ICSharpCode.SharpZipLib.Zip.FastZip();
z.CreateEmptyDirectories = true;
z.CreateZip("F:\\ZipTest.zip", "F:\\ZipTest\\", true, "");
if (File.Exists("F:\\ZipTest.zip"))
Console.WriteLine("Done");
else
Console.WriteLine("Failed");
I use following code:
public static bool ZipIt(string sourcePath, string destinationPath)
{
List<string> ListOfFiles = GetListOfFiles(sourcePath);
try
{
string OutPath = destinationPath + ".zip";
int TrimLength = (Directory.GetParent(sourcePath)).ToString().Length;
TrimLength += 1;
//remove '\'
FileStream ostream;
byte[] obuffer;
ZipOutputStream oZipStream = new ZipOutputStream(System.IO.File.Create(OutPath));
oZipStream.Password = EncodePassword("Password");
oZipStream.SetLevel(9);
// 9 = maximum compression level
ZipEntry oZipEntry;
foreach (string Fil in ListOfFiles.ToArray()) // for each file, generate a zipentry
{
oZipEntry = new ZipEntry(Fil.Remove(0, TrimLength));
oZipStream.PutNextEntry(oZipEntry);
if (!Fil.EndsWith(#"/")) // if a file ends with '/' its a directory
{
ostream = File.OpenRead(Fil);
obuffer = new byte[ostream.Length];
ostream.Read(obuffer, 0, obuffer.Length);
oZipStream.Write(obuffer, 0, obuffer.Length);
ostream.Close();
}
}
oZipStream.Finish();
oZipStream.Close();
return true;
}
catch (Exception ex)
{
return false;
}
}
public static string EncodePassword(string originalPassword)
{
Byte[] encodedBytes;
encodedBytes = ASCIIEncoding.Default.GetBytes(originalPassword);
return BitConverter.ToString(encodedBytes);
}
public byte[] GetFile(string filename)
{
FileStream aStream = File.Open(filename, FileMode.Open, FileAccess.Read);
BinaryReader binReader = new BinaryReader(aStream);
binReader.BaseStream.Position = 0;
byte[] binFile = binReader.ReadBytes(Convert.ToInt32(binReader.BaseStream.Length));
binReader.Close();
return binFile;
}
I run this method for a number of filepaths, problem is whenever a file cannot be accessed with File.Open (because it is used by another process) I get:
'aStream.Position' threw an exception of type 'System.ObjectDisposedException'
on the following line:
binReader.BaseStream.Position = 0;
And seldom I get
{System.IO.IOException: The process can not access the file '\folder\file.txt' because it is being used by another process.}
This is the exception I want. So why is the object disposed most of the times?
Note: I first had the FileStream line in a using statement but removed it because I thought that might have disposed the object. But the problem remains.
Edit: Using Compact Framework, which doesn't have ReadAllBytes.
Maybe that part of the time your FileStream is throwing the IOException whenever your file is in use, and at other times, perhaps, you are getting the ObjectDisposedException because your array is not initialized.
Obviously, I can not test this theory.
See if you can copy-n-paste this one with good results:
public byte[] GetFile(string filename)
{
byte[] binFile = null;
try
{
using (var aStream = File.Open(filename, FileMode.Open, FileAccess.Read))
{
BinaryReader binReader = new BinaryReader(aStream);
binFile = new byte[binReader.BaseStream.Length];
binReader.BaseStream.Position = 0; // <= this step should not be necessary
binFile = binReader.ReadBytes(binReader.BaseStream.Length);
binReader.Close();
}
} catch (IOException err) {
// file is being used by another process.
} catch (ObjectDisposedException err) {
// I am guessing you would never see this because your binFile is not disposed
}
return binFile;
}
Be sure to check for null return variables!
EDIT:
I wrote (what I think is) a simpler version. I tested it, and it seems to work OK. I also prefer Read() overload to ReadBytes(), because I know how much data was pulled in.
First, is the test function that calls the method for every image in my Pictures folder:
public void Test() {
DirectoryInfo dir = new DirectoryInfo(Environment.GetFolderPath(Environment.SpecialFolder.Personal));
foreach (var subDir in dir.GetDirectories()) {
if (-1 < subDir.Name.ToLower().IndexOf("pictures")) {
foreach (var file in subDir.GetFiles()) {
byte[] data = GetFile(file.FullName);
if (data != null) {
Console.WriteLine(data.Length);
}
}
}
}
}
public byte[] GetFile(string filename) {
byte[] result = null;
try {
if (File.Exists(filename)) {
int len = 0;
FileInfo file = new FileInfo(filename);
byte[] data = new byte[file.Length];
using (BinaryReader br = new BinaryReader(file.Open(FileMode.Open, FileAccess.Read))) {
len = br.Read(data, 0, data.Length);
br.Close();
}
if (0 < len) {
if (len == data.Length) {
return data;
} else {
// this section of code was never triggered in my tests;
// however, it is good to keep it as a backup.
byte[] dat2 = new byte[len];
Array.Copy(data, dat2, len);
return dat2;
}
}
}
} catch (IOException err) {
// file is being used by another process.
} catch (ObjectDisposedException err) {
// I am guessing you would never see this because your binFile is not disposed
}
return result;
}
I don't see any reason why these would not work - unless you are having an int overflow.
Just use this:
byte[] contents = File.ReadAllBytes(filename);
Why don't you simply use
public byte[] GetFile(string filename)
{
try { return File.ReadAllBytes(filename); }
catch { return null; }
}
Just for fun, you could even define an extension method
public static class Extensions
{
public static byte[] GetFile(this string filename)
{
try { return File.ReadAllBytes(filename); }
catch { return null; }
}
}
so you could do byte[] myfile = filename.GetFile();.
Remember you must check that return is not null before proceed:
if (myfile != null)
{
// Do what you need
}