I integrated (this)EPUB Reader reader to my project. It is working fine. & I want to load the file from SDCard instead of Isolated storage of device
To open file from Isolated storage we have IsolatedStorageFileStream like this
IsolatedStorageFileStream isfs;
using (IsolatedStorageFile isf = IsolatedStorageFile.GetUserStoreForApplication())
{
try
{
isfs = isf.OpenFile([Path to file], FileMode.Open);
}
catch
{
return;
}
}
ePubView.Source = isfs;
For file in SDcard I tried like this
ExternalStorageDevice sdCard = (await ExternalStorage.GetExternalStorageDevicesAsync()).FirstOrDefault();
// If the SD card is present, get the route from the SD card.
if (sdCard != null)
{
ExternalStorageFile file = await sdCard.GetFileAsync(_sdFilePath);
// _sdFilePath is string that having file path of file in SDCard
// Create a stream for the route.
Stream file = await file.OpenForReadAsync();
// Read the route data.
ePubView.Source = file;
}
Here I am getting exception System.IO.EndOfStreamException
If You want try.. Here is my project sample link
Question : How can I give my file as source to epubView control
Is this is proper way, please give a suggestion regarding this..
Thanks
Although I've not tried your approach, and I cannot say exactly where is an error (maybe file from SD is read async and thus you get EndOfStream, and please keep in mind that as it is said at EPUB Reader Site - it's under heavy developement). Check if after copying the file to ISolatedStorage, you will be able to use it. I would try in this case first copying from SD to Memory stream like this:
ExternalStorageDevice sdCard = (await ExternalStorage.GetExternalStorageDevicesAsync()).FirstOrDefault();
if (sdCard != null)
{
MemoryStream newStream = new MemoryStream();
using (ExternalStorageFile file = await sdCard.GetFileAsync(_sdFilePath))
using (Stream SDfile = await file.OpenForReadAsync())
newStream = await ReadToMemory(SDfile);
ePubView.Source = newStream;
}
And ReadToMemory:
private async Task<MemoryStream> ReadToMemory(Stream streamToRead)
{
MemoryStream targetStream = new MemoryStream();
const int BUFFER_SIZE = 1024;
byte[] buf = new byte[BUFFER_SIZE];
int bytesread = 0;
while ((bytesread = await streamToRead.ReadAsync(buf, 0, BUFFER_SIZE)) > 0)
{
targetStream.Write(buf, 0, bytesread);
}
return targetStream;
}
Maybe it will help.
There's a bug with the stream returned from ExternalStorageFile. There's two options to get around it...
If the file is small then you can simply copy the stream to a MemoryStream:
Stream s = await file.OpenForReadAsync();
MemoryStream ms = new MemoryStream();
s.CopyTo(ms);
However, if the file is too large you'll run in to memory issues so the following stream wrapper class can be used to correct Microsoft's bug (though in future versions of Windows Phone you'll need to disable this fix once the bug has been fixed):
using System;
using System.IO;
namespace WindowsPhoneBugFix
{
/// <summary>
/// Stream wrapper to circumnavigate buggy Stream reading of stream returned by ExternalStorageFile.OpenForReadAsync()
/// </summary>
public sealed class ExternalStorageFileWrapper : Stream
{
private Stream _stream; // Underlying stream
public ExternalStorageFileWrapper(Stream stream)
{
if (stream == null)
throw new ArgumentNullException("stream");
_stream = stream;
}
// Workaround described here - http://stackoverflow.com/a/21538189/250254
public override long Seek(long offset, SeekOrigin origin)
{
ulong uoffset = (ulong)offset;
ulong fix = ((uoffset & 0xffffffffL) << 32) | ((uoffset & 0xffffffff00000000L) >> 32);
return _stream.Seek((long)fix, origin);
}
public override bool CanRead
{
get { return _stream.CanRead; }
}
public override bool CanSeek
{
get { return _stream.CanSeek; }
}
public override bool CanWrite
{
get { return _stream.CanWrite; }
}
public override void Flush()
{
_stream.Flush();
}
public override long Length
{
get { return _stream.Length; }
}
public override long Position
{
get
{
return _stream.Position;
}
set
{
_stream.Position = value;
}
}
public override int Read(byte[] buffer, int offset, int count)
{
return _stream.Read(buffer, offset, count);
}
public override void SetLength(long value)
{
_stream.SetLength(value);
}
public override void Write(byte[] buffer, int offset, int count)
{
_stream.Write(buffer, offset, count);
}
}
}
Code is available here to drop in to your project:
https://github.com/gavinharriss/ExternalStorageFileWrapper-wp8
Example of use:
ExternalStorageFile file = await device.GetFileAsync(filename); // device is an instance of ExternalStorageDevice
Stream streamOriginal = await file.OpenForReadAsync();
ExternalStorageFileWrapper streamToUse = new ExternalStorageFileWrapper(streamOriginal);
Related
Subject:
I'm coming from nodejs so using some examples from nodejs to get the concept across.
Temp Directory: I'm using dotnet core so the app can run on either mac, windows or linux and the confusion lies for temp directory across operating systems where this image will be downloaded. (find temp dir in nodejs on any os -> os.tmpDir())
File format unknown: file format is unknown, not necessary to know in order to download the image and save but is necessary when sending it to the browsers in headers and it can be done using magic numbers. Reference
Download Image from Web
using (WebClient webClient = new WebClient())
{
byte [] data = webClient.DownloadData("https://fbcdn-sphotos-h-a.akamaihd.net/hphotos-ak-xpf1/v/t34.0-12/10555140_10201501435212873_1318258071_n.jpg?oh=97ebc03895b7acee9aebbde7d6b002bf&oe=53C9ABB0&__gda__=1405685729_110e04e71d9");
using (MemoryStream mem = new MemoryStream(data))
{
using (var yourImage = Image.FromStream(mem))
{
// how to save in the temp directory for all operating systems
}
}
}
Send the image to browser
byte[] ar;
using(FileStream fstream = new FileStream(tempPathForImage, FileMode.Open, FileAccess.Read);)
{
ar = new byte[(long)fstream.Length];
fstream.read(ar, 0, fstream.Length);
}
sw.WriteLine("Content-Type: "); // image/jpeg unknown, check first 4 bytes
sw.WriteLine("Content-Length: {0}", ar.Length); //Let's
sw.WriteLine();
sw.BaseStream.Write(ar, 0, ar.Length);
Magic Numbers:
magic numbers are basically first 4 bytes of a file that can help to find out the file type/extension. I've done something similar in nodejs where as image right before it gets streamed, i get a chance to read the first four bytes and then i set the header (content type) and continue streaming there. It's important to set the header before an image starts streaming, that's why you see in the following code checks for writeStream == null
response.on('data', function(chunk){
if(writeStream == null) {
url += '.' + getExtension(chunk.toString('hex', 0, 4));
writeStream = fs.createWriteStream(url);
writeStream.on('error', reject);
writeStream.on('finish', function(){
data.file = url;
resolve(data);
});
}
writeStream.write(chunk);
});
Some file formats and their magic numbers.
"ffd8ffDB": "jpg",
"ffd8ffe0": "jpg",
"ffd8ffe1": "jpg",
"ffd8ffe2": "jpg",
"ffd8ffe3": "jpg",
"ffd8ffe8": "jpg",
"ffd8ffdb": "jpg",
"89504e47": "png",
"47494638": "gif",
Question
Solve the problem of finding the temp directory
How to read first four bytes during the phase of streaming but just before sending, set the headers for content type (for once) and continue streaming.
You can create a specialized stream that wraps another stream, then override Read() to handle that detection. Here's a starting point. Most of this is boilerplate that defers to the wrapped stream.
class ImageDetectionStream : Stream
{
public string Mime { get; private set; }
private readonly Stream _stream;
private readonly byte[] _consideredBytes = new byte[MaxMagicNumberSize];
private int _consideredPosition;
private static readonly IDictionary<byte[], string> Magics = new Dictionary<byte[], string>
{
[new byte[] { 0xff, 0xdb, 0xff, 0xdb }] = "image/jpeg",
[new byte[] { 0xff, 0xd8, 0xff, 0xe0 }] = "image/jpeg",
[new byte[] { 0xff, 0xd8, 0xff, 0xe1 }] = "image/jpeg",
// and so on...
};
private static readonly int MaxMagicNumberSize = Magics.Keys.Max(x => x.Length);
public ImageDetectionStream(Stream stream)
{
_stream = stream ?? throw new ArgumentNullException(nameof(stream));
}
public override int Read(byte[] buffer, int offset, int count)
{
var value = _stream.Read(buffer, offset, count);
if (Mime != null) return value;
Array.Copy(buffer, 0, _consideredBytes, 0, _consideredBytes.Length);
_consideredPosition += value;
if (_consideredPosition < MaxMagicNumberSize) return value;
foreach (var magic in Magics)
{
var possibleMagic = buffer.Take(magic.Key.Length).ToArray();
if (possibleMagic.SequenceEqual(magic.Key))
{
Mime = magic.Value;
break;
}
}
return value;
}
// boilerplate
public override void Flush()
{
_stream.Flush();
}
public override long Seek(long offset, SeekOrigin origin)
{
return _stream.Seek(offset, origin);
}
public override void SetLength(long value)
{
_stream.SetLength(value);
}
public override void Write(byte[] buffer, int offset, int count)
{
_stream.Write(buffer, offset, count);
}
public override bool CanRead => _stream.CanRead;
public override bool CanSeek => _stream.CanSeek;
public override bool CanWrite => _stream.CanWrite;
public override long Length => _stream.Length;
public override long Position
{
get => _stream.Position;
set => _stream.Position = value;
}
}
Example use -
using (var fs = File.OpenRead("\\path\\to\\image\\file"))
using (var imageStream = new ImageDetectionStream(fs))
{
var bytes = new byte[128];
var bytesRead = imageStream.Read(bytes, 0, bytes.Length);
Console.WriteLine($"Image has {imageStream.Mime} type.");
}
Outputs:
Image has image/jpeg type.
End goal:
Users are uploading a large number of files in different sizes to my web site. And i dont want duplicate files on the disk.
The solution i have been using is a simple SH1 hash of the file when it is uploaded. With code like this:
public static string HashFile(string FileName)
{
using (FileStream stream = File.OpenRead(FileName))
{
SHA1Managed sha = new SHA1Managed();
byte[] checksum = sha.ComputeHash(stream);
string sendCheckSum = BitConverter.ToString(checksum).Replace("-",string.Empty);
return sendCheckSum;
}
}
This "works" fine for smaller files, but its a big pain when the file is 30gb. So i would like to hash the file as im reciving it from the client. I get the file from the client in "chunks" and size of the chunk is not always static.
Code that recives the file.
int chunk = context.Request["chunk"] != null ? int.Parse(context.Request["chunk"]) : 0;
int chunks = context.Request["chunks"] != null ? int.Parse(context.Request["chunks"]) : 0;
string fileName = context.Request["name"] != null ? context.Request["name"] : string.Empty;
HttpPostedFile fileUpload = context.Request.Files[0];
string fullFilePath = Path.Combine(SiteSettings.UploadTempFolder, fileName);
using (var fs = new FileStream(fullFilePath, chunk == 0 ? FileMode.Create : FileMode.Append))
{
var buffer = new byte[fileUpload.InputStream.Length];
fileUpload.InputStream.Read(buffer, 0, buffer.Length);
fs.Write(buffer, 0, buffer.Length);
**// Here i want the hash, when i have the file data in memory.**
}
You can always create your own stream :)
public class ActionStream : Stream
{
private readonly Stream _innerStream;
private readonly Action<byte[], int, int> _readAction;
public ActionStream(Stream innerStream, Action<byte[], int, int> readAction)
{
_innerStream = innerStream;
_readAction = readAction;
}
public override bool CanRead => true;
public override bool CanSeek => false;
public override bool CanWrite => false;
public override long Length => _innerStream.Length;
public override long Position
{
get { return _innerStream.Position; }
set { throw new NotSupportedException(); }
}
public override void Flush() { }
public override int Read(byte[] buffer, int offset, int count)
{
var bytesRead = _innerStream.Read(buffer, offset, count);
_readAction(buffer, offset, bytesRead);
return bytesRead;
}
public override long Seek(long offset, SeekOrigin origin)
{
throw new NotSupportedException();
}
protected override void Dispose(bool disposing)
{
if (disposing)
{
_innerStream.Dispose();
}
base.Dispose(disposing);
}
public override void SetLength(long value) { throw new NotSupportedException(); }
public override void Write(byte[] buffer, int offset, int count)
{
throw new NotSupportedException();
}
}
This allows you to bind together the two stream operations you're doing:
using (var fs = new FileStream(path, chunk == 0 ? FileMode.Create : FileMode.Append))
{
var as = new ActionStream(fileUpload.InputStream,
(buffer, offset, bytesRead) =>
{
fs.Write(buffer, offset, bytesRead);
});
var sha = new SHA1Managed();
var checksum = sha.ComputeHash(as);
}
This assumes that SHA1Manager reads through every single byte of the input stream in order - you should check that. I'm pretty sure that is how it works, though :)
This is a cut and paste from:
Compute a hash from a stream of unknown length in C#
MD5, like other hash functions, does not require two passes.
To start:
HashAlgorithm hasher = ..;
hasher.Initialize();
As each block of data arrives:
byte[] buffer = ..;
int bytesReceived = ..;
hasher.TransformBlock(buffer, 0, bytesReceived, null, 0);
To finish and retrieve the hash:
hasher.TransformFinalBlock(new byte[0], 0, 0);
byte[] hash = hasher.Hash;
This pattern works for any type derived from HashAlgorithm, including MD5CryptoServiceProvider and SHA1Managed.
HashAlgorithm also defines a method ComputeHash which takes a Stream object; however, this method will block the thread until the stream is consumed. Using the TransformBlock approach allows an "asynchronous hash" that is computed as data arrives without using up a thread.
I try to get the transfer speed at a ftp-upload, but I don't know where I should "get" it:
Code-Snippet:
FtpWebRequest request = (FtpWebRequest)WebRequest.Create(job.GetDestinationFolder() + "\\" + fileOnlyName);
request.Method = WebRequestMethods.Ftp.UploadFile;
request.Credentials = new NetworkCredential(Manager._user, Manager._password);
using (var requestStream = request.GetRequestStream())
{
using (var input = File.OpenRead(file))
{
//input.CopyToAsync()
input.CopyTo(requestStream);
//IS HERE ANY METHOD OR ATTRIBUTE, WHICH SHOWS THE SENT BYTES ?
}
}
FtpWebResponse response = (FtpWebResponse)request.GetResponse();
Console.WriteLine("Upload File Complete, status {0}", response.StatusDescription);
response.Close();
}
I already read that this code
public static void CopyStream(Stream input, Stream output)
{
byte[] buffer = new byte[32768];
int read;
while ((read = input.Read(buffer, 0, buffer.Length)) > 0)
{
output.Write (buffer, 0, read);
}
}
isn't really efficient, according to the comment that was left:
Note that this is not the fastest way to do it. In the provided code snippet, you have to wait for the Write to complete before a new block is read. When doing the Read and Write asynchronously this waiting will disappear. In some situation this will make the copy twice as fast. However it will make the code a lot more complicated so if speed is not an issue, keep it simple and use this simple loop.
How can I show the transfer speed like a download at chrome or firefox ?
EDIT:
This is what I tried before you (Tien Dinh) answered:
FtpWebRequest request = (FtpWebRequest)WebRequest.Create(job.GetDestinationFolder() + "\\" + fileOnlyName);
request.Method = WebRequestMethods.Ftp.UploadFile;
request.Credentials = new NetworkCredential(Manager._user, Manager._password);
using (var requestStream = request.GetRequestStream())
{
using (var input = File.OpenRead(file))
{
Console.WriteLine(input.Length);//bGroundWorker.ReportProgress(request.)
Console.WriteLine(input.Position);
while (input.Position != input.Length)
{
input.CopyToAsync(requestStream);
Console.WriteLine(input.Position);
//bGroundWorker.ReportProgress( (int) input.Position);
}
Console.WriteLine(input.Length + "(length)");
Console.WriteLine(input.Position + "(sent)");
//e.Result = input.Position;
}
}
FtpWebResponse response = (FtpWebResponse)request.GetResponse();
Console.WriteLine("Upload File Complete, status {0}", response.StatusDescription);
response.Close();
As you can see there is a BackgroundWorker so that I use CopyToAsync.
You could build your own stream wrapper class that reports the number of bytes written in a defined interval:
public class StreamWithProgress : Stream
{
private readonly TimeSpan interval;
private readonly long sourceLength;
private readonly Stopwatch stopwatch = Stopwatch.StartNew();
private readonly BackgroundWorker worker;
private int bytesInLastInterval;
private long bytesTotal;
private Stream innerStream;
public override bool CanRead
{
get { return this.innerStream.CanRead; }
}
public override bool CanSeek
{
get { return this.innerStream.CanSeek; }
}
public override bool CanWrite
{
get { return this.innerStream.CanWrite; }
}
public override long Length
{
get { return this.innerStream.Length; }
}
public override long Position
{
get { return this.innerStream.Position; }
set { this.innerStream.Position = value; }
}
public StreamWithProgress(Stream stream, BackgroundWorker worker, long sourceLength, TimeSpan? interval = null)
{
if (stream == null)
{
throw new ArgumentNullException("stream");
}
if (worker == null)
{
throw new ArgumentNullException("worker");
}
this.interval = interval ?? TimeSpan.FromSeconds(1);
this.innerStream = stream;
this.worker = worker;
this.sourceLength = sourceLength;
}
public override void Flush()
{
this.innerStream.Flush();
}
public override int Read(byte[] buffer, int offset, int count)
{
return this.innerStream.Read(buffer, offset, count);
}
public override int ReadByte()
{
return this.innerStream.ReadByte();
}
public override long Seek(long offset, SeekOrigin origin)
{
return this.innerStream.Seek(offset, origin);
}
public override void SetLength(long value)
{
this.innerStream.SetLength(value);
}
public override void Write(byte[] buffer, int offset, int count)
{
this.innerStream.Write(buffer, offset, count);
this.ReportProgress(count);
}
public override void WriteByte(byte value)
{
this.innerStream.WriteByte(value);
this.ReportProgress(1);
}
protected override void Dispose(bool disposing)
{
if (this.innerStream != null)
{
this.innerStream.Dispose();
this.innerStream = null;
}
}
private void ReportProgress(int count)
{
this.bytesInLastInterval += count;
this.bytesTotal += count;
if (this.stopwatch.Elapsed > this.interval)
{
double speed = this.bytesInLastInterval / (this.stopwatch.Elapsed.Ticks / (double) this.interval.Ticks);
double progress = this.bytesTotal / (double) this.sourceLength;
var progressPercentage = (int) (progress * 100);
this.worker.ReportProgress(progressPercentage, speed);
this.bytesInLastInterval = 0;
this.stopwatch.Restart();
}
}
}
You would use it like this:
BackgroundWorker worker = (BackgroundWorker)sender;
WebRequest request = WebRequest.Create("SOME URL");
WebResponse response = request.GetResponse();
using (Stream stream = response.GetResponseStream())
using (var dest = new StreamWithProgress(File.OpenWrite("PATH"), worker, response.ContentLength))
{
stream.CopyTo(dest);
}
The BackgroundWorker will be called repeatedly with the current progress and speed. You could refine that example using a queue that stores the last n speeds and reports a mean value.
You already have a CopyStream method, just need to improve performance. BufferedStream is great for this. See below.
I believe You can also improve it further by using the Async methods in .net 4.
public static void CopyStream(Stream input, Stream output, Action<int> totalSent)
{
BufferedStream inputBuffer = new BufferedStream(input);
BufferedStream outputBuffer = new BufferedStream(output);
byte[] buffer = new byte[32768];
int read;
int total = 0;
while ((read = inputBuffer.Read(buffer, 0, buffer.Length)) > 0)
{
outputBuffer.Write (buffer, 0, read);
total += read;
totalSent(total);
}
outputBuffer.Flush();
}
I'm porting my published app in Windows Phone, to Win 8. While trying to write to the IsolatedStorage equivalent, ApplicationDataContainer, I get an exception. The exception says
Error : The size of the state manager setting has exceeded the limit
I'm not sure if this is the correct way of using the ApplicationDataContainer.
public void WriteToIsolatedStorage()
{
try
{
ApplicationDataContainer localSettings = ApplicationData.Current.LocalSettings;
ApplicationDataCompositeValue composite = new ApplicationDataCompositeValue();
if (localSettings.Containers.ContainsKey("LoveCycleSetting"))
{
localSettings.DeleteContainer("LoveCycleSetting");
}
composite["GetWeekStart"] = m_bWeekStart;
composite["iHistCount"] = m_iHistCount;
composite["dtHistory"] = this.DateTimeToString(m_dtHistory);
composite["avgCycleTime"] = m_iAvgCycleTime;
}
}
The exception occurs at the second last line. m_dtHistory is a string array of size 400. So does the ApplicationDataCompositeValue have a fixed size? Or do I have to write the m_dtHistory array into a file? Cuz in WindowsPhone I could directly write the array into the IsolatedStorageSettings.
It would be really helpful if someone could guide me on this or give links.
Alfah
Yes, ironically settings storage is easier on the phone than WinRT. You can just serialize to a file instead. Here is what I did (partially copied from the code already in SuspensionManager.cs), which works for both value and reference types.
internal static async Task<bool> SaveSetting(string Key, Object value)
{
var ms = new MemoryStream();
DataContractSerializer serializer = new DataContractSerializer(value.GetType());
serializer.WriteObject(ms, value);
await ms.FlushAsync();
// Uncomment this to preview the contents being written
/*char[] buffer = new char[ms.Length];
ms.Seek(0, SeekOrigin.Begin);
var sr = new StreamReader(ms);
sr.Read(buffer, 0, (int)ms.Length);*/
ms.Seek(0, SeekOrigin.Begin);
StorageFile file = await ApplicationData.Current.LocalFolder.CreateFileAsync(Key, CreationCollisionOption.ReplaceExisting);
using (Stream fileStream = await file.OpenStreamForWriteAsync())
{
await ms.CopyToAsync(fileStream);
await fileStream.FlushAsync();
}
return true;
}
// Necessary to pass back both the result and status from an async function since you can't pass by ref
internal class ReadResults
{
public bool Success { get; set; }
public Object Result { get; set; }
}
internal async static Task<ReadResults> ReadSetting<type>(string Key, Type t)
{
var rr = new ReadResults();
try
{
var ms = new MemoryStream();
DataContractSerializer serializer = new DataContractSerializer(t);
StorageFile file = await ApplicationData.Current.LocalFolder.GetFileAsync(Key);
using (IInputStream inStream = await file.OpenSequentialReadAsync())
{
rr.Result = (type)serializer.ReadObject(inStream.AsStreamForRead());
}
rr.Success = true;
}
catch (FileNotFoundException)
{
rr.Success = false;
}
return rr;
}
The name of each setting can be 255 characters in length at most. Each setting can be up to 8K bytes in size and each composite setting can be up to 64K bytes in size.
https://msdn.microsoft.com/library/windows/apps/windows.storage.applicationdata.localsettings.aspx
I read somewhere but lost the reference that the size is 64KB
public static void StoreConfig(string content)
{
IEnumerable<string> strs = Split(content, 2000);
int i = 1;
foreach(var s in strs)
{
AppLocalSettings.Values["test" + (i++)] = s;
}
AppLocalSettings.Values["test_count"] = i-1 +"";
}
public static string ReadConfig()
{
string s = "";
int count = Convert.ToInt32(AppLocalSettings.Values["test_count"]);
for(int i = 1; i<=count; i++)
{
s += Convert.ToString(AppLocalSettings.Values["test" + (i)]);
}
return s;
}
I am trying to develop an application in which a sip call is established and then i am capturing rtp audio packets. As they are encoded so i need to decode them and save it has .wav file. Tried using NAudio but didnt worked. Is there any solution using NAudio or any other source to solve this problem...
the code i used is as follow. data is the byte array in which rtp packet data is.
System.IO.MemoryStream stream = new System.IO.MemoryStream(data);
RawSourceWaveStream rsws = new RawSourceWaveStream(stream, WaveFormat.CreateMuLawFormat(8000,1));
WaveStream conversionStream = WaveFormatConversionStream.CreatePcmStream(rsws);
WaveStream blockAlignedStream = new BlockAlignReductionStream(conversionStream);
byte[] buffer = new byte[udpHeader.Data.Length];
blockAlignedStream.Read(buffer, 0, udpHeader.Data.Length);
writer.WriteData(buffer, 0, buffer.Length);
Thanks in Advance.
Better late than never!
Use the following helper classs by Mark Heath
from
Using NAudio to decode mu-law audio
public class RawSourceWaveStream : WaveStream
{
private Stream sourceStream;
private WaveFormat waveFormat;
public RawSourceWaveStream(Stream sourceStream, WaveFormat waveFormat)
{
this.sourceStream = sourceStream;
this.waveFormat = waveFormat;
}
public override WaveFormat WaveFormat
{
get { return this.waveFormat; }
}
public override long Length
{
get { return this.sourceStream.Length; }
}
public override long Position
{
get
{
return this.sourceStream.Position;
}
set
{
this.sourceStream.Position = value;
}
}
public override int Read(byte[] buffer, int offset, int count)
{
return sourceStream.Read(buffer, offset, count);
}
}
Get the latest NAudio.dll and NAudio.WindowsMediaFormat.dll
from
http://naudio.codeplex.com/
Then do the following to convert from U-Law or Mu-Law to Wav:
Stream tmpMemStream = new FileStream(Server.MapPath("/input/") + "input.voc", FileMode.Open, FileAccess.Read);
var waveFormat = WaveFormat.CreateMuLawFormat(8000, 1); // Feel free to tweak this number
var reader = new RawSourceWaveStream(tmpMemStream, waveFormat);
using (WaveStream convertedStream = WaveFormatConversionStream.CreatePcmStream(reader))
{
WaveFileWriter.CreateWaveFile(Server.MapPath("/output/") + "output.wav", convertedStream);
}
tmpMemStream.Close();
Just remember to respect the privacy of the owners of those audio files!