I have an application on the Compact Framework that has some large embedded resources (some of them are several megabytes). I am calling assembly.GetManifestResourceStream(...) which returns a Stream object. However, I noticed that on some devices this call not only takes quite a while but causes the device to run out of available memory. Eventually I used reflector to look at the code for this method on the compact framework and it uses an internal method to get a byte[] of the resource data. It then returns this data wrapped in a MemoryStream.
Is there any way to retrieve a resource without using this call since it will always read everything into memory? Ideally I'd like to work with a Stream that I can get random access to without having to read the whole thing into memory (similar to how a FileStream works). It would be pretty neat if I could simply open a FileStream on the assembly and start reading at the appropriate offset, but I doubt this is how resources are embedded.
Don't use an embedded resource. Add it as a content file and open it off disk with a file stream.
I found an open source tool that exposes a lot of the assemblies meta meta and that allowed me to peak into the resource manually:
http://www.jbrowse.com/products/asmex/
Related
I'm using the Iconic Zip Library to do some zipping.
When it's time to zip up, I want to call their ZipFile.Save(Stream outputStream) method.
On the Telligent side, to save the content of a stream to a file you use their ICentralizedFileStorageProvider.AddUpdateFile(string path, string fileName, Stream contentStream) method.
As you can see, you provide the Iconic Zip Library a stream for them to write to, but to save a file, Telligent does not provide you with a stream for you to write to, you need to provide them with a stream for them to read from.
Sure, I could use a MemoryStream, load it by passing it the Iconic Zip Library and then unload it by passing it to the Telligent APIs, but that would cause the entire zip file to be loaded to memory all at once. I know that the final zip is going to be huge, so loading it entirely onto memory is not an option; I need to do some sort of buffering.
How do I reconcile these two APIs? How do I build a bridge between them where data can flow without hogging up memory? Any ideas?
how to read from a stream - a BufferedStream without using heavy locks? I'm reading audio data from a file using NAudio. Since NAudio stream have lock on Read methods, I used BufferedStream as a wrapper. I need only read data. However, I lost thread-safety. How to make stream threadsafe without using locks?
Unfortunately I cannot read data in large blocks, because interface which I use contains method float GetSample(long) which read just 1 sample at once.
I've never used that library, but if I understand you correctly, your disk file is locking because you are reading it, so no other application can read it, is that correct?
In this case, it's not your app that's become multi-threaded, and you shouldn't need locks within your application. You do however have to make sure that the file is opened for read only puroposes allowing readonly access to other applications.
If NAudio offers the full set of parameters when opening a file, you'll be able to open the file with sharing allowed, otherwise if NAudio can be passed a stream to read from, you'll be able to open the file however you want and then pass the stream to NAudio.
https://msdn.microsoft.com/en-us/library/y973b725%28v=vs.110%29.aspx
I have a C library with a .NET wrapper (it's Shapelib in this case) that writes files (Shapefiles) to the filesystem using a path like C:\Path\To\Things\Filename.shp. However, writing the files to the filesystem isn't actually what I need. Once they're written, I have to read them into back into streams anyways to either deliver them via the web, add them to a zip file, or some other task. Writing them to the filesystem means I just have to track the clutter and inevitably clean them up somehow.
I'm not sure if there's anything like PHP's stream protocol registers where the path could be like stream://output.shp...
Is it possible to intercept the filesystem writing and handle this entire task in memory? Even if this can be done, is it horrible practice? Thanks!
The consensus is that this is "virtually impossible." If you really need to ensure that this is done in RAM, your best bet is to install a RAM disk driver. Do a Google search for [windows intercept file output]. Or check out Intercept outputs from a Program in Windows 7.
That said, it's quite possible that much, perhaps most, of the data that you write to disk will be buffered in memory, so turning right around and reading the data from disk won't be all that expensive. You still have the cleanup problem, but it's really not that tough to solve: just use a try/finally block:
try
{
// do everything
}
finally
{
// clean up
}
I am building an ASP.NET web application that creates PowerPoint presentations on the fly. I have the basics working but it creates actual physical files on the hard disk. That doesn't seem like a good idea for a large multi-user web application. It seems like it would be better if the application created the presentations in memory and then streamed them back to the user. Instead of manipulating files should I be working with the MemoryStream class? I am not exactly sure I understand the difference between working with Files and working with Streams. Are they sort of interchangeable? Can anyone point me to a good resource for doing file type operations in memory instead of on disk? I hope I have described this well enough.
Corey
You are trying to make decision that you think impacts performance of your application based on "doesn't seem like a good idea" measurement, which is barely scientific. It would be better to implement both and compare, but first you should list your concerns about either implementations.
Here are some ideas to start:
there are really not much difference between temporary files and in-memory streams. Both would have content in physical memory if they are small enough, both will hit the disk if there is memory pressure. Consider using temporary Delete on close files for your files if cleaning files up is the main concern.
OS already doing very good job for managing large files with caching, one would need to make sure pure in-memory solution at least matches it.
MemoryStream is not the best implementation for reasonably sized streams due its "all data is in single byte array" contract (see my answer at https://stackoverflow.com/a/10424137/477420).
Managing multiple large in-memory streams (i.e. for multiple users) is fun for x86 platform, less of a concern for x64 ones.
Some API simply don't provide a way working with Stream-based classes and require physical file.
Files and streams are similar, yes. Both essentially stream a byte array...one from memory, one from the hard drive. If the API you are using allows you to generate a stream, then you can easily do that and serve it out to the user using the Response object.
The following code will take a PowerPoint memory object (you'll need to modify it for your own API, but you can get the general idea), save it to a MemoryStream, then set the proper headers and write the stream to the Response (which will then let the user save the file to their local computer):
SaveFormat format = SaveFormat.PowerPoint2007;
Slideshow show = PowerPointWriter.Generate(report, format);
MemoryStream ms = new MemoryStream();
show.Save(ms, format);
Response.Clear();
Response.Buffer = true;
Response.ContentType = "application/vnd.ms-powerpoint";
Response.AddHeader("Content-Disposition", "attachment; filename=\"Slideshow.ppt\"");
Response.BinaryWrite(ms.ToArray());
Response.End();
Yes, I would recommend the MemoryStream. Typically any time you access a file, you are doing so with a stream. There are many kinds of streams (e.g. network streams, file streams, and memory streams) and they all implement the same basic interface. If you are already creating the file in a file stream, instead of something like a string or byte array, then it should require very little coding changes to switch to a MemoryStream.
Basically, a steam is simply a way of working with large amounts of data where you don't have to, or can't, load all the data at into memory at once. So, rather than reading or writing the entire set of data into a giant array or something, you open a stream which gives you the equivalent of a cursor. You can move your current position to any spot in the stream and read or write to it from that point.
I want to stream a large data file for a game continuously from the disk of a iOS device.
The question is if anyone has streamed such files ( Blocks of 20MB ) before by using a System.IO.FileStream. I have no iOS-device do test it myself and i not expect to get one in the next time.
There are 2 questions:
Is the file streamed without loading it fully ( The behaviour which i expect from a stream but i'm unsure about the handling of MonoTouch ) and how is the memory usage while streaming it?
How is the performance of the loading process, especially when loading different files at once?
Thank you for any information.
MonoTouch base class libraries (BCL) comes from Mono so a lot of the code is available as open source. In the case of FileStream you can see the code on github.
Is the file streamed without loading it fully ( The behaviour which i expect from a stream but i'm unsure about the handling of MonoTouch )
You're right, it won't be fully loaded. You'll control what's being read.
and how is the memory usage while streaming it?
The above link shows that the default buffer size is set to 8192 bytes (8k) but that several constructors allows you to use a different size (if you wish so).
and how is the memory usage while streaming it?
But that buffer is an internal buffer. You'll provide your own buffer when you call methods like Read so you will be, again, in control of how much memory is being used.
How is the performance of the loading process, especially when loading different files at once?
That's difficult to predict and will largely depend on your application (e.g. number of files, total memory required...). You can use FileStream asynchronous methods, like BeginRead, to get better performance if required.