I have a Windows Phone 8.1 Silverlight app that renders .gif files to .mp4, using the Windows.Media.Editing.MediaComposition class.
Certain files will randomly crash the RenderToFileAsync method. There are at least two different error messages you can receive, one stating insufficient memory.
Does anyone have any ideas for a workaround, or some insider knowledge on how this is supposed to work?
Repro:
Create new c# WP8.1 Silverlight app blank project in VS2013
Add Usings and OnNavigatedTo to MainPage.xaml.cs as below.
Run in 512MB emulator. Observe crash (most of the time). Fiddle with value of i to see it work properly.
using System;
using System.Collections.Generic;
using System.Linq;
using System.Net;
using System.Windows;
using System.Windows.Controls;
using System.Windows.Navigation;
using Microsoft.Phone.Controls;
using Microsoft.Phone.Shell;
using System.Windows.Media.Imaging;
using System.IO;
using Windows.Media.Editing;
using System.Diagnostics;
-
protected async override void OnNavigatedTo(NavigationEventArgs e)
{
base.OnNavigatedTo(e);
SystemTray.ProgressIndicator = new ProgressIndicator();
SystemTray.ProgressIndicator.IsVisible = true;
SystemTray.ProgressIndicator.IsIndeterminate= true;
var comp = new MediaComposition();
var r = new Random();
for (int i = 0; i < 190; i++)
{
var wb = new WriteableBitmap(576, 300);
for (int iPix = 0; iPix < wb.Pixels.Length; iPix++)
{
wb.Pixels[iPix] = r.Next();
}
string filename = "file" + i.ToString() + ".jpg";
var file = await Windows.Storage.ApplicationData.Current.LocalFolder.CreateFileAsync(filename, Windows.Storage.CreationCollisionOption.ReplaceExisting);
using (var curr = await file.OpenStreamForWriteAsync())
{
wb.SaveJpeg(curr, wb.PixelWidth, wb.PixelHeight, 0, 95);
}
var clip = await MediaClip.CreateFromImageFileAsync(file, TimeSpan.FromMilliseconds(60));
comp.Clips.Add(clip);
}
// Ensure add capability to write to video library AND ID_CAP_MEDIALIB_PHOTO and change below to
// Windows.Storage.KnownFolders.VideosLibrary to see output in Videos app
var destFolder = Windows.Storage.ApplicationData.Current.LocalFolder;
var destFile = await destFolder.CreateFileAsync("test.mp4", Windows.Storage.CreationCollisionOption.ReplaceExisting);
Debug.WriteLine("Mem use before render to disk: " + Windows.System.MemoryManager.AppMemoryUsage.ToString("N0"));
await comp.RenderToFileAsync(destFile);
Debug.WriteLine("Mem use after render to disk: " + Windows.System.MemoryManager.AppMemoryUsage.ToString("N0"));
SystemTray.ProgressIndicator.IsVisible = false;
MessageBox.Show("Done OK");
}
I ran your code through the Windows Phone application analysis memory profiler. I can confirm that your app is running into the system memory limit of about 150 MB. The MediaComposition engine can require large amounts of memory depending on the size of the input and output formats. In your case you are adding a large number of clips. The number of clips that can be added is limited by the memory available for the decode.
Quite honestly the MediaComposition was not designed to handle such a large number of clips. The expected average number of clips hovers around five.
Unfortunately I was not able to get the only possible workaround I could think of to work. I think that this solution might be feasible but unfortunately I can't spend any more time on it: You might be to create multiple output files using lower numbers of clips. For instance you could create an output file with images one through twenty. You could then create a second file with images twenty one through forty. You could then join these two files together.
I hope this helps,
James
I had basically the same problem and after much trial and error I finally got rid of my "Value does not fall within the expected range" error which caused many or all missing frames.
I added GC.Collect(); right before every MediaClip.CreateFromImageFileAsync. It doesn't seem to effect performance or cause any problems and was the only way I could fix it. Hope this can help others.
Here is some of my code so you can see what I'm talking about:
foreach (var thisFrame in frames)
{
GC.Collect();
try
{
MediaClip clip = await MediaClip.CreateFromImageFileAsync(thisFrame, TimeSpan.FromMilliseconds(36 * speed));
app.mediaComposition.Clips.Add(clip);
}
catch (Exception ex)
{
Debug.WriteLine(ex.Message + ex.StackTrace);
}
}
Related
I'm trying to write a script to merge ~ 10,000 pdf's into a single file using iText 7 and c#.
My test files are ~5mb each and at around the 270 mark I start getting System.OutOfMemoryException ' s - even though I can see from task manager that I'm only using less than 25% of the available memory.
Heres the code
string sourceFolder = #"C:\Work\Generated5\";
string outputPath = #"C:\Work\MergeTest.pdf";
int i = 0;
string[] files = Directory.GetFiles(sourceFolder,"*.pdf");
if (files.Length > 0)
{
Array.Sort(files);
using (PdfDocument pdf = new PdfDocument(new PdfWriter(outputPath)))
{
foreach (var file in files)
{
try
{
using (var reader = new PdfReader(file))
{
using (PdfDocument sourceDoc = new PdfDocument(reader))
{
sourceDoc.CopyPagesTo(1, sourceDoc.GetNumberOfPages(), pdf);
}
reader.Close();
}
}
catch (Exception e)
{
e.Message.Dump(file);
}
if (i % 200 == 0)
{
//desperate attempt to free some memory - doesn't really help
GC.Collect(3);
}
}
}
}
I've found many examples online and here on stack overflow on doing this type of thing. However the documentation and previous answers I have found are out of date and in iTextSharp and some in iText 5. The classes used don't seem to be supported anymore in iText 7 and the only example I've been able to find is "How NOT to merge.."
Some stuff I've tried:
Enabling SmartMode on the pdf writer
Enabling Compression on the pdfwriter (got to about 1600 pdf's before throwing exceptions)
using PdfMerger instead of PdfDocument.copyPages
Force some Garbage Collection every X documents
I have several URLs stored in a text file, each of them is a link leading to a Facebook emoji, like https://www.facebook.com/images/emoji.php/v5/u75/1/16/1f618.png
I'm trying to download these images and store them on my disk. I'm using WebClient with DownloadFileAsync, something like
using (var client = new WebClient())
{
client.DownloadFileAsync(imgURL, imgName);
}
My problem is even if the amount of URLs is small, say 10, some of the images are downloaded ok, some give me a file corrupt error. So I thought I needed to wait for files to be downloaded till the end and added DownloadFileCompleted event, like this
using System;
using System.ComponentModel;
using System.Collections.Generic;
using System.Linq;
using System.Net;
class Program
{
static Queue<string> q;
static void Main(string[] args)
{
q = new Queue<string>(new[] {
"https://www.facebook.com/images/emoji.php/v5/u51/1/16/1f603.png",
"https://www.facebook.com/images/emoji.php/v5/ud2/1/16/1f604.png",
"https://www.facebook.com/images/emoji.php/v5/ud4/1/16/1f606.png",
"https://www.facebook.com/images/emoji.php/v5/u57/1/16/1f609.png",
"https://www.facebook.com/images/emoji.php/v5/u7f/1/16/1f60a.png",
"https://www.facebook.com/images/emoji.php/v5/ufb/1/16/263a.png",
"https://www.facebook.com/images/emoji.php/v5/u81/1/16/1f60c.png",
"https://www.facebook.com/images/emoji.php/v5/u2/1/16/1f60d.png",
"https://www.facebook.com/images/emoji.php/v5/u75/1/16/1f618.png",
"https://www.facebook.com/images/emoji.php/v5/u1e/1/16/1f61a.png"
});
DownloadItem();
Console.WriteLine("Hit return after 'finished' has appeared...");
Console.ReadLine();
}
private static void DownloadItem()
{
if (q.Any())
{
var uri = new Uri(q.Dequeue());
var file = uri.Segments.Last();
var webClient = new WebClient();
webClient.DownloadFileCompleted += DownloadFileCompleted;
webClient.DownloadFileAsync(uri, file);
}
else
{
Console.WriteLine("finished");
}
}
private static void DownloadFileCompleted(object sender, AsyncCompletedEventArgs e)
{
DownloadItem();
}
}
It didn't help and I decided to look closer into the files that are corrupted.
It appeared that the files that were corrupted were not actually image files, but HTML pages, which either had some redirection JavaScript code to an image or were full HTML pages saying that my browser was not supported.
So my question is, how do I actually wait that an image file has been fully loaded and is ready to be downloaded?
EDIT I have also tried to remove the using statement, but that did not help either.
Nothing's being corrupted by your download - it's simply Facebook deciding (sometimes, which is odd) that it doesn't want to serve the image to your client.
It looks like it's the lack of a user agent that causes the problem. All you need to do is specify the user agent, and that looks like it fixes it:
webClient.Headers.Add(HttpRequestHeader.UserAgent,
"Mozilla/5.0 (compatible; http://example.org/)");
I'm trying to write 4 sets of 15 txt files into 4 large txt files in order to make it easier to import into another app.
Here's my code:
using System;
using System.IO;
using System.Collections.Generic;
using System.Linq;
using System.Text;
namespace AggregateMultipleFiles
{
class AggMultiFilestoOneFile
{/*This program can reduce multiple input files and grouping results into one file for easier app loading.*/
static void Main(string[] args)
{
TextWriter writer = new StreamWriter("G:/user/data/yr2009/fy09_filtered.txt");
int linelen =495;
char[] buf = new char[linelen];
int line_num = 1;
for (int i = 1; i <= 15; i++)
{
TextReader reader = File.OpenText("G:/user/data/yr2009/fy09_filtered"+i+".txt");
while (true)
{
int nin = reader.Read(buf, 0, buf.Length);
if (nin == 0 )
{
Console.WriteLine("File ended");
break;
}
writer.Write(new String(buf));
line_num++;
}
reader.Close();
}
Console.WriteLine("done");
Console.WriteLine(DateTime.Now);
Console.ReadLine();
writer.Close();
}
}
}
My problem is somewhere in calling the end of the file. It doesn't finishing writing the last line of a file, and then, proceeds to start writing the first line of the next file half way through the middle of the last line of the previous file.
This is throwing off all of my columns and data in the app it imports into.
Someone suggested that perhaps I need to pad the end of each line of each of the 15 files with carriage and line return, \r\n.
Why doesn't what I have work?
Would padding work instead? How would I write that?
Thank you!
I strongly suspect this is the problem:
writer.Write(new String(buf));
You're always creating a string from all of buf, rather than just the first nin characters. If any of your files are short, you may end up with "null" Unicode characters (i.e. U+0000) which may be seen as string terminators in some apps.
There's no need even to create a string - just use:
writer.Write(buf, 0, nin);
(I would also strongly suggest using using statements instead of manually calling Close, by the way.)
It's also worth noting that there's nothing to guarantee that you're really reading a line at a time. You might as well increase your buffer size to something like 32K in order to read the files in potentially fewer chunks.
Additionally, if the files are small enough, you could read each one into memory completely, which would make your code simpler:
using (var writer = File.CreateText("G:/user/data/yr2009/fy09_filtered.txt"))
{
for (int i = 1; i <= 15; i++)
{
string inputName = "G:/user/data/yr2009/fy09_filtered" + i + ".txt";
writer.Write(File.ReadAllText(inputName));
}
}
I'm creating a simple program that takes a string, sends it to Google's text to speech server, and downloads the text to speech in a mp3/wav file on the computer. I have the code below, but it only works with up to 100 characters (Google's limit). How can I make a loop to cut the string into 100 character parts and then save it in one mp3/wav file on the computer? I know this is possible with javascript and actionscript (as I have seen them) but how can I do this in C#?
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Net;
using System.Threading;
namespace TestCSharp
{
class Program
{
static void Main(string[] args)
{
WebClient web = new WebClient();
web.Headers.Add(HttpRequestHeader.UserAgent, "Mozilla/4.0 (compatible; MSIE 9.0; Windows;)");
string encstr = string.Empty;
string filename = "tts.mp3"; //could also be tts.wav
string s = "This string cannot be more than 100 characters.";
encstr = Uri.EscapeDataString(s);
Console.WriteLine(encstr);
web.DownloadFile("http://translate.google.com/translate_tts?tl=en&q=" + encstr, ".\\" + filename);
}
}
}
This is not a direct answer, but I think the splitting is not good because TTS has word intonation as well as sentence intonation. Instead, I recommend you use SpeechSynthesizer Class with free TTS engine. However, I don't know which TTS engine is good as free and where it is. If finding goodness, I'll post it.
UPDATED
MP3 files are just concatenated without a problem, from this question.
well, before I get to concatenating the mp3 files, how would the while
loop look like to first get those mp3 files on the computer? if i go
through my loop, the tts.mp3 file would be overwritten and i would be
left with only the last 100 character string that was received..
You can merge the two files like the code below.
Finally, the fs1 will get all content.
string tts1 = "tts1.mp3";
string tts2 = "tts2.mp3";
FileStream fs1 = null;
FileStream fs2 = null;
try
{
fs1 = File.Open(tts1, FileMode.Append);
fs2 = File.Open(tts2, FileMode.Open);
byte[] fs2Content = new byte[fs2.Length];
fs2.Read(fs2Content, 0, (int)fs2.Length);
fs1.Write(fs2Content, 0, (int)fs2.Length);
}
catch (Exception ex)
{
MessageBox.Show(ex.Message + " : " + ex.StackTrace);
}
finally
{
fs1.Close();
fs2.Close();
}
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading;
using System.IO;
using System.Drawing;
using System.Drawing.Imaging;
namespace TrainSVM
{
class Program
{
static void Main(string[] args)
{
FileStream fs = new FileStream("dg.train",FileMode.OpenOrCreate,FileAccess.Write);
StreamWriter sw = new StreamWriter(fs);
String[] filePathArr = Directory.GetFiles("E:\\images\\");
foreach (string filePath in filePathArr)
{
if (filePath.Contains("HBP"))
{
sw.Write("1 ");
Console.Write("1 ");
}
else
{
sw.Write("1 ");
Console.Write("1 ");
}
using (Bitmap originalBMP = new Bitmap(filePath))
{
/***********************/
Bitmap imageBody;
ImageBody.ImageBody im = new ImageBody.ImageBody(originalBMP);
using (imageBody = im.GetImageBody(-1))
{
/* white coat */
Bitmap whiteCoatBitmap = Rgb2Hsi.Rgb2Hsi.GetHuePlane(imageBody);
float WhiteCoatPixelPercentage = Rgb2Hsi.Rgb2Hsi.GetWhiteCoatPixelPercentage(whiteCoatBitmap);
//Console.Write("whiteDone\t");
sw.Write("1:" + WhiteCoatPixelPercentage + " ");
Console.Write("1:" + WhiteCoatPixelPercentage + " ");
/******************/
Quaternion.Quaternion qtr = new Quaternion.Quaternion(-15);
Bitmap yellowCoatBMP = qtr.processImage(imageBody);
//yellowCoatBMP.Save("yellowCoat.bmp");
float yellowCoatPixelPercentage = qtr.GetYellowCoatPixelPercentage(yellowCoatBMP);
//Console.Write("yellowCoatDone\t");
sw.Write("2:" + yellowCoatPixelPercentage + " ");
Console.Write("2:" + yellowCoatPixelPercentage + " ");
/**********************/
Bitmap balckPatchBitmap = BlackPatchDetection.BlackPatchDetector.MarkBlackPatches(imageBody);
float BlackPatchPixelPercentage = BlackPatchDetection.BlackPatchDetector.BlackPatchPercentage;
//Console.Write("balckPatchDone\n");
sw.Write("3:" + BlackPatchPixelPercentage + "\n");
Console.Write("3:" + BlackPatchPixelPercentage + "\n");
}
}
sw.Flush();
}
sw.Dispose();
fs.Dispose();
}
}
}
There are some Bitmap instances there that you aren't disposing. You should really try to get in the habit of using a using block rather than disposing manually, to stop these things slipping through the net.
If you're getting the exception on this line:
using (Bitmap originalBMP = new Bitmap(filePath))
then that may mean you're trying to load an invalid or corrupted image file. For reasons known to no man, the OutOfMemoryException is what's thrown in this case. It actually has nothing to do with really being out of memory.
Try googling "bitmap.fromfile outofmemoryexception".
Shouldn't you be disposing imageBody as well?
I see it's opened:
Bitmap imageBody;
ImageBody.ImageBody im = new ImageBody.ImageBody(originalBMP);
imageBody = im.GetImageBody(-1);
But I don't see you disposing of it / setting it to null?
Maybe you are bitten by this bug:
https://connect.microsoft.com/VisualStudio/feedback/details/521147/large-object-heap-fragmentation-causes-outofmemoryexception
In this case, adding
GC.Collect(GC.MaxGeneration, GCCollectionMode.Forced);
GC.WaitForPendingFinalizers();
GC.Collect(GC.MaxGeneration, GCCollectionMode.Forced);
in the loop might help.
Or use sos.dll http://www.codeproject.com/KB/dotnet/Memory_Leak_Detection.aspx to see, where you leak memory.
You have a lot of calls to various classes, ImageBody,Rgb2Hsi,BlackPatchDetection etc
I assume these are you own code. Any of these could be holding on the resources.
I would suggest you grab a profiler and run some tests.
Most of them have trial versions giving you a couple of days with it.
Best .NET memory and performance profiler?
Try to take the Bitmap originalBMP = new Bitmap(filePath); in a using()
using (Bitmap originalBMP = new Bitmap(filePath)) {
// your code....
sw.Flush();
}
All within the using() clause is definitly disposed after leaving the clause.
You could also set you variables to null, after they are disposed.
You should use the using statement instead of Dispose() whereever possible. This way, you see in the declaration immediately that this instance you just create is freed.
Which is better?
Bitmap bmp = new Bitmap(filePath);
// .. pages of code goes here ..
bmp.Dispose(); // hopefully not forgotten
or
using (Bitmap bmp = new Bitmap(filePath))
{
// .. pages of code goes here ..
}
The using statement also ensures that all instances are freed even if you leave the current block/method prematurely with return,break or even an exception.
Note that you can put multiple assignments into the head of the using statement!
As with any Garbage collection issue, my approach would be to start commenting things out and seeing if I can still reproduce memory leaks. Things to try out:
Instantiate only one instance of Bitmap in the loop to see if the memory usage changes.
Dispose/not Dispose that one instance of Bitmap to see if disposing makes any difference.
From what I can see in the latest version of the code, whiteCoatBitmap, yellowCoatBitmap and blackPatchBitmap are not being disposed. Surround those with a using block as well.