In my application (.NET Framework 4.5) I'm rendering some RDLC reports (50-60) in order to export them to a single PDF.
Unfortunately there seems to be a big memory leak, basically every LocalReportnever gets disposed.
This is my code:
public void ProcessReport(ReportDataSource[] reportDS, string reportPath)
{
const string format = "PDF";
string deviceInfo = null;
string encoding = String.Empty;
string mimeType = String.Empty;
string extension = String.Empty;
Warning[] warnings = null;
string[] streamIDs = null;
Byte[] pdfArray = null;
using (var report = new LocalReport())
{
report.EnableExternalImages = true;
report.ReportEmbeddedResource = reportPath;
report.Refresh();
foreach (var rds in reportDS)
{
report.DataSources.Add(rds);
}
report.Refresh();
try
{
pdfArray = report.Render(format, deviceInfo, out mimeType, out encoding,
out extension, out streamIDs, out warnings);
}
catch (Exception ex)
{
Console.WriteLine(ex.InnerException.Message);
throw;
}
report.ReleaseSandboxAppDomain();
report.Dispose();
//Add pdfArray to MemoryStream and then to PDF - Doesn't leak
}
}
I found the memory leak just by looking to Visual Studio memory panel, every time report.Render get's called it add 20-30mb and they never go down until I close the application. I'm sure that using the MemoryStreamis not the issue because even if commented I still get 200mb-250mb in memory that never get released. This is bad because after running this application like 3-4 times it reaches >1GB until it doesn't even run anymore. I also tried to manually call the GarbageCollector but didn't work. The application is 32 bit.
What can I do to fix this ?
I have a real solution and can explain why!
It turns out that LocalReport here is using .NET Remoting to dynamically create a sub appdomain and run the report in order to avoid a leak internally somewhere. We then notice that, eventually, the report will release all the memory after 10 to 20 minutes. For people with a lot of PDFs being generated, this isn't going to work. However, the key here is that they are using .NET Remoting. One of the key parts to Remoting is something called "Leasing". Leasing means that it will keep that Marshal Object around for a while since Remoting is usually expensive to setup and its probably going to be used more than once. LocalReport RDLC is abusing this.
By default, the leasing time is... 10 minutes! Also, if something makes various calls into it, it adds another 2 minutes to the wait time! Thus, it can randomly be between 10 and 20 minutes depending how the calls line up. Luckily, you can change how long this timeout happens. Unluckily, you can only set this once per app domain... Thus, if you need remoting other than PDF generation, you will probably need to make another service running it so you can change the defaults. To do this, all you need to do is run these 4 lines of code at startup:
LifetimeServices.LeaseTime = TimeSpan.FromSeconds(5);
LifetimeServices.LeaseManagerPollTime = TimeSpan.FromSeconds(5);
LifetimeServices.RenewOnCallTime = TimeSpan.FromSeconds(1);
LifetimeServices.SponsorshipTimeout = TimeSpan.FromSeconds(5);
You'll see the memory use start to rise and then within a few seconds you should see the memory start coming back down. Took me days with a memory profiler to really track this down and realize what was happening.
You can't wrap ReportViewer in a using statement (Dispose crashes), but you should be able to if you use LocalReport directly. After that disposes, you can call GC.Collect() if you want to be doubly sure you are doing everything you can to free up that memory.
Hope this helps!
Edit
Apparently, you should call GC.Collect(0) after generating a PDF report or else it appears the memory use could still get high for some reason.
Related
I am generating a large Report pdf file using PDFClown using data from a database.
The process takes a very long time and eventually runs out of memory when the number of pages nears the 150 mark taking up more than 1.5GB of ram and with the error:
A first chance exception of type 'System.OutOfMemoryException' occurred in PDFClown.dll
Since I will need to regularly generate reports of over 1500 pages this is a major problem.
Is there anything I can do to:
Not run out of memory (Necessary)
Speed up the file creation (Ideally)
Please note: the reports generated (with smaller data sets) are accurate, although the file size is rather large.
Here is a sample of my code:
protected void PopulateReport()
{
foreach (Page page in _lstPages)
{
if (page != _Titlepage)
{
PrimitiveComposer composer = new PrimitiveComposer(page);
BlockComposer blockComposer = new BlockComposer(composer);
DataRow drInspection;
if (_mapPage1Rows.TryGetValue(page, out dataRow))
{
GeneratePage1(page, composer, blockComposer, dataRow);
}
else if (_mapPage2Rows.TryGetValue(page, out dataRow))
{
GeneratePage2(page, composer, blockComposer, dataRow);
}
}
}
}
protected void GeneratePage1()
{
composer.BeginLocalState();
composer.SetFont(ReportFonts.GetFont(GetPDFDocument(), bIsFieldName, false), nFontSize);
blockComposer.Begin(GetRectangle(fMarginX, fMarginY, fWidth, nFontSize), AlignX, AlignY);
int nIndex = blockComposer.ShowText(strText, false);
blockComposer.End();
....
composer.End();
composer.Flush();
}
Screenshot Sample Report Page (redacted for client-privacy reasons):
The Function: ReportFonts.GetFont(...) was creating a new font every single time it was called.
This font was then stored in the dll's memory space and the final file which was what was taking up so much space.
Used a Dictionary<> to solve the issue, not only is the memory space clean and the file size acceptable, but the execution time is also much improved.
Moving to 64-bit also helped somewhat.
I have a WindowsService that converts XML files to a PDF via LocalReports (RDLC reports). It works very well and is fast, but I'm struggling with memory leak problems similar to those here.
It appears the .Render("PDF") method is where the problems are appearing. A profiling session in VS2015 shows a huge about of memory used by ServerIdentity. This doesn't appear if I comment out .Render("PDF").
Specs
Application is .NET 4.5
Application uses ReportViewer library from VS2012 (I've tried 2015, too, and same behavior)
Built for 'Any CPU', runs as x86 presumably
Reports aren't huge, but we run quite a few so after a few weeks memory gets used up (1 page reports, maybe a dozen or more a day).
Code
public static byte[] GenerateReport(string name, IEnumerable<ReportDataSource> dataSources)
{
try
{
byte[] pdfBytes;
using (Stream reportStream = Assembly.GetExecutingAssembly().GetManifestResourceStream(name))
{
LocalReport localReport = new LocalReport();
localReport.LoadReportDefinition(reportStream);
foreach (var dataSource in dataSources)
{
localReport.DataSources.Add(dataSource);
}
pdfBytes = localReport.Render("PDF");
//Cleanup!
localReport.DataSources.Clear();
localReport.ReleaseSandboxAppDomain();
localReport.Dispose();
}
return pdfBytes;
}
catch (Exception ex)
{
throw new Exception("Error generating report at " + name, ex);
}
}
Other notes
If I never call .Render("PDF"), the memory never grows beyond 20MB or so.
Similarly, if i never render, there's no ServerIdentity or Microsoft.ReportingServices.OnDemanProcessing.*` objects in the heap. This is where 90%+ of the memory is consumed but I have no way to GC that memory.
Change the compilation target of your application to x64 and call
report.ReleaseSandboxAppDomain();
After Render method. Remember to Dispose the report object.
Using Rotativa 1.6.4 from NuGet and have noticed the following issue using the code below.
ActionAsPdf hangs randomly for indeterminate amount of time.
Code below that is hanging:
var pdfResult = new ActionAsPdf("Report", new {id = Request.Params["id"]})
{
Cookies = cookieCollection,
FormsAuthenticationCookieName = FormsAuthentication.FormsCookieName,
CustomSwitches = "--load-error-handling ignore"
};
Background info that may help:
The customSwitches is in use to ignore a documented issue calling wkhtmltopdf.exe using the ActionAsPdf, but it does not suppress errors in the code only in the wkhtmltopdf call.
Observations, usage and testing:
It works but when running the application (whether or not stepping through code), it can be anywhere from 10 seconds up to about 4 minutes between hitting the pdfResult = new ActionAsPdf and finally entering into the "Report" action being called. Can't discern anything actually happening in the output window of Visual Studio, no errors are being thrown that I have found. Just random slow transition into the Reports() action.
I can run the Reports() action directly via URL and it never slows like this and is quite fast for PDF generation. I am running it using the ActionAsPdf to obtain the binary to save to file system and send via email, which is the prescribed method of doing so for this library.
The behavior exists on both a local Windows 10 dev box and a remote Server 2008R2 Test box. .Net 4.5.1 on both boxes, default IIS on each.
Questions I have:
Any idea on what might cause this slow down and how to remedy it?
I ended up using UrlAsPdf() instead of ActionAsPdf() and it works. Seems there may be some issues with the ActionAsPdf() and I have filed a bug with Rotative project on GitHub. The ActionAsPdf() is still marked as beta, so hopefully it get's fixed in future versions or by the community.
In my case, I had to do few more tweaks along with using UrlAsPdf(). I have narrowed down the issue to the cookie collection that I was adding. So I tried just adding the cookie that I needed, and the issue was resolved. Following is the sample code that I have used.
var report = new UrlAsPdf(url);
Dictionary<string, string> cookieCollection = new Dictionary<string, string>();
foreach (var key in Request.Cookies.AllKeys)
{
if (Crypto.Hash("_user").Equals(key))
{
cookieCollection.Add(key, Request.Cookies.Get(key).Value);
break;
}
}
report.Cookies = cookieCollection;
report.FormsAuthenticationCookieName = FormsAuthentication.FormsCookieName;
I have an winform/OCX that consumes a qlikview document. We have gotten a patch from QV so that RefreshDocument works in the OCX as the RefreshDocument does in QV application. But the Application shows a nice enabled button when the document has been reload on the server.
Does anyone know what needs to be done to detect that. Either in C# or in macro code or ManagementAPI ?
This is the ReloadDocument Code.
private void button2_Click(object sender, EventArgs e)
{
var myBloodybookmarkHack = "dynaBookmark" + Guid.NewGuid().ToString().Replace("-","");
axQlikOCX1.ActiveDocument.CreateUserBookmark(myBloodybookmarkHack, true);
//axQlikOCX1.OpenDocument(#"qvp://qvSeverName/path/MyDocument.qvw?bookmark=Server\dynaBookmarkb5aa82ae467540fdb0d18bb499044ed9");
axQlikOCX1.RefreshDocument();
axQlikOCX1.ActiveDocument.RecallUserBookmark(myBloodybookmarkHack);
axQlikOCX1.ActiveDocument.RemoveUserBookmark(myBloodybookmarkHack);
}
By suppressing the paint event I get this to run pretty ok. Next patch will include that it keeps the selections (Will be fixed in 11.2 servicerelease 6).
You need to detect if CreateUserBookmark was successfull or not and not restore the bookmark if the creation failed.
This code works in QV 11.2 serviceRelease 5.
The filesystem reads a new modified time when the qvw file is rewritten after load. Assuming the data portion of this application is not broken out from the QVW file. Likely, you could come very close to accomplishing this by checking for new timestamps. Alternatively, if logging is enabled in the qvw document you could log read the text file* that QlikView generates to accomplish the same thing.
*The text file writes are delayed sometimes so your file might be refreshed a little bit before the log states that it is.
We ended up using the QV Management api to get the last task reload time
Download the Qv management api demo from QV
This code shows you how to get tasks on a document. Through that you get when "last document reload task" was finished.
private DateTime GetLastDocumentRun(string documentName)
{
string QMS = "http://MyQlikviewserver:4799/QMS/Service";
var client = new QMSClient("BasicHttpBinding_IQMS", QMS);
string key = client.GetTimeLimitedServiceKey();
ServiceKeyClientMessageInspector.ServiceKey = key;
var taskStatusFilter = new TaskStatusFilter();
var clientTaskStatuses = client.GetTaskStatuses(taskStatusFilter, TaskStatusScope.All);
foreach (var taskStatus in clientTaskStatuses)
{
Trace.WriteLine(taskStatus.General.TaskName);
if (taskStatus.General.TaskName.ToLower().Contains(documentName.ToLower()))
{
string fin = taskStatus.Extended.FinishedTime + "";
DateTime finishedTime;
if (DateTime.TryParse(fin, out finishedTime))
return finishedTime;
Logger.ErrMessage("QvManagementApi.GetLastDocumentRun",new Exception("Task finished time did not return a valid datetime value:" + fin));
return DateTime.MinValue;
}
}
return DateTime.MinValue;
}
This is slow, so you should run on a different thread.
Also this does not show if the task is successfully reloaded. We haven't fix that yet but on taskStatus.Extended you have the last log, which you can text parse to get if it was successfully reloaded or not.
If I understand correctly you want to know if a document has finished reloading on a QlikView server right?
I've you OCX application has a constant connection, you could evaluate the ReloadTime() function in the document which would tell you when the document was last reloaded. If you listen for the function and issuing a DocumentRefresh while doing this, then you would get a changed timestamp once the newly reloaded document becomes avaible on the server.
The code your posting, does not reload a QlikView document. At least not in QlikView lingo, it just open the documents on the server.
Please elaborate if I misunderstand you.
Regards Torber
I need to get when a file was created - I have tried using:
FileInfo fi = new FileInfo(FilePath);
var creationTime = fi.CreationTimeUtc;
and
var creationTime = File.GetCreationTimeUtc(FilePath);
Both methods generally return the wrong creation time - I guess it is being cached somewhere.
The file is deleted and re-created with the same name and I need to know when/if it has been re-created (by checking if the created date/time has changed) - I had planned to do this by seeing it the file creation time had changed but I have found this to be inaccurate.
I'm working on Win 7 and if I check File Explorer it shows the new file creation time correctly.
I have also tried using the FileSystemWatcher but it doesn't entirely work for my use case. E.g. if my program is not running, the FileSystemWatcher is not running, so when my program starts up again I don't know if the file has been deleted and recreated or not.
I've seen MSDN http://msdn.microsoft.com/en-us/library/system.io.file.getcreationtime.aspx where it says:
This method may return an inaccurate value, because it uses native functions whose values may not be continuously updated by the operating system.
But I have also tried using their alternative suggestion and setting the SetCreationDate after creating a new file but I also find that this doesn't work. See test below:
[Test]
public void FileDateTimeCreatedTest()
{
var binPath = System.IO.Path.GetDirectoryName(System.Reflection.Assembly.GetExecutingAssembly().GetName().CodeBase);
var fullFilePath = Path.Combine(binPath, "Resources", "FileCreatedDatetimeTest.txt");
var fullFilePathUri = new Uri(fullFilePath);
var dateFormatted = "2013-08-17T15:31:29.0000000Z"; // this is a UTC string
DateTime expectedResult = DateTime.MinValue;
if (DateTime.TryParseExact(dateFormatted, "o", CultureInfo.InvariantCulture,
DateTimeStyles.AssumeUniversal, out expectedResult)) // we expect the saved datetime to be in UTC.
{
}
File.Create(fullFilePathUri.LocalPath);
Thread.Sleep(1000); // give the file creation a chance to release any lock
File.SetCreationTimeUtc(fullFilePathUri.LocalPath, expectedResult); // physically check what time this puts on the file. It should get the local time 16:31:29 local
Thread.Sleep(2000);
var actualUtcTimeFromFile = File.GetCreationTimeUtc(fullFilePathUri.LocalPath);
Assert.AreEqual(expectedResult.ToUniversalTime(), actualUtcTimeFromFile.ToUniversalTime());
// clean up
if (File.Exists(fullFilePathUri.LocalPath))
File.Delete(fullFilePathUri.LocalPath);
}
Any help much appreciated.
You need to use Refresh:
FileSystemInfo.Refresh takes a snapshot of the file from the current
file system. Refresh cannot correct the underlying file system even if
the file system returns incorrect or outdated information. This can
happen on platforms such as Windows 98.
Calls must be made to Refresh before attempting to get the attribute
information, or the information will be outdated.
The key bits from MSDN indicate that it takes a snapshot and attribute information..will be outdated.
Try using FileInfo and Refresh method of it
fileInfo.Refresh();
var created = fileInfo.CreationTime;
this should work
File.Create(fullFilePathUri.LocalPath);
Thread.Sleep(1000); // give the file creation a chance to release any lock
That is not how you do it. File.Create creates stream writer which should be closed to release the lock without any waiting. If you find yourself using Thread.Sleep, you will often find that you are doing something wrong.
If the file described in the path parameter does not exist, this method returns 12:00 midnight, January 1, 1601 A.D. (C.E.) Coordinated Universal Time (UTC), adjusted to local time.
https://learn.microsoft.com/en-us/dotnet/api/system.io.file.getcreationtime?view=netframework-4.8