I have made a custom report viewer control that utilized the output of an ssrs render request. I am having difficulty matching the speed of the ssrs ReportViewer control. The report is 2.5 mb.
I am watching the request and responses over the wire and can see it is taking roughly 2.8 seconds to get the buffer of the contents from ssrs. However, it is taking 10-15 seconds to show the content inside of an html element in my current view.
The problem is when I log into the ssrs manager directly and render the report it takes just as long to process the report however, it only takes, 3 seconds to render in the ssrs report viewer control.
The original design was to place the html as a payload on the ajax response. Then I thought that the json serialization/deserialization could be the bottleneck so I added the ability to save the output to a temp folder on the server as html and call load(html)...no improvement.
I am using the code below to load the content in my page from the Ajax.BeginForm success function :
function setReportContent(content, isUrl) {
if(isUrl)
$('#reportContent').load(content);
else
$("#reportContent").html(content);
showReportWaitIndicator(false);
$("#reportContent").show();
}
I cant figure out why there is such a huge discrepancy in rendering times between the ssrs report viewer control and the way I am loading.
Is there a more efficient technique to load large content other than .load(url); ?
Edit: I have downloaded the html file and opened it directly in the browser and the markup was rendered instantly. Why does appending the contents to the dom take up to 20 seconds?
Thanks
I found one solution but it uses an IFrame and I am iffy on that but it matches the speed of the ssrs report viewer. In a nutshell loading an IFrame does not contain some of the overhead of placing the contents in the dom. Made a small adjustment to the setip and it works.
function setReportContent(content, isUrl, renderInIFrame) {
if (isUrl) {
if (renderInIFrame)
$("#reportContent").html("<iframe id='reportFrame' scrolling='yes' onload='resizeReportFrame();' src='" + content + "'\></iframe>");
else
$("#reportContent").load(content);
}
else {
if (renderInIFrame) {
$("#reportContent").html("<iframe id='reportFrame' scrolling='yes' onload='resizeReportFrame();'\></iframe>");
$('#reportFrame').contents().find('html').html(content);
}
else
$("#reportContent").html(content);
}
showReportWaitIndicator(false);
$("#reportContent").show();
}
Related
In my asp.net (Runtime Version 3.5) there is a requirement of displaying Pdf in the middle of the page along with other content.
So in such case I found the solution. for this I used Iframe . But we have to display the pdf on linbutton_click event.
HTML part
<div id='Pdfviewer'>
<iframe id="ifrDisplay" runat="server" scrolling="auto" width="600" height="700"></iframe>
</div>
C# code part
LinkButton1_click
{
string pdfUrl = "https://test-folder.com/rxdata//P09/201305240039MA9_000/Data/00000010.pdf";//
ifrDisplay.Attributes.Add("src", pdfUrl);
}
Here the issue is even if the content of the particular pdf changes, still it loads the pdf of older content.
So it is clear this is caching related issue.
Please let me know how to remove the cache due to which still the older content of pdf file is loading.
--Sarthak
To avoid caching you can use some random number in the URL so that browser does not cache the response.
string pdfUrl = "https://test-folder.com/rxdata//P09/201305240039MA9_000/Data/00000010.pdf"+DateTime.Now();
The second possible option is that who is serving PDF should set the cache to (-1) so that it not get cached in the browser.
Is there any way to Convert HTML table to Image file? I have one HTML table which contains controls like Labels, GridView, and CheckBoxes. How can we convert this table into Image file for creating a PDF file?
This is not a trivial task. Ultimately, you need an HTML renderer to convert the HTML to pixels.
Fortunately, you don't need to write your own. You can use the WebBrowser control.
You can see some code I wrote to extract a thumbnail image from a given URL in the article Creating Website Thumbnails in ASP.NET.
If this is a one-off, load it up in your browser and take a screenshot (Alt-PrtScr on windows for just the current application)
If you need to do this repeatedly/in an automated way, you can either automate a browser control or you can use a headless browser
You should also look into WebKit2Png which will render the page in the same manner as Chrome/other webkit browsers and then save that as a png. It can optionally simulate javascript/etc too
Similarly, there's a wkhtmltopdf which works on all platforms
Converting HTML Table into Image File
$("#selfAssessmentPowerPointDownload").on('mouseover', function () {
var element = $("#html-content-holder"); // global variable
var getCanvas; // global variable
html2canvas(element, {
onrendered: function (canvas) {
getCanvas = canvas;
var imgageData = getCanvas.toDataURL("image/png");
var newData = imgageData.replace(/^data:image\/png/, "data:application/octet-stream");
$("#selfAssessmentPowerPointDownload").attr("download", "gridData.png").attr("href", newData);
}
});
});
<button id="selfAssessmentPowerPointDownload">Download</button>
<table id="html-content-holder"></table>
Datatables is a powerful jquery plugin for working with tables. One of the option that datatables offered is make a pdf or excel from your table. You can see find many examples in it's website.
I am trying to display HTML files (containing local images) in a WebBrowser control. User can select the file using an OpenFileDialog, after which it is displayed in the control.
But I have these problems I am struggling to solve:
Since I've added the control to my Form, it's been loading really slowly. It takes almost 10s for the form to instantiate.
WebBrowser.Navigate only works the first time. When I try to load the second file, nothing happens. I have tried calling Refresh, Update, OpenNew, opening about:blank between two files, but I just don't have a clue how to do it properly. Only the initially opened file remains shown, no exceptions or warnings ever pop up when I try to navigate to a different page.
Am I doing something wrong? For example, Lutz Roeder's Writer starts instantly and loads subsequent files without problems, but it uses lots of interop (and is editable), so I am trying to avoid all that stuff.
The way I have been loading local .html files into a WebBrowser is like so:
OpenFileDialog ofd = new OpenFileDialog();
// Do filtering here
if (ofd.ShowDialog() == System.Windows.Forms.DialogResult.OK)
{
webBrowser1.DocumentText = System.IO.File.ReadAllText(ofd.FileName);
}
It can load files one after another with no problems. If you are trying to load a big html file when the form is initializing or is opening this could explain the 10 second loading time. My form loaded almost instantly when it had a WebBrowser control on it.
Hope this helps!
Edit: Try setting the stream of the WebBrowser:
System.IO.Stream s = System.IO.File.OpenRead(ofd.FileName);
webBrowser1.DocumentStream = s;
I'm serving up PDFs from a SQL db and presenting them in the browser. I'm trying to figure out a way to embed a number in the PDF dynamically so that the recordID for that PDFs SQL record is available to me when the user submits the XML form data. The user hits the submit button on the form and the form submits its XML data to my submission page. If there is some way of changing the submission URL on the fly then I could do a query string to pass my self the recordID. I'm not generating the PDF in code, its being created by hand and then uploaded to my site.
Edit
User is given a link someServer.com/pdfLink.aspx?formID=5 they go there and that pages pulls a PDF from the DB and displays it. This pulls up acrobat in browser full size so my aspx page isn't in control of submitting the completed form, Acrobat is. The user fills out the form and hits the submit button in the form. This submit button was set up at form design time to point to another page someSite.com/pdfSubmit.aspx The submit button posts the XML data to that page and I can process it. I need the recordID in the query string for the someSite.com/pdfSubmit.aspx page. To do this I would need to modify the PDF to either add the recordID and query string to the submit button's submit URL, or embed it in the PDF else ware. The big question is how do I modify the PDF just before I display it via someServer.com/pdfLink.aspx?formID=5 to do either of these two options.
Embedding a number in PDF is not exactly kosher, but there are some things that you can do that will honor the spec.
The current PDF spec says that "The last line of the file shall contain only the end-of-file marker
%%EOF
but there is some wiggle room - the implementation details say that it doesn't technically have to be the last line of the file, but only has to appear in the last 1K and, generally speaking, if you don't muck with things too much, most compliant readers won't even blink. If I had to do this, I would be inclined to add a newline (if there isn't one), then a % (which is a PDF comment), a marker to let me know it's mine, and finally the number. So something like:
// assume we already know it ends with %%EOF
void AppendNumberToPdf(Stream stm, int number, bool addNewline)
{
stm.Seek(0, SeekOrigin.End); // go to EOF
StreamWriter writer = new StreamWriter(stm, new ASCIIEncoding(), 1024);
writer.WriteLine(string.Format("{0}% {1} {2}", (addNewLine ? "\n" : ""), kMyMarkerString, number));
writer.Flush();
}
kMyMarkerString should be something like "MyApplicationDocumentIdentifier:" or some such thing that will make it easy to identify your tracks.
The querystring is read-only so you cannot dynamically change it at runtime. However can you:
Add the recordID to the form at the time the submit page is initially rendered
Can you process the submit form and then do a Response.Redirect or Server.Transfer to the correct page with the recordid parameter added to the querystring
While trying #plinth's suggestion I realized I had to change from XML submission (since his data was on the PDF directly. So I changed the form to submit as XDP which has XML data + embedded PDF. When I did this and viewed the raw XDP that the form submitted I ran across this.
<?xml version="1.0" encoding="UTF-8" ?>
<?xfa generator="XFA2_4" APIVersion="3.0.8262.0"?>
<xdp:xdp xmlns:xdp="http://ns.adobe.com/xdp/" timeStamp="2010-05-04T15:15:00Z" uuid="6d0944c8-1573-442c-9c85-11e372bd38c3">
<xfa:datasets xmlns:xfa="http://www.xfa.org/schema/xfa-data/1.0/">
<xfa:data>
<form1>
<TextField1>TestMe</TextField1>
</form1>
</xfa:data>
</xfa:datasets>
<pdf href="ViewPDF.aspx?formID=10" xmlns="http://ns.adobe.com/xdp/pdf/" />
</xdp:xdp>
Notice the 2nd to last line. It automatically includes the PDF's url which had the formID value that I needed. So all I had to do was get the XDP instead of pure XML post from the form and it gives me everything I needed.
I'm making an ASP.NET Web page that generates pages from SQL when they're not cached. Their loading time could be between 300ms to 1.5seconds (No fix you database replies please).
I personally find these values to be too long, and was looking for a solution that allows me to inform the user that the page he is visiting will require a bit of time to load.
I was looking for a solution that could be via the Page_Load function, if that's possible. The perfect solution for me in this case is showing the user either an animated GIF or text saying the page is being generated.
On a side note I come from programming mostly Windows applications.
Here is an example of how to use the Response object to flush content to the browser and continue processing:
using System;
using System.Web.UI;
using System.Threading;
public partial class _Default : Page
{
protected override void OnLoad(EventArgs e)
{
base.OnLoad(e);
Response.Write("<h1>please wait...</h1>");
Response.Flush();
// simulate load time
Thread.Sleep(2000);
Response.Write("<h1>finished</h1>");
}
}
You can start rendering the page, and flush the buffer calling Response.Flush(). Which will send the contents of the buffer to the browser. You will then need to turn off the graphic once its loaded.
Another option would be to use AJAX to load the images, so you load the entire page, without the images, and then iniate another request to get the images. This might be easier then trying to render a partial page.
1.5 seconds isn't bad for a page to load you sure this is worth your time and effort?
You're going to want to first output the loading graphic and then flush the output buffer so the content so far is sent to the user's browser by using Response.Flush().
When you output the rest of the content, you will need to have a bit of javascript in there to remove the first page elements sent so the loading graphic goes away.