Dynamic web pages within a C# winforms app - c#

I have a winforms app that I would like to add some new features to.
Ideally I would have a new form that would have an embedded browser control. I want the pages in the browser to be 'served' from the app itself ( as opposed to a remote sever ).
The pages will be dynamically created dependent on data from within the App.
Also, how do I cater for references to assets like CSS, Javascript and Image files. Ideally these would need to be handled by the application as well.
How do I do this?

I use this technique in my application. I host a WebBrowser, and I populated it as follows:
public void DisplayHtml(HtmlGenerator gen)
{
webBrowser.DocumentText = gen.GenerateHtmlString());
}
Using this method, I don't have to actually generate a file on my file system with the HTML content.

Use the WebBrowser control.
You can set the DocumentText property to the HTML you want to display. (thanks #Anton Semenov).
Alternatively, you can feed it local file URLs from files that your application creates.

In my test management tool (this one, if I'm allowed to add a link), I have written my own "mini ASP" by having HTML pages with C# code inside and then processing them dynamically by converting the page into C#, compiling the code and then execute it.
Pay attention that this might populate the application domain since you cannot unload the dynamically loaded script code.
An excerpt from such a HTML file looks like:
<div id="title">
<img src="../_Shared/images/32x32/component_blue_view.png" />
<h1>Test case "[$=tc.Title$]" - Details</h1>
</div>
Here, the [$= and $] are the equivalents of <%= and %>.
In another project I did something similar with the Microsoft VBScript interpreter; instead of compiling the code to C#, I compile it as a VBScript and let it execute then by the VBScript engine of the Microsoft Scripting host.
To handle resources like images and CSS, you can simply ship your own, integrated web server. I successfully did this with several projects by including this CodePlex project.

Related

Xamarin Forms - Webview - Remote Website - Load included javascript file

I have a simple Xamarin Forms project that has a WebView which is used to load an ASP.NET website on our intranet. Currently there is only a Android implementation in the solution.
The website being loaded includes several CSS and Javascript files. The CSS files are being linked via link rel tags, while the Javascript files are linked using script src tags.
I have put alert statements in a script on the page itself, as well as in the linked Javascript file. Using a browser on my computer, both alert statements show up, however in the WebView, only the alert for the page shows up.
I've also tried using the WebView.eval method to call a Javascript method in the linked file as well as one defined on the page itself. Calling the method defined on the page worked, but the one in the linked Javascript file didn't.
All that has lead me to the conclusion that for some reason the WebView isn't loading the Javascript files indicated by the script src tags.
From research I have done, there is mention of having to include the Javascript file itself in the Android and iOS projects, but those seemed to be for situations where the WebView source was being set to a constructed website, not pointing to an existing web site.
Here are samples of how the files are linked:
<link rel="stylesheet" href="/App_Themes/wms.min.css" />
<script src="/Scripts/wms.js"></script>
I have tried using absolute paths, which didn't work either:
<script src="http://path.to.site/Scripts/wms.js"></script>
What am I doing wrong? How do I get the the linked javascript files to load and be usable?
Edit: Update, apparently the file was being linked and loaded correctly, it was just being cached by the webview and or the android device or emulator.
Because of that, when I made changes to the javascript file, and tried to use / view those changes via the webview, it wasn't being reflected, since a new copy of the javascript file wasn't pulled from the website.
However, when I gave a new name to the javascript file and linked to that, the webview get the new file, and all my changes were there and it behaved as expected.
So, I will be using the clear cache method outlined here to clear my change when the app starts: http://www.tipsabc.com/2015/xamarin-froms-how-to-clear-webview-cached-files/
That way, every time it starts it will force the webview to get a fresh copy of all the files.

Where to download the defaultss.xsl or the official URL (not res://msxml.dll/defaultss.xsl)

When I am actually interacting with IE through a .aspx file. I can refer to the defautl xslt as res://msxml.dll/defaultss.xsl.
Dim objDefaultXSL As New System.Xml.XmlDocument
objDefaultXSL.load("res://msxml.dll/defaultss.xsl")
But if I am not using IE which url should i use to to accomplish the same?
Thanks
When i am actually interacting with IE through a .aspx file
I have no idea what you mean with this. ASPX is used server-side to dynamcially create HTML or other pages and serve them with a web server to the client side. The client side can be Internet Explorer, but the ASPX code does not "interact" with the client.
but if I am not using IE which url should i use to to accomplish the same? Thanks
res://msxml.dll/defaultss.xsl is a location in a resource inside a DLL, in this case msxml.dll. This is a Microsoft-specific DLL, used when people in Internet Explorer use XML or XSLT technologies.
When you are not targeting IE, this DLL will not be loaded and so you cannot access that file. But if, for some reason, you want that resource, simply load and save it to disk once and reuse it in your other code. Whether this makes any sense, I doubt it, because that file is specifically meant for IE.

Can people find and download the Code File linked to your aspx page?

So I imported this aspx page done by a former dev who worked for the company I'm in now. I found that the aspx page left by him doesn't have a codebehind file so I assumed this wasn't the source code. I can't find the source so I added a code file and try to work it out on my own. But my main concern is this: clients can't access the code behind, right? Is a manually added code file subject to the same protection?
The codebehind file is there as a place to put your server side code. However it's technically not necessary to have one since you can put the code in the aspx file using c# script tags. It's however recommended to put it in the codebehind file for better separation between markup and code.
It does not matter if you add it yourself or if Visual Studio adds it for you. It does not change anything in terms of access. In all events it executes on the server.
If your server is properly configured to run ASP.NET applications - which I believe it is - then IIS will not serve .cs files to a client. These will normally be accessible only through FTP. Try it yourself, by browsing to any .cs file in your application :)
Also notice that what you get when you browse to an .aspx file is not the very same code you'd see in Visual Studio, but the result of that being processed. IIS will serve the resulting HTML. So even if you have server side code in the ASPX file, that won't be visible to an end user browsing through your application.
Sounds like a web application project; in this case, the code is in the code-behind file as #TGH mentioned, and the code would be in the DLL compiled for the web application. Therefore, the only way to get that code is use a tool like Telerik JustDecompile, and decompile that DLL to grab the source code for EVERY file in the project. It would be much better to have the source, as these decompile tools do not include everything in that code-behind file.

Get elements using javascript and XPATH in internet Explorer

I am writing a tool in C# which has WebBrowser in it and it uses IE 8. Now I want to write a java script which I can run on the webpage inside the WebBrowser to get some elements.
As document.evaluate does not work on Internet Explorer, is there any other way I can achieve it? I have seen some other similar posts however they are related to writing the javascripts directly into HTML using some libraries. I am new to all this so can someone please tell me is there any way I can achieve this through the C# code?
Problem solved. Google's Wicked-good-xpath library worked.
https://code.google.com/p/wicked-good-xpath/
So to explain it, as its a webbrowser control, we need to add js file contents in the script tag of the current html in the WebBrowser and then call the install method as explained in your script.
In case if you are shipping your javascript, then you may need to include the wicked-good-xpath libraries file contents or as per standard way ship the library with your script and add its reference in your code. (In my case I was supposed to send only one js file so added the library contents in the same file)
Thanks minitech for the pointers.!

Parsing HTML generated from Legacy ASP Application to create ASP.NET 2.0 Pages

One of my friends is working on having a good solution to generate aspx pages, out of html pages generated from a legacy asp application.
The idea is to run the legacy app, capture html output, clean the html using some tool (say HtmlTidy) and parse it/transform it to aspx, (using Xslt or a custom tool) so that existing html elements, divs, images, styles etc gets converted neatly to an aspx page (too much ;) ).
Any existing tools/scripts/utilities to do the same?
Here's what you do.
Define what the legacy app is supposed to do. Write down the scenarios of getting pages, posting forms, navigating, etc.
Write unit test-like scripts for the various scenarios.
Use the Python HTTP client library to exercise the legacy app in your various scripts.
If your scripts work, you (a) actually understand the legacy app, (b) can make it do the various things it's supposed to do, and (c) you can reliably capture the HTML response pages.
Update your scripts to capture the HTML responses.
You have the pages. Now you can think about what you need for your ASPX pages.
Edit the HTML by hand to make it into ASPX.
Write something that uses Beautiful Soup to massage the HTML into a form suitable for ASPX. This might be some replacement of text or tags with <asp:... tags.
Create some other, more useful data structure out of the HTML -- one that reflects the structure and meaning of the pages, not just the HTML tags. Generate the ASPX pages from that more useful structure.
Just found HTML agility pack to be useful enough, as they understand C# better than python.
I know this is an old question, but in a similar situation (50k+ legacy ASP pages that need to display in a .NET framework), I did the following.
Created a rewrite engine (HttpModule) which catches all incoming requests and looks for anything that is from the old site.
(in a separate class - keep things organized!) use WebClient or HttpRequest, etc to open a connection to the old server and download the rendered HTML.
Use the HTML agility toolkit (very slick) to extract the content that I'm interested in - in our case, this is always inside if a div with the class "bdy".
Throw this into a cache - a SQL table in this example.
Each hit checks the cache and either a)retrieves the page and builds the cache entry, or b) just gets the page from the cache.
An aspx page built specifically for displaying legacy content receives the rewrite request and displays the relevant content from the legacy page inside of an asp literal control.
The cache is there for performance - since the first request for a given page has a minimum of two hits - one from the browser to the new server, one from the new server to the old server - I store cachable data on the new server so that subsequent requests don't have to go back to the old server. We also cache images, css, scripts, etc.
It gets messy when you have to handle forms, cookies, etc, but these can all be stored in your cache and passed through to the old server with each request if necessary. I also store content expiration dates and other headers that I get back from the legacy server and am sure to pass those back to the browser when rendering the cached page. Just remember to take as content-agnostic an approach as possible. You're effectively building an in-page web proxy that lets IIS render old ASP the way it wants, and manipulating the output.
Works very well - I have all of the old pages working seamlessly within our ASP.NET app. This saved us a solid year of development time that would have been required if we had to touch every legacy asp page.
Good luck!

Categories