Winform webbrowser custom fonts not showing - c#

I'm using the C# WebBrowser control (Winforms) and am passing markup into it via the .DocumentText property.
The document links to some css that uses the #font-face rule, which works when running locally (and from a server), but not when being consumed by the webbrowser control.
The css is pretty basic, it looks like this:
#font-face {
font-family: FontName;
src: url("/fonts/fontname.ttf") format("truetype");
-webkit-font-smoothing: antialiased;
-moz-osx-font-smoothing: grayscale;
}
I've tried the various Registry suggestions from over here (to no avail): https://weblog.west-wind.com/posts/2011/May/21/Web-Browser-Control-Specifying-the-IE-Version#FeatureDelegationviaRegistryHacks
Another common suggestion is to include a meta tag in the mark-up (which I have done) like this:
<meta http-equiv="X-UA-Compatible" content="IE=edge">
My font is already marked as installable (and works fine in IE), so there is obviously something specific about the webbrowser control that's the problem - but having spent hours trying to figure out, I'm at a loss as to what that could be.
Does anyone know what I'm missing here?
Thanks!

It turns out that there were two problems here. I finally managed to work my way around them, so I'll share the problems and their solutions here in the hope I can save someone else some time in future.
Issue #1
The browser control was presenting security warnings (which were being suppressed because .ScriptErrorsSuppressed was set to false). After looking into how to automatically suppress the security errors, I came across Sheng Jiang's excellent blog entry on the subject here - https://jiangsheng.net/2013/07/17/howto-ignoring-web-browser-certificate-errors-in-webbrowser-host/
Issue #2
After getting frustrated, I looked into a couple of commercial alternatives and the one from Essential Objects looked promising (https://www.essentialobjects.com) - although this still wasn't rendering my page (although their sample app did).
After digging into their WebView control a bit, I hooked all of the javascript console events it exposed and I got a useful error message out when accessing my page:
Access to Font at 'https://localhost:1234/fonts/foo.ttf' from origin 'https://localhost:1234' has been blocked by CORS policy: No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'null' is therefore not allowed access.
Finally, something to work with! I temporarily allowed any origin on my server and the page started rendering properly (Note - You really don't want to do this on a production server - this was just to test).
This also worked for the Winforms control (albeit sub-classed with Sheng Jiang's interop to solve the first issue) so the solution here was just to configure the CORS policy properly for my font assets.

Related

Improper Neutralization of Script-Related HTML Tags in a Web Page (Basic XSS) CWE ID 80

I run my application with VERACODE tool but I got struggling with some issues.
One of the issue which I face is Improper Neutralization of the Script-Related HTML Tags in a Web Page (Basic XSS) (CWE ID 80).
This happens in many screens in my application.
In the following particular line:
NewDivButton.Style["display"] = SearchParameters.NewDivButtonVisibility;
Does anyone have any suggestion on how to fix this issue?
Welcome Manikandan. The best answer to this question would involve knowledge of the language/framework you're using if you could share that?
One thing to note, is that there are many things you could do that would make the warning "go away", but wouldn't make your app any more secure. For that reason, it's best to understand the core of the problem, and then apply the standard fix for the language/framework you're working in. If in doubt, check with a security professional.
In general, XSS is a set of issues where you (potentially) render user input as part of your output.
In this example if I send you a link that says yoursite.com?NewDivButtonVisibility=">SendYourPrivateInfoSomewhereBad();
If you click a link like this, and the site blindly inserts the script into the page, it could steal data.
The best protection is often to validate input, only allowing known-valid input through.
Another common approach is to HTML-encode the unknown value being displayed. However, more care is needed depending on where the output is rendered (e.g. if already within a script tag)
There's much more general information on this type of issue here: https://cheatsheetseries.owasp.org/cheatsheets/Cross_Site_Scripting_Prevention_Cheat_Sheet.html#cross-site-scripting-prevention-cheat-sheet

Open AngleSharp document in Chrome

I am using AngleSharp in C# to simulate a web browser. For debugging purposes, sometimes I want to see the page I am traversing. I am asking if there is an easy way to show the current document in a web browser (preferably the system's default browser) and if possible with current cookie states.
I am very late to the party, but hopefully someone will find my answer useful: The short answer is no, the long answer is, yes - with some work that is possible in a limited way.
How to make it possible? By injecting some code into AngleSharp that opens a (local) webserver. The content from this webserver could then be inspected in any webbrowser (e.g., the system's default browser).
The injected local webserver would serve the current document at its root (e.g., http://localhost:9000/), along with all auxiliary information in HTTP headers (e.g., cookie states). The problem with this approach is that we either transport the document's original source or a serialization of the DOM as seen by AngleSharp. Therefore, there could be some deviations and it may not be what you want. Alternatively, the server could emit JS code that replicates what AngleSharp currently sees (however, then standard debugging seems more viable).
Any approach, however, requires some (tedious?) work and therefore needs to be justified. Since you want to "see" the page I guess a CSS renderer would be more interesting (also it could be embedded in any application or made available in form of a VS extension).
Hope this helps!

IInternetSecurityManager URLACTION_CROSS_DOMAIN_DATA and Asynchronous Pluggable Protocol for cross domain XMLHTTP requests in webbrowser control

I've implemented async pluggable protocol in a .net 2.0 application using C# which loads html files stored on the local machine into a MemoryStream.
when I load the html files normally in the webbrowser control using their local file paths, xmlhttprequest works fine but loading the files through the protocol and an attempt to use xmlhttprequest returns an access denied error.
I presume that this behavior is due to the webbrowser control no longer knowing that the html files are stored on the local machine, and is loading them in an untrusted internet zone.
Even though I'm returning S_OK for URLACTION_CROSS_DOMAIN_DATA inside IInternetSecurityManager's ProcessUrlAction which I checked with a break point to make sure it was fired, my IInternetSecurityManager's return value for this action is being ignored.
I've tried setting pdwZone to tagURLZONE.URLZONE_LOCAL_MACHINE in IInternetSecurityManager's MapUrlToZone for my protocol URLs and played around a little with GetSecurityId although I'm not sure exactly what I'm doing with and broke other things like allowing scripts to load etc... Nothing seems to work to allow cross-domain xmlhttprequest.
Anyone any idea how I can get this to work.
Not really an answer, but it may help to isolate the problem. I'd first implement this APP handler in C++ and test it with some robust unmanaged WebBrowser ActiveX host sample, like Lician Wishick's Webform:
http://www.wischik.com/lu/programmer/webform.html
If I could get it working reliably with the unmanaged host, I'd proceed with C# implementation.
I'd also try setting FEATURE_BROWSER_EMULATION to 8000 or less, to impose emulation of legacy IE behavior, just to check if it works that way.
That said, I wouldn't hold my hopes high. I've done my share of WebBrowser/MSHTML integration in the past, and I have a feeling that the APP support hasn't been regression-tested since IE9, in favor for new IE stuff aimed to embrace open web standards.
Updated, MSDN vaguely mentions this:
Upon successful completion, pbSecurityId contains the scheme, domain,
and zone information, as well as whether the specified pwszUrl was
derived from a Mark of the Web.
Here's the format which worked for me long ago (perhaps, way before "Mark of the Web" was introduced):
static const char security[] = "https:www.mysite.com\2\0\0"; // C++ puts the termination \0 for us
I believe, 2 stands here for the "Trusted Sites" zone. Other zones can be found here:
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\CurrentVersion\Internet Settings\Lockdown_Zones
Hope this helps.
Maybe I'm wrong but, have you tried to send in your protocol headers Access-Control-Allow-Origin: *?

Custom functions unavailable from BrowserInteropHelper.HostScript in an XBAP

I'm attempting to use an XBAP to acquire TWAIN Images but I'm not even getting that far. I can't seem to get the BrowserInteropHelper.HostScript to allow me to talk back to the javascript on the host page.
I am running the XBAP in an iframe.
I tried full trust (even though that should be a requirement).
I'm testing on IE9, .NET Framework 4.0
The BrowserInteropHelper.HostScript is not null, I can use the normal window methods, like .Close().
My code looks like this:
Index.html:
<p id="someP">My cat's breath smells like cat food.</p>
<script type="text/javascript">
function WorkDamnit() {
$('#someP').hide();
}
</script>
<iframe src="#Url.Content("~/XBAPs/WPFBrowserApplication1.xbap")" style="border: none;" width="500" height="500" />
Page1.xaml.cs:
private void Button_Click(object sender, RoutedEventArgs e)
{
if (BrowserInteropHelper.HostScript == null)
throw new ApplicationException("hostscript is null");
else
BrowserInteropHelper.HostScript.WorkDamnit();
}
I get:
System.MissingMethodException: Method '[object Window].WorkDamnit' not found.
New Info:
This scenario works properly from other PCs on the network and I found out why. IE9 has a setting turned on by default called "Display intranet sites in Compatibility View". This causes the page to render in "IE7 mode" and the javascript function is available from the XBAP. If I click Document Mode: (under the F12 developer tools console) and switch to IE9 mode then it no longer works (as above). On my own PC it uses IE9 mode by default (As it should) and it does not work unless I manually switch to IE7 mode.
In IE9, window methods are available to you, so you could try setTimeout
BrowserInteropHelper.HostScript.setTimeout("WorkDamnit()",0);
Its due to IE security settings.
IE by default blocks the scripts if page is opened from local drive.
IE should display the warning message as "'To help protect your security, Internet Explorer has restricted...."
However you can change the settings
Check the checkbox "Allow active content to run in files on My Computer"
IE9 does not expose BrowserInteropHelper.HostScript unless it is in compatibility mode.
This is pretty far after the fact but I wanted to chime in because I've dealt with this problem and it's an ugly one.
As you have observed, the machines it runs on work because they are running in compatibility mode. You can instruct IE9 to render your page using compatibility mode by adding the following tag to your html documents:
<meta http-equiv="X-UA-Compatible" content="IE=8"/>
I'm confident in saying, adding this tag is your only 'solution.' I've spent weeks looking for a better one and this is the one I'm still stuck with.
For whatever reason the ScriptInteropHelper is rife with problems in IE9. It hasn't gotten a lot of attention from Microsoft, likely because it's obscure functionality. That said, particularly because IE7 and IE8 have all sorts of quirks, it's extraordinarily annoying you have to run in compatibility mode. There are a number of MSDN articles, for example this one and enter link description here that discuss the issue further.
If someone is still wondering about this problem, I got found out a solution. Basically from document object you can communicate properly without any hacks in latest browsers and with HTML5 doctype.
I wrote a blog post that contains the codes and examples:
XBAP and Javascript with IE9 or newer - Solution for MissingMethodException problems
Quickly explained this works properly:
Javascript
document.ResponseData = function (responseData) {
alert(responseData);
}
C# XBAP
var hostScript = BrowserInteropHelper.HostScript;
hostScript.document.ResponseData("Hello World");
It can be then used to pass C# object to Javascript and used as "proxy" object. Check that blog post for details how that can be used.
BrowserInteropHelper.HostScript.setTimeout("WorkDamnit()",0);
its working but if i want to send callback functions to js how can i send object with setTimeout ?
when using for ie8
its working good
dynamic host = BrowserInteropHelper.HostScript;
host.sampleJSFunction(new CallbackObject(this));

Making my ASP.NET website compatible with Firefox?

I have an ASP.net website ( http://www.erate.co.za ) version 2.0.
When someone opens my website in Firefox everything looks different.
Why is that and how can I make it compatible?
Please help!
Etienne
The problems don't have anything to do with ASP.NET / C# Specifically.
They have to do with your understanding of web design / HTML / CSS and how you can make a cross-browser compatible UI.
I'd suggest you look at http://www.w3schools.com/ for some information on good web design practices.
Some obvious problems with the Source that you need to address are
No common css Stylesheets
Styles are applied inline on lots of elements
using long strings of " " to align text
The underlying server technology should not have any impact on your websites appearence as long as you are just producing HTML.
What you need to do is make sure that your HTML and CSS works as intended in all browsers. A good way to start is to make sure that you only output standards compliant code.
The issue at hand is that styles that you are using don't work in firefox such as cursor:hand; versus cursor:pointer; both work in IE but only pointer works in firefox. A quick recommendation would be to just run the resultant page through the w3c validator located at: http://validator.w3.org/
Per se, ASP.NET produce vanilla HTML/Javascript, so there's nothing wrong with the technology.
Focus on the html, try to be as close as possible to the w3c standards, it should help a lot.
Firebug, an incredible web dev extension for Firefox should also help you a lot in debugging your CSS.
It might be a painful task, especially if your site is old and big. Good luck!

Categories