I know i can use AJAX and SILVERLIGHT with my ASP.NET web page. But what do you think about using flash with asp.net? Can this be done? How can this be done? Would you recommend me using flash at all with ASP.NET? I will NOT be using WEB SERVICES, just a plain ASP.NET website.
Thanks in advance!
EDIT: What about performance issues???
I have used Flash in ASP.NET websites plenty.
Software should always boil down to the best tool for the job, if Flash is the way you need to go for your RIA, then so be it.
Remember, ASP.NET is nothing "new/different" ultimately, it is just a fancy HTML generator.
Therefore, to use flash, you simply use the plain old HTML OBJECT and EMBED tags to place the Flash on the page.
The benefit of using things like ASP.NET (or any other framework) is that you can encapsulate the EMBED logic to use things like swfObject.
flash is client side, what you use server side has very little impact on it.
Given Flash's high market penetration (98%+), I think Flash is a great way to go regardless of the underlying platform.
But, as with everything, it depends on what you want to do. If you want to deliver a rich user interface via Flash, you should consider using Flex.
There are several tools to help integrate a Flash/Flex application with ASP.NET. One of these that I recommend is WebORB.
It certainly can be done! We've done entire flash-based websites in the past that rely on data generated by a CMS and read from flash via XML. There are of course lots of gotchas (loading html text, multilingual characters), but once you've done it a few times you'll get the hang of it.
Flex is probably a better option.
Related
We are using Html Agility Pack to scrape data for HTML-based site; is there any DLL like Html Agility Pack to scrape flash-based site?
It really depends on the site you are trying to scrap. There are two types of sites in this regard:
If the site has the data inside the swf file, then you'll have to decompile the swf file, and read the data inside. with enough work you can probably do it programmatically. However if this is the case, it might be easier to just gather the data manually, since it's probably isn't going to change much.
If most cases however, especially with sites that have a lot of data, the flash file is actually contacting an external API. In that case you can simply ignore the flash altogether and get to the API directly. If your not sure, just activate Firebug's net panel, and start browsing. If it's using an external api it should become obvious.
Once you find that API, you could probably reverse engineer how to manipulate it to give you whatever data you need.
Also note that if it's a big enough site, there are probably non-flash ways to get to the same data:
It might have a mobile site (with no flash) - try accessing the site with an iPhone user-agent.
It might have a site for crawlers (like googlebot) - try accessing the site with a googlebot user-agent.
EDIT:
if your talking about crawling (crawling means getting data from any random site) rather then scraping (Getting structured data from a specific site), then there's not much you can do, even googlebot isn't scrapping flash content. Mostly because unlike HTML, flash doesn't have a standardized syntax that you can immediately tell what is text, what is a link etc...
You won't have much luck with the HTML Agility Pack. One method would be to use something like FiddlerCore to proxy HTTP requests to/from a Flash site. You would start the FiddlerCore proxy, then use something like the C# WebBrowser to go to the URL you want to scrape. As the page loads, all those HTTP requests will get proxied and you can inspect their contents. However, you wouldn't get most text since that's often static within the Flash. Instead, you'd get mostly larger content (videos, audio, and maybe images) that are usually stored separately. This will be slowed compared to more traditional scraping/crawling because you'll actually have to execute/run the page in the browser.
If you're familiar with all of those YouTube Downloader type of extensions, they work on this same principal except that they intercept HTTP requests directly from FireFox (for example) rather than a separate proxy.
I believe that Google and some of the big search engines have a special arrangement with Adobe/Flash and are provided with some software that lets their search engine crawlers see more of the text and things that Google relies on. Same goes for PDF content. I don't know if any of this software is publicly available.
Scraping Flash content would be quite involved, and the reliability of any component that claims to do so is questionable at best. However, if you wish to "crawl" or follow hyperlinks in a Flash animation on some web page, you might have some luck with Infant. Infant is a free Java library for web crawling, and offers limited / best-effort Flash content hyperlink following abilities. Infant is not open source, but is free for personal and commercial use. No registration required!
How about capturing the whole page as an image and running an OCR on the page to read the data
I am developing a web-based Pokemon Online game. Since it is online, I would like to optimize it to run as quickly possible.
I've installed Firebug and Page Speed suggests minifying my HTML output. I'm also using VS2008, ASP.NET 3.5, AJAX, and IIS 7.5; along with URL-Rewriting.
I want to minify my HTML, JavaScript, and CSS. Optimally, I'd like the minifying process to happen at compile time. I've spend hours looking online but couldn't find a decent solution, can you help me? Thank you.
Firstly, you should read the Yahoo best practices for speeding up webpages.
You will probably find that minifying the HTML won't have much difference (also see this question), but a lot of the other suggestions in that article will.
There are a couple of methods to achieve this. You can configure GZip compression with IIS7 if you have access. If you don't i.e. you are using a hosting provider it is possible to activate compression from within your code.
See this SO Post for further reading.
UPDATE:
To perform this at build time rather than run time see this blog post.
Instead of minifying your .aspx files consider dynamic compression. This will send compressed data to the browser. since you are using IIS 7.5 dynamic compression comes built-in you just have to enable it.
I want to make a program that will simulate a user browsing a site and clicking on links. Cookies and javascript have to be enabled. I've successfully done this in python, but I want to write it an compilable language (python ide's don't cut it). The links on the site are generated with javascript and are dynamic. With python I used PAMIE (third party module that uses win32com) to launch an instance of Internet explorer, scrape the generated html for the links, then navigate to one of them. The point is for the whole process to be transparent to the server. What's the best (compilable) language and method to do this? I was thinking C# with WebBrowser control but I don't want to spend a lot of time learning something if it isn't going to work. Any kind help is appreciated!
You might want to look at the automated testing via browser suites:
http://www.teknologika.com/blog/the-holy-grail-net-automated-web-gui-testing-for-internet-explorer/
http://watin.sourceforge.net/
I wrote a blog post on this awhile back: Web scraping in .NET. That discusses cookies but not JavaScript; I don't know if that would require additional coding.
Might be worth having a look at selenium .
We use it for web testing in a C# asp.net envirnorment.
The documentation isn't to bad
What is the best way to let other parties use your website as their own content using their own style ?
We have build a small website for a customer, asp.net, .net framework 3.0. Now the customer wants other parties to be aple to use our website in their own websites while maintaining the styling of the costumers website.
I have done nothing like this before and don't even know what to google, so any help is appriciated.
I know you can do this with sharepoint, but to use sharepoint for such a small site seems like a lot of overkill
it sounds that portlets is a good name for what they want. but googling portlets draws me in the world of java and doesn't give me a lot of info on what the other parties would have to do to make it work.
a simple Iframe would probably take me a long way, but how can you get the styling done within an Iframe
webparts also sound interesting , but they seem more for in project sharing then letting people use them in their own site.
It is a small website and the logic and backend communication is pretty good contained, so a complete rework of the frond-end is not a big problem.
Once again, any help is appreciated !
Omar Al Zabir has a book on how to build a Portal in ASP.NET- here is a link to his Website.
You might want to look at his Dropthings portal.
Yes, portlets are pretty much Java-only, despite any talk of standards.
If you can encapsulate your pages in web parts or user controls, that would make it easy to style them, and even to parameterize them: a web part can be configured.
You can use Kalitte Dynamic Dashboards for creating professional dashboards and portlets.
More information can be found at www.dynamicdashboards.net
I have an ASP.net website ( http://www.erate.co.za ) version 2.0.
When someone opens my website in Firefox everything looks different.
Why is that and how can I make it compatible?
Please help!
Etienne
The problems don't have anything to do with ASP.NET / C# Specifically.
They have to do with your understanding of web design / HTML / CSS and how you can make a cross-browser compatible UI.
I'd suggest you look at http://www.w3schools.com/ for some information on good web design practices.
Some obvious problems with the Source that you need to address are
No common css Stylesheets
Styles are applied inline on lots of elements
using long strings of " " to align text
The underlying server technology should not have any impact on your websites appearence as long as you are just producing HTML.
What you need to do is make sure that your HTML and CSS works as intended in all browsers. A good way to start is to make sure that you only output standards compliant code.
The issue at hand is that styles that you are using don't work in firefox such as cursor:hand; versus cursor:pointer; both work in IE but only pointer works in firefox. A quick recommendation would be to just run the resultant page through the w3c validator located at: http://validator.w3.org/
Per se, ASP.NET produce vanilla HTML/Javascript, so there's nothing wrong with the technology.
Focus on the html, try to be as close as possible to the w3c standards, it should help a lot.
Firebug, an incredible web dev extension for Firefox should also help you a lot in debugging your CSS.
It might be a painful task, especially if your site is old and big. Good luck!