I am working with the Image generation of a browser, but doing so i am taking a snapshot of the browser from the code, in windows form. But if the browser is not loaded in the specific time (suppose 15 seconds) then a blank snapshot occur. Anyone help me with this .
I'm not entirely sure if I understand exactly what you're trying to do, but I'll take a stab at it: It sounds like you're trying to open whatever program is set as the user's default web browser, and then do something like a BitBlt to take a screenshot of it.
However, as you've noticed, it's difficult to just wait a pre-defined interval and hope the browser has completely loaded. Instead, you could try something like WaitForIdleInput after running the process, which will suspend the thread's execution until the process has finished initializing and is idle (waiting for user input). This should allow the browser to finish loading before you proceed with taking the snapshot.
Something like the following code:
//start the web browser
System.Diagnostics.Process proc = System.Diagnostics.Process.Start("iexplore.exe");
//wait for it to completely finish loading
proc.WaitForInputIdle();
//take your screenshot, or whatever
//...
Related
I have some problem that bother me for few days..
I need to do the following..
start a process (with some arguments), find the window of the process, take a picture of that window, and kill the process
and I need to repeat this operation X times..
I can start the process, using user32: find its main handle, get the window size, so I know the size of the image, set the window position to 0,0, and make it topMost, and than copyFromScreen metod to capture that image.. from 0,0, to size of the window..
So it look like everything is ok, but there is one problem that I can't solve..
when the process is started, I notice that to use the setWindowPos native function, some time should pass so I use Thread.sleep(x seconds), but
that time is different on every system.. and that is the problem.. I must know how long delay to make on every system.. and I can't let big unnecceary delay because the process repeats hundreds of times.. and every millisecond is important,
Please if You have some solution, I will be happy to hear..
the process is opera mobile emulator..
other solution is to use sendMessage, and change the url, instead of starting the process over and over.. and
I will wait once, but that also doesn't work..
using Spy++ I try to find details about the process, like className .. and it says invalid window..
so I probably can't use sendMessage too..
Browsers are not designed for this kind of usage. If you want to reliably know when some browser is ready for rendering a new page and want to grab the result, try something that actually integrates into the rendering engine / core of the browser.
For the complete framework, see http://phantomjs.org/ For own solution - try embedding a browser widget directly in your application instead of interfacing with another process. I'm sure .net has the right controls for it. This way you can hook to the relevant events directly, instead of guessing the right delays.
I have created an Html 5 page that provides important server-side functionality. Unfortunately, it must be run in an Html 5 browser (Chrome, IE9, or Firefox) with a canvas to produce the results I need. It is completely self contained, taking needed parameters through the URL, and is ready to be closed when the OnLoad event is ready to send. So far so good.
The following process needs to be automated (no human eyes or interaction) and will be run from within a web service (not run from within a browser). Ideally, I don't want to waste extra cycles with busy wait, or delay the result by waiting for long time periods simply hoping the process has finished. I need to:
Open a browser (preferably Chrome) with a URL, using C#.
Wait for the page to completely finish loading - ideally receiving a callback of some kind.
Close the browser page when finished, again with C#.
We've tried using IE9. There is C# support to launch IE9, Wait until not Busy, and gracefully Close the browser; however, the page loads resources asynchronously (there is no way around this), and so we get the signal that it is no longer busy during the resource load - instead of when the page has finished. Adding busy wait would consume valuable server-side cpu cycles.
A simple Create Process call would be nice, but would only work if the browser could close itself with some html - but thanks to security measures in the browsers, I can't find a reliable way to use html commands to close a browser that was launched from command-line (I did see you can close tabs spawned from an already opened page - firefox only, but this doesn't help).
Does anyone know how I can accomplish this goal? Again - there is no human involvement in any of the process, no human eyes will ever see the page or interact with it in any way. The page only runs on the server machine, and will never be deployed to a client machine.
I would suggest to use the WebBrowser control to load the HTML. Once you get the data back, use an ObjectForScripting to call a c# method to notify when done.
See http://www.codeproject.com/Tips/130267/Call-a-C-Method-From-JavaScript-Hosted-in-a-WebBro
You dont really have to even show the webbrowser control.
Let me know if you have any questions. Hope it helps!
Automating the browser - thats what Selenium does. I think it will be a good fit for the task, and there's good C# support. It can even run the browser on a remote machine using the Selenium RC server.
I am using ASP.NET MVC 3 with C# and I am having a process that takes about 10 minutes to be finished. I need some help how could I show some interface (progressbar, etc).
What would happen if user turned the browser off? My process should not be stopped.
What would happen if another user tried to open the process page?
I started searching about jQuery progress bar but got those questions and looking for some help.
Thank you
If your process takes 10 minute to finish, then you must make the work on background, and keep the result somewhere to show it.
First question: What happend if the user close the browser, to solve this, you need to create a system to make the work on backgroud and leave the browser to continue. if can not make a full shedule class to make your works, a simple thead can do the same think - but is less flexible.
Second question: How to avoid the start of a new process page. You can solve this by using mutex. You set a mutex with a specidic name, and you close it when the job done, after 10minute. In the middle if some user try to re-run the same process you see that the mutex is lock and you show him a message to wait.
You need somewhere to keep the result information's, eg, Let say that you make a job of 10 minute, then store the results somewhere and the user see the results and when they are generated and if he like can rerun the procedure.
With this I describe you to not need to fully disable the page, just a message that result still running, or an automatic refresh to the page every 30 seconds to see if they are done.
I'm trying to make a web browser that tells me when a specific video is done. I already know how to get the time in seconds of how long it takes. I just need to know how to wait to execute the next step without stopping the webbrowser control (which needs to play the flash video).
Any ideas? I've been looking into the Timer class, but can't figure out how to apply it to this problem.
Thanks.
Well, you could try Thread.Sleep() (You're not clear about what threads are running what...), except this idea is doomed to failure. I mean, you are going to let him pause the video, aren't you?
(Yes, You are!)
I wrote a webcrawler which calls a web page in a do while loop amount 3 seconds
totally there are 7000 sites... i parse the data and save it in my DB.
sometimes because the script is loading for a long time, i got a timeout in browser,
but in background i continues. I see that on my database.
Can I prevent this?.. Now it's just possible if I stop webserver.
Thank you and best regards.
Your web page is kicking off a server-side process. Killing your browser or closing it is not going to stop this. It sounds to me like a web page to control this is the wrong approach, and you should be looking at a connected form of application like a WinForms/WPF app. There would be ways to get this to work with ASP.NET, but they are not going to be simple. I think you have just chosen the wrong technology.
Starting an intensive, long running process like this from a web page is almost never a good idea. There are lots of reasons, but the main ones are :
1) If you get a timeout in the browser (this is your scenario) the data you have harvested may not be displayed.
2) What happens if you hit refresh in the browser? Will it attepmt to start the whole process again? this is an easy target for an attacker, if he wants to tie up all your server resources.
3) Is the data you are crawling really likely to change to such an extent that you need "live" crawling? 99% of cases would be served just as well with a background timed job running the crawl, and your front end just displaying the contents of the database.
I would seriously recommend you rethink your crawling strategy to something more controllable and stable.