How to pause execution yet keep webbrowser control loading in C# - c#

I'm trying to make a web browser that tells me when a specific video is done. I already know how to get the time in seconds of how long it takes. I just need to know how to wait to execute the next step without stopping the webbrowser control (which needs to play the flash video).
Any ideas? I've been looking into the Timer class, but can't figure out how to apply it to this problem.
Thanks.

Well, you could try Thread.Sleep() (You're not clear about what threads are running what...), except this idea is doomed to failure. I mean, you are going to let him pause the video, aren't you?
(Yes, You are!)

Related

MessageBox.Show makes DirectShow go

Does anyone have any insight as to what can cause a scenario where if you were to call
mediaControl.Run();
that it won't actually start rendering video until this is called
MessageBox.Show("");
I couldn't figure out why the media wasn't running even when getstate returned that it was running until i return the HRESULT which said sure its running... but only displays video on the window when a messagebox is shown.
The weird thing is that if you dismiss the messagebox, and rendering pauses... but if you called it again say 5 seconds after, it continue to works.
Furthermore... if you dismiss the messagebox and have a continous loop right after it that sleeps the thread indefinitely, it continues to render.
I have been banging my head all day with this trying to figure out why a messagebox is key... but ultimately, I can't have a messagebox display.
Any random ideas would be helpful too... thanks.
Cheers.
The main thing MessageBox.Show() does is to run the message loop. You may be able to replace it with simulated DoEvents() (see this answer)
But there probably is a serious problem in your code surrounding this.

Using C# to take a screenshot of the Opera Mobile emulator

I have some problem that bother me for few days..
I need to do the following..
start a process (with some arguments), find the window of the process, take a picture of that window, and kill the process
and I need to repeat this operation X times..
I can start the process, using user32: find its main handle, get the window size, so I know the size of the image, set the window position to 0,0, and make it topMost, and than copyFromScreen metod to capture that image.. from 0,0, to size of the window..
So it look like everything is ok, but there is one problem that I can't solve..
when the process is started, I notice that to use the setWindowPos native function, some time should pass so I use Thread.sleep(x seconds), but
that time is different on every system.. and that is the problem.. I must know how long delay to make on every system.. and I can't let big unnecceary delay because the process repeats hundreds of times.. and every millisecond is important,
Please if You have some solution, I will be happy to hear..
the process is opera mobile emulator..
other solution is to use sendMessage, and change the url, instead of starting the process over and over.. and
I will wait once, but that also doesn't work..
using Spy++ I try to find details about the process, like className .. and it says invalid window..
so I probably can't use sendMessage too..
Browsers are not designed for this kind of usage. If you want to reliably know when some browser is ready for rendering a new page and want to grab the result, try something that actually integrates into the rendering engine / core of the browser.
For the complete framework, see http://phantomjs.org/ For own solution - try embedding a browser widget directly in your application instead of interfacing with another process. I'm sure .net has the right controls for it. This way you can hook to the relevant events directly, instead of guessing the right delays.

Long process with browser interface

I am using ASP.NET MVC 3 with C# and I am having a process that takes about 10 minutes to be finished. I need some help how could I show some interface (progressbar, etc).
What would happen if user turned the browser off? My process should not be stopped.
What would happen if another user tried to open the process page?
I started searching about jQuery progress bar but got those questions and looking for some help.
Thank you
If your process takes 10 minute to finish, then you must make the work on background, and keep the result somewhere to show it.
First question: What happend if the user close the browser, to solve this, you need to create a system to make the work on backgroud and leave the browser to continue. if can not make a full shedule class to make your works, a simple thead can do the same think - but is less flexible.
Second question: How to avoid the start of a new process page. You can solve this by using mutex. You set a mutex with a specidic name, and you close it when the job done, after 10minute. In the middle if some user try to re-run the same process you see that the mutex is lock and you show him a message to wait.
You need somewhere to keep the result information's, eg, Let say that you make a job of 10 minute, then store the results somewhere and the user see the results and when they are generated and if he like can rerun the procedure.
With this I describe you to not need to fully disable the page, just a message that result still running, or an automatic refresh to the page every 30 seconds to see if they are done.

How to know if the current browser is loaded with C# code

I am working with the Image generation of a browser, but doing so i am taking a snapshot of the browser from the code, in windows form. But if the browser is not loaded in the specific time (suppose 15 seconds) then a blank snapshot occur. Anyone help me with this .
I'm not entirely sure if I understand exactly what you're trying to do, but I'll take a stab at it: It sounds like you're trying to open whatever program is set as the user's default web browser, and then do something like a BitBlt to take a screenshot of it.
However, as you've noticed, it's difficult to just wait a pre-defined interval and hope the browser has completely loaded. Instead, you could try something like WaitForIdleInput after running the process, which will suspend the thread's execution until the process has finished initializing and is idle (waiting for user input). This should allow the browser to finish loading before you proceed with taking the snapshot.
Something like the following code:
//start the web browser
System.Diagnostics.Process proc = System.Diagnostics.Process.Start("iexplore.exe");
//wait for it to completely finish loading
proc.WaitForInputIdle();
//take your screenshot, or whatever
//...

Script does not stop while close the browser or click Abort

I wrote a webcrawler which calls a web page in a do while loop amount 3 seconds
totally there are 7000 sites... i parse the data and save it in my DB.
sometimes because the script is loading for a long time, i got a timeout in browser,
but in background i continues. I see that on my database.
Can I prevent this?.. Now it's just possible if I stop webserver.
Thank you and best regards.
Your web page is kicking off a server-side process. Killing your browser or closing it is not going to stop this. It sounds to me like a web page to control this is the wrong approach, and you should be looking at a connected form of application like a WinForms/WPF app. There would be ways to get this to work with ASP.NET, but they are not going to be simple. I think you have just chosen the wrong technology.
Starting an intensive, long running process like this from a web page is almost never a good idea. There are lots of reasons, but the main ones are :
1) If you get a timeout in the browser (this is your scenario) the data you have harvested may not be displayed.
2) What happens if you hit refresh in the browser? Will it attepmt to start the whole process again? this is an easy target for an attacker, if he wants to tie up all your server resources.
3) Is the data you are crawling really likely to change to such an extent that you need "live" crawling? 99% of cases would be served just as well with a background timed job running the crawl, and your front end just displaying the contents of the database.
I would seriously recommend you rethink your crawling strategy to something more controllable and stable.

Categories