The API doesn't officially support threading (see below) or a way to close an active document. That said, a work around to closing an active document is to call...
SendKeys.SendWait("^{F4}");
...from a separate thread. That works fine, except I need to loop through opening and closing several documents. If I put any code at all after thread, it will run it before closing the previous document. I have tried a number of standard threading callback methods including...
Task.Factory.StartNew(() =>
ThreadPool.QueueUserWorkItem(new WaitCallback
AutoResetEvent.WaitOne()
with no luck. And Thread.Sleep() just stalls the error/crash. Does anyone have any ideas.
"Revit's internals make use of multiprocessing in only a few select isolated locations. None of these locations currently encompass the code in the Revit API, or any part of it. Thus Autodesk does not recommend making any calls to the Revit API from within simultaneously executing parallel threads. It may be that some part of the Revit API is isolated enough to be able to execute successfully from within such threading code in a test environment; this should not be taken to be a guarantee that the same source code will function for any model or situation, or that a future change in Revit will not cause this code to cease to function."
public void OpenFile()
{
for (int i = 0; i < 3; i++)
{
uiApp.OpenAndActivateDocument(TargetPath(i));
ThreadPool.QueueUserWorkItem(CloseDocProc);
//any code here at all opens the next doc without closing the last
}
}
public void CloseDocProc(object stateInfo)
{
SendKeys.SendWait("^{F4}");
//can run code here
}
The problem was the threading, just like they said. Using any of the callback methods it would freeze right at that point. And you can only do a limited number of things in the thread, it would not let me open a document, no matter what!
The answer was to use a single-threaded timer.
System.Windows.Forms.Timer;
calling my Open() method every 10 seconds or so and stopping the timer and running the last bit of code when a counter reached a certain point.
Not sure if it could do the trick, but you can perhaps use this technique: http://adndevblog.typepad.com/autocad/2012/06/use-thread-for-background-processing.html
It's for AutoCAD, but I think it could work with Revit. Revit API, like the AutoCAD one, do not support multithreading. You should only call the API functions from the main thread.
You need if fact to marshal the call to the main thread. The simplest way to achieve that is creating a System.Windows.Forms.Control object on the main thread and call its Invoke() from the separate thread where you're closing the document.
Or you can also use the Idle Event in a creative way...
Create a state machine in your app idle event handler that interacts with your thread and which handles the revit calls.
Related
Background
I faced this problem a couple of years ago and got this very helpful answer from Stephen Cleary. Problem with VSTO add-ins is that they do not set a SynchronizationContext and therefore async calls do not resume on UI thread, causing all sorts of cross-thread access troubles. The solution, as he mentioned is to manually call SetSynchronizationContext before calling any async function.
I have been using this technique since then and thought that was all there was to it. But today I have seen a situation where even manually setting the context does not force it to resume on calling thread.
Situation
My VSTO add-in contains a WPF pane (inside a CustomTaskPane) which is bound to its ViewModel that contains several AsyncRelayCommand properties (from WCT). One of these commands calls my Data Service which in turn calls a RestSharp methods to fetch data from the API server.
All these calls use async/await and all these call use ConfigureAwait(false) except the one at the top level (i.e. the command itself). Here is a snapshot of how this call-site looks like:
As you can see, I have manually called SetSynchronizationContext before doing the await call. It also shows that SynchronizationContext.Current is set after it resumes after the await call, but somehow the code is still running on the worker thread. I also verified that the code was running on UI thread when it hit line 259 before drilling down into the await call.
I have already spent a lot of time and effort on this and can't make any sense of it. Can anyone help me figure out if I'm missing something obvious?
You need to use an instance of the WindowsFormsSynchronizationContext class instead.
The WindowsFormsSynchronizationContext class provides a synchronization mechanism for Windows Forms.
The DispatcherSynchronizationContext class is for WPF applications which uses two threads. One thread is background thread for rendering and the other thread is for UI. So, UI elements in one thread are not accessible to other UI threads. Microsoft introduced the dispatcher which is responsible for multiple UI threads interaction.
Does anyone know why this code is slowing down the UI:
Thread trdGenerateTrajectory = new Thread(() => HeavyMethod());
trdGenerateTrajectory.Start();
trdGenerateTrajectory.Join();
This is supposed to be in a separate thread than the main thread, am I right? If so, why running it slows down/freezes the UL?
EDIT: Thanks for your comments. I removed Join(), but it still freezes the UI. Any idea?
UPDATE: The HeavyMethod() method is calling a method from a Matlab dll that I created. The method in the dll generates manipulation trajectory for a robot. My project is a heavy robotics project that communicates with lots of hardware/devices. The project has 12 backgroundworkers and one timer. The timer is responsible for updating the UI and all of the texts/colors/images/... on it. I haven't had any issue so far with the backgroundworkers and the timer and no matter how heavy were the tasks they are runing, I never saw any delay or stop on the timer and UI update. However, when I call this specific method in Matlab dll, I see a full stop on UI being updated until the method is completed. This is what I experienced:
I used threads (above code) with no luck.
I then moved the method and the process after running method into another backgroundworker, again with no luck.
Then I realized that just some of the textboxes on my form are experiencing this issue. They are those that are getting their values from another method of the same Matlab dll. That was the time that I realized that this issue has nothing to do with threads/backgroundworkers and is related to the Matlab way of running methods. Maybe it is single threaded!? Anyway, I though it may help if I make separate dll for this specific method that generates trajectory, so I created another dll, but I experienced the exact same issue. It seems like Matlab dll can only run one method at a time no matter if you call them from different threads or even from separate dll. I believe I should ask this in separate SO question and I will, but, in the meantime, do you have any comment on this? (Update: I didn't receive any reply so I posted a new question: Calling two functions from a single Matlab dll at the same time)
Although you're running your computation on a background thread, your Join call causes your UI thread to block anyway until your computation completes.
The proper way to do this today would be using the async and await keywords. If you really want to restrict yourself to threads, you can use an Invoke call within the thread to dispatch control back to the UI thread once the computation completes:
Thread trdGenerateTrajectory = new Thread(() =>
{
HeavyMethod();
this.Invoke(new Action(() =>
{
// Update UI here.
}));
});
trdGenerateTrajectory.Start();
// trdGenerateTrajectory.Join(); <- do not block
Edit: Assuming that you want to run your computation in response to some button click (or any other event), you can use the async pattern in the event handler like so:
private async void myButton_Click()
{
await Task.Run(HeavyMethod);
// Update UI here.
}
You're starting the thread and immediately joining it. Joining the thread means that you are waiting for it to end.
Normally, user interfaces are single threaded. This means that your method needs to return for another event to be processed.
Help with ideas for redesign of the below C# program would be greatly appreciated. I am trying to pick between implementing multithreading using 1) TAP, 2) course-grained threads that contain spinners that terminate when their bools are set to false, or 3) the same threads using signalling instead of these bools. I will explain the program below, to make the case clear.
The Program
The program is a game automation application in C# that I am developing as a fun way to learn the language and C# (5.0) features better. It has a UI that needs to remain responsive while the app runs.
When a particular tab in the UI is opened, the program fires up a new thread called "Scan" that, in a new method in another class, scans various memory locations and updates labels in the UI with these quickly changing values using the UI's SynchronizationContext. This goes on in a while(scanning) loop, for as long as scanning bool is true (usually the full life-duration of the program).
When the user clicks the Start button on this tab, the program fires up two new threads that does the following: Thread "Run" moves the character around following a particular path. Thread "Action" hits particular buttons and performs actions at the same time as the player runs the path. If a certain scenario occurs, the program should stop the running thread and the action thread temporarily, run a method, and when it finishes, go back to the running and action'ing.
When the user clicks the Stop button on this tab, the automation should halt and threads terminate.
The Challenge
I have already created a working version using continuous spinner loops in each thread that takes care of the various work. The spinners run using a while(myBool). For the three threads the bools are: scanning, running and actioning.
When I want to stop a thread I set the bool to false, and use a Thread.Join to wait for the thread to terminate gracefully before proceeding. The threads can, as mentioned, be stopped by the user clicking the Stop button, or automatically by the program as part of its functionality. In the latter case a thread is stopped, Joined, and then at a later stage restarted.
After having done a lot of reading and research on threading and the new async programming tools in C# 5.0, I have realized that the way I am currently doing it might be very clumsy and unprofessional. It creates lots of synchronization/thread-safety issues, and as the goal of all of this is to learn more about C# I wanted to get your take on whether I should change it to a fine-grained asynchrounous programming approach instead, using TAP with async and await as appropriate.
Does this sound like a case where Tasks with cancellation tokens could be useful? The threads are after all long-running operations, so I was concerned that using the thread pool (Task.Run) would cause bad hygiene in the thread pool (over-subscription). If async programming seems like a bad match here, what about using threads as I have done, but instead use signalling to start and stop the threads?
Any thoughts greatly appreciated.
No. TPL was designed to run shorter tasks where the allocation of new threads all time would hurt perfomance. It got quite nice features like job queues and work stealing (a TPL thread can take jobs from another thread). It can of course have longer running task, but you wont get so many benefits from that. On the contrarary, you force TPL to allocate new threads.
However, the question is a bit general in the sense that we need more information about your actual implementation to know what you should use. For the Scan thread it's quite obvious that it should run in a single thread.
But for the others it's hard to know. Do they do work all the time or periodically? If they do work all the time you should keep them in seperate threads.
As for the thread syncronization there is another alternative. You could use a ConcurrentQueue to queue up everything that has to be drawn. In that way you do not need any synchronization. Just let the UI thread check the queue and draw anything in it, while the producers can continue to do their work.
In fact, in that way you can move anything not related to UI drawing to other threads. That should also improve the responsiveness in your application.
public void ActionRunnerThreadFunc()
{
_drawQueue.Enqueue(new SpaceShipRenderer(x, y));
}
public void UIThreadFunc()
{
IItemRender item;
if (_drawQueue.TryDequeue(out item))
item.Draw(drawContext);
}
I am having problems closing an application that uses WaitForSingleObject() with an INFINITE timout.
The full picture is this. I am doing the following to allow my application to handle the device wakeup event:
Register the event with:
CeRunAppAtEvent("\\\\.\\Notifications\\NamedEvents\\WakeupEvent",
NOTIFICATION_EVENT_WAKEUP);
Start a new thread to wait on:
Thread waitForWakeThread = new Thread(new ThreadStart(WaitForWakeup));
waitForWakeThread.Start();
Then do the following in the target method:
private void WaitForWakeup()
{
IntPtr handle = CreateEvent(IntPtr.Zero, 0, 0, "WakeupEvent");
while (true)
{
WaitForSingleObject(handle, INFINITE);
MessageBox.Show("Wakey wakey");
}
}
This all works fine until I try to close the application when, predictably, WaitForSingleObject continues to wait and does not allow the app to close properly. We only allow one instance of our app to run at a time and we check for this on startup. It appears to continue running until the device is soft reset.
Is there a way to kill the handle that WaitForSingleObject is waiting for, to force it to return?
Many thanks.
Use WaitForMultipleObject instead, and pass 2 handles. The existing one, and one for an event called something like 'exit'. During app shutdown, SetEvent on the exit event, and the WaitForMultipleObject will return and you can get it to exit the thread gracefully.
You need to switch on the return value of WaitForMultipleObject to do the appropriate behaviour depending on which one of the handles was triggered.
Possibly, also, you can set the thread to be a background thread. This will prevent it from stopping your application from shutting down when the main thread terminates.
See:
http://msdn.microsoft.com/en-us/library/system.threading.thread.isbackground.aspx
This is what I would do...
Use the EventWaitHandle class instead of calling CreateEvent directly. There shouldn't be any need to use the Windows API other than CeRunAppAtEvent (and API calls make code ugly...). Get this working first.
Before creating the thread, create a ManualResetEvent variable that is not initially flagged. Call it "TerminateEvent".
Replace the WaitForSingleObject API call with WaitHandle.WaitAny(WaitHandle[]) and pass an array containing "TerminateEvent" and the EventWaitHandle class wrapping the CeRunAppAtEvent notification.
Your loop can use the return value of WaitAny to determine what to do. The return value is the array index of the wait handle that unblocked the thread, so you can determine whether to continue the loop or not.
To cleanly end the thread, you can call "Set" on your "TerminateEvent" and then "Join" the thread to wait for it to terminate.
'This all works fine until I try to close the application when, predictably, WaitForSingleObject continues to wait and does not allow the app to close properly.'
Any app can close, no matter what its threads are doing. If you call ExitProcess(0) from any thread in your app, the app will close, no matter if there are threads waiting INFINITE on some API/sychro, sleeping, running on another processor, whatever. The OS will change the state of all theads that are not running to 'never run again' and use its interprocessor driver to hard-interrupt any other processors that are actually running your thread code. Once all the threads are stopped, the OS frees handles, segments etc and your app no longer exists.
Problems arise when developers try to 'cleanly' shut down threads that are stuck - like yours, when the app is closing. So..
Do you have a TThread.WaitFor, or similar, in an OnClose/OnCloseQuery handler, FormDestroy or destructor? If you have, and have no vital reason to ensure that the thread is terminated, just comment it out!
This allows the main form to close and so your code will finally reach the ExitProcess() it has been trying to get at since you clicked on the red cross button
You could, of coure, just call ExitProcess() yourself, but this may leave you with resources leaked in other proceses - database connections, for example.
'216/217 errors on close if I don't stop the threads'. This often happens because developers have followed the er... 'unfortunate' Delphi thread examples and communicate with threads by directly exchanging data between secondary thread fields and main thread fields, (eg. TThread.synchronize). This just sucks and is hell-bent on causing problems, even in the app run, never mind at shutdown when a form has been destroyed and a thread is trying to write to it or a thread has been destroyed and a main-thread form is trying ot call methods on it. It is much safer to communicate asynchronously with threads by means of queueing/PostMessaging objects that outlive both of them, eg. objects created in the thread/form and freed in the form/thread, or by means of a (thread-safe), pool of objects created in an initialization section. Forms can then close/free safely while associated threads may continue to pointlessly fill up objects for handling until the main form closes, ExitProcess() is reached and the OS annihilates the threads.
'My Form handle is invalid because it has closed but my thread tries to post a message to it'. If the PostMessage excepts, exit your thread. A better way is similar to the approach above - only post messages to a window that outlives all forms. Create one in an initialization section with a trivial WndProc that only handles one const message number that all threads use for posting. You can use wParam to pass the TwinControl instance that the thread is trying to communicate with, (usually a form variable), while lParam passes the object being communicated. When it gets a message from a thread, WndProc calls 'Peform' on the TwinControl passed and the TwinControl will get the comms object in a message-handler. A simple global boolean, 'AppClosing', say, can stop the WndProc calling Peform() on TwinControls that are freeing themselves during shutdown. This approach also avoids problems arising when the OS recreates your form window with a different handle - the Delphi form handle is not used and Windows will not recreate/change the handle of the simple form created in initialization.
I have followed these approaches for decades and do not get any shutdown problems, even with apps with dozens of threads slinging objects around on queues.
Rgds,
Martin
Of course the preferable way to solve this is to use WaitForMultipleObjects, or any other suitable function that is able to wait for multiple criterias (such as WaitForMultipleObjects, MsgWaitForMultipleObjects, etc.).
However if you have no control over which function is used - there're some tricky methods to solve this.
You may hack the functions imported from system DLL, by altering in memory the import table of any module. Since WaitForMultipleObjects is exported from kernel32.dll - it's ok.
using this technics you may redirect the function caller into your hands, and there you will be able to use the WaitForMultipleObjects.
Can Application.DoEvents() be used in C#?
Is this function a way to allow the GUI to catch up with the rest of the app, in much the same way that VB6's DoEvents does?
Hmya, the enduring mystique of DoEvents(). There's been an enormous amount of backlash against it, but nobody ever really explains why it is "bad". The same kind of wisdom as "don't mutate a struct". Erm, why does the runtime and the language supports mutating a struct if that's so bad? Same reason: you shoot yourself in the foot if you don't do it right. Easily. And doing it right requires knowing exactly what it does, which in the case of DoEvents() is definitely not easy to grok.
Right off the bat: almost any Windows Forms program actually contains a call to DoEvents(). It is cleverly disguised, however with a different name: ShowDialog(). It is DoEvents() that allows a dialog to be modal without it freezing the rest of the windows in the application.
Most programmers want to use DoEvents to stop their user interface from freezing when they write their own modal loop. It certainly does that; it dispatches Windows messages and gets any paint requests delivered. The problem however is that it isn't selective. It not only dispatches paint messages, it delivers everything else as well.
And there's a set of notifications that cause trouble. They come from about 3 feet in front of the monitor. The user could for example close the main window while the loop that calls DoEvents() is running. That works, user interface is gone. But your code didn't stop, it is still executing the loop. That's bad. Very, very bad.
There's more: The user could click the same menu item or button that causes the same loop to get started. Now you have two nested loops executing DoEvents(), the previous loop is suspended and the new loop is starting from scratch. That could work, but boy the odds are slim. Especially when the nested loop ends and the suspended one resumes, trying to finish a job that was already completed. If that doesn't bomb with an exception then surely the data is scrambled all to hell.
Back to ShowDialog(). It executes DoEvents(), but do note that it does something else. It disables all the windows in the application, other than the dialog. Now that 3-feet problem is solved, the user cannot do anything to mess up the logic. Both the close-the-window and start-the-job-again failure modes are solved. Or to put it another way, there is no way for the user to make your program run code in a different order. It will execute predictably, just like it did when you tested your code. It makes dialogs extremely annoying; who doesn't hate having a dialog active and not being able to copy and paste something from another window? But that's the price.
Which is what it takes to use DoEvents safely in your code. Setting the Enabled property of all your forms to false is a quick and efficient way to avoid problems. Of course, no programmer ever actually likes doing this. And doesn't. Which is why you shouldn't use DoEvents(). You should use threads. Even though they hand you a complete arsenal of ways to shoot your foot in colorful and inscrutable ways. But with the advantage that you only shoot your own foot; it won't (typically) let the user shoot hers.
The next versions of C# and VB.NET will provide a different gun with the new await and async keywords. Inspired in small part by the trouble caused by DoEvents and threads but in large part by WinRT's API design that requires you to keep your UI updated while an asynchronous operation is taking place. Like reading from a file.
It can be, but it's a hack.
See Is DoEvents Evil?.
Direct from the MSDN page that thedev referenced:
Calling this method causes the current
thread to be suspended while all
waiting window messages are processed.
If a message causes an event to be
triggered, then other areas of your
application code may execute. This can
cause your application to exhibit
unexpected behaviors that are
difficult to debug. If you perform
operations or computations that take a
long time, it is often preferable to
perform those operations on a new
thread. For more information about
asynchronous programming, see
Asynchronous Programming Overview.
So Microsoft cautions against its use.
Also, I consider it a hack because its behavior is unpredictable and side effect prone (this comes from experience trying to use DoEvents instead of spinning up a new thread or using background worker).
There is no machismo here - if it worked as a robust solution I would be all over it. However, trying to use DoEvents in .NET has caused me nothing but pain.
Yes, there is a static DoEvents method in the Application class in the System.Windows.Forms namespace. System.Windows.Forms.Application.DoEvents() can be used to process the messages waiting in the queue on the UI thread when performing a long-running task in the UI thread. This has the benefit of making the UI seem more responsive and not "locked up" while a long task is running. However, this is almost always NOT the best way to do things.
According to Microsoft calling DoEvents "...causes the current thread to be suspended while all waiting window messages are processed." If an event is triggered there is a potential for unexpected and intermittent bugs that are difficult to track down. If you have an extensive task it is far better to do it in a separate thread. Running long tasks in a separate thread allows them to be processed without interfering with the UI continuing to run smoothly. Look here for more details.
Here is an example of how to use DoEvents; note that Microsoft also provides a caution against using it.
From my experience I would advise great caution with using DoEvents in .NET. I experienced some very strange results when using DoEvents in a TabControl containing DataGridViews. On the other hand, if all you're dealing with is a small form with a progress bar then it might be OK.
The bottom line is: if you are going to use DoEvents, then you need to test it thoroughly before deploying your application.
Yes.
However, if you need to use Application.DoEvents, this is mostly an indication of a bad application design. Perhaps you'd like to do some work in a separate thread instead?
I saw jheriko's comment above and was initially agreeing that I couldn't find a way to avoid using DoEvents if you end up spinning your main UI thread waiting for a long running asynchronous piece of code on another thread to complete. But from Matthias's answer a simple Refresh of a small panel on my UI can replace the DoEvents (and avoid a nasty side effect).
More detail on my case ...
I was doing the following (as suggested here) to ensure that a progress bar type splash screen (How to display a "loading" overlay...) updated during a long running SQL command:
IAsyncResult asyncResult = sqlCmd.BeginExecuteNonQuery();
while (!asyncResult.IsCompleted) //UI thread needs to Wait for Async SQL command to return
{
System.Threading.Thread.Sleep(10);
Application.DoEvents(); //to make the UI responsive
}
The bad: For me calling DoEvents meant that mouse clicks were sometimes firing on forms behind my splash screen, even if I made it TopMost.
The good/answer: Replace the DoEvents line with a simple Refresh call to a small panel in the centre of my splash screen, FormSplash.Panel1.Refresh(). The UI updates nicely and the DoEvents weirdness others have warned of was gone.
I've seen many commercial applications, using the "DoEvents-Hack". Especially when rendering comes into play, I often see this:
while(running)
{
Render();
Application.DoEvents();
}
They all know about the evil of that method. However, they use the hack, because they don't know any other solution. Here are some approaches taken from a blog post by Tom Miller:
Set your form to have all drawing occur in WmPaint, and do your rendering there. Before the end of the OnPaint method, make sure you do a this.Invalidate(); This will cause the OnPaint method to be fired again immediately.
P/Invoke into the Win32 API and call PeekMessage/TranslateMessage/DispatchMessage. (Doevents actually does something similar, but you can do this without the extra allocations).
Write your own forms class that is a small wrapper around CreateWindowEx, and give yourself complete control over the message loop.
-Decide that the DoEvents method works fine for you and stick with it.
Check out the MSDN Documentation for the Application.DoEvents method.
The DoEvents does allow the user to click around or type and trigger other events, and background threads are a better approach.
However, there are still cases where you may run into issues that require flushing event messages. I ran into a problem where the RichTextBox control was ignoring the ScrollToCaret() method when the control had messages in queue to process.
The following code blocks all user input while executing DoEvents:
using System;
using System.Runtime.InteropServices;
using System.Windows.Forms;
namespace Integrative.Desktop.Common
{
static class NativeMethods
{
#region Block input
[DllImport("user32.dll", EntryPoint = "BlockInput")]
[return: MarshalAs(UnmanagedType.Bool)]
private static extern bool BlockInput([MarshalAs(UnmanagedType.Bool)] bool fBlockIt);
public static void HoldUser()
{
BlockInput(true);
}
public static void ReleaseUser()
{
BlockInput(false);
}
public static void DoEventsBlockingInput()
{
HoldUser();
Application.DoEvents();
ReleaseUser();
}
#endregion
}
}
Application.DoEvents can create problems, if something other than graphics processing is put in the message queue.
It can be useful for updating progress bars and notifying the user of progress in something like MainForm construction and loading, if that takes a while.
In a recent application I've made, I used DoEvents to update some labels on a Loading Screen every time a block of code is executed in the constructor of my MainForm. The UI thread was, in this case, occupied with sending an email on a SMTP server that didn't support SendAsync() calls. I could probably have created a different thread with Begin() and End() methods and called a Send() from their, but that method is error-prone and I would prefer the Main Form of my application not throwing exceptions during construction.