I'm working on a Windows CE project and have run into a strange problem. When i display a messagebox the physical key stop working. The project relies solely on key input which work fine when changing focus between objects on the form. Because i can't slect an option from the message box, the application basically locks out.
I initially thought my keydown handler was to blame but i discovered the same issue regardless of the keydown handler (even in a new project it doesn't work)
I've been looking around online for information but the only thing i've come across involves changing messagebox buttons. The most prominent solution is to create your own messageboxes but it seems like a waste for what i need. Is there a more efficient way of solving this problem?
Related
I am a fairly experienced WinForms developer. I have an MdiApplication that used to work well. However, recently the main shell of the application, for which we use ComponentOne RibbonForm, has been updated in a big way. This update did affect some of our other 3rd party components, which we established was due to ComponentOne's use of DoEvents() in their event code. I thought I had cleaned up all of the code causing problems but I now have found another...
When I have multiple MdiChildren open and select one of these in code from an button click event on the ribbon form via
document.Activate();
document.EditorControl.Select();
document.EditorControl.Focus();
the other open MdiChildren documents still have focus, that it the forms are highlighted and input is not set on the document I set in code. Two questions:
How can I ensure that the Form I want to make active is the only one that is active?
Linking to the above; setting one form as active using form.Activate() should deactivate the others MdiChildren, but it is not - how can I deactivate the other windows in code?
Thanks for your time.
[Too long for a comment]
I am sick to the hind teeth with fighting C1. Esp. the Ribbon. I have confirmed with their support that they do use DoEvents() which they use to yield on their Gui threads. I am now going to switch to DevExpress which should be straight forward for my MVC application...
C1's use of DoEvents() messes up the normal flow of your application. DoEvents() is asynchronous which means it terminates before the application has actually processed any outstanding events, so if you're using it in a procedure with many sequential statements, calling DoEvents() causes a huge disturbance whenever it's called. This is what I think we are seeing when we perform our MDI operations, but we can never be sure without the C1 source code.
I hope this helps.
We have a small problem with our .net 4.0 wpf application. If the user locks his screen (windows) and a few seconds later unlock the screen (resume to the work), the application looks like this:
The user can not click on any control or do some thing. The curious thing is,
that it only appears if the user is on special tabs, where we use BeginInvoke
and work a lot with threading. I could not find any thing about this bug, besides that BeginInvoke can create those problems.
I also could not find any thing in the windows logs.
EDIT
I use multimonitor, if my application is black, removing the monitor cable an reconnecting it, the application continues to work.
The problem only seems to appear when using the wpf DataGrid. Switching from thsi grid to Telerik-Grid solve the problem, but i could not figure out which part cause the problem.
I'm just starting out with MonoMac in Xamarin Studio, and I've run into the strangest problem:
I a window with an NSButton and a NSTextField on it. By this point I've cut out the event handler on the button, so it doesn't DO anything, except highlight when I click it. The button creation code looks like this:
nsButton = new NSButton(new System.Drawing.RectangleF(0, 0, 100, 100));
nsButton.BezelStyle = NSBezelStyle.RoundRect;
nsButton.Font = NSFont.SystemFontOfSize(
NSFont.SystemFontSizeForControlSize(NSControlSize.Regular));
nsButton.StringValue = text;
...and then it gets added to the window like so:
nsView.AddSubview(control.Handle as NSView);
(because in this part of the code, control.Handle is typed as object, and nsView is the main view on the window).
All runs and works fine at first. But, if I click repeatedly on that button, eventually the window just closes. No error, no exception, and the app itself doesn't quit; menus continue to respond and cheerfully log messages when I use them. But the window is simply -- gone.
It's extremely repeatable: it happens after 21 clicks. If I add an event handler that updates the NSTextField (e.g. hello.Caption="Foo";), then it happens after 19 clicks. It doesn't matter whether I click fast or slow; it's always the same number of clicks. Note that there is no code in the project to close the window, and the window doesn't even have a close box; I know of no legitimate way to close it short of quitting the app.
I am baffled here, and don't know how to debug this further. Does Xamarin have some sort of evaluation limit that closes your windows after so-many events? Is it a framework bug? Any insight will be greatly appreciated.
But, if I click repeatedly on that button, eventually the window just
closes. No error, no exception, and the app itself doesn't quit; menus
continue to respond and cheerfully log messages when I use them. But
the window is simply -- gone.
This "disappearing without a trace" sometimes occurs when an application crashes in native code badly enough. This can occur due to bugs in the binding code or mistakes made in calling the native APIs that corrupt internal cocoa state. I believe you are using MonoMac, and that this particular issue has been fixed in Xamarin.Mac.
You can sometimes get more information from the output window or by attaching lldb to your process.
This turned out to be the same issue as this one, in a slightly different guise.
In short, I wasn't keeping a reference to the NSWindow object, but instead was letting it go out of scope. So the GUI window would stick around for a while, but eventually (after some number of events or other code creating behind-the-scenes garbage) it is noticed and disposed of by the garbage collector. The window is then torn down.
It's all perfectly reasonable once you think of it, and happens under both Xamarin and MonoMac (just at slightly different times).
The simple solution, of course, is to retain a reference to the window until you're truly done with it. Problem solved!
(And yes, I feel a bit sheepish, but hopefully this question will get found by future Mac C# developers, and save them some grief.)
I'm working on a rather big project with my team and after a while, we struck into a big problem.
Infact when we minimize the main window of the application, clicking on the taskbar to restore it results in a "bing" sound (the one that windows uses when you are trying to interact with a background window when a modal dialog is opened on it). I can't restore the window except if I press ENTER button (after obviusly clicking on it).
We are using XNA to render something inside a WindowsFormsHost component in our WPF application and the problem comes out when we change something that is not connected directly with wpf (something inside the rendering engine, so it works only with XNA).
I can't post any code because I don't own any rights of it and would be meaningless because the project is enough big.
So my question is: what are the things that can produce a problem like this one (unable to restore window sound) when you click on the taskbar?
At least I can understand where to search for this bug, because I don't even understand where I shall dirt my hands in.
Important notes: I'm using a splash screen and the problems come up when I do something on a second window (so not directly the main one) which is not modal
Thanks for any suggestion
We solved the problem by finding out that there is a bug with an external component that we used inside our application. Still I don't understand how this bug has been created, if someone has a better answer I'll mark it.
We have a C# .NET 3.5 UI client application that runs in a multiple monitor desktop environment (typically 4 screens) on Windows 7. Every so often, after running several of these applications, the screen stops redrawing.
Controls continue to be reactive to clicks or keypress and values can be updated programmatically, but the entire form is not redrawn to reflect any changes. For example buttons that are enabled/disabled based on state may be remain grayed out, but be reactive to clicks or vice versa. Buttons do not animate when clicked.
Workaround: minimizing and restoring the window appears to clear the problem. After this, the application begins to draw correctly.
The must frustrating aspect of this problem is that programmatically, everything appears to be running normally. No exceptions are caught in our logs. Nothing was visible in the system event logs. We have not found a way to detect this condition is happening yet.
Other miscellaneous aspects: logging uses log4net, server communication layer uses ZMQ
Update:
Calling form Invalidate() and Update() does not fix the problem.
When dragging the window between screens, it shows different values on each screen.
Minimize/restore still resolves the issue.
I can't be sure of anything without seeing the app and the code, but my best guess is someone calls .SuspendLayout() before a complicated update, and an exception (probably swallowed) prevents the code from ever reaching the corresponding .ResumeLayout(). To test this, try adding a button that calls .ResumeLayout() for the form.
It seems the solution is there:
1) http://blogs.msdn.com/b/alejacma/archive/2009/08/11/controls-won-t-get-resized-once-the-nesting-hierarchy-of-windows-exceeds-a-certain-depth-x64.aspx
2) http://support.microsoft.com/kb/2664641/en-us