For my UWP app (in C#) I'd like to have a button so that when the user presses that Button, the on-screen keyboard would open up in inking mode (i.e. where the user can start handwriting and have their handwriting recognized and turned into typed text input).
If I have a textbox and put the focus on the textbox then the on-screen keyboard opens up but in text mode. But what I want is to have it come up in inking mode right from the start, instead of bringing it up in text mode and have the user switch it to ink mode.
Anyone knows how can I go about doing this?
You need to have one of these supported Components of the Windows Ink platform.
There is also a sample, but keep in mind that a bunch of code is commented.
If you want to tie an event to a push of a button, you can for example switch the focus on that component when the button is pressed.
Related
I would like to know if it is possible to, while running a WPF window application in Visual Studio, wait for the user to click anywhere on the screen (not necessarily inside the window of my application - for the purpose of my application, the click would most likely occur inside a browser page) and then gather the information about the click (like inside the window of which application the user clicked, or the selector of the html element the user clicked)? I know this question might be very confunsing but this is basically my last resort since researching on the Internet hasn't helped me much. Just to provide a better idea of what I seek, it's like what the 'Extract Structured Data' Activity does in UiPath. Oh and I'm using C# by the way.
You can try and use this external library called GlobalMouseHook.
This library allows you to tap keyboard and mouse, detect and record their activity even when an application is inactive and runs in background.
Here is what you can do with this library:
Mouse coordinates
Mouse buttons clicked
Mouse drag actions
Mouse wheel scrolls
Key presses and releases
Special key states
Hope this helps.
I have a very simple scenario where I can focus an editable text box, the cursor appears inside the field, bt the keyboard will not show.
I have replicated this in a small sample app (Windows Phone 8.1 - Universal App). Very easy to recreate.
Create an 8.1 universal app. In the MainPage for phone add a text box and a button. The code for the button just sets the textbox to NOT read only. The default state of the textbox is ReadOnly.
Run app, select edit and then select the field. Cursor is present and keyboard opens. Close app.
Failure scenario:
Open app, touch read only text field. Note: No cursor is in box as it is read only.
Select Edit button. Tap the text field. Cursor is focused into field, but keyboard does not appear. I have a sample app with this behavior.
Any Resolutions?
This is a known issue in Windows Phone 8.1 which is fixed in current builds of Windows 10 Mobile.
Unfortunately I don't see any good workarounds for this on Windows Phone 8.1 other than "don't do that". Instead of switching a TextBox into and out of IsReadOnly mode try swapping between two TextBoxes (or a TextBox and a TextBlock).
how to change keyboard position change in windows 8 metro application when it is appear on focus on text box it always appear on the bottom of the screen. any one help ?
On Screen Keyboard will appear automatically when a text box got focus, we cannot control it from the app, you can look for events like Showing and Hiding: http://code.msdn.microsoft.com/windowsapps/keyboard-events-sample-866ba41c?
How to place/position an element just above the on-screen keyboard in windows phone application (Silverlight - C#) so that it looks like a part of keyboard. Also need to consider automatic scrolling done by windows when the keyboard is above the focused element.
No you cannot put any control above the On Screen Keyboard in WindowsPhone, the On Screen Keyboard is a system object, you cannot access it.
I have a TagVisualizer covering the entire screen. Nested inside is a fullscreen ScatterView. Inside one of my ScatterViewItems is UserControl which lists n amount of SurfaceTextBoxs. The UserControl and ScatterViewItems are always visible.
E.g.
<TagVisualizer Panel.ZIndex="1">
<TagVisualizer.Definitions>
...
<TagVisualizer.Definitions>
<ScatterView Panel.ZIndex="2">
<ScatterViewItem />
<ScatterViewItem />
<ScatterViewItem >
<UserControl />
</ScatterViewItem>
...
</ScatterView>
</TagVisualizer>
The problem I have is when there is no tags being recognized on the table the Touch input works and the Keyboard pops up and functions correctly. However when there is a tag on the table being recognized the keyboard no longer focuses on the textblock and wont react to user touch input. I thought layering the content zindex would solve this but I was wrong...
What is frustrating is that my code works with the Surface Input Simulator tools but not on the device itself. (tag and touch tool options, not mouse).
The tags are precisely cutout and do not appear as blobs / finger
touches on the table.
Dragging and dropping of content from
scatterview to librarybars etc all work as intended.
Other buttons and touch elements are all working as intended
The Source for each tag is IsHitTestVisible=false and has no buttons etc (only a Ellipse which shows me where the tag is if being recognised)
Only the
keyboard functionality is not functioning as I thought it was
intended.
I believe the reason behind this is due to that the Tag captures the Mouse context and won't let go and the default Keyboard functionality needs not the touch input but the single mouse event. How would I go about making the Keyboard work on a SurfaceTextBox - Inside a UserControl - when a Tag is being recognized?
I'm new to WPF and this problem is causing me a minor headache...I have looked at the SDK examples and I cannot find a solution for this.. Is there something stupid I am doing or something simple not doing/have missed?
Any help would be appreciative :)
After researching more about the keyboard structure of Surface 2.0 programs and again following the Shopping Cart example you have to use the Surface 2.0 Keyboard, not the Tablet one (the one you can toggle on/off in the control panel). I made the foolish error in thinking that both the keyboards can operate the same... nope.
The Surface 2.0 keyboard accepts touch input and works with TagVisualizers. This is due binding the keyboard to certain SurfaceTextBoxes and it taking in touch input even when the mouse has not clicked on it.
The Tablet windows default keyboard does not work with touch input unless a mouse input has touched it (when there is no other input the last and current active touch input is promoted to a mouse). This keyboard always shows when no over keyboard is specified and it is enabled in the Control Panel. This is tedious to work with TagVisualizers as when there is no mouse/touch input but there is a recognized tag.. this tag will steal the contact and until you take it off.
This problem does not exist with the Surface 2.0 keyboard. When you develop with the Surface 2.0 keyboard the Tablet keyboard will take it's place (annoyingly) until you run your software via the Surface Shell...
To develop your program and see the Surface 2.0 Keyboard you will have to have a development environment for and on a set up for Surface. E.g. develop on a Samsung SUR40 etc
http://msdn.microsoft.com/en-us/library/ff727875.aspx Testing in Windows mode
http://msdn.microsoft.com/en-us/library/ff727840.aspx Testing/debugging for surface
http://msdn.microsoft.com/en-us/library/ff727766.aspx Surface Keyboard
P.s. I believe this is correct... I am still researching on this issue and if I find a way around this Ill update my answer...
Edit:
To extend this answer.. You should code the application to use the Surface Shell, via the launcher you should either launch your application from the selection of available applications and / or set the shell to one application mode if you're programming for a store environment.