Place element just above on-screen keyboard in wpf - c#

How to place/position an element just above the on-screen keyboard in windows phone application (Silverlight - C#) so that it looks like a part of keyboard. Also need to consider automatic scrolling done by windows when the keyboard is above the focused element.

No you cannot put any control above the On Screen Keyboard in WindowsPhone, the On Screen Keyboard is a system object, you cannot access it.

Related

How can I add a custom toolbar to the touch keyboard in UWP apps?

I'm having an input form with multiple TextBox elements and I would like to add a "Next" button on the touch keyboard that allows the user to advance to the next TextBox.
I've tried experimenting with the CommandBar as BottomAppBar (hide/show it based on touch keyboard visibility) but it doesn't really fit the purpose I think. What's the way to do it?
It is not possible to customize software keyboard in UWP app.
I've tried experimenting with the CommandBar as BottomAppBar (hide/show it based on touch keyboard visibility) but it doesn't really fit the purpose I think. What's the way to do it?
I think you've made a good decision to use CommandBar as BottomAppBar, and to hide/show it based on touch keyboard visibility. There is no facility for defining custom keyboard layouts in an application, so a bottom Commandbar is the best choice in this scenario under the current condition.

How to display on-screen number pad instead of full keyboard in WPF desktop application?

The task is simple. I have standard desktop Windows 10 application. It has a TextBox control. When it gets focus, I want only numbers to be allowed. It's easy to filter the keyboard input, but how to make the application display the number pad on touch devices?
I've read http://brianlagunas.com/showing-windows-8-touch-keyboard-wpf/ .
I have no use for described solution, my application depends on touch events, so I have no problem showing the on-screen keyboard manually. However - I have numeric input field and it's super user-unfriendly to show the user an on-screen keyboard where he or she can't even enter numbers directly.
I can't make this app on UWP, because it heavily depends on unmanaged, architecture-dependent DLL-s.

UWP - Programatically bring up the on screen keyboard in ink mode

For my UWP app (in C#) I'd like to have a button so that when the user presses that Button, the on-screen keyboard would open up in inking mode (i.e. where the user can start handwriting and have their handwriting recognized and turned into typed text input).
If I have a textbox and put the focus on the textbox then the on-screen keyboard opens up but in text mode. But what I want is to have it come up in inking mode right from the start, instead of bringing it up in text mode and have the user switch it to ink mode.
Anyone knows how can I go about doing this?
You need to have one of these supported Components of the Windows Ink platform.
There is also a sample, but keep in mind that a bunch of code is commented.
If you want to tie an event to a push of a button, you can for example switch the focus on that component when the button is pressed.

Windows 8 metro app keyboard position change

how to change keyboard position change in windows 8 metro application when it is appear on focus on text box it always appear on the bottom of the screen. any one help ?
On Screen Keyboard will appear automatically when a text box got focus, we cannot control it from the app, you can look for events like Showing and Hiding: http://code.msdn.microsoft.com/windowsapps/keyboard-events-sample-866ba41c?

TagVisualizer - when active - blocks Textbox touch Keyboard focus inside a ScatterViewItem

I have a TagVisualizer covering the entire screen. Nested inside is a fullscreen ScatterView. Inside one of my ScatterViewItems is UserControl which lists n amount of SurfaceTextBoxs. The UserControl and ScatterViewItems are always visible.
E.g.
<TagVisualizer Panel.ZIndex="1">
<TagVisualizer.Definitions>
...
<TagVisualizer.Definitions>
<ScatterView Panel.ZIndex="2">
<ScatterViewItem />
<ScatterViewItem />
<ScatterViewItem >
<UserControl />
</ScatterViewItem>
...
</ScatterView>
</TagVisualizer>
The problem I have is when there is no tags being recognized on the table the Touch input works and the Keyboard pops up and functions correctly. However when there is a tag on the table being recognized the keyboard no longer focuses on the textblock and wont react to user touch input. I thought layering the content zindex would solve this but I was wrong...
What is frustrating is that my code works with the Surface Input Simulator tools but not on the device itself. (tag and touch tool options, not mouse).
The tags are precisely cutout and do not appear as blobs / finger
touches on the table.
Dragging and dropping of content from
scatterview to librarybars etc all work as intended.
Other buttons and touch elements are all working as intended
The Source for each tag is IsHitTestVisible=false and has no buttons etc (only a Ellipse which shows me where the tag is if being recognised)
Only the
keyboard functionality is not functioning as I thought it was
intended.
I believe the reason behind this is due to that the Tag captures the Mouse context and won't let go and the default Keyboard functionality needs not the touch input but the single mouse event. How would I go about making the Keyboard work on a SurfaceTextBox - Inside a UserControl - when a Tag is being recognized?
I'm new to WPF and this problem is causing me a minor headache...I have looked at the SDK examples and I cannot find a solution for this.. Is there something stupid I am doing or something simple not doing/have missed?
Any help would be appreciative :)
After researching more about the keyboard structure of Surface 2.0 programs and again following the Shopping Cart example you have to use the Surface 2.0 Keyboard, not the Tablet one (the one you can toggle on/off in the control panel). I made the foolish error in thinking that both the keyboards can operate the same... nope.
The Surface 2.0 keyboard accepts touch input and works with TagVisualizers. This is due binding the keyboard to certain SurfaceTextBoxes and it taking in touch input even when the mouse has not clicked on it.
The Tablet windows default keyboard does not work with touch input unless a mouse input has touched it (when there is no other input the last and current active touch input is promoted to a mouse). This keyboard always shows when no over keyboard is specified and it is enabled in the Control Panel. This is tedious to work with TagVisualizers as when there is no mouse/touch input but there is a recognized tag.. this tag will steal the contact and until you take it off.
This problem does not exist with the Surface 2.0 keyboard. When you develop with the Surface 2.0 keyboard the Tablet keyboard will take it's place (annoyingly) until you run your software via the Surface Shell...
To develop your program and see the Surface 2.0 Keyboard you will have to have a development environment for and on a set up for Surface. E.g. develop on a Samsung SUR40 etc
http://msdn.microsoft.com/en-us/library/ff727875.aspx Testing in Windows mode
http://msdn.microsoft.com/en-us/library/ff727840.aspx Testing/debugging for surface
http://msdn.microsoft.com/en-us/library/ff727766.aspx Surface Keyboard
P.s. I believe this is correct... I am still researching on this issue and if I find a way around this Ill update my answer...
Edit:
To extend this answer.. You should code the application to use the Surface Shell, via the launcher you should either launch your application from the selection of available applications and / or set the shell to one application mode if you're programming for a store environment.

Categories