When clicking on a text field, automatically open up the Windows 10 touch keyboard if the device is in 'Tablet Mode'.
How to Know device is in 'Tablet Mode'?
How to open keyboard in WPF?
I think this is a misunderstanding. WPF doesn't run in tablet mode at all and doesn't have an on-screen keyboard. It's strictly a desktop-oriented technology. I'm assuming that you are building an applicant in XAML and that you want the same application to run in both places without having to create to separate projects.
WPF is a technology to run XAML in desktop mode whereas a different technology, called WinRT, is required to run applications created in XAML in tablet mode on Windows 10.
If you want to build a single application that is capable of running in both modes, you need to consider adopting an MVVM "framework" to help you out with this task.
The purpose of using MVVM in this context is that it enables you to separate your cross-platform "business logic code" inside classes called "ViewModels" while putting your UI specific code inside a XAML file with a different XAML file being created for each platform you are supporting (for example WPF, Silverlight, Windows Phone, WinRT).
Here are some MVVM frameworks that can help you with this:
Caliburn.Micro:
(http://caliburnmicro.com/documentation/windows-runtime)
MvvmCross: https://mvvmcross.com/
Catel: (https://catel.codeplex.com/)
Simple MVVM: (https://simplemvvmtoolkit.codeplex.com/)
MVVMLight: (http://www.mvvmlight.net/)
Related
I'm building a set of small internal apps for our business. A majority are web based tools. To provide easy access to all of the tools to all our staff. Previously I built small windows applications which are used currently. However, I'm looking to upgrade these to Universal Apps as we can better distribute future applications through our Microsoft Business Store.
Currently when placing a WebView into the UWP app, the app loads and functions as expected in terms of loading the WebView. However I cannot seem to find a means to implement Back / Forward navigation (previously this was a toggled option in the properties group) if a user navigates using Back/Forward on their mouse etc. Any pointers would be greatly appreciated. No doubt it's something quite obvious I haven't spotted.
I was expecting the basic back / forward navigation to work out of the box (as is the case I have experienced previously when developing for other platforms)
The app I'm currently working on is using the UWP with XAML and C# and currently has the WebViews implemented using:
<WebView x:Name="webView" Source="https://app.domain.com"/>
within MainPage.xaml and no further modifications made to this app as it stands.
I regularly use SO as a guest to find answers to questions but I couldn't see a direct answer to this scenario.
Thanks
The WebView class has built-in APIs that could Go Back/Forward. You could add some buttons and call them in the click event.
I'm going to develop a desktop application which is a sort of background service listening for events. The requirements for this application are:
tray icon in background always visible
show up notifications on desktop when an event occurs (like receive an email such as, you will notify with a popup)
The main problem is that the application must run under Windows,Mac and Linux.
I decided to develop the main business logic in .NET Core, but I have a few problems about the front-end part; using Windows.Forms.NotifyIcon it's easy, but obviously this solution isn't valid for other platforms.
So, I found GtkSharp project, in particular the StatusIcon object seems to do exactly what I need, except for the notification part. What I need is a sort of balloon info which is already part of Windows.Forms.NotifyIcon, but not in Gtk. Does anyone know if there is a library or a component that allow me to achieve my requirements for a cross-platform context?
[UPDATE]
I asked directly in GtkSharp channel and they told me that is not possible do what I need with GtkSharp (moreover StatusIcon is deprecated and removed from GTK 4).
I found another interesting possible way, seems that Electron framework support all my requirements. Is a different approch, but it covers all what I need and works well on MacOs,Windows and Linux.
Me and a friend are working on a relatively big project, which is basically a darts scoreboard and tournament scheme software with stats tracking etc. The problem is that the tournament scheme part has been done in Windows forms, and that is a very massive part of the project, but the scoreboard part has to be done in WPF because that UI needs a little style tweaking. The thing is how do I go about getting these 2 working in conjunction, starting the whole thing from... well the WinForms end, and it exchanging data with the Scoreboard(WPF app) I need very basic things, like passing a string from a label in the form to somewhere in the WPF window. I would move the tournaments scheme to WPF, but we just don't have enough time at this point.
You can use controls created in WPF application in your Windows Forms application, for an example see Walkthrough: Hosting a WPF Composite Control in Windows Forms.
Or you can use control created in Windows Forms application in your WPF application, example:
<WindowsFormsHost>
<wf:MaskedTextBox x:Name="mtbDate" Mask="00/00/0000"/>
</WindowsFormsHost>
Sorry, I don't have much experience in programming and English and my question might be awkward. It relates to Windows 10 applications for classic desktop and touch screens.
I want to add a nice-looking touch-oriented Modern-style UI to my non-Visual Studio application. It is possible to call an external .NET assembly from my app and use its public constructors, methods, properties, and handle its events.
So I’ve created a new WPF User Control Library project in Visual Studio, replaced default User Control with a new Window, and built a dll file. Now I can use it in my app as described above. Everything seems to work fine.
Here are the questions:
Am I doing it right? Is building a dll from WPF project the only way
to make a UI dll that can be used in external applications?
How can I make the UI looks and acts like Windows Universal Apps (at least I want it to respond screen rotation)? As I understand I
cannot build a dll from Universal App projects and I have to create
WPF window, make it full screen, and add appropriate UI elements.
Is it possible to use Windows 10 contracts and extensions in such UI
dll?
If you want to share your code between different programs, you have to make it a shared library which is dll in windows. I don't know the other two questions.
I'm automation test engineer and performing automation using SilkTest
Is there any way to get active Metro window elements (e.g name, style, coordinates, sub-elements)?
Latest SilkTest 13.5 version do not see Metro objects at all :(.
Could not figure out from the comments whether you are looking at options other than SilkTest. If you do then have a look at RIATest which does support Metro applications.
Stating from version 6.0 a number of features in RIATest are specifically targeted at Metro style application automation, particularly:
Ability to stay on top of Metro UI to allow you to simultaneously see your Metro application and RIATest IDE and minimize switching from Desktop to Metro screens when automating a Metro application.
Recording of actions performed on native Windows GUI elements (including Metro GUI). The recorder analyses your script code and reuses your variable names to generate cleaner recorded scripts similar to how you would hand-code an automation script.
Seamless workaround for bugs like this in Windows UI Automation implementation in Metro UI.
Disclaimer: I work for Cogitek, the RIATest company.
From the SilkTest 13.5 release notes