I'm implementing windows 8.1. application and I'd like to show some menus on edge gestures. On top edge gesture Menu A and on bottom gesture Menu B. I found out that it is probably not possible.
In following code
void EdgeGesture_Completed(object sender, EdgeGestureEventArgs e)
{
if (e.Kind == EdgeGestureKind.Touch)
{
Scenario1OutputText.Text = "Invoked with touch.";
}
else if (e.Kind == EdgeGestureKind.Keyboard)
{
Scenario1OutputText.Text = "Invoked with keyboard.";
}
else if (e.Kind == EdgeGestureKind.Mouse)
{
Scenario1OutputText.Text = "Invoked with right-click.";
}
}
we have possible to recognize if top/bottom menu is invoked by Touch, Keyboard or Right click but EdgeGestureEventArgs doesn't contain any other info.
Do you have any idea how to recognize edge gestures? I mean, if it is Top or Bottom edge gesture.
The standard behaviour is to show both the top and the bottom together. If you use the built-in AppBar control then you'll get this automatically.
If you want to separate the top and bottom app bars then it's trickier and you'll need to implement that yourself. There isn't any direct way to tell if edgie was triggered from the top or the bottom, but you can track PointerEvents and if the EdgeGestureKind isTouch then you can guess based on the pointer location.
There is no difference if edgie was triggered by keyboard or mouse since those gestures aren't location dependant.
Also note that the standard appbar and charms behaviour is different in the Windows 10 Technical Preview than on Windows 8.1, so if you implement it yourself your app's behaviour may end up farther from standard than you intend.
Related
I am building a game for Windows PCs and I need to change the cursor icon when the user is over clickable UI elements. I have found this command Cursor.SetCursor(texture2d, vector2) but it has some lag.
When the mouse is over the UI elements it changes but it has some delay which is really annoying (This is what other users said about that anyway).
After some reading, I learned that Unity basically just changes the cursor icon in the software level. i.e. Hides the cursor and displays my image, and make it follow the cursor position.
My question is: How can I change the icon in hardware level, again, in windows builds only.
When I searched "Changing mouse cursor in C#", I have found the windows.forms option (which doesn't work in unity) and a c++ code but it wasn't full (Only methods' names), and I don't know how to run it in C#…
The SetCursor not work very good on all Windows app, but is the right way to do it.
Cursor.SetCursor(cursorTexture, hotSpot, cursorMode);
Another way is to make a Little fake hidding the mouse cursor and making a GUI cursor for your desire case. You can add more conditions for each event you like to customize.
var OnMouseEnterCursor:Texture2D;
var cursorSizeX: int = 32; // your cursor width
var cursorSizeY: int = 32; // your cursor height
var MouseEnterCond = false;
function OnMouseEnter()
{
MouseEnterCond = true;
Screen.showCursor = false;
}
function OnMouseExit()
{
MouseEnterCond = false;
Screen.showCursor = true;
}
function OnGUI()
{
if(MouseEnterCond )
{
GUI.DrawTexture (Rect(Input.mousePosition.x-cursorSizeX/2 + cursorSizeX/2, (Screen.height-Input.mousePosition.y)-cursorSizeY/2 + cursorSizeY/2, cursorSizeX, cursorSizeY),OnMouseEnterCursor);
}
}
If there is a way to enforce hardware cursors in Unity the world has not found it yet.
Unity's own documentation explains it pretty well:
A hardware cursor is preferred on supported platforms through the function which you have yourself found (remember to set the cursor-mode to auto), but it will automatically, and uncontrollably, fall back on software for unsupported platforms and resolutions.
An important thing to note is that there is a special field (which the reference only peripherally mentioned) in the project settings/player settings menu called the "Default Cursor".
This is the only supported hardware cursor on e.g. windows store apps.
And remember to set your textures to type "cursor" in their configurations,.
Finally, keep in mind that windows only supports the 32x32px size. This may force Unity's hand when selecting the render type.
The default cursor settings in the player settings worked in the editor but it did not show up in a build. Also limiting the texture size to 32x32. I resolved this issue by using Cursor.SetCursor method in a script to change the cursor.
[SerializeField] Texture2D cursor;
void Start() {
Cursor.SetCursor(cursor, Vector3.zero, CursorMode.ForceSoftware);
}
I have two UWP apps and after testing them out with Continuum I noticed the app bar of the OS (the bar with the Start button) at the bottom of the screen (it can be at each of the 4 edges of the screen, of course) was covering part of my app.
Now, I'm already using ApplicationView.GetForCurrentView().SetDesiredBoundsMode(ApplicationViewBoundsMode.UseVisible) before calling Window.Current.Activate(), but that doesn't seem to solve the issue.
1) Why is it that setting the DesiredBoundsMode property doesn't seem to work here? Shouldn't that automatically resize the window
content to the visible bounds (ie. excluding system overlays like the
navigation bar or the app bar)?
The workaround I'm using for now on Windows 10 Mobile devices is to subscribe to the VisibleBoundsChanged event and then manually adjust the margins of my Window.Current.Content item to make sure it doesn't show anything behind covered areas of the screen.
Basically, I use the Window.Current.Bounds property and the ApplicationView.VisibleBounds property to calculate the occluded areas on the different edges of the app window, and increase the margins from there.
2) Is there a proper/better way to do this?
I mean, I'm quite sure there's another method that should be used to avoid this issue (considering there are tons of different situations like Continuum, navigation bar etc... that I don't think are supposed to be manually handled one by one).
Thank you for your help!
Use the subscription to the event VisibleBoundsChanged. This is the best solution that I found.
var curr = ApplicationView.GetForCurrentView();
if (curr.IsFullScreenMode == true)
{
ApplicationView.PreferredLaunchWindowingMode = ApplicationViewWindowingMode.FullScreen;
curr.FullScreenSystemOverlayMode = FullScreenSystemOverlayMode.Minimal;
}
else
{
ApplicationView.PreferredLaunchWindowingMode = ApplicationViewWindowingMode.Auto;
curr.FullScreenSystemOverlayMode = FullScreenSystemOverlayMode.Standard;
}
I have a GridView in a windows store project, that contains some big squares, and inside those i have a list of user images, when i tap on of those images a flyout appears on the right showing some information like this.
the problem is that my gridview extends to the edges of the screen and beyond, and when that does i get this situation, i press the user with the red border near the edge of the screen and flyout appears on left.
My flyout placement is set to Right, and im guessing since the element i press is near the edge it follows the Fall back order with according to this is Right > Left > Top > Bottom.
What i would like to know is how to detect this happens, so i can adjust my flyout position, or another viable alternative :)
After searching through properties and Dependency properties on Flyout and FlyoutBase, I haven't found a way to simply get the actual placement of the Flyout (which is unfortunate because I think that can be important, as we see in your case). Perhaps you can try what was proposed here by implementing a method that compares the desired size of the Flyout with the available space.
You can subscribe to FlyOut.Opened event and compare absolute coordinates of the flyout and the element you're showing it at. Here is an example for top/bottom flyout placement (easily extendable to check for left/right as well):
private void FlyOut_Opened(object sender, object e)
{
GeneralTransform flyOutTransform =
flyOut.Content.TransformToVisual(Window.Current.Content);
Point flyOutPosition =
flyOut.TransformPoint(new Point(0, 0));
GeneralTransform showAtElementTransform =
showAtElementTransform.TransformToVisual(Window.Current.Content);
Point showAtElementPosition =
showAtElementPosition.TransformPoint(new Point(0, 0));
if(flyOutPosition.Y < showAtElementPosition.Y)
{
// top
}
else
{
// bottom
}
}
I am developing Windows 10 Universal App using C#. My question is refers to a case when an user runs my application on PC/desktop.
I want my app to be full screen, this can by done by calling
Windows.UI.ViewManagement::TryEnterFullScreenMode()
Now we are in full screen. But when user swipes mouse pointer to lower edge of the screen Windows' Taskbar will appear and I don't want it to appear - what can I do?
I've tried setting FullScreenSystemOverlayMode property of corresponding Windows.UI.ViewManagement to FullScreenSystemOverlayMode.
Minimal alongside with setting SuppressSystemOverlays to true.
Neither has helped, what can I do?
When I do something like that I use:
ApplicationView.GetForCurrentView().SuppressSystemOverlays = true;
ApplicationView.GetForCurrentView().FullScreenSystemOverlayMode = FullScreenSystemOverlayMode.Minimal;
ApplicationView.GetForCurrentView().TryEnterFullScreenMode();
Maybe just semantics, but sometime those semantics matter.
I am very very beginner of C# programming.
The previous program that I made can get touch input as a click event on the Surface Pro.
So, It cannot get multi-touch input.
How can I get the multi-touch inputs(positions on the screen) on the Windows Surface Pro when writing the application in c#?
I heard that I can use touch class, but I never know how to use it ....
I am struggling with this for 2 weeks, but could not make any progress.
Anyone who can explain how to use touch class specifically?
Or any other suggestions to get touch input values from the Surface pro touch panel?
Assuming you are programming WPF.
Register UIElement.TouchDown event, for example, in you MainWindow's constructor, add
this.TouchDown += MainWindow_TouchDown;
If multiple fingers are touching screen at the same time, the TouchDown event is fired for each finger.
You can get the position of the touch (relative to screen) from the TouchEventArgs which is passed as an argument to the event handler.
void MainWindow_TouchDown(object sender, TouchEventArgs e)
{
TouchPoint pointFromWindow = e.GetTouchPoint(this); //relative to Top-Left corner of Window
Point locationFromScreen = this.PointToScreen(new Point(pointFromWindow.Position.X, pointFromWindow.Position.Y)); //translate to coordinates relative to Top-Left corner of Screen
}
Note: .NET 4.0 is required, Windows 8 shipped with .NET 4.0 pre-installed. If you are running on Windows 7, you have to ensure it is installed.