I am Developing a FPS Game and I am using The FPSController that comes with the Standard Assets. I am developing a Gamepad controller to control the mobile game.
After configuring all inputs of the controller I want to replace the code as following:
if (!m_Jump)
{
m_Jump = CrossPlatformInputManager.GetButtonDown("Jump");
}
The code should be like following:
if(Application.platform == RuntimePlatform.Android)
{
if (!m_Jump)
{
m_Jump = Input.GetButtonDown("Joystick A");
}
}else{
if (!m_Jump)
{
m_Jump = CrossPlatformInputManager.GetButtonDown("Jump");
}
}
What is the best way to modify the FPS Controller Scripts and add These Inputs?
Thanks in advance.
You can go around this in a way that does not require you to edit or add any aditional code.
Under Edit>Project Settings > Input you can find the InputManager and "axis". In here you will find all the pre-defined input values and their corresponding buttons. If you unfold the Input Axis "Jump" You'll see it has a variable called "Positive button" which is by default space, and an empty "alternative button". If you add your desired button to the alternative button sectiong it will trigger the CrossPlatformInputManager.GetButtonDown("Jump"); action upon pressing the corresponding button on your gamepad.
you can also if you prefer extend the list and add your own triggers.
The input manager offers out of the box support for keyboard and mouse, joystick, and gamepad more information can be found in the docs here
The advantage of doing this is that this requires no additional code or conditions. Which is probably the most optimalised way to go aroung it
Related
I am making a game using the new input system, but I have gotten stuck when trying to have a button do two things.
Currently, I have an action map for gameplay, and the mouse click acts as the "fire" event.
When the player clicks on a UI element (for example, an inventory slot), I don't want the "fire" event to occur, but rather another suitable method.
I tried to use two action maps (gameplay and UI), but I don't want the player to stop moving (due to gameplay input not being picked up on the UI action map).
I'm sure there is a simple solution to this, but there is a lack of extensive information on the new input system. I am thinking I may have to add an IPointer OnEnter, OnExit interface to the monobehaviour script and have the input script decide which action is best suited at the time of being pressed. Is there a more efficient way to do this?
Put the responsibility on the Firing script to check that player state is valid. You can expose player state information in order to check if the player is in a menu or in some other state that does not allow Fire to happen.
CanFire is a property in the player state composed from any number of checks.
public class PlayerState
{
.. other properties like IsInMenu, IsInCutscene, IsDead, etc..
public bool CanFire => !IsInMenu && !IsInCutscene && !IsDead;
}
Then in your firing script when the event is fired, check the CanFire boolean in the player state.
if (playerState.CanFire) Fire();
We have a Xamarin app (Android) that at one stage opens up a web view (Webkit.Webview not Forms.Webview). This directs the user to a page on a third party site which has been set up for us.
Firstly - on certain input fields the keyboard which shows up is the wrong one - we are expecting a dismissable keyboard (i.e. "Done" in the bottom corner, not a "Submit"). I know this can be changed but not sure what is the correct way to do this. Does it have to be the metadata/text inputs on the web page that is changed? If so - what needs to be modified per text box entry on the html of the page? Just the type? i.e:
<input type="email">
Secondly, rather than wait for the third party to fix the page, is there a way we can force the webview to always open a certain keyboard type?
We have an option of intercepting the keyboard key presses and trying to dismiss the keyboard on return press at the minute. But would prefer not to put a hack in that intercepts every key press.
Appreciate the help, not sure what the way forward is here.
Thanks
From the comments: To your second question about forcing a keyboard button, you can check out this link which describes how to override OnCreateInputConnection to specify the Keyboard Enter Button type.
public class MyWebView : WebView {
...
public override IInputConnection OnCreateInputConnection (EditorInfo outAttrs) {
var inputConnection = base.OnCreateInputConnection (outAttrs);
// outAttrs.ImeOptions in Xamarin only allows ImeFlags but it also should allow ImeActions
outAttrs.ImeOptions = outAttrs.ImeOptions | (ImeFlags)ImeAction.Next;
return inputConnection;
}
}
That will not dismiss your keyboard when tapped though since it is meant to take the user to the next input. Hopefully someone else can come along and either provide a better answer or give a good way to dismiss the keyboard in this situation without hacking something together.
void OnMouseDown() {
SceneManager.LoadScene ("Scene2");
}
I have tried every conceivable method. The method posted has worked for me using GameObjects with colliders. Instead, this time I am using a button on a 2D canvas. It does not work in this context.
How do I load a new scene using a button in a canvas? I have tried so many different things. This should be simple.
Thanks for any advice.
Here (link: Unity page) you can find a video tutorial how to use Button on canvas in UnityGUI. It's for Unity 4.6 but its really simillar to newest (5.3.1).
It's quite simple. U can make a script with public method e.g
public void LoadScene2()
{
SceneManager.LoadScene ("Scene2");
}
Attach this script to some GameObject e.g Controller. And add event in Button inspector.
In my opinion there is a better solution for the one shown by #Paweł Marecki
I use this in my projects.
OK so you will simply create a script called ButtonManager and inside it you can make a method like this
public void ChangeToScene(string sceneName)
{
Application.LoadLevel(sceneName);
OR
SceneManager.LoadScene(sceneName);
}
Now you have your canvas button, you will select it and look for "Event Trigger"
(i got this image from google to help) add a new mouse down event.
Create an empty GameObject on your Scene, name it "ButtonManager" and drag it onto the event box.
Now you need to click that dropDown list and find your "ChangeToScene" method.
You will see that an editor field appears below, type your desired scene name and hit play :P
This way you will always use this script when you want to change scenes.
You can add other methods and add functionality, but the beautiful part is that you dont need to create a method each time the name of the scene changes.
I am trying to make my game have the ability to right mouse button click on web builds.
void MouseCheck()
{
if(Input.GetMouseButton(1))
{
//my code
}
}
But it doesn't get detected because when I right click in web build it shows some default options such as full screen.
This is a bit tricky, but not impossible.
You need to make changes on the built's HTML. This is how you would do it:
var params =
{
disableContextMenu: true,
};
This parameter will notify Unity Web Player whether it should display the ContextMenu or not. This in turn prevents the context menu from appearing, which then allows your game to check for Right-Mouse Clicks.
Here is a link to all the customization you can do to the HTML file: Unity WebPlayer's Behaviours.
It is important that you include the C# code which does the actual right click checks happen.
if(Input.GetMouseButtonDown(1))
{
// Code here.
}
This was added for future readers who might benefit from this part
I need to show, and input some text in xbox-like onscreen keyboard. Sadly, when I call Guide.BeginShowKeyboardInput, there is only some funny textbox shown, and i must fill it via keyboard. I know, that on PC iv very normal to use keyboard, but in my case i MUST enter text via gamepad, using xbox on screen keyboard.
Is there any way to achieve this? To call xbox onscreen keyboard on PC?
If you need one for like a touchscreen monitor you could do
using System.Diagnostics;
using System.IO;
then use this function
Process.Start(Environment.GetFolderPath(Environment.SpecialFolder.System) + Path.DirectorySeparatorChar + "osk.exe");
but if you need it to work with like an xbox controller you will probably need to build your own
No. It was a design decision (documented here) to give the end user control of the keyboard being invoked. Therefore, the end user has to touch a text box (or the like) to invoke the virtual on-screen keyboard.
Check this text form this link
Blockquote User-driven invocation
The invocation model of the touch keyboard is designed to put the user in control of the keyboard. Users indicate to the system that they want to input text by tapping on an input control instead of having an application make that decision on their behalf. This reduces to zero the scenarios where the keyboard is invoked unexpectedly, which can be a painful source of UI churn because the keyboard can consume up to 50% of the screen and mar the application's user experience. To enable user-driven invocation, we track the coordinates of the last touch event and compare them to the location of the bounding rectangle of the element that currently has focus. If the point is contained within the bounding rectangle, the touch keyboard is invoked.
Blockquote This means that applications cannot programmatically invoke the touch keyboard via manipulation of focus. Big culprits here in the past have been webpages—many of them set focus by default into an input field but have many other experiences available on their page for the user to enjoy. A good example of this is msn.com. The website has a lot of content for consumption, but happens to have a Bing search bar on the top of its page that takes focus by default. If the keyboard were automatically invoked, all of the articles located below that search bar would be occluded by default, thus ruining the website's experience on tablets. There are certain scenarios where it doesn't feel great to have to tap to get the keyboard, such as when a user has started a new email message or has opened the Search pane. However, we feel that requiring the user to tap the input field is an acceptable compromise.
Check out: http://classes.soe.ucsc.edu/cmps020/Winter08/lectures/controller-keyboard-input.pdf
You might find your answer in here, It has all information about the input of a gamepad and such.
There is a good guide on how to do this here:
static public string GetKeyboardInput()
{
if (HandleInput.currentState.IsButtonDown(Buttons.B))
{
useKeyboardResult = false;
}
if (KeyboardResult == null && !Guide.IsVisible)
{
string title = "Name";
string description = "Pick a name for this game";
string defaultText = "Your name here";
pauseType = PauseType.pauseAll;
KeyboardResult = Guide.BeginShowKeyboardInput(HandleInput.playerIndex, title,
description, defaultText, null, null);
useKeyboardResult = true;
pauseType = PauseType.pauseAll;
}
else if (KeyboardResult != null && KeyboardResult.IsCompleted)
{
pauseType = PauseType.none;
KeyboardInputRquested = false;
string input = Guide.EndShowKeyboardInput(KeyboardResult);
KeyboardResult = null;
if (useKeyboardResult)
{
return input;
}
}
return null;
}
And your Update method should contain something like this:
if (KeyboardInputRequested)
{
string result = GetKeyboardInput();
}
if (result != null)
{
//use result here
}
It is unlikely that the on-screen keyboard was packaged in the XNA DLLs for PC, but if you really want to find out, you could research a free .NET decompiling program (such as ILSpy) and look through it.
Also, you can't use the chatpad either. I would recommend either making your own on-screen keyboard that is usable by a controller, or maybe, if you can, using the MonoGame framework (an open-source version of XNA), and modifying it to have an on-screen keyboard on Windows.
You could also make a separate program that acts as a virtual keyboard controlled by the controller.