Mobile App to Work Along side a Winform C# - c#

Ive made a Windows Form Application and all is working fine on that.
but what im trying to do is make it so people have a mobile App(that ill end up making) that will allow them to be away from the PC and push a button on the mobile app that will send someform or command to the Winform Application and get it to trigger a button click event.
E.G
Winform is open on PC has a button on it that will play or pause music.
I go get a drink downstairs and i want to pause the music from the mobile app.
Push the button on the mobile app and it connects to the Winform application and triggers the event for the pause button.
If anyone can help me out with this or put me in the right direction to do somthing like this that would be great. thank you.
Ben

Basically what you just described is that you need to build a mobile application.
There are several tools that allow you to move your application to web. I used to work with these guys, they were called Artinsoft at the time but they are now called Mobilize.net, and they have a tool for making this conversion: www.mobilize.net/press/topic/convert-windows-to-web
However in your cases it seems like what you are trying to build is a remote control, in this case you need to either expose an endpoint for a mobile app to connect to or create a centralized server that both the remote and Winforms connect to.

I suggest using a TCP/IP connection. I don't know anything about mobile development, so I'll just focus on .NET and the concepts should carry-over.
System.Net.Sockets contains TCP classes for opening connections to hosts/clients. I'd recommend that the desktop-app acts as the host while the mobile-app is the client.
There are plenty of online articles on the subject, so take one and explore it's code until you understand it. Then, prototype your task on a single desktop machine using two-different .NET programs, one for the host (mimics the desktop program in the final project) and the other as the client (mimics the mobile-app in the final project). Then, learn the mobile API for networking and mimic what you did in the client-test program on-to the mobile platform project.

Related

Connecting two devices with different scenes

I'm implementing a game in Unity and I have a problem that I need advise for. I want to connect a tablet and a computer. The scene that is on the computer is supposed so send an event once an object is collected so that a special scene can be shown on the tablet. So based on what is happening on the computer there is a different scene shown on the tablet.
I only have no idea how to achieve that.
First, you need to create 2 different applications (for desktop and mobile). Second, you need to determine what protocol you want to connect desktop app and mobile app. The easiest way to implement this I think by using WebSocket/internet but not HTTP. Yes, you need to write code for the backend so it likes creating 3 different apps. But you can use firebase to simplify this process.
I recommended you use firestore from firebase. Connect booth app to firestore. When desktop app sends data to firestore, firebase will automatically notify mobile app and send that data. But you need to figure out yourself how to pair a desktop app and a mobile app otherwise, your firestore will receive data from all connected desktop app and send the same data to all connected mobile app.
This tutorial to use firestore:
https://www.youtube.com/watch?v=b5h1bVGhuRk
But before that, you need to add firebase to your project before you add firestore:
https://firebase.google.com/docs/unity/setup
You can use any wireless protocol available. You can even use Bluetooth if you want your app offline. But I don't know how to use Bluetooth...

Show lock status of windows pc as a tile in the home app on iOS

I already made a service which sends a push notification to my iPhone whenever my pc locks/unlocks, pretty simple. But it would be really nice if i could see the lock status in my home app. Guess i will need to have some kind of "bridge" running as a service on my pc.
So, i was wondering if any of you guys have tried something similar? and can point me in the right direction. If it's even possible of course.
Screenshot of current service notifications
Well, it definitely is possible. I'll give you a short outline:
Get the HAP-Spec: https://developer.apple.com/support/homekit-accessory-protocol/
You need to find a suitable representation for your PC inside the HomeKit ecosystem. That is, think about what Services your PC should expose and how you want to map them to the Services defined by Apple. The problem is that, while the Home app can display any accessory - even if it is a custom one, it will only display some generic information if it doesn't know the accessory's services. I suggest using the Lock Mechanism or Switch service. Choose the one that results in the best-fitting graphical representation inside the Home app. (Check out the spec, there are a lot more services).
Implement your HomeKit accessory: I suggest using https://homebridge.io or https://github.com/homebridge/HAP-NodeJS directly. You may also want to take a look at https://www.npmjs.com/package/homebridge-http-switch, which wouldn't require any HomeKit related coding from your side.
Host your HomeKit accessory: The thing here is that the Home app is going to request the accessory's current state each time you open it. Thus, hosting the bridge on your pc is maybe not the best idea (if your PC is locked and sleeping or powered off, the accessory ether won't answer, or you need Wake on Lan and then the PC-fans and Monitor turn on...depends on your setup). You probably want to host your bridge on a raspberry pi that is always running.
Write some script that sends a http message to your bridge whenever the pc is locked/unlocked.

Send commands remotely to mobile app in Xamarin C#

I am developing a mobile (Xamarin.IOS) and a desktop app (C# Winforms) for our small start-up for internal use. I want to be able to send commands (not remote notifications) to the client (mobile app) through our main desktop app.
For example; I want all our employees using our mobile app to perform certain task, let's say, show a popup window with specific text when they open up the app. Any ideas about how to implement such kind of thing? Is is technically possible to tell a device to remotely execute certain management commands?
Thanks in advance.
There are multiple ways of doing this.
A Microsoft MVP, Mark Arteaga, suggests using Azure Functions-HTTP Triggers to do this. He presented this in multiple Microsoft events and also has a Github repo where he implements it in a sample mobile app.
So whenever you need to, you would send commands through Azure Functions to the client using similar, and if you want to go one step further, you can even look into implementing it with Backgrounding using Shiny

How to notify C# GUI from PHP

I am writing a RESTful Webservice with PHP. When I change data it is very simple to update a GUI in iOS oder Android with Push Notifications on mobile devices but I haven´t found a good solution for C# desktop application on computer. The user should get these changes very quickly. (1-2 second is ok).
My question is: Is there a similar way to update the GUI in C# like push notifications?
PS: I don´t want to use a windows web service!

How to get programmatic access to controls of Windows CE app running in Emulator using Coded UI or MSAA?

I am trying to automate the application developed in Windows CE. I have created a simple Form consisting of Button and TextBox.
I run the application with the Pocket PC 2003 Emulator and once i record the test, it is able to automate the process like it can click the Button and using the keyboard on the simulator, it can also type in the textBox.
Now my concern is to validate the things like i want to get the TextBox as a control in my client application and want to validate the text inside it.
With CodedUI, I am unable to "Add Assertions" because the focus does not go inside the Emulator like below
Now, the window of the Emulator shows its been developed in the MSAA technology, i want to add verification points in my test and in order to do that i need to have programmatic access of the controls in the Emulator.
Please guide.
The emulator is a virtual machine, meaning that to your PC it's a completely separate machine. The host PC doesn't have access to the windows or controls of the hosted machine in any way. You could probably script some sort of location based output to simulate clicks onto the VM, but there's no way you can get a windows handle and do things like put text into or read text out of a control. Your better off creating a test proxy that runs on a device (could be the emulator or a physical device) and communicate with that proxy via network, RAPI, remote tools framework or something along those lines.

Categories