Working with multiple screens in XNA 4.0 - c#

I'm struggling to find info how to detect available screens in a XNA 4.0.
What I'm trying to do is to get a list of available "screens", select one and start my XNA app as full screen on that screen.
Specifically I wan't to open the XNA window in Full Screen mode on a Full HD TV connected via HDMI cable.
Thanks,
Stefan

This article explains how to get XNA working on multiple monitors.

Related

How to take high quality photo from integrated camera in WPF

I'm implementing WPF app where I need to take high quality photos from integrated camera. So far I've been successful with capturing video and taking frames from it (described for example here: Wpf and C # capture webcam and network cameras).
But this is not what I want - because video frame quality is not so great. I have MS Surface 4 Pro which has 8Mpx camera with full HD video support and with the above method I'm able to just get full HD frame from it. But I would like to have full 8Mpx picture, like it is possible to take in the native Windows Camera app.
In UWP I would probably have been successful with CameraCaptureUI class, but I didn't find any clues for WPF.
Does anyone has an idea how this could be implemented?
I've found out that XAML Islands do work with .NET Framework 4.8. So I've been able to implement a WPF solution using UWP components MediaCapture and CaptureElement. With that I can take photos with full resolution which was my goal.
Simple sample project can be found here: https://github.com/ondrasvoboda/WPFCamera, consider it just as a proof of concept.
If your app will run on Windows 10 or above, you can now use most of the APIs from Windows 10 in a WPF application.
https://blogs.windows.com/windowsdeveloper/2019/04/30/calling-windows-10-apis-from-a-desktop-application-just-got-easier/

Windows8/win10 c++/c# :have multiple virtual desktop active at the same time

I'm developing a vr application for windows8/10 so that user can not only see and interact with the existing monitor plugged to their gpu in vr, but also create a new virtual monitor, or just open a application outside of the desktop/screen in vr space.
After some research, i figured i can show exising desktops on monitors by duplicatedesktop api, and also create/show/switch a virtual workspace, but how do i have multiple desktop object active at the same time with only one monitor?, so that my unity programming can create textures using duplicatedesktop api from those desktop object and show them in vr.
And also, is it possible to have a application window without/outside of a desktop?
My target is windows10, but would be best if also works on windows8 !, Thank you in advance!
And sry if my description sounds vague, im experienced in unity but just started on windows development.

Image Capture in Windows 8.1 C# Application

I have a Desktop Based piece of Touch Screen Software. As part of the process you capture a photo of the user to store later on. The Application works perfectly fine in all versions of Windows other than Windows 8.1 on a Surface Pro 3 (seems to be machines with Metro on).
When it comes to Windows 8.1 the Application Captures the image with no issues. When you then go back to the page to capture again it is unable to grab the camera to take another photo?
Using the Camera Capture in Windows 8.1 itself uses the camera with no issues again. Restarting the Surface Pro allows the camera to work in the application again for one instance and then stops again.
I've tried scouring the web but not having much luck so far. We're using the DirectShow lib
Any advice/ideas/help would be greatly appreciated.
Thanks
Paul

touch enabled wpf web browser control

I have to create x64 wpf app with webbrowser included, the app should rotate/scale to face all four edges of the touch capable screen that will be built into the table. Suggest me some alternatives to native wpf web browser control. tutorials much appreciated.
if you need to rotate things, Awesomium (as mentioned in #Noeratto's comment) is the only real game in town

C# XNA touch panel - Surface Pro touchscreen not recognized

I used some tutorials on the internet that taught how to use touch input in XNA for windows computers with a touch screen. However, I'm using a Surface Pro and the XNA studio doesn't even recognize it as a touch screen. I used a test by printing the status of TouchPanel IsConnected and it was false. The maximum touch capability returned 0 (I think surface has a multitouch of 10).
Is anyone familiar with this? I was trying to do some simple tasks like, receiving multiple finger clicks on the screen.
XNA 4.0 doesn't support touch on any device other than Windows Phone 7. See this blog post for full details.
That linked blog post provides some alternative methods for receiving touch input (basically: use the Windows APIs).

Categories