VMR9 vs EVR : How to use ImageCompositor with EVR? - c#

I've coded a PixelShader compiler/tester that works live on the Image or Video source that is playing using DirectShow.Net + VMR9.
And it was all good until I decided to give it a go as a real video player, and started adjusting every bit of it to work as it should (titles,etc).
Then I found out that video is very pixelated (badly interpolated) on Windows7 with ATI gpus. The solution was to go with EVR. And I did it. Aside with some glitches with background flickering and resize slowness that I'll try to solve with a custom presenter it all looked good.
BUT...
I lost the ability to apply pixel shading to the output video because there is no SetImageCompositor method on the EVR FilterConfig interface.
This is the EVR interface:
[SuppressUnmanagedCodeSecurity]
[Guid("83E91E85-82C1-4ea7-801D-85DC50B75086")]
[InterfaceType(ComInterfaceType.InterfaceIsIUnknown)]
public interface IEVRFilterConfig
{
void GetNumberOfStreams(out int pdwMaxStreams);
void SetNumberOfStreams(int dwMaxStreams);
}
This is the VMR9 interface:
[Guid("5a804648-4f66-4867-9c43-4f5c822cf1b8")]
[InterfaceType(ComInterfaceType.InterfaceIsIUnknown)]
[SuppressUnmanagedCodeSecurity]
public interface IVMRFilterConfig9
{
int GetNumberOfStreams(out int pdwMaxStreams);
int GetRenderingMode(out VMR9Mode Mode);
int GetRenderingPrefs(out VMR9RenderPrefs pdwRenderFlags);
int SetImageCompositor(IVMRImageCompositor9 lpVMRImgCompositor);
int SetNumberOfStreams(int dwMaxStreams);
int SetRenderingMode(VMR9Mode Mode);
int SetRenderingPrefs(VMR9RenderPrefs dwRenderFlags);
}
I have been using this approach with a custom image compositor to apply pixel shaders:
IVMRFilterConfig9 filterConfig = (IVMRFilterConfig9)vmr9;
// frameManager is my custom class implementing IVMRImageCompositor9
hr = filterConfig.SetImageCompositor(frameManager);
DsError.ThrowExceptionForHR(hr);
Now I cannot...
Using: VS2010, C#, DirectShow.NET, Mediafoundation.NET, Managed DX9.
What is the solution to this problem? Any guidelines on how to do it with EVR?
Thank you very much!

Since nobody had suggestions I dig up a little and think I have found out what could be the solution:
http://msdn.microsoft.com/en-us/library/bb530107(v=vs.85).aspx
This should be now done in the custom presenter...

Related

How to use CommandBuffer.IssuePluginCustomBlit on Unity Engine?

On Unity Engine, I'm trying to convert an image effect (from the asset store) from the camera event OnRenderImage(RenderTexture source, RenderTexture destination) to a camera command buffer (UnityEngine.Rendering.CommandBuffer)(to control the rendering order of all the effects stack).
The image effect rendering C# method is pretty complicated and I would rather not modify it (redoing the effect from scratch might be faster).
So the ideal solution would be calling the effect rendering C# method by a command buffer triggered event.
CommandBuffer.IssuePluginCustomBlit looks like a solution, but I can't find any example how to set it up.
It references a callback method, a source render texture and a destination render texture.
There are examples of the method call (especially in the VRWorks plugin) :
buffer.IssuePluginCustomBlit(PluginExtGetIssueEventCallback(), (UInt32)command, source, dest, commandParam, commandFlags);
[DllImport("GfxPluginVRWorks32", CharSet = CharSet.Ansi, CallingConvention = CallingConvention.StdCall)]
private static extern IntPtr PluginExtGetIssueEventCallback();
Nonetheless, I have no clue how the PluginExtGetIssueEventCallback is built and how it implement the source and destination render textures.
If someone would have an example to share of how to use CommandBuffer.IssuePluginCustomBlit and to create a callback method, it would be appreciated.
Thanks!
I would suggest CommandBuffer.Blit() instead. You can easily use custom shaders with it.
To create a native render plugin you would need to write separate implementations of it for each 3d api (d3d11, d3d12, vulkan, metal) for each platform you want to support (windows, android, ios). All of this in c++ except for apple where you would rather use objective c. You would also have to write your own shaders for each 3d api separately.
Just look at this small example here:
https://github.com/Unity-Technologies/NativeRenderingPlugin
Bear in mind that Unity native render plugin API is very poorly documented. You're pretty much on your own.
I struggled to find it, but this seems to work:
In your C#:
...
cmd.IssuePluginCustomBlit(GetCustomBlitCallback(), 0, source, destination, 0, 0);
...
private const string PluginName = "UnityRendering";
[DllImport(PluginName, CallingConvention = CallingConvention.StdCall)]
private static extern IntPtr GetCustomBlitCallback();
In your native plugin:
#include "Unity/IUnityRenderingExtensions.h"
extern "C" UNITY_INTERFACE_EXPORT void UNITY_INTERFACE_API CustomBlit(unsigned int command, UnityRenderingExtCustomBlitParams* iParams)
{
...
}
typedef void(UNITY_INTERFACE_API* UnityRenderingCustomBlit)(
unsigned int command,
UnityRenderingExtCustomBlitParams* iParams);
extern "C" UNITY_INTERFACE_EXPORT UnityRenderingCustomBlit UNITY_INTERFACE_API
GetCustomBlitCallback()
{
return CustomBlit;
}

Setting wallpaper through Active Desktop in UWP app changes fit mode

I am using the Active Desktop interface in C# to change desktop wallpaper in Windows. I am only using the IActiveDesktop.SetWallpaper method and never use IActiveDesktop.SetWallpaperOptions, so I would expect only the wallpaper image to change and not its fit (tile, stretch, fill, etc.).
When I my compile my code as a .NET desktop app, this behaves as expected. However, when I use Desktop Bridge to compile my app as a UWP app for the Windows Store, the wallpaper fit changes and not just the image. I don't understand why running my code as a UWP app should make the Active Desktop interface behave any differently.
For example, if I select "Span" in the Windows 10 Settings app to make the wallpaper stretch across my two monitors, my UWP app does not respect this setting. When it changes the wallpaper image, the fit also changes to show the image separately on each monitor. But the .NET desktop version of my app respects the wallpaper fit setting and does not change it.
I have included the relevant part of my code below. The entire file can be found here.
[ComImport]
[Guid("F490EB00-1240-11D1-9888-006097DEACF9")]
[InterfaceType(ComInterfaceType.InterfaceIsIUnknown)]
public interface IActiveDesktop
{
[PreserveSig]
int ApplyChanges(AD_Apply dwFlags);
[PreserveSig]
int SetWallpaper([MarshalAs(UnmanagedType.LPWStr)] string pwszWallpaper, int dwReserved);
}
public class WallpaperChanger
{
public static readonly Guid CLSID_ActiveDesktop =
new Guid("{75048700-EF1F-11D0-9888-006097DEACF9}");
public static IActiveDesktop GetActiveDesktop()
{
Type typeActiveDesktop = Type.GetTypeFromCLSID(WallpaperChanger.CLSID_ActiveDesktop);
return (IActiveDesktop)Activator.CreateInstance(typeActiveDesktop);
}
public static void SetWallpaper(string imagePath)
{
IActiveDesktop iad = GetActiveDesktop();
iad.SetWallpaper(imagePath, 0);
iad.ApplyChanges(AD_Apply.ALL | AD_Apply.FORCE | AD_Apply.BUFFERED_REFRESH);
}
}
Note: I've tried using the SetWallpaperAsync function available in the Windows UWP library, and it has the same problem. Also this problem is not specific to multiple monitors, the same thing happens with just a single one.

implement an interface Xamarin forms pcl with c#

I just downloaded the plugin XamJam.Screen and I have to implement it, the problem is that it is an interface.
That said, I've never implemented interfaces in c # programming, and I wonder if anyone can advise me how to implement it, I tried but without success, the following code:
namespace Fimap.WebPart
{
public partial class HomePage : ContentView
{
public HomePage()
{
InitializeComponent();
Screen xyz = new getScreen();
var w = xyz.Size.Width;
}
public class getScreen : Screen
{
public ScreenSize Size
{
get
{
return Size;
}
}
}
}
}
the problem is that size 0 me back around.
My goal is to take the width and height of the device.
You do not need to implement the interface yourself. If you installed the package through NUGET, you only need to call
var size = Plugin.XamJam.Screen.CrossScreen.Current.Size;
you can access the width and height of the screen with size.Width and size.Height
Please visit the project github home page and checkout the additional disclaimers on how the library works, provided by the author.
As you can see, the library is of limited use. There is probably a better way to achieve what ever it is that you are trying to do. Consider researching your actual use case / problem further.

How to programmatically disable the sound?

I have a console program, which needs a way to mute the sound on my computer.
Here's the code that does this, but it was written for the WF. There is an easier-analogue for the console program?
using System;
using System.Windows.Forms;
using System.Runtime.InteropServices;
namespace VolumeOff
{
public partial class Form1 : Form
{
public Form1()
{
InitializeComponent();
}
[DllImport("winmm.dll", SetLastError = true, CharSet = CharSet.Auto)]
public static extern uint waveOutGetVolume(uint hwo, ref uint dwVolume);
[DllImport("winmm.dll", SetLastError = true, CallingConvention =
CallingConvention.Winapi)]
public static extern int waveOutSetVolume(uint uDeviceID, uint dwVolume);
private void button1_Click(object sender, EventArgs e)
{
uint volume = 0;
uint hWO = 0;
waveOutGetVolume(hWO, ref volume);
textBox1.Text = volume.ToString();
}
private void button2_Click(object sender, EventArgs e)
{
uint hWO = 0;
waveOutSetVolume(hWO, Convert.ToUInt32(textBox1.Text.ToString()));
}
}
}
Edit 1:
By the way, this code works a bit wrong. With this slider to adjust the volume in the system tray (which adjusts the volume in Windows) does not move. The impression is that my application does not regulate the volume of the system, but something else.
on my computer.
Supposedly you want to mute or change volume on a device rather than for specific application. Or, you could have thought of all devices - you were not specific enough. All in all, the choice of API is an unlucky guess, instead you want Core Audio APIs.
From MSDN:
Audio applications that use the MMDevice API and WASAPI typically use the ISimpleAudioVolume interface to manage stream volume levels on a per-session basis. In rare cases, a specialized audio application might require the use of the IAudioEndpointVolume interface to control the master volume level of an audio endpoint device. A client of IAudioEndpointVolume must take care to avoid the potentially disruptive effects on other audio applications of altering the master volume levels of audio endpoint devices. Typically, the user has exclusive control over the master volume levels through the Windows volume-control program, Sndvol.exe.
This questions is a lead to look for API/interfaces of your interest: Where in the .NET class library is IAudioEndpointVolume located? Also, here is another one for you: Mute/unmute, Change master volume in Windows 7 x64 with C#

GUI In Cosmos: Help in C#

I have been using Cosmos in Microsoft Visual C# 2008 to make primitive, TUI, operating systems. I wonder how to make a GUI in Cosmos. I know that it's possible, but I just want to know how to make it. Constructive criticism appreciated, insults not! Please reply with code (and comments in the code), because I am an absolute beginner, with only some knowledge of basic c# commands. Thanks!
I do not know what milestone your using, but I think this might work for you. You need this class level variable:
Cosmos.Hardware.VGAScreen screen;
And in your Init method:
screen = new Cosmos.Hardware.VGAScreen();
screen.SetMode300x200x8();
screen.Clear(0);
//done init vga screen
After that last comment, in your code, you can use this to set the color of a pixel:
screen.SetPixel300x200x8(uint x, uint y, uint color);
The color parameter is the color of the pixel in 256 color format (numbers 0 through 255). That's all you need to make a GUI. You need lots of math skills to make shapes, though.
There are also GUI API's with function's to make shapes.
Search on Google/YouTube or visit the discussion page on Cosmos' Codeplex page:
http://cosmos.codeplex.com/discussions
This is for 2020 Because this is a VERY good way todo this in cosmos. (USING CGS)
using System;
using System.Drawing;
using Cosmos.System.Graphics;
using Sys = Cosmos.System;
namespace Graphics
{
public class Kernel : Sys.Kernel
{
Canvas canvas;
protected override void BeforeRun()
{
canvas = FullScreenCanvas.GetFullScreenCanvas();
canvas.Clear(Color.Black);
}
protected override void Run()
{
Pen pen = new Pen(Color.White);
// DRAW stuff see https://www.gocosmos.org/docs/cosmos-graphic-subsystem/
}
}
}

Categories