Getting Camera Resolutions Using Accord.Video.DirectShow - c#

I am evaluating the Accord.NET Framework (https://github.com/accord-net/framework/) for use in an imaging application. At the moment I have some basic requirements - capture video from a USB camera to display on the UI and view/change all camera properties.
Accord.Video.DirectShow.VideoCaptureDevice.DisplayPropertyPage works well for showing the camera properties, such as brightness, contrast, hue etc. but does not show available camera resolutions.
Accord.Video.DirectShow.VideoCaptureDevice.VideoCapabilities is returning only one resolution but I was expecting several more.
I have tried the VideoCapx (http://videocapx.com/) ActiveX control and using its ShowVideoFormatDlg method I can display a dialog which shows all available resolutions, framerates etc. I understand this is a dialog provided by the manufacturer and accessed via OLE\COM. What I am looking for is a way of accessing this via .NET, hopefully through the Accord framework.
I understand the additional resolutions might be properties of a transform filter however I am new to DirectShow and COM interfaces in .NET so I am looking for some pointers.

I use to wrap DirectShow code for .NET.
For sure with DirectShow it is possible to get, set ,and retrieve a/v source capabilities.
Have You tried using IAMStreamConfig video interface to set output format on certain capture and compression filters?
I use this code to get resolutions and set it on different sources.
where m_pVCap: source filter
hr = m_pBuilder->FindInterface(&PIN_CATEGORY_CAPTURE,&MEDIATYPE_Interleaved,
m_pVCap, IID_IAMVideoCompression,(void **)&m_pVC);
if (hr != S_OK)
hr = m_pBuilder->FindInterface(&PIN_CATEGORY_CAPTURE, &MEDIATYPE_Video,
m_pVCap,IID_IAMVideoCompression,(void **)&m_pVC);
// !!! What if this interface isn't supported?
// we use this interface to set the frame rate and get the capture size
hr = m_pBuilder->FindInterface(&PIN_CATEGORY_CAPTURE,&MEDIATYPE_Interleaved,
m_pVCap, IID_IAMStreamConfig, (void **)&m_pVSC);
if (hr != NOERROR)
{
hr = m_pBuilder->FindInterface(&PIN_CATEGORY_CAPTURE, &MEDIATYPE_Video,
m_pVCap, IID_IAMStreamConfig,(void **)&m_pVSC);
if (hr != NOERROR)
{
LogDXError(hr, false, FILELINE);
}
}
To get current source format
hr = m_pVSC->GetFormat(&pmt);
// DV capture does not use a VIDEOINFOHEADER
if (hr == NOERROR)
{
if (pmt->formattype == FORMAT_VideoInfo)
{
VIDEOINFOHEADER *pvi = (VIDEOINFOHEADER *)pmt->pbFormat;
pvi->AvgTimePerFrame = (LONGLONG)(10000000 / m_FrameRate);
hr = m_pVSC->SetFormat(pmt);
if (hr != NOERROR)
(NotifyNewError) (FILELINE, "", LOG_ALL, ERR_GRAVE, false,
"Cannot set frame rate for capture");
hr = m_pVSC->GetFormat(&pmt);
pvi = (VIDEOINFOHEADER *)pmt->pbFormat;
pvi->bmiHeader.biWidth = g_SizeOutput.cx;
pvi->bmiHeader.biHeight = g_SizeOutput.cy;
pvi->bmiHeader.biSizeImage = DIBSIZE(pvi->bmiHeader);
hr = m_pVSC->SetFormat(pmt);
if (hr != NOERROR)
{
char ErrTxt[MAX_ERROR_TEXT_LEN];
AMGetErrorText(hr, ErrTxt,MAX_ERROR_TEXT_LEN);
wsprintf(szError, "Error %x: %s\nCannot set frame rate (%d)for
prev", hr, ErrTxt,m_FrameRate);
(NotifyNewError)(FILELINE, "", LOG_ALL, ERR_GRAVE, false, szError);
}
DeleteMediaType(pmt);
}
To get sources capabilities you can use:
IAMStreamConfig::GetNumberOfCapabilities and then IAMStreamConfig::GetStreamCaps

Related

Unity: CopyTexture to External texture2D

I need to expose a Unity texture/Rendertexture to some native plugin, which requires the "D3D11_RESOURCE_MISC_SHARED" flag on the texture.
textures created by unity doesn't have this flag, so, I created it from the plugin side, and then created a reference texture within unity using CreateExternalTexture, and copied the contents to this native texture using Graphics.CopyTexture.
the 2 textures have the same dimension, same size, same format, and same mipCount(0)
the problem is, when I show it in unity (for debugging purpose), I can see nothing and no error occurs.
btw, if I copy by ReadPixel, an error occures :
ReadPixels called on undefined image 0 (valid values are 0 - -1
if I create the texture using unity api, CopyTexture succeeds and result can be seen. but then, I lose the "D3D11_RESOURCE_MISC_SHARED" flag.
so, maybe the texture I created is not valid?
my code:
D3D11_TEXTURE2D_DESC desc = { 0 };
desc.Width = width;
desc.Height = height;
desc.MipLevels = 0;
desc.ArraySize = 1;
desc.Format = DXGI_FORMAT_R8G8B8A8_UNORM;//这里,格式能不能调整?比如,A8是需要的吗?
desc.SampleDesc.Count = 1;
desc.Usage = D3D11_USAGE_DEFAULT;
desc.BindFlags = D3D11_BIND_SHADER_RESOURCE;//普通资源
//desc.CPUAccessFlags = D3D11_CPU_ACCESS_READ;//应该不需要cpu访问的,不是read也不是write
desc.MiscFlags = D3D11_RESOURCE_MISC_SHARED;//for "OpenSharedHandle" d3d11 api
HRESULT hr = E_FAIL;
if (FAILED(hr = pDevice->CreateTexture2D(&desc, nullptr, &g_unityEquirectTexture)))
{
Log(" Create Shared Texture Failed!");
return NULL;
}
Log("CreateSharedTexture success");
//return g_unityEquirectTexture;
Unity CopyTexture Code:
if (output == null)
{
Debug.Log($"limit = {QualitySettings.masterTextureLimit}");
//output = new Texture2D(equirect.width, equirect.height, TextureFormat.RGBA32,false);//uncomment this line and then copyTexture below succeeds
IntPtr externalTextureData = CGDKInterOp.cgdk_c_CreateExternalTexture(equirectLeft.GetNativeTexturePtr(), equirectLeft.width * 2, equirectLeft.height);
if (externalTextureData != IntPtr.Zero)
{
output = Texture2D.CreateExternalTexture(equirectLeft.width * 2, equirectLeft.height, TextureFormat.RGBA32, false, true, externalTextureData);
}
}
if (output == null)
{
Debug.LogError("create texture from external failed!");
return;
}
//RenderTexture.active = equirect;
//output.ReadPixels(new Rect(0, 0, equirect.width, equirect.height), 0, 0);
//RenderTexture.active = null;
Graphics.CopyTexture(equirect, output);
OK, solved by myself.
the problem is: Miplevel == 0.this causes d3d11 creating a texture with 0B memory allocated!
change Miplevel to 1 solved the problem
note:
In unity inspector, we can see the memory allocated by textures. I found that my texture has 0B memory from the inspector and then, I search using this clue and found the solution
There might be a better solution for this but the last time I needed to copy over texture data from a native side to managed (Unity), I did it through marshalling of the data.
You essentially just need to expose a method in the native plugin to pass you the texture data as an array and then have a method in the C# code to fetch it data (by calling the method), releasing the pointer to native memory when you're done. You can find information about marshalling and interop in Microsoft's documentation (e.g. https://learn.microsoft.com/en-us/dotnet/framework/interop/marshalling-different-types-of-arrays).
If your texture is always guaranteed to be of the same size and format, it's easier - but if you need to know some additional parameters so that you know how the Texture should be represented in managed-land, you can always pass yourself the additional data through the same method.

DirectShow memory doesn't release properly

I'm stuck with a memory issue using directshow and, more specifically, the directshow.net library
The goal is reproducing an AVI file and everything works fine, except for the fact the memory doesn't release at all after calling Marshal.ReleaseComObject.
Few steps I'm following on creating the graph
Create IFilterGraph2 object
Create ICaptureGraphBuilder2
Set Filtergraph
Adding the source filter using IFilterGraph2.AddSourceFilter
Create and configuring ISampleGrabber
Adding ISampleGrabber
Render the stream
At this point if I remove the filters from the graph and calling Marshal.ReleaseComObject on ISampleGrabber, IFilterGraph2 and ICaptureGraphBuilder2 to free up, the memory still retained and never unloads.
The problem is that if I play many files without shutting down the whole process I eventually receive a 80004001 error from Com Objects, which sounds a bit weird
Thanks in advance for reading and for any further suggestion
EDIT: Adding code from directshow.net library sample, which has the same behaviour
private void SetupGraph(Control hWin, string FileName)
{
int hr;
// Get the graphbuilder object
m_FilterGraph = new FilterGraph() as IFilterGraph2;
// Get a ICaptureGraphBuilder2 to help build the graph
ICaptureGraphBuilder2 icgb2 = new CaptureGraphBuilder2() as ICaptureGraphBuilder2;
try
{
// Link the ICaptureGraphBuilder2 to the IFilterGraph2
hr = icgb2.SetFiltergraph(m_FilterGraph);
DsError.ThrowExceptionForHR( hr );
// Add the filters necessary to render the file. This function will
// work with a number of different file types.
IBaseFilter sourceFilter = null;
hr = m_FilterGraph.AddSourceFilter(FileName, FileName, out sourceFilter);
DsError.ThrowExceptionForHR( hr );
// Get the SampleGrabber interface
m_sampGrabber = (ISampleGrabber) new SampleGrabber();
IBaseFilter baseGrabFlt = (IBaseFilter) m_sampGrabber;
// Configure the Sample Grabber
ConfigureSampleGrabber(m_sampGrabber);
// Add it to the filter
hr = m_FilterGraph.AddFilter( baseGrabFlt, "Ds.NET Grabber" );
DsError.ThrowExceptionForHR( hr );
// Connect the pieces together, use the default renderer
hr = icgb2.RenderStream(null, null, sourceFilter, baseGrabFlt, null);
DsError.ThrowExceptionForHR( hr );
// Now that the graph is built, read the dimensions of the bitmaps we'll be getting
SaveSizeInfo(m_sampGrabber);
// Configure the Video Window
IVideoWindow videoWindow = m_FilterGraph as IVideoWindow;
ConfigureVideoWindow(videoWindow, hWin);
// Grab some other interfaces
m_mediaEvent = m_FilterGraph as IMediaEvent;
m_mediaCtrl = m_FilterGraph as IMediaControl;
}
finally
{
if (icgb2 != null)
{
Marshal.ReleaseComObject(icgb2);
icgb2 = null;
}
}
}

Adding an ISampleGrabber filter to my current graph

I am currently trying to add a ISampleGrabber filter to my program. Currently the program captures and displays a preview to my windows form however when i try to add my own ISampleGrabber filter using others examples the webcam section of the program stops working completely.
IVideoWindow videoWindow = null;
IMediaControl mediaControl = null;
IMediaEventEx mediaEventEx = null;
IGraphBuilder graphBuilder = null;
ICaptureGraphBuilder2 captureGraphBuilder = null;
IBaseFilter baseFilterForSampleGrabber;
ISampleGrabber sampleGrabber;
AMMediaType mediaType;
VideoInfoHeader videoInfoHeader;
public void capturePreview()
{
int hr = 0;
IBaseFilter baseFilter = null;
try
{
interfaces();
hr = this.captureGraphBuilder.SetFiltergraph(this.graphBuilder);
DsError.ThrowExceptionForHR(hr);
baseFilter = getListOfDevices();
hr = this.graphBuilder.AddFilter(baseFilter, "Webcam");
DsError.ThrowExceptionForHR(hr);
sampleGrabber = new SampleGrabber() as ISampleGrabber;
baseFilterForSampleGrabber = (IBaseFilter)new SampleGrabber();
if (baseFilterForSampleGrabber == null)
{
Marshal.ReleaseComObject(sampleGrabber);
sampleGrabber = null;
}
mediaType = new AMMediaType();
mediaType.majorType = MediaType.Video;
mediaType.subType = MediaSubType.RGB24;
mediaType.formatType = FormatType.VideoInfo;
//int width = videoInfoHeader.BmiHeader.Width;
//int height = videoInfoHeader.BmiHeader.Height;
//int size = videoInfoHeader.BmiHeader.ImageSize;
//mediaType.formatPtr = Marshal.AllocCoTaskMem(Marshal.SizeOf(videoInfoHeader));
//Marshal.StructureToPtr(videoInfoHeader, mediaType.formatPtr, false);
hr = sampleGrabber.SetMediaType(mediaType);
DsUtils.FreeAMMediaType(mediaType);
hr = graphBuilder.AddFilter(baseFilterForSampleGrabber, "ISampleGrabber Filter");
DsError.ThrowExceptionForHR(hr);
hr = this.captureGraphBuilder.RenderStream(PinCategory.Preview, MediaType.Video, baseFilter, baseFilterForSampleGrabber, null);
DsError.ThrowExceptionForHR(hr);
Marshal.ReleaseComObject(baseFilter);
videoWindowSetup();
hr = sampleGrabber.SetBufferSamples(true);
DsError.ThrowExceptionForHR(hr);
hr = this.mediaControl.Run();
DsError.ThrowExceptionForHR(hr);
}
catch
{
MessageBox.Show("Error...Try restart");
}
}
The above code contains my current graph along with the starting ISampleGrabber code I see repeated in every example, however when I add the commented code this is when the program stops. I do not know where the issue is and presume I should at least get the basics sorted before continuing adding on the graph.
If I resolve this problem any further help on what else I require to complete this graph would be very helpful, I aim to convert the frames captured into bitmaps so I can immediately edit them, such as add a crosshair, and show them in the windows form once edited straight away.
Any help is appreciated :)
Commented code does not initialize videoInfoHeader correctly. You need to initialize all members there (well, some might be left with zeros, but you have to add values for mandatory ones). biCompression, biBitCount to say the least. Also your code does not even initialize those members and vice versa reads uninitialized values back.
This is however a wrong way already. Most samples suggest that you don't intialize format and formatPtr for a reason. With major type and subtype, Sample Grabber would "hint" intelligent connect what format you want data in (24-bit RGB here and typically). Yes this is what you can do and this works out well. However there is no flexibility to specify resolution there, or frame rate, not even every pixel format works out. That is, whatever you are trying to do here is likely to be incorrect. You are supposed to be happy with partial media type (major type and subtype only).
DxScan from DirectShow.NET Samples adds the Sample Grabber and shows how to do it and how to set it up:
private void ConfigureSampleGrabber(ISampleGrabber sampGrabber)
{
AMMediaType media;
int hr;
// Set the media type to Video/RBG24
media = new AMMediaType();
media.majorType = MediaType.Video;
media.subType = MediaSubType.RGB24;
media.formatType = FormatType.VideoInfo;
hr = sampGrabber.SetMediaType( media );
DsError.ThrowExceptionForHR( hr );
DsUtils.FreeAMMediaType(media);
media = null;
// Choose to call BufferCB instead of SampleCB
hr = sampGrabber.SetCallback( this, 1 );
DsError.ThrowExceptionForHR( hr );
}

DirectShow USB webcam changing video source

Hey all i am trying to find the setting to change my video source to "composite" on my webcam. Seems that if i unplug the USB and then plug it back in and fire up the code, its just got a blank screen. But once i change the video source (in another program) and then go back and run my code again, it shows up.
So i need something that will allow me to change that in order for that same thing to happen but within my own app without having to start another program that has that feature to set the webcam.
When i pull the USB cable out then put it back in and i run the source code, the app's picturebox is Black.
The "other program" i use to change the video source (that seems to work to bring up the image):
After i use that "other program" i go back to the source code and run it and this is what i get then:
I am using the C# code called dot Net Webcam Library from here: enter link description here
It seems to use the DirectShow from enter link description here
I have noticed in the source for that it lists different types of video settings (found below in the AXExtend.cs):
public enum PhysicalConnectorType
{
Video_Tuner = 1,
Video_Composite,
Video_SVideo,
Video_RGB,
Video_YRYBY,
Video_SerialDigital,
Video_ParallelDigital,
Video_SCSI,
Video_AUX,
Video_1394,
Video_USB,
Video_VideoDecoder,
Video_VideoEncoder,
Video_SCART,
Video_Black,
Audio_Tuner = 0x1000,
Audio_Line,
Audio_Mic,
Audio_AESDigital,
Audio_SPDIFDigital,
Audio_SCSI,
Audio_AUX,
Audio_1394,
Audio_USB,
Audio_AudioDecoder,
}
But i am unsure of how to call that up in the code here:
Device selectedDevice = device as Device;
imageCapture.Device = selectedDevice as Device;
imageCapture.PerformAutoScale();
imageCapture.Refresh();
imageCapture.Start();
So i am guessing that the "Video_Composite" is what i may need in order to do that?
Any help would be great!!! Thanks!
David
Code update
foreach (Device device in Device.FindDevices())
{
if (device.ToString() == "BackupCamera")
{
Device selectedDevice = device as Device;
IGraphBuilder graphBuilder = new FilterGraph() as IGraphBuilder;
DsDevice device1 = DsDevice.GetDevicesOfCat(FilterCategory.VideoInputDevice)[1]; // <<--- Your Device
Guid baseFilterIdentifier = typeof(IBaseFilter).GUID;
object videoSourceObject;
device1.Mon.BindToObject(null, null, ref baseFilterIdentifier, out videoSourceObject);
IBaseFilter videoSourceBaseFilter = videoSourceObject as IBaseFilter;
graphBuilder.AddFilter(videoSourceBaseFilter, "Source");
ICaptureGraphBuilder2 captureGraphBuilder = new CaptureGraphBuilder2() as ICaptureGraphBuilder2;
captureGraphBuilder.SetFiltergraph(graphBuilder);
object crossbarObject;
captureGraphBuilder.FindInterface(FindDirection.UpstreamOnly, null, videoSourceBaseFilter, typeof(IAMCrossbar).GUID, out crossbarObject);
IAMCrossbar crossbar = crossbarObject as IAMCrossbar;
int inputPinCount, outputPinCount;
crossbar.get_PinCounts(out inputPinCount, out outputPinCount); // <<-- In/Out Pins
// Pin Selection: Physical Input 2 (e.g. Composite) to Capture Pin 0
crossbar.Route(0, 2);
imageCapture.Device = selectedDevice as Device;
imageCapture.PerformAutoScale();
imageCapture.Refresh();
imageCapture.Start();
}
}
Before running the filer graph, you need to obtain the crossbar interface. You typically use ICaptureGraphBuilder2::FindInterface for this. This requires an additional filter and the FindInterface method is useful specifically for this reason:
Supporting Filters. If a capture device uses a Windows Driver Model (WDM) driver, the graph may require certain filters upstream from the WDM Video Capture filter, such as a TV Tuner filter or an Analog Video Crossbar filter. If the pCategory parameter does not equal NULL, this method automatically inserts any required WDM filters into the graph.
Having this done, you will have IAMCrossbar interface, and IAMCrossbar::Route method is how you switch the inputs.
See also: Crossbar filter change current input to Composite
Code snippet:
IGraphBuilder graphBuilder = new FilterGraph() as IGraphBuilder;
DsDevice device = DsDevice.GetDevicesOfCat(FilterCategory.VideoInputDevice)[1]; // <<--- Your Device
Guid baseFilterIdentifier = typeof(IBaseFilter).GUID;
object videoSourceObject;
device.Mon.BindToObject(null, null, ref baseFilterIdentifier, out videoSourceObject);
IBaseFilter videoSourceBaseFilter = videoSourceObject as IBaseFilter;
graphBuilder.AddFilter(videoSourceBaseFilter, "Source");
ICaptureGraphBuilder2 captureGraphBuilder = new CaptureGraphBuilder2() as ICaptureGraphBuilder2;
captureGraphBuilder.SetFiltergraph(graphBuilder);
object crossbarObject;
captureGraphBuilder.FindInterface(FindDirection.UpstreamOnly, null, videoSourceBaseFilter, typeof(IAMCrossbar).GUID, out crossbarObject);
IAMCrossbar crossbar = crossbarObject as IAMCrossbar;
int inputPinCount, outputPinCount;
crossbar.get_PinCounts(out inputPinCount, out outputPinCount); // <<-- In/Out Pins
// Pin Selection: Physical Input 2 (e.g. Composite) to Capture Pin 0
crossbar.Route(0, 2);

IBasicVideo CetCurrentImage catastrophic failure(DirectShow.NET)

I'm trying to grab image from webcam using DirectShow.NET and IBasicVideo CetCurrentImage. But I only get catastrophic failure on second call GetCurrentImage.
What I'm doing particularly:
IBasicVideo bv = (IBasicVideo)graph;
IntPtr bvp = new IntPtr();
int size = 0;
int hr = bv.GetCurrentImage(ref size, IntPtr.Zero);
DsError.ThrowExceptionForHR(hr);
bvp = Marshal.AllocCoTaskMem(size);
hr = bv.GetCurrentImage(ref size, bvp);
DsError.ThrowExceptionForHR(hr);
Bitmap image = new Bitmap(480, 320, 480 * (24 / 8), System.Drawing.Imaging.PixelFormat.Format24bppRgb, bvp);
image.Save(path);
What am I doing wrong?
Prety much all I have:
IGraphBuilder graph = null;
IMediaEventEx eventEx = null;
IMediaControl control = null;
ICaptureGraphBuilder2 capture = null;
IBaseFilter srcFilter = null;
public IVideoWindow videoWindow = null;
IntPtr videoWindowHandle = IntPtr.Zero;
public void GetPreviewFromCam()
{
graph = (IGraphBuilder)(new FilterGraph());
capture = (ICaptureGraphBuilder2)(new CaptureGraphBuilder2());
eventEx = (IMediaEventEx)graph;
control = (IMediaControl)graph;
videoWindow = (IVideoWindow)graph;
videoWindowHandle = hVideoWindow;
eventEx.SetNotifyWindow(hVideoWindow, WM_GRAPHNOTIFY, IntPtr.Zero);
int hr;
// Attach the filter graph to the capture graph
hr = capture.SetFiltergraph(graph);
DsError.ThrowExceptionForHR(hr);
// Find capture device and bind it to srcFilter
FindCaptureDevice();
// Add Capture filter to our graph.
hr = graph.AddFilter(srcFilter, "Video Capture");
DsError.ThrowExceptionForHR(hr);
// Render the preview pin on the video capture filter
// Use this instead of graph->RenderFile
hr = capture.RenderStream(PinCategory.Preview, MediaType.Video, srcFilter, null, null);
DsError.ThrowExceptionForHR(hr);
hr = control.Run();
DsError.ThrowExceptionForHR(hr);
}
IBasicVideo::GetCurrentImage does not have to unconditionally succeed. What it does is forwarding of the call to video renderer in your graph (fails if you don't have one, or you have a weird non-renderer filter which unexpectedly implements the interface), then the renderer would attempt to get you the image. The renderer might fail if it is operating in incompatible mode (windowless video renderers don't have IBasicVideo - might fail here), or the renderer yet did not receive any video frame to have a copy delivered to you, that is the call is premature.
Additionally, there might be a handful of other issues related to obvious bugs - you did not put the graph into active mode, you are under wrong impression about topology you are having, you are using wrong interface, your code has threading issues etc.
With a specter of possibly causes this wide, start with a simple question: at the time of the call, do you have the video frame already presented to you visually?

Categories