Following is my Filter Graph. I am trying to insert "ffdshow video encoder" encoder in the filtergraph, but I am unable to do so.
Following is my Code for trying to Connect Compressor after getting filtergraph generated:
public void setFileName(string pFileName)
{
int hr;
IBaseFilter _infinitePinTeeFilter = null;
graph.FindFilterByName("Infinite Pin Tee Filter", out _infinitePinTeeFilter);
mediaControl.Stop();
hr = captureGraphBuilder.SetOutputFileName(MediaSubType.Avi, pFileName, out mux, out sink);
checkHR(hr, "Can't set SetOutputFile");
hr = captureGraphBuilder.RenderStream(null, MediaType.Video, _infinitePinTeeFilter, _videoCompressor, mux);
checkHR(hr, "Can't Render Output File");
mediaControl.Run();
}
Any help would be appreciated... Thanks.
ICaptureGraphBuilder::SetOutputFileName is not a good choice of API to set the graph up. It does the job well for simple graphs, but as it forwards you back errors without good description and the stage at which the error actually happened, every time you have hard time trying to understand what goes wrong.
The problem might be caused by absence of frame rate information on the media type on the output of video compressor, but as the building stage you have on your screenshot you don't even have this media type yet available and you cannot troubleshoot and get this information.
Use IGraphBuilder::AddFilter, IGraphBuilder::Connect, IFileSinkiFilter::SetFileName instead to reliably configure the pipeline.
Related
I'm creating a webcam control using DirectShow.NET. I want to render the video of the camera into a WPF window. What is happening currently is that the IVMRWindowlessControl9 doesn't seem to be going into windowless mode and is not being parented to the window that I'm specifying, even though I'm calling the appropriate methods.
Why are these methods not being invoked? Is there something else that I'm not doing?
Below is a snippet of the relevant code:
IGraphBuilder graphBuilder = (IGraphBuilder) new FilterGraph();
ICaptureGraphBuilder2 captureGraphBuilder = (ICaptureGraphBuilder2)new CaptureGraphBuilder2();
IMediaControl mediaControl = (IMediaControl) this.graphBuilder;
IBaseFilter renderFilter = (IBaseFilter) new VideoMixingRenderer9();
hr = this.captureGraphBuilder.SetFiltergraph(this.graphBuilder);
DsError.ThrowExceptionForHR(hr);
IBaseFilter sourceFilter = FindCaptureDevice();
hr = this.graphBuilder.AddFilter(sourceFilter, "Video Capture");
DsError.ThrowExceptionForHR(hr);
SetCaptureResolution();
IVMRFilterConfig9 filterConfig = (IVMRFilterConfig9)renderFilter;
hr = filterConfig.SetNumberOfStreams(1);
DsError.ThrowExceptionForHR(hr);
hr = filterConfig.SetRenderingMode(VMR9Mode.Windowless);
DsError.ThrowExceptionForHR(hr);
windowlessControl = (IVMRWindowlessControl9)renderFilter;
hr = this.graphBuilder.AddFilter(renderFilter, "Video Capture");
DsError.ThrowExceptionForHR(hr);
Window window = Window.GetWindow(this);
var wih = new WindowInteropHelper(window);
IntPtr hWnd = wih.Handle;
hr = windowlessControl.SetVideoClippingWindow(hWnd);
DsError.ThrowExceptionForHR(hr);
hr = windowlessControl.SetAspectRatioMode(VMR9AspectRatioMode.LetterBox);
DsError.ThrowExceptionForHR(hr);
hr = this.captureGraphBuilder.RenderStream(PinCategory.Capture, MediaType.Video, sourceFilter, null, null);
DsError.ThrowExceptionForHR(hr);
Marshal.ReleaseComObject(sourceFilter);
hr = this.mediaControl.Run();
DsError.ThrowExceptionForHR(hr);
Here is an image of what is happening (I made the background green to make it easier to see):
This is a diagram of the filter graph:
To answer a potential question (because I've had this issue previously), yes, hWnd is getting set/has a value - so the windowlessControl does have a pointer to the window.
A popup "ActiveMovie Window" created when you run a filter graph is a symptom of video renderer filter inserted into pipeline and running in default mode, without being configured to be a part of other UI: embedded as a child window etc.
Your reversing your graph sheds light on what is going on:
You insert and set up one video renderer filter, then there is another one added by the API and connected to your input. While embedded into your UI is the first one, it remains idle and the other one renders video into popup.
The code line which gives the problem is:
hr = this.captureGraphBuilder.RenderStream(PinCategory.Capture,
MediaType.Video, sourceFilter, null, null);
The problem is quite typical for those who build graphs by inserting a video renderer and expecting them to be picked up and connected, especially that it sometimes works and such code snippets might be found online.
MSDN says:
If the pSink parameter is NULL, the method tries to use a default renderer. For video it uses the Video Renderer, and for audio it uses the DirectSound Renderer.
The call added another instance and you expected your existing one to be connected. While RenderStream is a powerful call and does filter magic to get things connected, it is easy to end up having it done something in wrong way.
If you already have your video renderer, you could use it as a sink argument in this call. Or, you could avoid doing RenderStream and incrementally add filters you need to make sure everything is built following your expectations. Or, another option is IFilterGraph2::RenderEx call instead with AM_RENDEREX_RENDERTOEXISTINGRENDERERS flag:
...the method attempts to use renderers already in the filter graph. It will not add new renderers to the graph. (It will add intermediate transform filters, if needed.) For the method to succeed, the graph must contain the appropriate renderers, and they must have unconnected input pins.
I am using windows media foundation for keeping track of all the mics and cameras in an application. I am getting event type MEError instead of MECaptureAudioSessionDeviceRemoved when I unplug a Mic. I have tried unplugging Mics connected via USB and audio jack and I always get an eventtype with id MEError. The issue is not seen with video capture device(webcam) as I get the expected MEVideoCaptureDeviceRemovedevent type.
The mic's are getting initialized correctly as I can hear the audio correctly.
I have found zero information on this particular(unplugging mic with media foundation) issue on the internets. On top of this, I am a newbie C# dev. I am curious to understand why I am not getting the MECaptureAudioSessionDeviceRemoved but getting the MEError? Is this something the Mic driver developer did not implement or is it something expected if an error exists in my code?
Here's my code for getting the EventType(Not exactly necessary for my question) The class this function belongs to implements IMFAsyncCallback-
HRESULT MicCaptureSession::Invoke(IMFAsyncResult* pAsyncResult)
{
ComPointerCustom<IMFMediaEvent> pEvent;
HRESULT hr = S_OK;
std::lock_guard<std::mutex> lock(m_critSec);
if (pAsyncResult == 0)
{
return E_UNEXPECTED;
}
hr = m_localSession->EndGetEvent(pAsyncResult, &pEvent);
MediaEventType eventType;
HRESULT hr = S_OK;
HRESULT hrStatus = S_OK;
UINT32 TopoStatus = MF_TOPOSTATUS_INVALID;
if (pEvent== NULL)
return E_UNEXPECTED;
hr = pEvent->GetType(&eventType); <------ Y U NO WORK ??
if (FAILED(hr))
{
return E_UNEXPECTED;
}
hr = pEvent->GetStatus(&hrStatus);
if (FAILED(hr))
{
return E_UNEXPECTED;
}
/* ----- MORE CODE -----*/
}
I cannot say exactly reason of it, but I can advice you check more error invokes. Audio capture is different from video capture - video capture usually has about 33 ms between frames, but audio capture has about 5 - 10 ms, and it can generate MEError before then Windows Audio driver generate MECaptureAudioSessionDeviceRemoved.
Also, usually MF sources generate chain of errors. Try to check more error invokes from audio capture source.
I am using DirectShowLib.net in C# and I am trying to set change my ALLOCATOR_PROPERTIES settings, as I am using a live video source and changes are not "immediately" visible on screen.
When constructing a filter graph in GraphStudioNext the ALLOCATOR_PROPERTIES show for the upstream and the downstream pin, although only after connection.
I'd like to set the ALLOCATOR_PROPERTIES using IAMBufferNegotiation, but when trying to get the interface from my capture filter (AV/C Tape Recorder/Player) I get an E_UNEXPECTED (0x8000ffff) error. Here is the relevant C# code:
DS.IAMBufferNegotiation iamb = (DS.IAMBufferNegotiation)capturePin;
DS.AllocatorProperties allocatorProperties = new DS.AllocatorProperties();
hr = iamb.GetAllocatorProperties(allocatorProperties);
DS.DsError.ThrowExceptionForHR(hr);
When I used the downstream video decoder input pin, I get a System.InvalidCastException as the interface is not supported.
How I can I change the cBuffers value of ALLOCATOR_PROPERTIES?
Changing number of buffers is not going to help you here. The number of buffers are negotiated between filters are are, basically, not to be changed externally, however in your case there is no real buffering in the pipeline: once the video frame is available on the first output pin, then it immediately goes through and reaches video renderer. If you see a delay there, it means that either DV player has internal latency, or it is time stamping frames "late" and video renderer has to wait before presentation. You can troubleshoot the latter case by inserting Smart Tee Filter in between and connecting its Preview output pin downward to the video renderer - if this helps out, then the issue is frame time stamping on the source. Amount of buffers does not cause any presentation lag here.
I am trying to figure out, how can i get an bitmap data from a filter.
I am using DirectShowNet wrapper to get an image from my webcamera.
My current code is:
public partial class Form1 : Form
{
public IGraphBuilder gb;
public ICaptureGraphBuilder2 cgb;
public IBaseFilter filter;
public Form1()
{
InitializeComponent();
DsDevice[] videoInputDevices = DsDevice.GetDevicesOfCat(FilterCategory.VideoInputDevice);
object obj = null; Guid iid = typeof(IBaseFilter).GUID;
videoInputDevices[1].Mon.BindToObject(null, null, ref iid, out obj);
filter = (IBaseFilter)obj;
((IAMCameraControl)filter).Set(CameraControlProperty.Exposure, 0, CameraControlFlags.Auto);
gb = (IGraphBuilder) new FilterGraph();
cgb = (ICaptureGraphBuilder2) new CaptureGraphBuilder2();
cgb.SetFiltergraph(gb);
gb.AddFilter(filter, "First Filter");
cgb.RenderStream(PinCategory.Preview, MediaType.Video, filter, null, null);
((IVideoWindow)gb).put_Owner(this.panel1.Handle);
((IVideoWindow)gb).put_WindowStyle(WindowStyle.Child | WindowStyle.ClipChildren);
((IVideoWindow)gb).put_Visible(OABool.True);
((IVideoWindow)gb).SetWindowPosition(0, 0, this.panel1.Width, this.panel1.Height);
((IMediaControl)gb).Run();
}
}
This simple code just render webcamera output to panel control. I tried to use timer and SaveToBitmap function to copy image from panel to bitmap, but bitmap is blank after that.
I read something about Grabber filter, but my solution did not work, it returned null ptr to buffer/sample.
I would like to ask, what should i add to be able to read image data ?
Thank you very much.
Standard behavior of DirectShow pipeline is such that filters pass data one to another without showing it to the controlling application and code, so there is no direct way to access the data.
You typically do one of the following:
You add Sample Grabber Filter to certain position of your pipeline and set it up so that SG calls you back every time it has data going through
You grab a copy of currently displayed video from video renderer
Both methods are documented, popular and discussed multiple times including on StackOverflow:
Efficiently grabbing pixels from video
take picture from webcam c#
Here's a detailed example of exactly this:
Working with raw video data from webcam in C# and DirectShowNet
I've got a C# control wrapped around the DirectShow libraries. Though I'm not certain it's relevant, I'm running on Windows CE 6.0R3. When trying to play a WMA audio file using the control, the following code throws an exception of "No such interface supported":
m_graph = new DShowGraph(mediaFile);
m_graphBuilder = m_graph.Open();
m_videoWindow = (IVideoWindow)m_graph.GetVideoWindow();
if (m_videoWindow == null)
{
// this is not hit
}
try
{
m_videoWindow.put_WindowStyle((int)(WS.CHILD | WS.VISIBLE | WS.CLIPSIBLINGS));
}
catch (Exception ex)
{
// I end up here
}
The Open call looks like this (error handling, etc. trimmed):
private IGraphBuilder _graphBuilder;
internal IGraphBuilder Open()
{
object filterGraph = ClassId.CoCreateInstance(ClassId.FilterGraph);
_graphBuilder = (IGraphBuilder)filterGraph;
_graphBuilder.RenderFile(_input, null);
return _graphBuilder;
}
The GetVideoWindow call simply looks like this:
public IVideoWindow GetVideoWindow()
{
if (_graphBuilder == null)
return null;
return (IVideoWindow)(_graphBuilder);
}
Strangely, this all works just fine with the same control DLL, same application and same media file when run under Windows CE 5.0.
My suspicion is that it might have something to do with the fact we're playing an audio-only file (checking to see if the same problem occurs with a video file now), but I'm not overly versed in Direct Show, so I'd like to understand exactly what's going on here.
One of the large challenges in debugging this is that I don't have the failing hardware in my office - it's at a customer's site, so I have to make changes, send them and wait for a reply. While that doesn't affect the question, it does affect my ability to quickly follow up with suggestions or follow on questions anyone might have.
EDIT1
Playing a WMV file works fine, so it is related to the file being audio-only. We can't test MP3 to see if it's a WMA codec issue becasu the device OEM does not include the MP3 codec in the OS due to their concerns over licensing.
The graph's IVideoWindow is nothing but forward to underlying IVideoWindow of video rendering filter. With audio only pipeline you don't have the video renderer (obviously) and IVideoWindow does not make much sense. The interface is still available but once you try to call methods, there is nothing to forward, hence the error.