I am using DirectShowLib.net in C# and I am trying to set change my ALLOCATOR_PROPERTIES settings, as I am using a live video source and changes are not "immediately" visible on screen.
When constructing a filter graph in GraphStudioNext the ALLOCATOR_PROPERTIES show for the upstream and the downstream pin, although only after connection.
I'd like to set the ALLOCATOR_PROPERTIES using IAMBufferNegotiation, but when trying to get the interface from my capture filter (AV/C Tape Recorder/Player) I get an E_UNEXPECTED (0x8000ffff) error. Here is the relevant C# code:
DS.IAMBufferNegotiation iamb = (DS.IAMBufferNegotiation)capturePin;
DS.AllocatorProperties allocatorProperties = new DS.AllocatorProperties();
hr = iamb.GetAllocatorProperties(allocatorProperties);
DS.DsError.ThrowExceptionForHR(hr);
When I used the downstream video decoder input pin, I get a System.InvalidCastException as the interface is not supported.
How I can I change the cBuffers value of ALLOCATOR_PROPERTIES?
Changing number of buffers is not going to help you here. The number of buffers are negotiated between filters are are, basically, not to be changed externally, however in your case there is no real buffering in the pipeline: once the video frame is available on the first output pin, then it immediately goes through and reaches video renderer. If you see a delay there, it means that either DV player has internal latency, or it is time stamping frames "late" and video renderer has to wait before presentation. You can troubleshoot the latter case by inserting Smart Tee Filter in between and connecting its Preview output pin downward to the video renderer - if this helps out, then the issue is frame time stamping on the source. Amount of buffers does not cause any presentation lag here.
Related
I have a DX11 Unity application, which is loading a native C++ DLL. The DLL creates its own D3D11 Device. I would like to take a texture from Unity and use it in my C++ DLL - ideally without copying to CPU memory.
On Unity side I do this :
MyNativeLib.SetBuffers11(srcTexture.GetNativeTexturePtr());
In this case, srcTexture is RenderTexture.
In the native DLL, I do this:
void SetBuffers11(ID3D11Resource* colorRes)
{
D3D11_TEXTURE2D_DESC texDesc;
ID3D11Texture2D* tempColor;
colorRes->QueryInterface(__uuidof(ID3D11Texture2D), (void**)&tempColor);
tempColor->GetDesc(&texDesc); //This works
ID3D11Resource* tempResource;
HRESULT openResult = m_device->OpenSharedResource1(
colorRes, __uuidof(ID3D11Resource), (void**)&tempResource); //This fails
tempResource->QueryInterface(__uuidof(ID3D11Texture2D), (void**)(&tempColor));
tempColor->GetDesc(&texDesc);
}
If I just query texture description, I get correct D3D11_TEXTURE2D_DESC. But when I try to access the texture data, I get MISCELLANEOUS CORRUPTION #18: CORRUPTED_PARAMETER. So I tried to use OpenSharedResource - It also failed but it told me, the pointer is probably NT (I am not actualy sure about that) and I should OpenSharedResource1. So I did and I get this :
D3D11 ERROR: ID3D11Device::OpenSharedResource1:
Returning E_INVALIDARG, meaning invalid parameters were passed.
[ STATE_CREATION ERROR #381: DEVICE_OPEN_SHARED_RESOURCE_INVALIDARG_RETURN]
OpenSharedResource1 returns E_INVALIDARG and pointer is set to 0. I am not sure which function to call for textures created by Unity in DX11 mode.
When I query for texture description, I get this:
Format : 0x09 (DXGI_FORMAT_R16G16B16A16_TYPELESS)
Bind Flags : 0x20 | 0x08 (D3D11_BIND_SHADER_RESOURCE and D3D11_BIND_RENDER_TARGET)
CPU ACCESS : 0
Any help appreciated
I finaly managed to get this working. I had several issues with my original approach.
To share a texture between two DX11 devices, the texture needs to have both D3D11_RESOURCE_MISC_SHARED_NTHANDLE and D3D11_RESOURCE_MISC_SHARED_KEYEDMUTEX set for MiscFlags. This a recommended approach since Windows 8. There are also some other restrictions when creating shareable texture, check the docs.
As far as I know, Unity wont let me specify these flags (maybe in a native plugin), but I can capture the original DX11 device, create a new shareable texture and copy the Unity texture to this new texture on the original Unity DX11 device.
ID3D11Device* otherDevice;
ID3D11Device1* otherDevice1;
unityTexture->GetDevice(&otherDevice);
otherDevice->QueryInterface(__uuidof (ID3D11Device1), (void**)&otherDevice1);
otherDevice->Release();// throw away the original device
otherDevice1->GetImmediateContext1(&otherContext);
//you can now create the shareable texture on otherContext
Notice that ID3D11Device1 is needed, since it contains methods for working with shared resources.
Now that you have shareable texture, you first need to share it on the original device, by creating a shared handle from the shareable texture :
IDXGIResource1* shareableResource;
//sharedTexture is the resource you want to share
sharedTexture->QueryInterface(__uuidof(IDXGIResource1),(void**)&shareableResource);
HANDLE sharedHandle;
HRESULT createSharedRes = shareableResource->CreateSharedHandle(NULL,
DXGI_SHARED_RESOURCE_READ,//other device will only READ
nullptr,
&sharedHandle);
shareableResource->Release();
This shared handle can now be passed to other DX11 device, which can open it :
m_device->OpenSharedResource1(sharedHandle, __uuidof(ID3D11Texture2D), (void**)sharedTextureLocal);
Now sharedTextureLocal contains a pointer to a ID3D11Texture2D on your local device, which you can use. However, when you want do move texture data from one device to another, you need to synchronize access to these resources, otherwise you will get garbage data. You need to synchronize access everytime time you access shared resources on both devices. In my case I needed to synchronize it twice, once on original Unity device when copying from Unity texture to shareable texture and then on my local device when using the local shared texture :
IDXGIKeyedMutex* keyedMutex;
sharedTexture->QueryInterface(__uuidof(IDXGIKeyedMutex), (void**)&keyedMutex);
keyedMutex->AcquireSync(0, INFINITE);
otherContext->CopyResource(dst, src);//use the correct context
keyedMutex->ReleaseSync(0);
An alternative title could be: What happened to PIN_CATEGORY_STILL?
I am currently comparing images that were captured using DirectShow and PIN_CATEGORY_STILL with images that were captured using UWP MediaCapture.
On the device I am testing/playing around with DirectShow and MediaCapture, DirectShow detects a PIN_CATEGORY_STILL but I am not able to initialize an instance of MediaCapture with anything other than PhotoCaptureSource.VideoPreview.
MediaCaptureInitializationSettings settings = new()
{
VideoDeviceId = someDeviceId,
PhotoCaptureSource = PhotoCaptureSource.Photo
};
MediaCapture capture = new();
// this throws an exception
// "The capture source does not have an independent photo stream."
await capture.InitializeAsync(settings);
At this point I'm not even sure if PhotoCaptureSource.Photo is meant to be used as an equivalent to PIN_CATEGORY_STILL.
Images captured with PIN_CATEGORY_STILL are way brighter in a dark environment and have a much better quality (in file size and resolution) (which is clear to me, since I am using PhotoCaptureSource.VideoPreview for MediaCapture).
Considering this resource Win32 and COM for UWP apps, it seems like UWP MediaCapture does not use DirectShow underneath but MediaFoundation (which is meant to be a successor for DirectShow).
This article led me to this StackOverflow question Media Foundation is incorrectly marking still image capture stream descriptors as video capture, which basically states that MediaFoundation has no PIN_CATEGORY_STILL but returns 1 FPS as video capability for such devices (or profiles).
Since I am not directly using MediaFoundation nor C++, I tried testing this by querying GetAvailableMediaStreamProperties:
private void Foo()
{
var videoPreviewProperties = GetEncodingProperties(MediaStreamType.VideoRecord);
var photoProperties = GetEncodingProperties(MediaStreamType.Photo);
var videoRecordProperties = GetEncodingProperties(MediaStreamType.VideoPreview);
}
private List<VideoEncodingProperties> GetEncodingProperties(MediaStreamType streamType)
{
// MediaCapture was previously initialized with PhotoCaptureSource.VideoPreview
return MediaCapture.VideoDeviceController
.GetAvailableMediaStreamProperties(streamType)
.OfType<VideoEncodingProperties>()
.ToList();
}
None of these returns a VideoEncodingProperties with only 1 FPS.
To test MediaCapture any further I tried some of the sample applications from here UWP sample. I tried CameraAdvancedCapture, CameraOpenCV and CameraManualControls, but the results were not nearly as good as good old PIN_CATEGORY_STILL.
What happened to PIN_CATEGORY_STILL?
Is there any way to capture images without DirectShow/PIN_CATEGORY_STILL and still keeping this level of quality?
Any enlightenment is much appricated.
I'm using the Canon SDK 2.1 and i am trying to take a picture at the camera from C# code.
I started a session (EdsOpenSession) and everything works fine with this line of code:
EDSDK.EdsSendCommand(cameraDev, EDSDK.CameraCommand_TakePicture, 0);
the camera takes a picture and stores it on memory card.
The problem is here: if there is an AF error (e.g. the lens cap is on), the camera gets 'BUSY' and never gets back.
Also if i try to shut down the EDSDK with the functions EdsCloseSession or EdsTerminateSDK, they blocks. The only thing to get it up again is to restart the application and the camera.
I'm using an EOS 100D.
What can i do to get ignore these AF error and try to take another picture?
I have also just had this issue.
I have solved it by Sending a half button press to focus followed by a full button press to take the photo if that succeeds.
try
{
EDSDK.EdsSendCommand(cameraDev, EDSDK.CameraCommand_PressShutterButton, 1); // Half
EDSDK.EdsSendCommand(cameraDev, EDSDK.CameraCommand_PressShutterButton, 3); // Completely
}
finally
{
EDSDK.EdsSendCommand(cameraDev, EDSDK.CameraCommand_PressShutterButton, 0); // Off
}
I have the same problem with Canon EOS 1100D, but I've found http://digicamcontrol.com which is open source. They've managed to make autofocus working, but I haven't found what exactly they did. Maybe you can find it. I if you do, please share the solution.
I'm trying to control the volume of an AVPlayer in my iPhone app.
I seem to receive an "unrecognized selector sent to instance" error when trying to simply get the volume value, or even set it, from the AVPlayer.Volume property -
AVPlayer myAVPlayer = new AVPlayer();
var volume = myAVPlayer.Volume;
Any ideas how to make this work?
That's likely because you're using an older device (or simulator) version of iOS. The Volume property was added in iOS7.
There are other ways to set the volume - but you'll need to tell us more about what you're trying to accomplish.
Following is my Filter Graph. I am trying to insert "ffdshow video encoder" encoder in the filtergraph, but I am unable to do so.
Following is my Code for trying to Connect Compressor after getting filtergraph generated:
public void setFileName(string pFileName)
{
int hr;
IBaseFilter _infinitePinTeeFilter = null;
graph.FindFilterByName("Infinite Pin Tee Filter", out _infinitePinTeeFilter);
mediaControl.Stop();
hr = captureGraphBuilder.SetOutputFileName(MediaSubType.Avi, pFileName, out mux, out sink);
checkHR(hr, "Can't set SetOutputFile");
hr = captureGraphBuilder.RenderStream(null, MediaType.Video, _infinitePinTeeFilter, _videoCompressor, mux);
checkHR(hr, "Can't Render Output File");
mediaControl.Run();
}
Any help would be appreciated... Thanks.
ICaptureGraphBuilder::SetOutputFileName is not a good choice of API to set the graph up. It does the job well for simple graphs, but as it forwards you back errors without good description and the stage at which the error actually happened, every time you have hard time trying to understand what goes wrong.
The problem might be caused by absence of frame rate information on the media type on the output of video compressor, but as the building stage you have on your screenshot you don't even have this media type yet available and you cannot troubleshoot and get this information.
Use IGraphBuilder::AddFilter, IGraphBuilder::Connect, IFileSinkiFilter::SetFileName instead to reliably configure the pipeline.