What determines the surface texture width and height for camera? - c#

Here's a snippet from my implementation of my custom camera page.
For some reason, I keep getting a lower resolution textureView/SurfaceTexture height and width than I expect. I want to keep the native maximum camera resolution (or picture size) my native camera on my phone takes normally. However, with the way I have it setup right now, the resolution of my textureView is lower than the native camera resolution. I'm wondering why? I tried to debug and seems OnSurfaceTextureAvailable() seems to be setting the width and height incorrectly. Where is it grabbing it from?
public class CameraPage : PageRenderer, TextureView.ISurfaceTextureListener, Android.Views.View.IOnTouchListener
{
global::Android.Hardware.Camera camera;
Activity activity;
CameraFacing cameraType;
TextureView textureView;
SurfaceTexture surfaceTexture;
public void OnSurfaceTextureAvailable (SurfaceTexture surface, int width, int height)
{
GetCameraInstance();
if (camera != null) {
textureView.LayoutParameters = new FrameLayout.LayoutParams(width, height);
surfaceTexture = surface;
camera.SetPreviewTexture(surface);
PrepareAndStartCamera();
}
}
private void GetCameraInstance()
{
try {
camera = global::Android.Hardware.Camera.Open((int)CameraFacing.Back);
}
catch (Exception e) {
//ignore any exception
}
}
public bool OnSurfaceTextureDestroyed (SurfaceTexture surface)
{
StopCameraPreviewAndRelease();
return true;
}
private void StopCameraPreview()
{
try {
if (camera != null)
camera.StopPreview();
}
catch { }
}
private void StopCameraPreviewAndRelease()
{
try {
if (camera != null) {
StopCameraPreview();
camera.SetPreviewCallback(null);
camera.Release();
camera = null;
}
}
catch { }
}
public void OnSurfaceTextureSizeChanged (SurfaceTexture surface, int width, int height)
{
PrepareAndStartCamera ();
}
public void OnSurfaceTextureUpdated (SurfaceTexture surface)
{
}
private void PrepareAndStartCamera ()
{
var flashMode = GetFlashMode();
SetCameraParameters(flashMode);
StopCameraPreview();
var display = activity.WindowManager.DefaultDisplay;
if (display.Rotation == SurfaceOrientation.Rotation0) {
camera.SetDisplayOrientation (90);
}
if (display.Rotation == SurfaceOrientation.Rotation270) {
camera.SetDisplayOrientation (180);
}
if (flashOn)
toggleFlashButton.SetBackgroundResource(Resource.Drawable.flash_on);
else
toggleFlashButton.SetBackgroundResource(Resource.Drawable.flash_off);
camera.StartPreview ();
}
}
This is how I'm setting my textureView:
textureView = view.FindViewById<TextureView> (Resource.Id.textureView);
textureView.SurfaceTextureListener = this;
textureView in my cameraLayout:
<TextureView
android:id="#+id/textureView"
android:layout_marginTop="-110dp"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:backgroundTint="#99b4d1ff"
android:layout_marginLeft="0dp" />
That is, I'm expecting 2576x1932 resolution when the photo is taken and saved. But I am getting 1451x720 instead when the photo is taken and saved. Seems like that was determined to be the textureView size (but I want the native camera resolution size).
EDIT: Here's the way the photo is being taken of:
private async void TakePhotoButtonTapped (object sender, EventArgs e)
{
try{
try
{
StopCameraPreview();
}
catch (Exception ex) {
camera.Reconnect();
PrepareAndStartCamera();
StopCameraPreview();
}
var image = textureView.Bitmap;
var imageQuality = AppState.ApplicationInfo.AndroidImageCompressionFactor;
using (var imageStream = new MemoryStream ()) {
await image.CompressAsync(Bitmap.CompressFormat.Jpeg, imageQuality, imageStream);
image.Recycle();
imageBytes = imageStream.ToArray ();
}
count +=1;
textView.Text = Convert.ToString(count);
_images.Add(imageBytes);
camera.StartPreview ();
}
catch(Exception ex)
{
}
}

Related

Camera preview face detection failed

In my Xamarin.forms android app, I am using IFaceDetectionListener Doc, for detecting faces in custom camera preview. When I try to open my custom camera page I am getting error like this.
Java.Lang.RuntimeException: start face detection failed
My stack traces are
JniEnvironment+InstanceMethods.CallNonvirtualVoidMethod (Java.Interop.JniObjectReference instance, Java.Interop.JniObjectReference type, Java.Interop.JniMethodInfo method, Java.Interop.JniArgumentValue* args)
JniPeerMembers+JniInstanceMethods.InvokeNonvirtualVoidMethod (System.String encodedMember, Java.Interop.IJavaPeerable self, Java.Interop.JniArgumentValue* parameters)
Camera.StartFaceDetection ()
CameraPreviewRenderer+<>c__DisplayClass8_0.<OnElementChanged>b__1 ()
Thread+RunnableImplementor.Run ()
mono.java.lang.RunnableImplementor.run RunnableImplementor.java:30
android.os.Handler.handleCallback Handler.java:883
This error occurs on my OnElementChanged of CameraCustomRender.
CameraCustomRender
public class CameraPreviewRenderer : ViewRenderer<App.Views.Clocking.CustomCamera.CameraPreview, App.Droid.CustomRender.Clocking.CameraPreview>, Camera.IFaceDetectionListener, Camera.IPictureCallback, Camera.IShutterCallback
{
CameraPreview cameraPreview;
String Picture_Name = "";
private CameraFacing camerainfo = CameraFacing.Front;
int DetectedFaceCount = 0;
[get: Android.Runtime.Register("getMaxNumDetectedFaces", "()I", "GetGetMaxNumDetectedFacesHandler", ApiSince = 14)]
public virtual int MaxNumDetectedFaces { get; }
public CameraPreviewRenderer(Context context) : base(context)
{
}
[Obsolete]
protected override void OnElementChanged(ElementChangedEventArgs<Centraverse.Views.Clocking.CustomCamera.CameraPreview> e)
{
try
{
base.OnElementChanged(e);
if (Control == null)
{
try
{
cameraPreview = new CameraPreview(Context);
SetNativeControl(cameraPreview);
}
catch (Exception ex)
{
}
}
if (e.OldElement != null)
{
}
if (e.NewElement != null)
{
try
{
if (Control == null)
{
cameraPreview = new CameraPreview(Context);
SetNativeControl(cameraPreview);
}
Control.Preview = Camera.Open((int)e.NewElement.Camera);
Control.CameraID = 1;
var CameraParaMeters = cameraPreview.camera.GetParameters();
if (CameraParaMeters != null)
{
if (CameraParaMeters.MaxNumDetectedFaces > 0)
{
Device.BeginInvokeOnMainThread(() =>
{
>------------------// Getting crashed at here-------------------------------------<
Control.Preview.SetFaceDetectionListener(this);
Control.Preview.StartFaceDetection();
});
}
}
}
catch (Exception ex)
{
}
}
}
catch(Exception ex)
{
}
}
protected override void Dispose(bool disposing)
{
try
{
if (disposing)
{
Control.Preview.Release();
MessagingCenter.Unsubscribe<Object>(this, "CaptureClick");
MessagingCenter.Unsubscribe<Object>(this, "FlipClick");
}
Device.BeginInvokeOnMainThread(base.Dispose);
}
catch (Exception ex)
{
}
}
[Obsolete]
public void OnFaceDetection(Camera.Face[] faces, Camera camera)
{
try
{
DetectedFaceCount = faces.Length;
}
catch (Exception ex)
{
}
}
private void takepicture()
{
try
{
Control.Preview.TakePicture(this, this, this);
}
catch (Exception ex)
{
}
}
public void OnPictureTaken(byte[] data, Camera camera)
{
try
{
// Managing camera capture
}
catch (Exception ex)
{
}
}
public void OnShutter() { }
}
I am not able to isolate the issue. The app gets crash with error saying "start Face detection failed". Since I am using older camera API , will that cause this issue? Please help me to fix this issue. For the convenience I have the sample project here.sample Project
EDIT
I am able to catch the exception like this
try
{
Control.Preview.SetFaceDetectionListener(this);
Control.Preview.StartFaceDetection();
}
catch (Java.Lang.RuntimeException ex)
{
}
I test your sample,and when i don't use a mode page to open the CameraPage,it will work.
You could try to change which in your MainPage.xaml.cs :
await Navigation.PushModalAsync(new CameraPage());
to
await Navigation.PushAsync(new CameraPage());
like:
private async void Open_Camera(object sender, EventArgs e)
{
try
{
var PhotoRequeststatus = await Permissions.RequestAsync<Permissions.Camera>();
var StorageRequStatus = await Permissions.RequestAsync<Permissions.StorageWrite>();
if (PhotoRequeststatus != Xamarin.Essentials.PermissionStatus.Granted || StorageRequStatus != Xamarin.Essentials.PermissionStatus.Granted)
{
await DisplayAlert("Enable Permission", "Please allow camera permission", "Close");
}
else
{
await Navigation.PushAsync(new CameraPage());
}
}
catch(Exception ex)
{
}
}

How to take a snapshot with MediaElement in UWP

I have a UWP project want to take a snapshot from Mediaelement while playing the video.
Does anyone know any useful links or how to tackle this task?
For your requirement, you could realize it with Custom video effects. Because you could get per frame in ProcessFrame method. And you could use a static property to store current frame and pass it to your image control. The following is RExampleVidoEffect class.
public sealed class RExampleVidoEffect : IBasicVideoEffect
{
private static SoftwareBitmap Snap;
public void SetEncodingProperties(VideoEncodingProperties encodingProperties, IDirect3DDevice device)
{
}
public void ProcessFrame(ProcessVideoFrameContext context)
{
var inputFrameBitmap = context.InputFrame.SoftwareBitmap;
Snap = inputFrameBitmap;
}
public static SoftwareBitmap GetSnapShot()
{
return Snap;
}
public void Close(MediaEffectClosedReason reason)
{
}
public void DiscardQueuedFrames()
{
}
public bool IsReadOnly
{
get
{
return true;
}
}
public IReadOnlyList<VideoEncodingProperties> SupportedEncodingProperties
{
get { return new List<VideoEncodingProperties>(); }
}
public MediaMemoryTypes SupportedMemoryTypes
{
get { return MediaMemoryTypes.Cpu; }
}
public bool TimeIndependent
{
get { return true; }
}
public void SetProperties(IPropertySet configuration)
{
}
}
Usage
private async void VideoPlayer_Loaded(object sender, RoutedEventArgs e)
{
var videoFile = await Package.Current.InstalledLocation.GetFileAsync("big_buck_bunny.mp4");
MediaClip clip = await MediaClip.CreateFromFileAsync(videoFile);
var videoEffectDefinition = new VideoEffectDefinition(typeof(RExampleVidoEffect).FullName);
clip.VideoEffectDefinitions.Add(videoEffectDefinition);
MediaComposition compositor = new MediaComposition();
compositor.Clips.Add(clip);
this.VideoPlayer.SetMediaStreamSource(compositor.GenerateMediaStreamSource());
}
private async void Button_Click(object sender, RoutedEventArgs e)
{
var bitmap = RExampleVidoEffect.GetSnapShot();
if (bitmap.BitmapPixelFormat != BitmapPixelFormat.Bgra8 ||
bitmap.BitmapAlphaMode == BitmapAlphaMode.Straight)
{
bitmap = SoftwareBitmap.Convert(bitmap, BitmapPixelFormat.Bgra8, BitmapAlphaMode.Premultiplied);
}
var source = new SoftwareBitmapSource();
await source.SetBitmapAsync(bitmap);
img.Source = source;
}
Effect

UWP Camera CaptureElement displaying on 50% of screen

I am writing a UWP application.It's like a simple custom camera with Taking PHOTO Button. It includes an XAML view having captureElement, and page behind the code.
The issue is that Camera opens on 50% of the screen on devices: Lumia 950 and Lumia 950XL as if the Grid containing captureElement is divided into two columns but on other devices, it perfect.
I took help of:
Microsoft UWP Samples: Camera Sample
namespace InfoMedia.Views.Posting
{
public sealed partial class PostingCameraView : BasePage
{
private readonly DisplayInformation _displayInformation = DisplayInformation.GetForCurrentView();
private readonly SimpleOrientationSensor _orientationSensor = SimpleOrientationSensor.GetDefault();
private SimpleOrientation _deviceOrientation = SimpleOrientation.NotRotated;
private DisplayOrientations _displayOrientation = DisplayOrientations.Portrait;
private static readonly Guid RotationKey = new Guid("C380465D-2271-428C-9B83-ECEA3B4A85C1");
private StorageFolder _captureFolder = null;
private readonly DisplayRequest _displayRequest = new DisplayRequest();
private readonly SystemMediaTransportControls _systemMediaControls = SystemMediaTransportControls.GetForCurrentView();
// MediaCapture and its state variables
private MediaCapture _mediaCapture;
private bool _isInitialized;
private bool _isPreviewing;
private bool _mirroringPreview;
private bool _externalCamera;
private bool _isRecording;
private PostingAdViewModel postingAdViewModel;
#region Constructor, lifecycle, and navigation
public PostingCameraView()
{
this.InitializeComponent();
if (ApplicationHelper.GetInstance().DetectDeviceFamily() == CommonServices.Enums.DeviceType.Phone)
{
NavigationCacheMode = NavigationCacheMode.Disabled;
this.Loaded += PostingCameraView_Loaded;
}
}
private async void PostingCameraView_Loaded(object sender, RoutedEventArgs e)
{
postingAdViewModel.IsCameraInitializing = true;
await SetupUiAsync();
await InitializeCameraAsync();
postingAdViewModel.IsCameraInitializing = false;
}
#region Event handlers
private async void SystemMediaControls_PropertyChanged(SystemMediaTransportControls sender, SystemMediaTransportControlsPropertyChangedEventArgs args)
{
await Dispatcher.RunAsync(CoreDispatcherPriority.Normal, async () =>
{
// Only handle this event if this page is currently being displayed
if (args.Property == SystemMediaTransportControlsProperty.SoundLevel && Frame.CurrentSourcePageType == typeof(MainPage))
{
if (sender.SoundLevel == SoundLevel.Muted)
{
await CleanupCameraAsync();
}
else if (!_isInitialized)
{
await InitializeCameraAsync();
}
}
});
}
private void OrientationSensor_OrientationChanged(SimpleOrientationSensor sender, SimpleOrientationSensorOrientationChangedEventArgs args)
{
if (args.Orientation != SimpleOrientation.Faceup && args.Orientation != SimpleOrientation.Facedown)
{
_deviceOrientation = args.Orientation;
}
}
private async void DisplayInformation_OrientationChanged(DisplayInformation sender, object args)
{
_displayOrientation = sender.CurrentOrientation;
if (_isPreviewing)
{
await SetPreviewRotationAsync();
}
}
#endregion Event handlers
#region MediaCapture methods
private async Task InitializeCameraAsync()
{
Debug.WriteLine("InitializeCameraAsync");
if (_mediaCapture == null)
{
// Attempt to get the back camera if one is available, but use any camera device if not
var cameraDevice = await FindCameraDeviceByPanelAsync(Windows.Devices.Enumeration.Panel.Back);
if (cameraDevice == null)
{
Debug.WriteLine("No camera device found!");
return;
}
// Create MediaCapture and its settings
_mediaCapture = new MediaCapture();
// Register for a notification when video recording has reached the maximum time and when something goes wrong
_mediaCapture.RecordLimitationExceeded += MediaCapture_RecordLimitationExceeded;
_mediaCapture.Failed += MediaCapture_Failed;
var settings = new MediaCaptureInitializationSettings { VideoDeviceId = cameraDevice.Id };
// Initialize MediaCapture
try
{
await _mediaCapture.InitializeAsync(settings);
_isInitialized = true;
}
catch (UnauthorizedAccessException)
{
Debug.WriteLine("Access to the camera is denied. You can change the permissions in mobile settings for camera", "Alert!");
}
// If initialization succeeded, start the preview
if (_isInitialized)
{
// Figure out where the camera is located
if (cameraDevice.EnclosureLocation == null || cameraDevice.EnclosureLocation.Panel == Windows.Devices.Enumeration.Panel.Unknown)
{
// No information on the location of the camera, assume it's an external camera, not integrated on the device
_externalCamera = true;
}
else
{
// Camera is fixed on the device
_externalCamera = false;
// Only mirror the preview if the camera is on the front panel
_mirroringPreview = (cameraDevice.EnclosureLocation.Panel == Windows.Devices.Enumeration.Panel.Front);
}
await StartPreviewAsync();
UpdateCaptureControls();
}
}
}
private async Task StartPreviewAsync()
{
// Prevent the device from sleeping while the preview is running
_displayRequest.RequestActive();
// Set the preview source in the UI and mirror it if necessary
PreviewControl.Source = _mediaCapture;
PreviewControl.FlowDirection = _mirroringPreview ? FlowDirection.RightToLeft : FlowDirection.LeftToRight;
// Start the preview
await _mediaCapture.StartPreviewAsync();
_isPreviewing = true;
// Initialize the preview to the current orientation
if (_isPreviewing)
{
await SetPreviewRotationAsync();
}
}
private async Task FocusCameraLens()
{
if (_mediaCapture != null)
{
if (_isPreviewing)
{
// test if focus is supported
if (_mediaCapture.VideoDeviceController.FocusControl.Supported)
{
// get the focus control from the mediaCapture object
var focusControl = _mediaCapture.VideoDeviceController.FocusControl;
// try to get full range, but settle for the first supported one.
var focusRange = focusControl.SupportedFocusRanges.Contains(AutoFocusRange.FullRange)
? AutoFocusRange.FullRange
: focusControl.SupportedFocusRanges.FirstOrDefault();
var focusMode = focusControl.SupportedFocusModes.Contains(FocusMode.Auto)
? FocusMode.Auto
: focusControl.SupportedFocusModes.FirstOrDefault();
focusControl.Configure(
new FocusSettings
{
Mode = focusMode,
AutoFocusRange = focusRange
});
try
{
// finally wait for the camera to focus
await focusControl.FocusAsync();
}
catch (Exception _focusExp)
{
//Ignore
}
}
}
}
}
private async Task SetPreviewRotationAsync()
{
// Only need to update the orientation if the camera is mounted on the device
if (_externalCamera) return;
// Calculate which way and how far to rotate the preview
int rotationDegrees = ConvertDisplayOrientationToDegrees(_displayOrientation);
// The rotation direction needs to be inverted if the preview is being mirrored
if (_mirroringPreview)
{
rotationDegrees = (360 - rotationDegrees) % 360;
}
// Add rotation metadata to the preview stream to make sure the aspect ratio / dimensions match when rendering and getting preview frames
var props = _mediaCapture.VideoDeviceController.GetMediaStreamProperties(MediaStreamType.VideoPreview);
int MFVideoRotation = ConvertVideoRotationToMFRotation(VideoRotation.Clockwise90Degrees);
// props.Properties.Add(RotationKey, rotationDegrees);
props.Properties.Add(RotationKey, PropertyValue.CreateInt32(MFVideoRotation));
await _mediaCapture.SetEncodingPropertiesAsync(MediaStreamType.VideoPreview, props, null);
}
int ConvertVideoRotationToMFRotation(VideoRotation rotation)
{
int MFVideoRotation = 0; // MFVideoRotationFormat::MFVideoRotationFormat_0 in Mfapi.h
switch (rotation)
{
case VideoRotation.Clockwise90Degrees:
MFVideoRotation = 90; // MFVideoRotationFormat::MFVideoRotationFormat_90;
break;
case VideoRotation.Clockwise180Degrees:
MFVideoRotation = 180; // MFVideoRotationFormat::MFVideoRotationFormat_180;
break;
case VideoRotation.Clockwise270Degrees:
MFVideoRotation = 270; // MFVideoRotationFormat::MFVideoRotationFormat_270;
break;
}
return MFVideoRotation;
}
#endregion MediaCapture methods
#region Helper functions
private async Task SetupUiAsync()
{
DisplayInformation.AutoRotationPreferences = DisplayOrientations.Portrait;
if (ApiInformation.IsTypePresent("Windows.UI.ViewManagement.StatusBar"))
{
await Windows.UI.ViewManagement.StatusBar.GetForCurrentView().HideAsync();
}
_displayOrientation = _displayInformation.CurrentOrientation;
if (_orientationSensor != null)
{
_deviceOrientation = _orientationSensor.GetCurrentOrientation();
}
RegisterEventHandlers();
var picturesLibrary = await StorageLibrary.GetLibraryAsync(KnownLibraryId.Pictures);
_captureFolder = picturesLibrary.SaveFolder ?? ApplicationData.Current.LocalFolder;
this.UpdateLayout();
}
private void UpdateCaptureControls()
{
postingAdViewModel.IsCaptureButtonEnabled = _isPreviewing;
if (_isInitialized && !_mediaCapture.MediaCaptureSettings.ConcurrentRecordAndPhotoSupported)
{
postingAdViewModel.IsCaptureButtonEnabled = !_isRecording;
PhotoButton.Opacity = postingAdViewModel.IsCaptureButtonEnabled ? 1 : 0;
}
}
private void RegisterEventHandlers()
{
if (ApiInformation.IsTypePresent("Windows.Phone.UI.Input.HardwareButtons"))
{
HardwareButtons.CameraPressed += HardwareButtons_CameraPressed;
}
if (_orientationSensor != null)
{
_orientationSensor.OrientationChanged += OrientationSensor_OrientationChanged;
}
_displayInformation.OrientationChanged += DisplayInformation_OrientationChanged;
_systemMediaControls.PropertyChanged += SystemMediaControls_PropertyChanged;
}
private void UnregisterEventHandlers()
{
if (ApiInformation.IsTypePresent("Windows.Phone.UI.Input.HardwareButtons"))
{
HardwareButtons.CameraPressed -= HardwareButtons_CameraPressed;
}
if (_orientationSensor != null)
{
_orientationSensor.OrientationChanged -= OrientationSensor_OrientationChanged;
}
_displayInformation.OrientationChanged -= DisplayInformation_OrientationChanged;
_systemMediaControls.PropertyChanged -= SystemMediaControls_PropertyChanged;
}
private static async Task<DeviceInformation> FindCameraDeviceByPanelAsync(Windows.Devices.Enumeration.Panel desiredPanel)
{
var allVideoDevices = await DeviceInformation.FindAllAsync(DeviceClass.VideoCapture);
DeviceInformation desiredDevice = allVideoDevices.FirstOrDefault(x => x.EnclosureLocation != null && x.EnclosureLocation.Panel == desiredPanel);
return desiredDevice ?? allVideoDevices.FirstOrDefault();
}
private static async Task ReencodeAndSavePhotoAsync(IRandomAccessStream stream, StorageFile file, PhotoOrientation photoOrientation)
{
using (var inputStream = stream)
{
var decoder = await BitmapDecoder.CreateAsync(inputStream);
using (var outputStream = await file.OpenAsync(FileAccessMode.ReadWrite))
{
var encoder = await BitmapEncoder.CreateForTranscodingAsync(outputStream, decoder);
var properties = new BitmapPropertySet { { "System.Photo.Orientation", new BitmapTypedValue(photoOrientation, PropertyType.UInt16) } };
await encoder.BitmapProperties.SetPropertiesAsync(properties);
await encoder.FlushAsync();
}
}
}
#endregion Helper functions
#region Rotation helpers
private SimpleOrientation GetCameraOrientation()
{
if (_externalCamera)
{
return SimpleOrientation.NotRotated;
}
var result = _deviceOrientation;
if (_displayInformation.NativeOrientation == DisplayOrientations.Portrait)
{
switch (result)
{
case SimpleOrientation.Rotated90DegreesCounterclockwise:
result = SimpleOrientation.NotRotated;
break;
case SimpleOrientation.Rotated180DegreesCounterclockwise:
result = SimpleOrientation.Rotated90DegreesCounterclockwise;
break;
case SimpleOrientation.Rotated270DegreesCounterclockwise:
result = SimpleOrientation.Rotated180DegreesCounterclockwise;
break;
case SimpleOrientation.NotRotated:
result = SimpleOrientation.Rotated270DegreesCounterclockwise;
break;
}
}
if (_mirroringPreview)
{
switch (result)
{
case SimpleOrientation.Rotated90DegreesCounterclockwise:
return SimpleOrientation.Rotated270DegreesCounterclockwise;
case SimpleOrientation.Rotated270DegreesCounterclockwise:
return SimpleOrientation.Rotated90DegreesCounterclockwise;
}
}
return result;
}
private static int ConvertDeviceOrientationToDegrees(SimpleOrientation orientation)
{
switch (orientation)
{
case SimpleOrientation.Rotated90DegreesCounterclockwise:
return 90;
case SimpleOrientation.Rotated180DegreesCounterclockwise:
return 180;
case SimpleOrientation.Rotated270DegreesCounterclockwise:
return 270;
case SimpleOrientation.NotRotated:
default:
return 0;
}
}
private static int ConvertDisplayOrientationToDegrees(DisplayOrientations orientation)
{
switch (orientation)
{
case DisplayOrientations.Portrait:
return 90;
case DisplayOrientations.LandscapeFlipped:
return 180;
case DisplayOrientations.PortraitFlipped:
return 270;
case DisplayOrientations.Landscape:
default:
return 0;
}
}
private static PhotoOrientation ConvertOrientationToPhotoOrientation(SimpleOrientation orientation)
{
switch (orientation)
{
case SimpleOrientation.Rotated90DegreesCounterclockwise:
return PhotoOrientation.Rotate90;
case SimpleOrientation.Rotated180DegreesCounterclockwise:
return PhotoOrientation.Rotate180;
case SimpleOrientation.Rotated270DegreesCounterclockwise:
return PhotoOrientation.Rotate270;
case SimpleOrientation.NotRotated:
default:
return PhotoOrientation.Normal;
}
}
#endregion Rotation helpers
}
}
What is causing this issue?
Solved:
var boolean = ApplicationView.TryEnterFullScreenMode();
It hides the soft Navigaiton Bar, and aspect ratio is maintained to captureElement, and camera works fine

MediaClip is not rendered with custom IBasicVideoEffect

I am trying to apply saturation video effect implemented with IBasicVideoEffect according to https://github.com/aarononeal/media-contrib example.
When I add VideoEffectDefinition to MediaClip, preview video is set, but nothing is rendered. (video disappears)
private void AddSaturationEffectButton_Click(object sender, RoutedEventArgs e)
{
var clip = _composition.Clips[0];
clip.VideoEffectDefinitions.Add(new VideoEffectDefinition(typeof(SaturationVideoEffect).FullName));
SetupMediaStreamSource();
}
SaturationVideoEffect is implemented in separated Windows Runtime Component (Universal Windows) project.
using System;
using System.Collections.Generic;
using Windows.Foundation.Collections;
using Windows.Graphics.DirectX.Direct3D11;
using Windows.Media.Effects;
using Windows.Media.MediaProperties;
using Microsoft.Graphics.Canvas;
using Microsoft.Graphics.Canvas.Effects;
namespace VideoEffectsLibrary
{
public sealed class SaturationVideoEffect : IBasicVideoEffect
{
private VideoEncodingProperties _currentEncodingProperties;
private CanvasDevice _canvasDevice;
private IPropertySet _configuration;
private float Saturation
{
get
{
if (_configuration != null && _configuration.ContainsKey("Saturation"))
return (float)_configuration["Saturation"];
else
return 0.5f;
}
set
{
_configuration["Saturation"] = value;
}
}
public void ProcessFrame(ProcessVideoFrameContext context)
{
using (CanvasBitmap inputBitmap = CanvasBitmap.CreateFromDirect3D11Surface(_canvasDevice, context.InputFrame.Direct3DSurface))
using (CanvasRenderTarget renderTarget = CanvasRenderTarget.CreateFromDirect3D11Surface(_canvasDevice, context.OutputFrame.Direct3DSurface))
using (CanvasDrawingSession ds = renderTarget.CreateDrawingSession())
{
var saturation = new SaturationEffect()
{
Source = inputBitmap,
Saturation = this.Saturation
};
ds.DrawImage(saturation);
}
}
public void SetEncodingProperties(VideoEncodingProperties encodingProperties, IDirect3DDevice device)
{
_currentEncodingProperties = encodingProperties;
_canvasDevice = CanvasDevice.CreateFromDirect3D11Device(device);
CanvasDevice.DebugLevel = CanvasDebugLevel.Error;
}
public void SetProperties(IPropertySet configuration)
{
_configuration = configuration;
}
public bool IsReadOnly { get { return false; } }
public MediaMemoryTypes SupportedMemoryTypes { get { return MediaMemoryTypes.Gpu; } }
public bool TimeIndependent { get { return false; } }
public IReadOnlyList<VideoEncodingProperties> SupportedEncodingProperties
{
get
{
return new List<VideoEncodingProperties>()
{
// NOTE: Specifying width and height is only necessary due to bug in media pipeline when
// effect is being used with Media Capture.
// This can be changed to "0, 0" in a future release of FBL_IMPRESSIVE.
VideoEncodingProperties.CreateUncompressed(MediaEncodingSubtypes.Argb32, 800, 600)
};
}
}
public void Close(MediaEffectClosedReason reason)
{
// Clean up devices
if (_canvasDevice != null)
_canvasDevice.Dispose();
}
public void DiscardQueuedFrames()
{
// No cached frames to discard
}
}
}
How can I make this effect run properly? Thanks for help.

How to use a thread in a playback function

I want to playback a sound file in two or three external sound cards at the same time and I think that using threads is the solution but I really didn't know how to use it in the playback code.
This is the event makes on button play:
public partial class PlaybackForm : Form
{
IWavePlayer waveOut;
string fileName = null;
WaveStream mainOutputStream;
WaveChannel32 volumeStream;
int _deviceNum;
int _deviceNum1;
Thread t1;
Thread t2;
public PlaybackForm(int deviceNum,int deviceNum1)
{
InitializeComponent();
_deviceNum = deviceNum;
_deviceNum1 = deviceNum1;
}
private void buttonPlay_Click(object sender, EventArgs e)
{
if (waveOut != null)
{
if (waveOut.PlaybackState == PlaybackState.Playing)
{
return;
}
else if (waveOut.PlaybackState == PlaybackState.Paused)
{
waveOut.Play();
return;
}
}
// we are in a stopped state
// TODO: only re-initialise if necessary
if (String.IsNullOrEmpty(fileName))
{
toolStripButtonOpenFile_Click(sender, e);
}
if (String.IsNullOrEmpty(fileName))
{
return;
}
try
{
CreateWaveOut();
}
catch (Exception driverCreateException)
{
MessageBox.Show(String.Format("{0}", driverCreateException.Message));
return;
}
mainOutputStream = CreateInputStream(fileName);
trackBarPosition.Maximum = (int)mainOutputStream.TotalTime.TotalSeconds;
labelTotalTime.Text = String.Format("{0:00}:{1:00}", (int)mainOutputStream.TotalTime.TotalMinutes,
mainOutputStream.TotalTime.Seconds);
trackBarPosition.TickFrequency = trackBarPosition.Maximum / 30;
try
{
waveOut.Init(mainOutputStream);
}
catch (Exception initException)
{
MessageBox.Show(String.Format("{0}", initException.Message), "Error Initializing Output");
return;
}
// not doing Volume on IWavePlayer any more
volumeStream.Volume = volumeSlider1.Volume;
waveOut.Play();
}
And this is how to create the waveout:
private void CreateWaveOut()
{
CloseWaveOut();
int latency = (int)comboBoxLatency.SelectedItem;
//if (radioButtonWaveOut.Checked)
{
//WaveCallbackInfo callbackInfo = checkBoxWaveOutWindow.Checked ?
WaveCallbackInfo callbackInfo = WaveCallbackInfo.FunctionCallback();
// WaveCallbackInfo callbackInfo = WaveCallbackInfo.FunctionCallback();
// WaveCallbackInfo.NewWindow(): WaveCallbackInfo.FunctionCallback();
WaveOut outputDevice = new WaveOut(callbackInfo);
outputDevice.DesiredLatency = latency;
outputDevice.DeviceNumber = _deviceNum;
waveOut = outputDevice;
}
}
I declared two deviceNum but until now I can playsound only in one device,that's why I want to use thread.
Can you help me please
Thank you in advance
Do something like that:
using System.Threading;
...
private void buttonPlay_Click(object sender, EventArgs e)
{
ThreadPool.QueueUserWorkItem(new WaitCallback(this.PlaySound), 1);
ThreadPool.QueueUserWorkItem(new WaitCallback(this.PlaySound), 2);
}
private void PlaySound(object obj)
{
int deviceNumber = (int)obj;
// Do the stuff you used to do in buttonPlay_Click
WaveOut myWaveOut = CreateWaveOut(deviceNumber);
...
}
private WaveOut CreateWaveOut(int deviceNumber)
{
...
WaveOut outputDevice = new WaveOut(callbackInfo);
outputDevice.DesiredLatency = latency;
outputDevice.DeviceNumber = _deviceNum;
return outputDevice;
}

Categories