UWP Kinect V2 keep frame rate constant (30fps) - c#

I am processing frames received from Kinect v2 (Color and IR) in UWP. The program runs on remote machine (XBOX One S). The main goal is to get frames and write them to the disk with 30 fps for Color and IR to later process them further.
I am using the following code to check the frame rate:
public MainPage()
{
this.InitialiseFrameReader(); // initialises MediaCapture for IR and Color
}
const int COLOR_SOURCE = 0;
const int IR_SOURCE = 1;
private async void InitialiseFrameReader()
{
await CleanupMediaCaptureAsync();
var allGroups = await MediaFrameSourceGroup.FindAllAsync();
if (allGroups.Count == 0)
{
return;
}
_groupSelectionIndex = (_groupSelectionIndex + 1) % allGroups.Count;
var selectedGroup = allGroups[_groupSelectionIndex];
var kinectGroup = selectedGroup;
try
{
await InitializeMediaCaptureAsync(kinectGroup);
}
catch (Exception exception)
{
_logger.Log($"MediaCapture initialization error: {exception.Message}");
await CleanupMediaCaptureAsync();
return;
}
// Set up frame readers, register event handlers and start streaming.
var startedKinds = new HashSet<MediaFrameSourceKind>();
foreach (MediaFrameSource source in _mediaCapture.FrameSources.Values.Where(x => x.Info.SourceKind == MediaFrameSourceKind.Color || x.Info.SourceKind == MediaFrameSourceKind.Infrared)) //
{
MediaFrameSourceKind kind = source.Info.SourceKind;
MediaFrameSource frameSource = null;
int frameindex = COLOR_SOURCE;
if (kind == MediaFrameSourceKind.Infrared)
{
frameindex = IR_SOURCE;
}
// Ignore this source if we already have a source of this kind.
if (startedKinds.Contains(kind))
{
continue;
}
MediaFrameSourceInfo frameInfo = kinectGroup.SourceInfos[frameindex];
if (_mediaCapture.FrameSources.TryGetValue(frameInfo.Id, out frameSource))
{
// Create a frameReader based on the source stream
MediaFrameReader frameReader = await _mediaCapture.CreateFrameReaderAsync(frameSource);
frameReader.FrameArrived += FrameReader_FrameArrived;
_sourceReaders.Add(frameReader);
MediaFrameReaderStartStatus status = await frameReader.StartAsync();
if (status == MediaFrameReaderStartStatus.Success)
{
startedKinds.Add(kind);
}
}
}
}
private async Task InitializeMediaCaptureAsync(MediaFrameSourceGroup sourceGroup)
{
if (_mediaCapture != null)
{
return;
}
// Initialize mediacapture with the source group.
_mediaCapture = new MediaCapture();
var settings = new MediaCaptureInitializationSettings
{
SourceGroup = sourceGroup,
SharingMode = MediaCaptureSharingMode.SharedReadOnly,
StreamingCaptureMode = StreamingCaptureMode.Video,
MemoryPreference = MediaCaptureMemoryPreference.Cpu
};
await _mediaCapture.InitializeAsync(settings);
}
private void FrameReader_FrameArrived(MediaFrameReader sender, MediaFrameArrivedEventArgs args)
{
using (var frame = sender.TryAcquireLatestFrame())
{
if (frame != null)
{
//Settings.cameraframeQueue.Enqueue(null, frame.SourceKind.ToString(), frame.SystemRelativeTime.Value); //Add to Queue to process frame
Debug.WriteLine(frame.SourceKind.ToString() + " : " + frame.SystemRelativeTime.ToString());
}
}
}
I am trying to debug the application to check the frame rate so I have removed further processing.
I am not sure if I am not calculating it properly or something else is wrong.
For example, System Relative Time from 04:37:06 to 04:37:48 gives :
IR:
Fps(Occurrence)
31(1)
30(36)
29(18)
28(4)
Color:
Fps(Occurrence)
30(38)
29(18)
28(3)
I want this frame rate to be constant (30 fps) and aligned so IR and Color and same number of frames for that time.
This does not include any additional code. As soon as I have a process queue or any sort of code, the fps decreases and ranges from 15 to 30.
Can anyone please help me with this?
Thank you.
UPDATE:
After some testing and working around, it has come to my notice that PC produces 30fps but XBOX One (remote device) on debug mode produces very low fps. This does however improve when running it on release mode but the memory allocated for UWP apps is quite low.
https://learn.microsoft.com/en-us/windows/uwp/xbox-apps/system-resource-allocation

XBOX One has maximum available memory of 1 GB for Apps and 5 for Games.
https://learn.microsoft.com/en-us/windows/uwp/xbox-apps/system-resource-allocation
While in PC the fps is 30 (as the memory has no such restrictions).
This causes the frame rate to drop. However, the fps did improve when running it on release mode or published to MS Store.

Related

Mixed Reality WebRTC - Screen capturing with GraphicsCapturePicker

Setup
Hey,
I'm trying to capture my screen and send/communicate the stream via MR-WebRTC. Communication between two PCs or PC with HoloLens worked with webcams for me, so I thought the next step could be streaming my screen. So I took the uwp application that I already had, which worked with my webcam and tried to make things work:
UWP App is based on the example uwp app from MR-WebRTC.
For Capturing I'm using the instruction from MS about screen capturing via GraphicsCapturePicker.
So now I'm stuck in the following situation:
I get a frame from the screen capturing, but its type is Direct3D11CaptureFrame. You can see it below in the code snipped.
MR-WebRTC takes a frame type I420AVideoFrame (also in a code snipped).
How can I "connect" them?
I420AVideoFrame wants a frame in the I420A format (YUV 4:2:0).
Configuring the framePool I can set the DirectXPixelFormat, but it has no YUV420.
I found this post on so, saying that it its possible.
Code Snipped Frame from Direct3D:
_framePool = Direct3D11CaptureFramePool.Create(
_canvasDevice, // D3D device
DirectXPixelFormat.B8G8R8A8UIntNormalized, // Pixel format
3, // Number of frames
_item.Size); // Size of the buffers
_session = _framePool.CreateCaptureSession(_item);
_session.StartCapture();
_framePool.FrameArrived += (s, a) =>
{
using (var frame = _framePool.TryGetNextFrame())
{
// Here I would take the Frame and call the MR-WebRTC method LocalI420AFrameReady
}
};
Code Snippet Frame from WebRTC:
// This is the way with the webcam; so LocalI420 was subscribed to
// the event I420AVideoFrameReady and got the frame from there
_webcamSource = await DeviceVideoTrackSource.CreateAsync();
_webcamSource.I420AVideoFrameReady += LocalI420AFrameReady;
// enqueueing the newly captured video frames into the bridge,
// which will later deliver them when the Media Foundation
// playback pipeline requests them.
private void LocalI420AFrameReady(I420AVideoFrame frame)
{
lock (_localVideoLock)
{
if (!_localVideoPlaying)
{
_localVideoPlaying = true;
// Capture the resolution into local variable useable from the lambda below
uint width = frame.width;
uint height = frame.height;
// Defer UI-related work to the main UI thread
RunOnMainThread(() =>
{
// Bridge the local video track with the local media player UI
int framerate = 30; // assumed, for lack of an actual value
_localVideoSource = CreateI420VideoStreamSource(
width, height, framerate);
var localVideoPlayer = new MediaPlayer();
localVideoPlayer.Source = MediaSource.CreateFromMediaStreamSource(
_localVideoSource);
localVideoPlayerElement.SetMediaPlayer(localVideoPlayer);
localVideoPlayer.Play();
});
}
}
// Enqueue the incoming frame into the video bridge; the media player will
// later dequeue it as soon as it's ready.
_localVideoBridge.HandleIncomingVideoFrame(frame);
}
I found a solution for my problem by creating an issue on the github repo. Answer was provided by KarthikRichie:
You have to use the ExternalVideoTrackSource
You can convert from the Direct3D11CaptureFrame to Argb32VideoFrame
 
// Setting up external video track source
_screenshareSource = ExternalVideoTrackSource.CreateFromArgb32Callback(FrameCallback);
struct WebRTCFrameData
{
public IntPtr Data;
public uint Height;
public uint Width;
public int Stride;
}
public void FrameCallback(in FrameRequest frameRequest)
{
try
{
if (FramePool != null)
{
using (Direct3D11CaptureFrame _currentFrame = FramePool.TryGetNextFrame())
{
if (_currentFrame != null)
{
WebRTCFrameData webRTCFrameData = ProcessBitmap(_currentFrame.Surface).Result;
frameRequest.CompleteRequest(new Argb32VideoFrame()
{
data = webRTCFrameData.Data,
height = webRTCFrameData.Height,
width = webRTCFrameData.Width,
stride = webRTCFrameData.Stride
});
}
}
}
}
catch (Exception ex)
{
}
}
private async Task<WebRTCFrameData> ProcessBitmap(IDirect3DSurface surface)
{
SoftwareBitmap softwareBitmap = await SoftwareBitmap.CreateCopyFromSurfaceAsync(surface, Windows.Graphics.Imaging.BitmapAlphaMode.Straight);
byte[] imageBytes = new byte[4 * softwareBitmap.PixelWidth * softwareBitmap.PixelHeight];
softwareBitmap.CopyToBuffer(imageBytes.AsBuffer());
WebRTCFrameData argb32VideoFrame = new WebRTCFrameData();
argb32VideoFrame.Data = GetByteIntPtr(imageBytes);
argb32VideoFrame.Height = (uint)softwareBitmap.PixelHeight;
argb32VideoFrame.Width = (uint)softwareBitmap.PixelWidth;
var test = softwareBitmap.LockBuffer(BitmapBufferAccessMode.Read);
int count = test.GetPlaneCount();
var pl = test.GetPlaneDescription(count - 1);
argb32VideoFrame.Stride = pl.Stride;
return argb32VideoFrame;
}
private IntPtr GetByteIntPtr(byte[] byteArr)
{
IntPtr intPtr2 = System.Runtime.InteropServices.Marshal.UnsafeAddrOfPinnedArrayElement(byteArr, 0);
return intPtr2;
}

How can a custom audio effect produce some sound after a AudioFileInputNode has finished playing

I'm developping an audio application in C# and UWP using the AudioGraph API.
My AudioGraph setup is the following :
AudioFileInputNode --> AudioSubmixNode --> AudioDeviceOutputNode.
I attached a custom echo effect on the AudioSubmixNode.
If I play the AudioFileInputNode I can hear some echo.
But when the AudioFileInputNode playback finishes, the echo sound stops brutally.
I would like it to stop gradually after few seconds only.
If I use the EchoEffectDefinition from the AudioGraph API, the echo sound is not stopped after the sample playback has finished.
I don't know if the problem comes from my effect implementation or if it's a strange behavior of the AudioGraph API...
The behavior is the same in the "AudioCreation" sample in the SDK, scenario 6.
Here is my custom effect implementation :
public sealed class AudioEchoEffect : IBasicAudioEffect
{
public AudioEchoEffect()
{
}
private readonly AudioEncodingProperties[] _supportedEncodingProperties = new AudioEncodingProperties[]
{
AudioEncodingProperties.CreatePcm(44100, 1, 32),
AudioEncodingProperties.CreatePcm(48000, 1, 32),
};
private AudioEncodingProperties _currentEncodingProperties;
private IPropertySet _propertySet;
private readonly Queue<float> _echoBuffer = new Queue<float>(100000);
private int _delaySamplesCount;
private float Delay
{
get
{
if (_propertySet != null && _propertySet.TryGetValue("Delay", out object val))
{
return (float)val;
}
return 500.0f;
}
}
private float Feedback
{
get
{
if (_propertySet != null && _propertySet.TryGetValue("Feedback", out object val))
{
return (float)val;
}
return 0.5f;
}
}
private float Mix
{
get
{
if (_propertySet != null && _propertySet.TryGetValue("Mix", out object val))
{
return (float)val;
}
return 0.5f;
}
}
public bool UseInputFrameForOutput { get { return true; } }
public IReadOnlyList<AudioEncodingProperties> SupportedEncodingProperties { get { return _supportedEncodingProperties; } }
public void SetProperties(IPropertySet configuration)
{
_propertySet = configuration;
}
public void SetEncodingProperties(AudioEncodingProperties encodingProperties)
{
_currentEncodingProperties = encodingProperties;
// compute the number of samples for the delay
_delaySamplesCount = (int)MathF.Round((this.Delay / 1000.0f) * encodingProperties.SampleRate);
// fill empty samples in the buffer according to the delay
for (int i = 0; i < _delaySamplesCount; i++)
{
_echoBuffer.Enqueue(0.0f);
}
}
unsafe public void ProcessFrame(ProcessAudioFrameContext context)
{
AudioFrame frame = context.InputFrame;
using (AudioBuffer buffer = frame.LockBuffer(AudioBufferAccessMode.ReadWrite))
using (IMemoryBufferReference reference = buffer.CreateReference())
{
((IMemoryBufferByteAccess)reference).GetBuffer(out byte* dataInBytes, out uint capacity);
float* dataInFloat = (float*)dataInBytes;
int dataInFloatLength = (int)buffer.Length / sizeof(float);
// read parameters once
float currentWet = this.Mix;
float currentDry = 1.0f - currentWet;
float currentFeedback = this.Feedback;
// Process audio data
float sample, echoSample, outSample;
for (int i = 0; i < dataInFloatLength; i++)
{
// read values
sample = dataInFloat[i];
echoSample = _echoBuffer.Dequeue();
// compute output sample
outSample = (currentDry * sample) + (currentWet * echoSample);
dataInFloat[i] = outSample;
// compute delay sample
echoSample = sample + (currentFeedback * echoSample);
_echoBuffer.Enqueue(echoSample);
}
}
}
public void Close(MediaEffectClosedReason reason)
{
}
public void DiscardQueuedFrames()
{
// reset the delay buffer
_echoBuffer.Clear();
for (int i = 0; i < _delaySamplesCount; i++)
{
_echoBuffer.Enqueue(0.0f);
}
}
}
EDIT :
I changed my audio effect to mix the input samples with a sine wave. The ProcessFrame effect method runs continuously before and after the sample playback (when the effect is active). So the sine wave should be heared before and after the sample playback. But the AudioGraph API seems to ignore the effect output when there is no active playback...
Here is a screen capture of the audio output :
So my question is : How can the built-in EchoEffectDefinition output some sound after the playback finished ? An access to the EchoEffectDefinition source code would be a great help...
By infinitely looping the file input node, then it will always provide an input frame until the audio graph stops. But of course we do not want to hear the file loop, so we can listen the FileCompleted event of AudioFileInputNode. When the file finishes playing, it will trigger the event and we just need to set the OutgoingGain of AudioFileInputNode to zero. So the file playback once, but it continues to silently loop passing input frames that have no audio content to which the echo can be added.
Still using scenario 4 in the AudioCreation sample as an example. In the scenario4, there is a property named fileInputNode1. As mentioned above, please add the following code in fileInputNode1 and test again by using your custom echo effect.
fileInputNode1.LoopCount = null; //Null makes it loop infinitely
fileInputNode1.FileCompleted += FileInputNode1_FileCompleted;
private void FileInputNode1_FileCompleted(AudioFileInputNode sender, object args)
{
fileInputNode1.OutgoingGain = 0.0;
}

Take several photos with PhotoCapture Unity C#

I got a script that should be able to take more than one photo. I am using PhotoCapture and get an error making it unable to capture a second photo. I get the error "Value cannot be null" on the photoCaptureObject.StartPhotoModeAsync(cameraParameters, result => row, but I don't understand why this is.
I have commented away the photoCaptureObject = null; row so that the photoCaptureObject should not be null. The row if (photoCaptureObject == null) return; also proofs that the photoCaptureObjectis not null.
PhotoCapture photoCaptureObject = null;
Texture2D targetTexture = null;
public string path = "";
CameraParameters cameraParameters = new CameraParameters();
private void Awake()
{
var cameraResolution = PhotoCapture.SupportedResolutions.OrderByDescending((res) => res.width * res.height).First();
targetTexture = new Texture2D(cameraResolution.width, cameraResolution.height);
// Create a PhotoCapture object
PhotoCapture.CreateAsync(false, captureObject =>
{
photoCaptureObject = captureObject;
cameraParameters.hologramOpacity = 0.0f;
cameraParameters.cameraResolutionWidth = cameraResolution.width;
cameraParameters.cameraResolutionHeight = cameraResolution.height;
cameraParameters.pixelFormat = CapturePixelFormat.BGRA32;
});
}
private void Update()
{
// if not initialized yet don't take input
if (photoCaptureObject == null) return;
if (Input.GetKeyDown(KeyCode.K) || Input.GetKeyDown("k"))
{
Debug.Log("k was pressed");
VuforiaBehaviour.Instance.gameObject.SetActive(false);
// Activate the camera
photoCaptureObject.StartPhotoModeAsync(cameraParameters, result =>
{
if (result.success)
{
// Take a picture
photoCaptureObject.TakePhotoAsync(OnCapturedPhotoToMemory);
}
else
{
Debug.LogError("Couldn't start photo mode!", this);
}
});
}
}
There is some code here in between that change the photo taken and so on, but I don't think that that code is part of the problem.
private void OnStoppedPhotoMode(PhotoCapture.PhotoCaptureResult result)
{
// Shutdown the photo capture resource
VuforiaBehaviour.Instance.gameObject.SetActive(true);
photoCaptureObject.Dispose();
//photoCaptureObject = null;
Debug.Log("Photomode stopped");
}
So what else could be null? Is it the StartPhotoModeAsync somehow? How can I fix this issue?
Thanks!
Okay so I understand now thanks to Henriks comment.
Unity specifically says this about StartPhotoModeAsync:
Only one PhotoCapture instance can start the photo mode at any given
time
I have focused more on the sentence after saying that one should always use PhotoCapture.StopPhotoModeAsyncbecause having PhotoCaptureMode on takes more power so I never thought about that the instance wouldn't start again after stopping it.
Now I only have TakePhotoAsyncin the update on key press and do not stop the PhotoMode due to that the app that I am making should always be able to capture photos.

Video feed flickering/flying in C# using EmguCV

I am creating a program that can dump individual frames from a video feed for a drones competition. I am having a problem where by the wireless video stream coming form the drone is flickering and flying all over the place.
I am using this code to capture the video stream:
Capture _capture;
Emgu.CV.Image<Emgu.CV.Structure.Bgr,byte> frame;
void StartCamera()
{
_capture = null;
_capture = new Capture((int)nudCamera.Value);
_capture.SetCaptureProperty(Emgu.CV.CvEnum.CAP_PROP.CV_CAP_PROP_FPS, FrameRate);
_capture.SetCaptureProperty(Emgu.CV.CvEnum.CAP_PROP.CV_CAP_PROP_FRAME_HEIGHT, FrameHeight);
_capture.SetCaptureProperty(Emgu.CV.CvEnum.CAP_PROP.CV_CAP_PROP_FRAME_WIDTH, FrameWidth);
webcam_frm_cnt = 0;
cam = 1;
Video_seek = 0;
System.Windows.Forms.Application.Idle += ProcessFrame;
}
private void ProcessFrame(object sender, EventArgs arg)
{
try
{
Framesno = _capture.GetCaptureProperty(Emgu.CV.CvEnum.CAP_PROP.CV_CAP_PROP_POS_FRAMES);
frame = _capture.QueryFrame();
if (frame != null)
{
pictureBox1.Image = frame.ToBitmap();
if (cam == 0)
{
Video_seek = (int)(Framesno);
double time_index = _capture.GetCaptureProperty(Emgu.CV.CvEnum.CAP_PROP.CV_CAP_PROP_POS_MSEC);
//Time_Label.Text = "Time: " + TimeSpan.FromMilliseconds(time_index).ToString().Substring(0, 8);
double framenumber = _capture.GetCaptureProperty(Emgu.CV.CvEnum.CAP_PROP.CV_CAP_PROP_POS_FRAMES);
//Frame_lbl.Text = "Frame: " + framenumber.ToString();
Thread.Sleep((int)(1000.0 / FrameRate));
}
if (cam == 1)
{
//Frame_lbl.Text = "Frame: " + (webcam_frm_cnt++).ToString();
}
}
}
catch (Exception ex)
{
MessageBox.Show(ex.Message.ToString());
}
}
Is there a setting somewhere that I am missing?
This video stream flickering seems to happen in other programs too however it fixes itself when you fiddle with the video setting (NTSC/PAL settings)
Edit: So I need to be able to put the video stream into NTSC /M mode, is this possible with EmguCV? If so how do I do it?
Edit 2: All documents that I have read point towards it being completely and utterly impossible to change the video type and that there is no documentation on this topic. I would love to be proved wrong :)
Thanks in advanced
This sourceforge link tells you how to set the video mode.

mediaEnded event doesn't always fire

I am using a design for playing wav files one after another. This design is based on mediaEnded event and works fine. When trying to imply this design on very short wav files, the mediaEnded event isn't always raised.
Is their a solution for this problem?
Thanks.
I implemented a workaround using a timer to periodically check for two situations:
Video Playback Position has not moved since last check - indicates the video is stuck.
Video Playback Position is beyond the end of the video - indicates the video is playing but usually just a black screen will be displayed.
If either condition exists force the next vide to start playing.
Note that this is an extract of code from a class file.
namespace XXXX
{
public partial class VideoWindow : Window
{
private DispatcherTimer mediaPositionTimer; // We check the play position of the video each time this fires to see if it has hung
double _lastPosition = -1;
int _video_count;
int videoDuration;
DateTime lastVideoStartTime;
public VideoWindow()
{
adMediaElement.LoadedBehavior = MediaState.Manual;
adMediaElement.MediaEnded += adMediaElement_MediaEnded;
adMediaElement.MediaOpened += adMediaElement_MediaOpened;
adMediaElement.MediaFailed += adMediaElement_MediaFailed;
}
// Increment the counter every time we open a video
void adMediaElement_MediaOpened(object sender, RoutedEventArgs e)
{
Log("Media opened");
_video_count++;
}
// If we have a failure then just start the next video
void adMediaElement_MediaFailed(object sender, ExceptionRoutedEventArgs e)
{
Log("Media failed");
this.Dispatcher.BeginInvoke(System.Windows.Threading.DispatcherPriority.Normal, new Action(delegate()
{
PlayNextMedia();
}));
}
// When a video ends start the next one
private void adMediaElement_MediaEnded(object sender, RoutedEventArgs e)
{
Log("MediaEnded called");
this.Dispatcher.BeginInvoke(System.Windows.Threading.DispatcherPriority.Normal, new Action(delegate()
{
PlayNextMedia();
}));
}
// Stops and Closes any existing video
// Increments the file pointer index
// Switches to display ads if no more videos
// else
// Starts the video playing
private void PlayNextMedia()
{
// Log(String.Format("PlayNextMedia called"));
//Close the existing file and stop the timer
EndVideo2();
adMediaElement.Dispatcher.Invoke(System.Windows.Threading.DispatcherPriority.Normal, new Action(delegate()
{
adMediaElement.Stop();
adMediaElement.Close();
}));
currentMediaFileIndex++;
if (currentMediaFileIndex > (mediaFileCount - 1))
{
Log(String.Format(" switching to Ads, currentMediaFileIndex = {0}", currentMediaFileIndex));
// Now setup and then run static adds for 10 minutes
currentMediaFileIndex = 0;
StartAds();
}
else
{
Log(String.Format(" switching media, index = {0}", currentMediaFileIndex));
adMediaElement.Dispatcher.BeginInvoke(System.Windows.Threading.DispatcherPriority.Normal, new Action(delegate()
{
StartVideo2();
}));
}
}
// Stops the mediaPositionTimer, must be called in conjunction with a admediaElement.Pause() or Stop()
private void EndVideo2()
{
// Log("EndVideo2() called");
_is_playing = false;
// Stop the timer
if (mediaPositionTimer != null)
{
mediaPositionTimer.Stop();
}
}
// Load the media file
// Set the volume
// Set the lastVideoStartTime variable
private void StartVideo2()
{
// Log("StartVideo2() called");
loadMediaFile(currentMediaFileIndex);
adMediaElement.Volume = Properties.StationSettings.Default.VolumeMedia;
lastVideoStartTime = DateTime.Now; // Record the time we started
// Stop the timer if it exists, otherwise create a new one
if (mediaPositionTimer == null)
{
mediaPositionTimer = new DispatcherTimer();
// Set up the timer
mediaPositionTimer.Interval = TimeSpan.FromSeconds(10);
mediaPositionTimer.Tick += new EventHandler(positionTimerTick);
}
else
{
mediaPositionTimer.Stop();
}
// Start it running
adMediaElement.Dispatcher.BeginInvoke(System.Windows.Threading.DispatcherPriority.Normal, new Action(delegate()
{
mediaPositionTimer.Start();
_is_playing = true;
adMediaElement.Play();
}));
}
private void RestartVideo2()
{
//Log("Restart the video");
adMediaElement.Dispatcher.BeginInvoke(System.Windows.Threading.DispatcherPriority.Normal, new Action(delegate()
{
mediaPositionTimer.Start();
_is_playing = true;
adMediaElement.Play();
}));
}
// Check the playback position of the mediaElement has moved since the last check,
// if not then the video has hung
// Also check if the playback position exceeds the video duration,
// if so it has also hung
// If video has hung then force the next one to start and report a Warning
private void positionTimerTick(object sender, EventArgs e)
{
double duration;
double currentPosition = adMediaElement.Position.TotalSeconds;
if (adMediaElement.NaturalDuration.HasTimeSpan)
{
duration = adMediaElement.NaturalDuration.TimeSpan.TotalSeconds;
}
else
{
duration = 5 * 60; // Default to 5 minutes if video does not report a duration
}
if ((currentPosition == lastPosition) || currentPosition > (duration + 30))
{
// do something
//Log("*** Video position has exceed the end of the media or video playback position has not moved ***");
if (_is_playing)
{
String logString = String.Format("*** Video {0} has frozen ({1}c:{2}l:{3}d)so forcing the next one to start ***", mediaFiles[currentMediaFileIndex], currentPosition, lastPosition, duration);
Log(logString);
PlayNextMedia();
// Send a message indicating we had to do this
mainWindow.SendHeartbeat(MainWindow.STATUS_WARNING, logString);
}
}
lastPosition = currentPosition;
}
}
}

Categories