I have MediaStreamSource implementation which can open files with multiple audio streams. In OpenMediaAsync method i deliver MediaStreamDescription for all video and audio streams but MediaElement detects only 1 audio stream. Also I have tested next logic which works:
detected 2 streams
report only first or second MediaStreamDescription of audio stream to ReportOpenMediaCompleted
But of course I want to report first and second audio stream to ReportOpenMediaCompleted and in result have MediaElement with 2 audio streams. Also I have discovered StreamId field inside MediaStreamSource class but it doesn't have set accessor, and when reporting streams with ReportOpenMediaCompleted all MediaStreamDescription have StreamId == 0.
OpenMediaAsync code:
protected override void OpenMediaAsync()
{
this.streamDesc = new Dictionary<int, MediaStreamDescription>();
List<MediaStreamDescription> availableStreams = new List<MediaStreamDescription>();
for (int i = 0; i < this.parser.StreamCount; i++)
{
Dictionary<MediaStreamAttributeKeys, string> streamAttributes = new Dictionary<MediaStreamAttributeKeys, string>();
MediaStreamDescription msd = null;
var type = this.parser.GetStreamType(i);
streamAttributes[MediaStreamAttributeKeys.CodecPrivateData] = this.parser.GetCodecPrivateData(i);
if (type == ParserComponent.StreamType.Video)
{
streamAttributes[MediaStreamAttributeKeys.VideoFourCC] = this.parser.GetCodecID(i);
streamAttributes[MediaStreamAttributeKeys.Width] = this.parser.GetWidth(i).ToString();
streamAttributes[MediaStreamAttributeKeys.Height] = this.parser.GetHeight(i).ToString();
msd = new MediaStreamDescription(MediaStreamType.Video, streamAttributes);
}
else if (type == ParserComponent.StreamType.Audio)
{
msd = new MediaStreamDescription(MediaStreamType.Audio, streamAttributes);
}
if (msd != null)
{
if (i == this.parser.CurrentAudioStreamIndex || i == this.parser.CurrentVideoStreamIndex)
{
this.parser.SetStreamActive(i, true);
// quick fix for multilanguage videos to submit only 1 audio stream
// availableStreams.Add(msd);
}
this.streamDesc.Add(i, msd);
availableStreams.Add(msd);
}
}
Dictionary<MediaSourceAttributesKeys, string> sourceAttributes = new Dictionary<MediaSourceAttributesKeys, string>();
sourceAttributes[MediaSourceAttributesKeys.CanSeek] = this.parser.Seekable.ToString();
sourceAttributes[MediaSourceAttributesKeys.Duration] = this.parser.Duration.Ticks.ToString();
ReportOpenMediaCompleted(sourceAttributes, availableStreams);
}
May be it's a good idea to use Player framework. It supports multiple audio streaming.
Related
I am working on an application in C# using XAudio2 from SharpDX. I know its outdated but instead of CSCore it offers the better Reverb. The aim itself is simple: load a wav file (48khz,24bit,1channel), get it through the XAudio2 Reverb with several Reverb presets and do a playback.
I managed to implement everything for a normal playback and also built in the Reverb. The audio file will be played back with the defaul reverb settings but as soon as I change the preset or change a specific value (RoomFilterFreq) the file isnt audible anymore and i don´t know why this happens. Does anyone has a clue about this? I worked through the few examples which are still online but could not find a reason for this behaviour.
I´m coding under Visual Studio 2019 with Net 4.7.2. The code looks as followed:
xaudio2 = new XAudio2();
xaudio2.StartEngine();
var masteringVoice = new MasteringVoice(xaudio2, 1, 48000);
var stream = new SoundStream(File.OpenRead(soundFilepath));
SharpDX.Multimedia.WaveFormat waveFormat = stream.Format;
var buffer = new AudioBuffer
{
Stream = stream.ToDataStream(),
AudioBytes = (int)stream.Length,
Flags = BufferFlags.EndOfStream
};
stream.Close();
sourceVoice = new SourceVoice(xaudio2, waveFormat, true);
// Set Loop
if (f1.loopcheckBox.Checked == true)
{
buffer.LoopBegin = buffer.PlayBegin;
buffer.LoopLength = buffer.PlayLength;
buffer.LoopCount = AudioBuffer.LoopInfinite;
}
// Set Reverb
if (f1.reverbenableButton.BackColor == Color.LightGreen)
{
var reverb = new Reverb(xaudio2);
var reverbparameters = (ReverbParameters)ReverbI3DL2Parameters.Presets.Room;
var effectDescriptor = new EffectDescriptor(reverb);
if (waveFormat.Channels == 1)
{
effectDescriptor.OutputChannelCount = 1;
}
else if (waveFormat.Channels == 2)
{
effectDescriptor.OutputChannelCount = 2;
}
else
{
MessageBox.Show("Channelrate not supported!");
return sourceVoice = null;
}
sourceVoice.SetEffectChain(effectDescriptor);
sourceVoice.SetEffectParameters(0, reverbparameters);
sourceVoice.EnableEffect(0);
}
sourceVoice.SubmitSourceBuffer(buffer, stream.DecodedPacketsInfo);
sourceVoice.Start();
I'm using the following code to read a Range Header from an HttpRequestMessage object to determine how to read a video file and send the data back (so it streams):
private static bool GetHttpRange(RangeItemHeaderValue rangeItemHeaderValue, long contentLength, out HttpRange httpRange)
{
httpRange = new HttpRange();
long actualContentLength = contentLength - 1;
if (rangeItemHeaderValue.From != null)
{
httpRange.From = rangeItemHeaderValue.From.Value;
if (rangeItemHeaderValue.To != null)
{
httpRange.To = rangeItemHeaderValue.To.Value;
}
else
{
httpRange.To = actualContentLength;
}
}
else
{
httpRange.To = actualContentLength;
if (rangeItemHeaderValue.To != null)
{
httpRange.From = contentLength - rangeItemHeaderValue.To.Value;
}
else
{
httpRange.From = 0;
}
}
return httpRange.From < contentLength && httpRange.To < contentLength;
}
This streams the video perfectly! However, the requests for iPhones and Apple TVs are random and a 200MB video ends up reading 5GB of data from Blob Storage.
Is there any way to stop this from happening?
I am working on an UWP app for the HoloLens to read single frames from the devices camera. I want to use the camera mode with the lowest resolution available.
I took a look at the following links and examples and tried to create a minimal working app:
https://learn.microsoft.com/en-us/windows/uwp/audio-video-camera/set-media-encoding-properties
https://learn.microsoft.com/en-us/windows/uwp/audio-video-camera/use-opencv-with-mediaframereader
https://github.com/Microsoft/Windows-universal-samples/tree/master/Samples/CameraResolution
https://github.com/microsoft/Windows-universal-samples/tree/master/Samples/CameraOpenCV
This is the code snippet from MainPage.xaml.cs:
public async Task<int> Start()
{
// Find the sources
var allGroups = await MediaFrameSourceGroup.FindAllAsync();
var sourceGroups = allGroups.Select(g => new
{
Group = g,
SourceInfo = g.SourceInfos.FirstOrDefault(i => i.SourceKind == MediaFrameSourceKind.Color)
}).Where(g => g.SourceInfo != null).ToList();
if (sourceGroups.Count == 0)
{
// No camera sources found
return 0;
}
var selectedSource = sourceGroups.FirstOrDefault();
// Initialize MediaCapture
_mediaCapture = new MediaCapture();
var settings = new MediaCaptureInitializationSettings()
{
SourceGroup = selectedSource.Group,
SharingMode = MediaCaptureSharingMode.ExclusiveControl,
StreamingCaptureMode = StreamingCaptureMode.Video,
MemoryPreference = MediaCaptureMemoryPreference.Cpu
};
await _mediaCapture.InitializeAsync(settings);
// Query all properties of the device
IEnumerable<StreamResolution> allVideoProperties = _mediaCapture.VideoDeviceController.GetAvailableMediaStreamProperties(MediaStreamType.VideoRecord).Select(x => new StreamResolution(x));
// Order them by resolution then frame rate
allVideoProperties = allVideoProperties.OrderBy(x => x.Height * x.Width).ThenBy(x => x.FrameRate);
await _mediaCapture.VideoDeviceController.SetMediaStreamPropertiesAsync(MediaStreamType.VideoRecord, allVideoProperties.ElementAt(0).EncodingProperties);
// Create the frame reader
MediaFrameSource frameSource = _mediaCapture.FrameSources[selectedSource.SourceInfo.Id];
_reader = await _mediaCapture.CreateFrameReaderAsync(frameSource, MediaEncodingSubtypes.Bgra8);
_reader.FrameArrived += ColorFrameReader_FrameArrivedAsync;
await _reader.StartAsync();
return 1;
}
private async void ColorFrameReader_FrameArrivedAsync(MediaFrameReader sender, MediaFrameArrivedEventArgs args)
{
var frame = sender.TryAcquireLatestFrame();
if (frame != null)
{
var inputBitmap = frame.VideoMediaFrame?.SoftwareBitmap;
}
}
On my local machine (MacBookPro with Bootcamp partition) this code works using the webcam. It detects three supported video modes. I can change the resolution of the bitmap image in FrameArrivedAsync by changing the index from 0 to 1 or 2 at:
_mediaCapture.VideoDeviceController.SetMediaStreamPropertiesAsync(MediaStreamType.VideoRecord, allVideoProperties.ElementAt(0).EncodingProperties);
On HoloLens this code does not work. It detects the different modes like explained here (https://learn.microsoft.com/en-us/windows/mixed-reality/locatable-camera). But setting the MediaStreamProperties does not change anything regarding the received bitmap image. The bitmap is always 1280x720.
Just in case, we want to share how we setup the capture profile, you can refer to the following code with annotate to modify your project for testing. If in doubt, please feel free to add comments.
private async void SetupAndStartMediaCapture()
{
string deviceId = string.Empty;
_mediaCapture = new MediaCapture();
DeviceInformationCollection devices = await DeviceInformation.FindAllAsync(DeviceClass.VideoCapture);
foreach (var device in devices)
{
if(MediaCapture.IsVideoProfileSupported(device.Id))
{
deviceId = device.Id;
break; // The video device for which supported video profile support is queried.
}
}
MediaCaptureInitializationSettings mediaCapSettings = new MediaCaptureInitializationSettings
{
VideoDeviceId = deviceId
};
IReadOnlyList<MediaCaptureVideoProfile> profiles = MediaCapture.FindAllVideoProfiles(deviceId);
var profileMatch = (
from profile in profiles
from desc in profile.SupportedRecordMediaDescription
where desc.Width == 896 && desc.Height == 504 && desc.FrameRate == 24 // HL1
select new { profile, desc }
).FirstOrDefault();// Select the Profile with the required resolution from all available profiles.
if (profileMatch != null)
{
mediaCapSettings.VideoProfile = profileMatch.profile;
mediaCapSettings.RecordMediaDescription = profileMatch.desc;
}
await _mediaCapture.InitializeAsync(mediaCapSettings); //Initializes the MediaCapture object.
}
i am streaming an RTP Audio-Stream via an Barix InStreamer 100 with the Format:
PCM 16Bit 8kHz Mono (Little endian)
I am trying to Play that stream in "realtime" via an MediaElement using a custom MediaStreamSource. The Problem is that i'm getting an Delay of 2 seconds while playing that stream. With VLC on my PC there is "no" delay.
I found out, that MediaStreamSource has a property "AudioBufferLength" witch can be set to a value between 15ms and 1000ms. But when the value is too small, i got an exception in GetSampleAsync->ReportGetSampleCompleted
Thats my code:
protected override void GetSampleAsync(MediaStreamType mediaStreamType)
{
Debug.WriteLine("GetSampleAsync called.");
// Start with one second of data, rounded up to the nearest block.
var blocksize = (uint)AlignUp(_wavParser.WaveFormatEx.AvgBytesPerSec, _wavParser.WaveFormatEx.BlockAlign);
var chunkSize = Math.Min(blocksize, (uint)_stream.Length - _currentPosition);
var chunkDuration = _currentPosition * 10000000L / 8000; //// _wavParser.WaveFormatEx.AudioDurationFromBufferSize((uint)chunkSize);
// Send out the next sample
var chunkSample = new MediaStreamSample(_mediaStreamDescription, _stream, _currentPosition, chunkSize, _currentTimeStamp, chunkDuration, _emptySampleDict);
// Move our timestamp and position forward
_currentPosition += chunkSize;
_currentTimeStamp += chunkDuration;
ReportGetSampleCompleted(chunkSample); // <-- There i got a NullRef Ex, when the chunk is too small
}
here is how i initialy open the media:
protected override void OpenMediaAsync()
{
try
{
// Create a parser
_wavParser = new WavParser(_stream);
// Parse the header
_wavParser.ParseWaveHeader();
_wavParser.WaveFormatEx.ValidateWaveFormat();
_startPosition = _currentPosition = _wavParser.DataPosition;
// Init
_streamAttributes = new Dictionary<MediaStreamAttributeKeys, string>();
_sourceAttributes = new Dictionary<MediaSourceAttributesKeys, string>();
var availableStreams = new List<MediaStreamDescription>();
// Stream Description
_streamAttributes[MediaStreamAttributeKeys.CodecPrivateData] = _wavParser.WaveFormatEx.ToHexString();
_mediaStreamDescription = new MediaStreamDescription(MediaStreamType.Audio, _streamAttributes);
availableStreams.Add(_mediaStreamDescription);
_sourceAttributes[MediaSourceAttributesKeys.Duration] = _wavParser.Duration.ToString();
ReportOpenMediaCompleted(_sourceAttributes, availableStreams);
}
catch (Exception exception)
{
Debug.WriteLine("Error while opening media source." + exception.Message);
}
}
Can someone give me a hint? Or are there better ways to Play an RTP/PCM Stream in Realtime on an WindowsPhone (8)? Please save my ass :)
I want to enable drag and drop from our windows forms based application to Windows Explorer. The big problem: The files are stored in a database, so I need to use delayed data rendering. There is an article on codeproject.com, but the author is using a H_GLOBAL object which leads to memory problems with files bigger than aprox. 20 MB. I haven't found a working solution for using an IStream Object instead. I think this must be possible to implement, because this isn't an unusual case. (A FTP program needs such a feature too, for example)
Edit: Is it possible to get an event when the user drops the file? So I could for example copy it to temp and the explorer gets it from there? Maybe there is an alternative approach for my problem...
AFAIK, there is not working article about this for .net. So you should write it by yourself, this is somewhat complicate, because .net DataObject class is limited. I have working example of the opposite task (accepting delayed rendering files from explorer), but it is easier, because I do not needed own IDataObject implementation.
So your task will be:
Find working IDataObject implementation in .net. I recommend you look here (Shell Style Drag and Drop in .NET (WPF and WinForms))
You also need an IStream wrapper for managed stream (it is relatively easy to implement)
Implement delayed rendering using information from MSDN (Shell Clipboard Formats)
This is the starting point, and in general enough information to implement such feature. With bit of patience and several unsuccessful attempts you will do it :)
Update: The following code lacks many necessary methods and functions, but the main logic is here.
// ...
private static IEnumerable<IVirtualItem> GetDataObjectContent(System.Windows.Forms.IDataObject dataObject)
{
if (dataObject == null)
return null;
List<IVirtualItem> Result = new List<IVirtualItem>();
bool WideDescriptor = dataObject.GetDataPresent(ShlObj.CFSTR_FILEDESCRIPTORW);
bool AnsiDescriptor = dataObject.GetDataPresent(ShlObj.CFSTR_FILEDESCRIPTORA);
if (WideDescriptor || AnsiDescriptor)
{
IDataObject NativeDataObject = dataObject as IDataObject;
if (NativeDataObject != null)
{
object Data = null;
if (WideDescriptor)
Data = dataObject.GetData(ShlObj.CFSTR_FILEDESCRIPTORW);
else
if (AnsiDescriptor)
Data = dataObject.GetData(ShlObj.CFSTR_FILEDESCRIPTORA);
Stream DataStream = Data as Stream;
if (DataStream != null)
{
Dictionary<string, VirtualClipboardFolder> FolderMap =
new Dictionary<string, VirtualClipboardFolder>(StringComparer.OrdinalIgnoreCase);
BinaryReader Reader = new BinaryReader(DataStream);
int Count = Reader.ReadInt32();
for (int I = 0; I < Count; I++)
{
VirtualClipboardItem ClipboardItem;
if (WideDescriptor)
{
FILEDESCRIPTORW Descriptor = ByteArrayHelper.ReadStructureFromStream<FILEDESCRIPTORW>(DataStream);
if (((Descriptor.dwFlags & FD.FD_ATTRIBUTES) > 0) && ((Descriptor.dwFileAttributes & FileAttributes.Directory) > 0))
ClipboardItem = new VirtualClipboardFolder(Descriptor);
else
ClipboardItem = new VirtualClipboardFile(Descriptor, NativeDataObject, I);
}
else
{
FILEDESCRIPTORA Descriptor = ByteArrayHelper.ReadStructureFromStream<FILEDESCRIPTORA>(DataStream);
if (((Descriptor.dwFlags & FD.FD_ATTRIBUTES) > 0) && ((Descriptor.dwFileAttributes & FileAttributes.Directory) > 0))
ClipboardItem = new VirtualClipboardFolder(Descriptor);
else
ClipboardItem = new VirtualClipboardFile(Descriptor, NativeDataObject, I);
}
string ParentFolder = Path.GetDirectoryName(ClipboardItem.FullName);
if (string.IsNullOrEmpty(ParentFolder))
Result.Add(ClipboardItem);
else
{
VirtualClipboardFolder Parent = FolderMap[ParentFolder];
ClipboardItem.Parent = Parent;
Parent.Content.Add(ClipboardItem);
}
VirtualClipboardFolder ClipboardFolder = ClipboardItem as VirtualClipboardFolder;
if (ClipboardFolder != null)
FolderMap.Add(PathHelper.ExcludeTrailingDirectorySeparator(ClipboardItem.FullName), ClipboardFolder);
}
}
}
}
return Result.Count > 0 ? Result : null;
}
// ...
public VirtualClipboardFile : VirtualClipboardItem, IVirtualFile
{
// ...
public Stream Open(FileMode mode, FileAccess access, FileShare share, FileOptions options, long startOffset)
{
if ((mode != FileMode.Open) || (access != FileAccess.Read))
throw new ArgumentException("Only open file mode and read file access supported.");
System.Windows.Forms.DataFormats.Format Format = System.Windows.Forms.DataFormats.GetFormat(ShlObj.CFSTR_FILECONTENTS);
if (Format == null)
return null;
FORMATETC FormatEtc = new FORMATETC();
FormatEtc.cfFormat = (short)Format.Id;
FormatEtc.dwAspect = DVASPECT.DVASPECT_CONTENT;
FormatEtc.lindex = FIndex;
FormatEtc.tymed = TYMED.TYMED_ISTREAM | TYMED.TYMED_HGLOBAL;
STGMEDIUM Medium;
FDataObject.GetData(ref FormatEtc, out Medium);
try
{
switch (Medium.tymed)
{
case TYMED.TYMED_ISTREAM:
IStream MediumStream = (IStream)Marshal.GetTypedObjectForIUnknown(Medium.unionmember, typeof(IStream));
ComStreamWrapper StreamWrapper = new ComStreamWrapper(MediumStream, FileAccess.Read, ComRelease.None);
// Seek from beginning
if (startOffset > 0)
if (StreamWrapper.CanSeek)
StreamWrapper.Seek(startOffset, SeekOrigin.Begin);
else
{
byte[] Null = new byte[256];
int Readed = 1;
while ((startOffset > 0) && (Readed > 0))
{
Readed = StreamWrapper.Read(Null, 0, (int)Math.Min(Null.Length, startOffset));
startOffset -= Readed;
}
}
StreamWrapper.Closed += delegate(object sender, EventArgs e)
{
ActiveX.ReleaseStgMedium(ref Medium);
Marshal.FinalReleaseComObject(MediumStream);
};
return StreamWrapper;
case TYMED.TYMED_HGLOBAL:
byte[] FileContent;
IntPtr MediumLock = Windows.GlobalLock(Medium.unionmember);
try
{
long Size = FSize.HasValue ? FSize.Value : Windows.GlobalSize(MediumLock).ToInt64();
FileContent = new byte[Size];
Marshal.Copy(MediumLock, FileContent, 0, (int)Size);
}
finally
{
Windows.GlobalUnlock(Medium.unionmember);
}
ActiveX.ReleaseStgMedium(ref Medium);
Stream ContentStream = new MemoryStream(FileContent, false);
ContentStream.Seek(startOffset, SeekOrigin.Begin);
return ContentStream;
default:
throw new ApplicationException(string.Format("Unsupported STGMEDIUM.tymed ({0})", Medium.tymed));
}
}
catch
{
ActiveX.ReleaseStgMedium(ref Medium);
throw;
}
}
// ...
Googlers may find this useful: download a file using windows IStream