I am using MediaCapture class to record voice in aac-lc/m4a format. I am trying to play same audio data on MediaElement. It is not playing the m4a audio. However if i record the voice in 'WAV' format, MediaElement is able to play the audio without any issues. I tried with all possible MIME Types for aac-lc/m4a audio.
Here is the player code:
var inMemoryRas = new InMemoryRandomAccessStream();
var writeStream = inMemoryRas.AsStreamForWrite();
await writeStream.WriteAsync(audioData, 0, audioData.Length);
await writeStream.FlushAsync();
inMemoryRas.Seek(0);
mMediaPlayer_.AudioCategory = AudioCategory.ForegroundOnlyMedia;
String mimeType_ = "";
mMediaPlayer_.SetSource(inMemoryRas, mimeType_);
mMediaPlayer_.AutoPlay = true;
mMediaPlayer_.Play();
I tried with following set of mime types, still no help.
audio/mpeg, audio/mp4, audio/aac, video/mp4, audio/m4a
Again if i record the audio in wav format it plays without any issues.
Here is the recorder code:
MediaEncodingProfile recordProfile = null;
recordProfile = MediaEncodingProfile.CreateM4a(Windows.Media.MediaProperties.AudioEncodingQuality.Low);
mRecordingStream_ = new InMemoryRandomAccessStream();
await m_mediaCaptureMgr.StartRecordToStreamAsync(recordProfile, mRecordingStream_);
I appreciate any help on this, Thanks in advance.
Related
I'm trying to get one video stream (like webcam) with MediaCapture and apply a Chromakey effect on it and overlay the result on another video stream (like another webcam) with MediaCapture too and record the result in a mp4 file during "live".
I've tried with custom video effect. The ChromaKeyEffect works but I've failed having the secondary input stream for put is as background in my effect class.
I've also try with MediaComposition but it doesn't work with MediaCapture. It use only MediaClips witch are not "live".
I don't know how should I do this.
Edit:
In my page
VideoEffectDefinition definition = new VideoEffectDefinition(typeof(AlphaOnMediaCaptureEffect).FullName);
await mediaCapture.AddVideoEffectAsync(definition, MediaStreamType.VideoRecord);
In a AlphaOnMediaCaptureEffect: IBasicVideoEffect
public async void ProcessFrame(ProcessVideoFrameContext context){
using (CanvasBitmap inputBitmap = CanvasBitmap.CreateFromDirect3D11Surface(canvasDevice, context.InputFrame.Direct3DSurface))
using (CanvasRenderTarget renderTarget = CanvasRenderTarget.CreateFromDirect3D11Surface(canvasDevice, context.OutputFrame.Direct3DSurface, 96))
using (CanvasDrawingSession ds = renderTarget.CreateDrawingSession())
{
ChromaKeyEffect chromaKeyEffect = new ChromaKeyEffect
{
Source = inputBitmap,
Color = Color.FromArgb(255, 0, 255, 0),
Feather = true
};
chromaKeyEffect.Tolerance = 0.3f;
CompositeEffect compositeEffect = new CompositeEffect
{
Sources = { otherInputBitmap, chromaKeyEffect }
};
compositeEffect.Mode = CanvasComposite.SourceIn;
ds.DrawImage(compositeEffect);
}
I need another CanvasBitmap like inputBitmap (otherInputBitmap) which contains frames from a second mediacapture. I said "I've failed" because I didn't find a way to code this. I think I wrong in the "process" to obtain this live video overlay on a live video with an ChromaKeyEffect. I'm not sure that IBasicVideoEffect is the best way to do this.
Is it possible to play audio on Xamarin.Forms (Android only, no iOS required) which is received from network over TCP or UDP? In WPF, I'm using NAudio and I have something like this:
// set output device
var provider = new BufferedWaveProvider(codec.RecordFormat);
outputDevice = new WaveOut();
outputDevice.DeviceNumber = 0;
outputDevice.Init(provider);
outputDevice.Play();
Data is received from TCP connection:
if (outputDevice != null)
{
byte[] decoded = codec.Decode(data, 0, data.Length);
provider.AddSamples(decoded, 0, decoded.Length);
}
In this case, data is byte[] - its added to circular buffer and WaveOut handles it like stream, playing it continously. That solution works great.
I need same thing in Xamarin - I guess I need some kind of wrapper around AudioTrack since it apparently supports playing from byte stream. How should I do this, what is the "best" or preferred way? Basically, how to play streamed audio received over pure TCP/UDP socket?
Google's Android ExoPlayer can stream that media format properly.
The following code is a really simple example of ExoPlayer, but it will show you that it does play that stream:
var mediaUrl = "http://api-streaming.youscribe.com/v1/products/2919465/documents/3214936/audio/stream";
var mediaUri = Android.Net.Uri.Parse(mediaUrl);
var userAgent = Util.GetUserAgent(this, "ExoPlayerDemo");
var defaultHttpDataSourceFactory = new DefaultHttpDataSourceFactory(userAgent);
var defaultDataSourceFactory = new DefaultDataSourceFactory(this, null, defaultHttpDataSourceFactory);
var extractorMediaSource = new ExtractorMediaSource(mediaUri, defaultDataSourceFactory, new DefaultExtractorsFactory(), null, null);
var defaultBandwidthMeter = new DefaultBandwidthMeter();
var adaptiveTrackSelectionFactory = new AdaptiveTrackSelection.Factory(defaultBandwidthMeter);
var defaultTrackSelector = new DefaultTrackSelector(adaptiveTrackSelectionFactory);
exoPlayer = ExoPlayerFactory.NewSimpleInstance(this, defaultTrackSelector);
exoPlayer.Prepare(extractorMediaSource);
exoPlayer.PlayWhenReady = true;
Note:
1.exoPlayer is a class-level variable of SimpleExoPlayer type;
2.this is using the Xamarin.Android binding libraries from the Xam.Plugins.Android.ExoPlayer package
ExoPlayer Docs:
https://developer.android.com/guide/topics/media/exoplayer
I want to stream youtube video through MediaPlayerElement.
MediaElement.Source = MediaSource.CreateFromUri(new Uri("https://www.youtube.com/embed/QTYVJhy04rs"));
I tried this code but it didn't work.
I want something similar to videos in MSPowerUser app.
I have achieved this by using MyToolKit.Extended library for UWP apps with the 'YouTube' class in the 'MyToolKit.Multimedia' namespace.
string youtubeVideoID = "INSERT VIDEO ID"; // Set video id
YouTubeQuality youtubeQuality = YouTubeQuality.Quality720P; // Set video quality
YouTubeUri videoUri = await YouTube.GetVideoUriAsync(youtubeVideoID, youtubeQuality);
// You can get Uri using videoUri.Uri and set the source to a media player
MyToolKit.Extended: https://github.com/MyToolkit/MyToolkit
I used this instructions to capture a video
http://msdn.microsoft.com/en-us/library/windows/apps/hh452791.aspx
here is the part were the file is getting written to the app storage
var storageFile = await Windows.Storage.KnownFolders.VideosLibrary.CreateFileAsync("cameraCapture.mp4", Windows.Storage.CreationCollisionOption.GenerateUniqueName);
await _mediaCapture.StartRecordToStorageFileAsync(_profile, storageFile);
How I can access the file on an other page of the app.
I want to play this video within a MediaElement (VideoLeinwand)
var stream = ???
VideoLeinwand.AutoPlay = true;
VideoLeinwand.SetSource(stream, ???);
VideoLeinwand.Play();
Thank you very much!
If I understood correctly and you have succeded in recording a file cameraCapture.mp4, then you should be able to play it for example like this:
StorageFile mediafile = await Windows.Storage.KnownFolders.VideosLibrary.GetFileAsync("cameraCapture.mp4");
var stream = await mediafile.OpenAsync(Windows.Storage.FileAccessMode.Read);
VideoLeinwand.SetSource(stream, mediafile.ContentType);
VideoLeinwand.Play();
As the XNA SoundEffect is no longer available in the Windows Runtime API (for developing Universal App), I need something similar to play multiple audio streams at the same time.
Requirements:
Play the same audio file multiple times, simultaneously.
Previous Silverlight implementation with SoundEffect:
// Play sound 10 times, sound can be played together.
// i.e. First sound continues playing while second sound starts playing.
for(int i=0; i++; i < 10)
{
Stream stream = TitleContainer.OpenStream("sounds/Ding.wav");
SoundEffect effect = SoundEffect.FromStream(stream);
FrameworkDispatcher.Update();
effect.Play();
// Wait a while before playing again.
Thread.Sleep(500);
}
SoundEffect supports multiple (up to 16 I think) SoundEffectInstance being played simultaneously.
The standard MediaElement API only supports 1 audio stream for Windows Phone 8.1.
I bumped into this: https://github.com/rajenki/audiohelper which uses the XAudio2 API but it doesn't seem to support simultaneous audio either.
Solved. I used SharpDX. Huge thanks to the author here: http://www.hoekstraonline.net/2013/01/13/how-to-play-a-wav-sound-file-with-directx-in-c-for-windows-8/?utm_source=rss&utm_medium=rss&utm_campaign=how-to-play-a-wav-sound-file-with-directx-in-c-for-windows-8
Here is the code to the solution:
Initialization:
xAudio = new XAudio2();
var masteringVoice = new MasteringVoice(xAudio);
var nativeFileStream = new NativeFileStream("Assets/Ding.wav", NativeFileMode.Open, NativeFileAccess.Read, NativeFileShare.Read);
stream = new SoundStream(nativeFileStream);
waveFormat = stream.Format;
buffer = new AudioBuffer
{
Stream = stream.ToDataStream(),
AudioBytes = (int)stream.Length,
Flags = BufferFlags.EndOfStream
};
Event handler:
var sourceVoice = new SourceVoice(xAudio, waveFormat, true);
sourceVoice.SubmitSourceBuffer(buffer, stream.DecodedPacketsInfo);
sourceVoice.Start();
The officially provided code by SharpDX's sample does not use NativeFileStream, it is required to make it work.