I have a project. It is with Raspberry Pi Camera V2. One PC is used for encoding the captured video in MJPEG format and sending it with the serial port.
My PC is used for receiving the data, saving it in a .mjpeg formatted file and playing it with an MJPEG to MP4 converter.
I am trying to save the data in these lines:
byte[] data= new byte[100];
serialPort.Read(data,0,100);
BinaryWriter videoFile = new BinaryWriter(File.Open("video.mjpeg",FileMode.Create));
string dataAscii;
dataAscii = System.Text.Encoding.UTF8.GetString(data); //bytearray to string
videoFile.Write(dataAscii); // which is received
It works, it creates a .mjpeg file. However, I couldn't make it play with the converter. Maybe I should save the data frame by frame or try to save in a different way. I have no idea about what I am doing wrong.
Any ideas, many thanks!
Kane
Why are you converting the byte array into a string before writing it? That's your problem. Just write the byte array directly to the file stream.
Related
I need your help.
I was creating an application in c# that converts the data from the IP camera to an image (JPEG).
I was able to convert the image using the below code:
hex = "FFD8FFDB008400130D0F1.........";/// supply this with the attached hex dump.
byte[] image = HexString2Bytes(hex);
File.WriteAllBytes("visio.png", image);
Process.Start("visio.png");
private static byte[] HexString2Bytes(string hexString)
{
int bytesCount = (hexString.Length) / 2;
byte[] bytes = new byte[bytesCount];
for (int x = 0; x < bytesCount; ++x)
{
bytes[x] = Convert.ToByte(hexString.Substring(x * 2, 2), 16);
}
return bytes;
}
Sometimes I get a better image as expected:https://ibb.co/pxrwn6p
but sometimes I get a distorted image after converting https://ibb.co/9twx5ZT.
I was wondering if there is a problem with the conversion or the way I save the image.
because as per the supplier what I need to do is to directly save the image from the stream.
but since I receive it as a byte and I still need to convert it maybe there is something wrong with my codes.
the image also starts with ÿØÿÛ FF D8 and ends with ÿ Ùÿÿÿÿ (FF D9 FF FF FF FF)
here's the hex dump from their sample app:
https://drive.google.com/file/d/1CMlQ0xaVjM0jfU5A4MB-_HwK54dUMTOr/view?usp=sharing
using their test application the image can be captured and converted the image perfectly.
captured image using their application:https://ibb.co/2KgyLTc
using the hex from the sniff and convert it using my code:
converted image using my code:https://ibb.co/G0WMjht
sample source code:
please bare with my codes because currently this is only my test app before integrating this feature to another app.
https://drive.google.com/file/d/1Ux7zsR39IVNyd1wrBxQPQKA6yM4YnwJN/view?usp=sharing
Thank You in advance.
Looking at the hex-dump it looks like some kind of XML file with embedded image data. Trying to convert this directly to an image will most likely not work, you would need to parse the XML-data to extract the actual image file. But it looks like you have a valid Jpeg header, so I would guess you have found the start of the image at least. But you probably also need to check the length property from the XML-data to find the length of the image-data block.
However, the datablock looks like it contains large sections of zeros, this should not be present in a jpeg file, so it might indicate some data corruption. Possibly from the way the network data is captured.
I would expect cameras to use some higher level protocol than raw TCP. Like Real Time Streaming Protocol, GigE vision, or mjpeg over http. I have not seen any camera that require you to process a raw TCP streams. But since you do not show how the data is fetched it is difficult to tell if there is any mistakes in that code.
I am trying to create a WAV object, from a WMA byte array, I've tried the following code, but the software exits, and in the dump file I found the following exception when I try to create fileReader
"The thread tried to read from or write to a virtual address for which it does not have the appropriate access."
System.IO.File.WriteAllBytes("wmatemp.wma", data);
WMAFileReader fileReader = new WMAFileReader("wmatemp.wma");
WaveStream waveStream = WaveFormatConversionStream.CreatePcmStream(fileReader);
WAV wav = new WAV(AudioMemStream(waveStream).ToArray());
I know I should not save the .wma to the HDD, but I don't know how to proceed differently, any help ?
I want send string byte to speaker something like this:
byte[] bt = {12,32,43,74,23,53,24,54,234,253,153};// example array
var ms = new MemoryStream(bt);
var sound = new System.Media.SoundPlayer();
sound.Stream = ms;
sound.Play();
but I get this exception:
my problem pic http://8pic.ir/images/g699b52xe5ap9s8yf0pz.jpg
The first bytes of a WAV stream contain info about length, etc.
You have to send this "WAV-Header" as well in the first few bytes.
See http://de.wikipedia.org/wiki/RIFF_WAVE
As you'll see its perfectly possible to compose these few bytes in the header and send them before your raw audio data,
You can use some library for reading data from microphone or playing it to speakers.
I worked successfuly with:
NAudio - http://naudio.codeplex.com/
I would not recommend building a WAV file yourself, it may be too much effort for this.
Note that this library (and probably some others, Bass - http://www.un4seen.com is also widely used) also have built in functionality for saving and reading WAV files.
NAudio is best app to play that functionality. use sample app provided.It may help.
I have a device which is sending RGB32 encoded color image frames as byte arrays at 30 FPS.
I'd like to send these frames to the browser in the most performant manner possible via a websocket connection.
I'm assuming that the best way to get the image data through the websocket is as a base64 string that I interpret as a URI in the browser, but base64 encoding raw RGB32 data obviously doesn't work, as it's not interpretable by the browser.
Therefore, I need to encode the RGB32 data as jpg before encoding into base64 and sending down the pipe, but all of the solutions I have been able to find for encoding byte arrays involve saving to the disk, which is obviously a performance killer.
Can anyone explain to me a performant manner to convert these RGB32 byte arrays to jpgs on the fly so I can push them down the pipe?
Alternately, if anyone has any ideas on an alternate high performance manner to get these RGBA frames down a websocket to a browser, I'd love to hear.
Thanks!
Code looks something like this:
private void ColorFrameReady(object sender, ColorImageFrameReadyEventArgs e)
{
using (ColorImageFrame frame = e.OpenColorImageFrame())
{
if (frame != null)
{
byte[] pixelData = new byte[frame.PixelDataLength];
frame.CopyPixelDataTo(pixelData);
//display in wpf
this._ColorImageBitmap.WritePixels(this._ColorImageBitmapRect, pixelData, this._ColorImageStride, 0);
//send via websocket to chrome
//how do we create a base64 encoded jpg here and pass it off to chrome?
MainWindow.Broadcast(DataForChromeGoesHere);
}
}
}
I have a flash app which sends raw data for a jpg image to a particular url Send.aspx . In Send.aspx I am using request.binaryread() to get the total request length and then read in the data to a byte array.
Then I am writing the data as jpg file to the server. The code is given below:
FileStream f = File.Create(Server.MapPath("~") + "/plugins/handwrite/uploads/" + filename);
byte[] data = Request.BinaryRead(Request.TotalBytes);
f.Write(data, 0, data.Length);
f.Close();
The file is getting created but there is no image in it. It always shows up as empty in any graphic viewer. What part am I missing. Am I supposed to use jpg encoding first before writing it to file? Thanks in advance
Well, you should use a using statement for your file stream, but other than that it looks okay to me.
A few suggestions for how to proceed...
Is it possible that the client isn't providing the data properly? Perhaps it's providing it as base64-encoded data?
Have you already read some data from the request body? (That could mess things up.)
I suggest you look closely at what you end up saving vs the original file:
Are they the same length? If not, which is longer?
If they're the same length, do their MD5 sums match?
If you look at both within a binary file editor, do they match at all? Any obvious differences?