I'm trying to create a Desktop Recording Application. When I record the full screen, the program works as it is supposed to, but in some cases when I select a specific region from the desktop to record I get an error at: int result = AVIStreamSetFormat(psCompress, 0, ref bi, (Int32)bi.biSize);
Error in VideoStreamSetFormat: -2147205016.
I'm using Xvid MPEG-4 Codec to create AVI video. I think the problem might me that Xvid MPEG-4 Codec does not accept certaint image sizes (width and height). I'm not sure and stuck on this problem and I'm asking if somebody can help me understand why it is not working.
private void SetFormat(IntPtr psCompress)
{
BITMAPINFOHEADER bi = new BITMAPINFOHEADER();
bi.biSize = (uint)Marshal.SizeOf(bi);
bi.biWidth = (Int32)_width;
bi.biHeight = (Int32)_height;
bi.biPlanes = 1;
bi.biBitCount = 24;
bi.biCompression = 0; // 0 = BI_RGB
bi.biSizeImage = _stride * _height;
int result = AVIStreamSetFormat(psCompress, 0, ref bi, (Int32)bi.biSize);
if (result != 0)
{
throw new Exception("Error in VideoStreamSetFormat: " + result.ToString());
}
}
I found what was the problem. When taking screenshots from selected regions on the desktop I had to be sure that the height and width are divisible to 2. It seems that Xvid MPEG-4 Codec does not accept just any image size.
I had the same problem.
In my case I was setting the scale and rate to 0.
Make sure you are specifying the speed of the avi correctly before calling that function.
Related
I would like to create a video file from a list in c# which can be in any format that media player can open.
I have tried Aforge and Avi file wrapper, but unfortunately they only work in x86 and I have quite a lot of dependencies so that I can not change the project type. So, it has to be x64.
All my Bitmaps are in a list (which is around 50 or so)
public List tv_ImageData = new List();
I am new to c# and don't know my way around much. I have googled and could find no solution. I'd be grateful if someone can point me to the right direction (or library).
(I feel like this would be better as a comment, but I don't have the reputation for that yet. I'm sorry if this is bad practice!)
Since your only problem with AForge seems to be that it is compiled for x86, I'll mention that it looks like you can recompile it yourself for an x64 target.
https://github.com/andrewkirillov/AForge.NET
A quick search found this link to a recompile of AForge that includes a 64-bit version:
https://archive.codeplex.com/?p=aforgeffmpeg
I wouldn't know if that's up to date or not, so I might recommend compiling it yourself.
I hope that helps!
After some mingling with SharpAvi I solved my problem.
I had a List called
List<ushort[]> tv_data = new List<ushort> tv_data();
which contained the frames as raw data (values in 0-255 range).
I tried to use the example supplied by the documentation but it gave me an upside avi(I guess its because SharpAvi expects DIB bitmaps). So I changed it a bit and borrowed a bit from here (How to create bitmap from byte array?) top get a working solution.
Here is my function:
using SharpAvi;
using SharpAvi.Output;
This may not be the best way to do it but it works. Hope someone will find it useful.
private void SaveAsVideo(object sender, RoutedEventArgs e)
{
if (loadedFileName != "")
{
try
{
var writer = new AviWriter(string.Format("{0}.avi", fullPath))
{
FramesPerSecond = (decimal)VI.FrameRate,
EmitIndex1 = true
};
var stream = writer.AddVideoStream();
stream.Width = (int)VI.VideoWidth;
stream.Height = (int)VI.VideoHeight;
stream.Codec = KnownFourCCs.Codecs.Uncompressed;
stream.BitsPerPixel = BitsPerPixel.Bpp8;
var frameData = new byte[stream.Width * stream.Height];
int frameNo = 0;
foreach (ushort[] data in tv_Data)
{
byte[] byteData = tv_Data.ElementAt(frameNo);
byte[] newbytes = PadLines(byteData, stream.Height, stream.Width);
stream.WriteFrame(true, newbytes, 0, frameData.Length);
frameNo++;
}
writer.Close();
MessageBox.Show("Video file saved.");
}
catch (Exception ex)
{
MessageBox.Show(string.Format("Failed to save video. \n {0}", ex.Message));
}
}
}
static byte[] PadLines(byte[] bytes, int rows, int columns)
{
int currentStride = columns;
int newStride = columns;
byte[] newBytes = new byte[newStride * rows];
byte[] tempBytes = new byte[newStride];
for (int i = 0; i < rows; i++)
{
Buffer.BlockCopy(bytes, currentStride * i, tempBytes, 0, currentStride);
Array.Reverse(tempBytes);
Buffer.BlockCopy(tempBytes, 0, newBytes, newStride * i, currentStride);
}
Array.Reverse(newBytes);
return newBytes;
}
I am trying to do something very specific here; Essentially, what this code does:
else if (cmd == "streams")
{
Console.WriteLine("Hello, thank you for testing out the streams beta.\nBefore you start, there are some things you should know.\n First off, all screenshots are saved to your documents folder. They are named streams and streams_green.\nTo stop recording, close the software.");
Console.WriteLine("Please enter your monitors resolution (RECORD WITH YOUR GAME IN FULLSCREEN WINDOWED)");
Console.WriteLine("X:");
int xres = Convert.ToInt32(Console.ReadLine());
Console.WriteLine("Y:");
int yres = Convert.ToInt32(Console.ReadLine());
p.Send(maxfps);
Console.WriteLine("Thank you, recording now started.");
Bitmap memoryImage;
memoryImage = new Bitmap(xres, yres);
Size s = new Size(memoryImage.Width, memoryImage.Height);
Graphics memoryGraphics = Graphics.FromImage(memoryImage);
for (var i = 0; ; i++)
{
memoryGraphics.CopyFromScreen(0, 0, 0, 0, s);
string str = "";
string str2 = "";
str = string.Format(Environment.GetFolderPath(Environment.SpecialFolder.MyDocuments) +
$#"\streams{i}.png");
str2 = string.Format(Environment.GetFolderPath(Environment.SpecialFolder.MyDocuments) +
$#"\streams_green{i}.png");
System.Threading.Thread.Sleep(2);
//Send spacebar would go here
p.Send(green);
memoryImage.Save(str);
System.Threading.Thread.Sleep(1);
p.Send(regular);
memoryImage.Save(str2);
System.Threading.Thread.Sleep(1);
}
Is send a dvar to a game, take a screenshot, and another dvar, then take another screenshot ect. As you can see by "p.Send(green)" and (regular). However, I have an issue. I am trying to have this save all of the greenscreen screenshots as streams_green, and the regular ones under streams. It is currently properly saving all of the files, however, the _green images are identical to the regular ones, even though the moment the green screenshot was taken was a completely different displayed frame as the one before it. Thanks.
Based on the code your method takes only one screenshoot at line memoryGraphics.CopyFromScreen(0, 0, 0, 0, s); and then save it in two separate files by memoryImage.Save. If instancegreen (in line p.Send(green)) and instance regular (in line p.Send(regular)) have information you want to save then you should change them to Bitmap and save.
I've implemented a class that reads 24 bit-per-pixel TIFF generated by Microsoft.Reporting.WinForms.ReportViewer, converts it to a 1 bit-per-pixel TIFF and stores the result into a file.
This part is working just fine - I'm able to open the resulting TIFF in a TIFF viewer and view the contents.
For compression I'm using the following codec:
outImage.SetField(TiffTag.COMPRESSION, Compression.CCITT_T6);
Now I'm trying to read the same 1 bit-per-pixel TIFF and decompress it. I wrote the following methods:
public static void DecompressTiff(byte[] inputTiffBytes)
{
using (var tiffStream = new MemoryStream(inputTiffBytes))
using (var inImage = Tiff.ClientOpen("in-memory", "r", tiffStream, new TiffStream()))
{
if (inImage == null)
return null;
int totalPages = inImage.NumberOfDirectories();
for (var i = 0; i < totalPages; )
{
if (!inImage.SetDirectory((short) i))
return null;
var decompressedTiff = DecompressTiff(inImage);
...
}
private static byte[] DecompressTiff(Tiff image)
{
// Read in the possibly multiple strips
var stripSize = image.StripSize();
var stripMax = image.NumberOfStrips();
var imageOffset = 0;
int row = 0;
var bufferSize = image.NumberOfStrips() * stripSize;
var buffer = new byte[bufferSize];
int height = 0;
var result = image.GetField(TiffTag.IMAGELENGTH);
if (result != null)
height = result[0].ToInt();
int rowsperstrip = 0;
result = image.GetField(TiffTag.ROWSPERSTRIP);
if (result != null)
rowsperstrip = result[0].ToInt();
if (rowsperstrip > height && rowsperstrip != -1)
rowsperstrip = height;
for (var stripCount = 0; stripCount < stripMax; stripCount++)
{
int countToRead = (row + rowsperstrip > height) ? image.VStripSize(height - row) : stripSize;
var readBytesCount = image.ReadEncodedStrip(stripCount, buffer, imageOffset, countToRead); // Returns -1 for the last strip of the very first page
if (readBytesCount == -1)
return null;
imageOffset += readBytesCount;
row += rowsperstrip;
}
return buffer;
}
The problem is that when ReadEncodedStrip() is called for the last strip of the very first page - it returns -1, indicating that there is an error. And I can't figure out what's wrong even after debugging LibTIFF.NET decoder code. It's something with EOL TIFF marker discovered where it's not expected.
By some reason, LibTIFF.NET can't read a TIFF produced by itself or most likely I'm missing something. Here is the problem TIFF.
Could anyone please help to find the root cause?
After a more than a half day investigation, I've finally managed to detect the cause of this strange issue.
To convert from 24 bit-per-pixel TIFF to 1 bit-per-pixel, I ported algorithms from C to C# of the the 2 tools shipping with original libtiff: tiff2bw and tiffdither.
tiffdither has the bug that it doesn't include last image row in the output image, i.e. if you feed to it an image with 2200 rows height, you get the image with 2199 rows height as output.
I've noticed this bug in the very beginning of the porting and tried to fix, but, as it turned out eventually, not completely and the ported algorithm actually didn't write the last row via WriteScanline() method to the output TIFF. So this was the reason why LibTIFF.NET wasn't able to read last strip\row of the image depending on what reading method I used.
What was surprising to me is that LibTIFF.NET allows to write such actually corrupted TIFF without any error during writing. For example WriteDirectory() method returns true in this situation when image height set via TiffTag.IMAGELENGTH differs from the actual coount of rows written to it. However, later it can't read such the image and the error is thrown while reading.
Maybe this behavior inherited from the original libtiff, though.
I am new on processing wav file and C#.My goal is to real time data plotting in waveform of wavfile.I mean while recording sound(wav) file,i want to plot its graph simultaneously.I searched some sound libiraries and decide to use NAudio.(Dont know it is the best choice for me.I am open to any suggestions about choosing audio libirary). However i have no idea about real time data plotting using sound. Some people suggest GDI but as i said i am new and i think it will take too much time to use GDI efficiently.If i must learn GDI,pls share any article that can help a beginner like me. Actually i look like dont know where should i start. Need to be guided :)) And i have a question.
One of the tutorial of NAudio,he works with byte array to plot the waveform in Chart.It is fine if you know the size of wav file.However it works too slow and gives Out of Memory Exception for bigger wav files than 10mb.The code below refers to what i mean.
OpenFileDialog open = new OpenFileDialog();
open.Filter = "Wave File (*.wav)|*.wav;";
if (open.ShowDialog() != DialogResult.OK) return;
chart1.Series.Add("wave");
chart1.Series["wave"].ChartType = System.Windows.Forms.DataVisualization.Charting.SeriesChartType.FastLine;
chart1.Series["wave"].ChartArea = "ChartArea1";
NAudio.Wave.WaveChannel32 wave = new NAudio.Wave.WaveChannel32(new NAudio.Wave.WaveFileReader(open.FileName));
byte[] buffer = new byte[426565];
int read;
while (wave.Position < wave.Length)
{
read = wave.Read(buffer, 0, 426565);
for (int i = 0; i < read / 4; i++)
{
chart1.Series["wave"].Points.Add(BitConverter.ToSingle(buffer, i * 4));
}
}
Is there a way to perform this operation faster?
If you plot every single sample, you will end up with a waveform that is unmanageably large since audio usually contains many thousands of samples per second. A common way waveforms are drawn is by selecting the maximum value over a period of time, and then drawing a vertical line to represent it. For example, if you had a three minute song, and wanted a waveform around 600 pixels wide, each pixel would represent about a third of a second. So you'd find the largest sample value in that third of a second and use that to plot your waveform.
Also, in your sample code you are reading an odd number of bytes. But since this is floating point audio, you should always read in multiples of four bytes.
This worked for me
WaveChannel32 wave = new WaveChannel32(new WaveFileReader(txtWave.Text));
int sampleSize = 1024;
var bufferSize = 16384 * sampleSize;
var buffer = new byte[bufferSize];
int read = 0;
chart.Series.Add("wave");
chart.Series["wave"].ChartType = System.Windows.Forms.DataVisualization.Charting.SeriesChartType.FastLine;
chart.Series["wave"].ChartArea = "ChartArea1";
while (wave.Position < wave.Length)
{
read = wave.Read(buffer, 0, bufferSize);
for (int i = 0; i < read / sampleSize; i++)
{
var point = BitConverter.ToSingle(buffer, i * sampleSize);
chart.Series["wave"].Points.Add(point);
}
}
I've got a PCX decoder in C# (see below) that is meant to read in a Stream and output a Bitmap. It works when dealing with an image that has dimensions that are multiples of 8, and seems to work with most images that are less than 8bpp regardless of dimensions, but images with different dimensions become skewed in an unusal way (see this link ). The pixels are all there, just it seems to be almost moved left in a weird way. The image is a valid PCX and opens in IrfanView and Paint.net.
Edit 1:
Okay, here's the result of quite a bit of testing: images with a byte-per-line value that divides by 8 (e.g. 316x256) decode fine, but images with an odd value don't. This is not true for all PCX files; it would seem that some (most?) images created in IrfanView work fine, but those I've found elsewhere do not. I was working on this some time ago, so I can't recall where they came from, I do know that images saved with the paint.net plug-in (here) also reproduce this problem. I think it's likely due to a padding issue, either with them or my decoder, but the images to decode fine elsewhere so, it's likely I'm the one with the problem, I just can't see where :(
End of Edit 1.
My code for importing is here (there's a lot, but it's the whole decoding algorithm, minus the header, which is processed separately):
public IntPtr ReadPixels(Int32 BytesPerScanline, Int32 ScanLines, Stream file)
{
//BytesPerScanLine is the taken from the header, ScanLines is the height and file is the filestream
IntPtr pBits;
Boolean bRepeat;
Int32 RepeatCount;
Byte ReadByte;
Int32 Row = 0;
Int32 Col = 0;
Byte[] PCXData = new Byte[BytesPerScanline * ScanLines]; //BytesPerScanline * ScanLines);
BinaryReader r = new BinaryReader(file);
r.BaseStream.Seek(128, SeekOrigin.Begin);
while (Row < ScanLines)
{
ReadByte = r.ReadByte();
bRepeat = (0xc0 == (ReadByte & 0xC0));
RepeatCount = (ReadByte & 0x3f);
if (!(Col >= BytesPerScanline))
{
if (bRepeat)
{
ReadByte = r.ReadByte();
while (RepeatCount > 0)
{
PCXData[(Row * BytesPerScanline) + Col] = ReadByte;
RepeatCount -= 1;
Col += 1;
}
}
else
{
PCXData[(Row * BytesPerScanline) + Col] = ReadByte;
Col += 1;
}
}
if (Col >= BytesPerScanline)
{
Col = 0;
Row += 1;
}
}
pBits = System.Runtime.InteropServices.Marshal.AllocHGlobal(PCXData.Length);
System.Runtime.InteropServices.Marshal.Copy(PCXData, 0, pBits, PCXData.Length);
return pBits;
}
I've been advised that it might be an issue with padding, but I can't see where this may be in the code and I'm struggling to see how to understand where the padding is.