This question already has answers here:
Creating Dicom file out of video
(1 answer)
How to encapsulate the H.264 bitstream of video file in C++
(1 answer)
Closed 7 months ago.
I'am trying to create a dicom video from mp4 file and add my own tags. I can create it but when I try to reproduce the video with DICOM viewer like MicroDicom i can only see a black image.
public async void VideoToDicom(string srcFilePath, string videoName, string videoId)
{
// var dataset = new DicomDataset(DicomTransferSyntax.MPEG4AVCH264BDCompatibleHighProfileLevel41);
var dataset = new DicomDataset(DicomTransferSyntax.MPEG4AVCH264HighProfileLevel42For2DVideo);
var metadata = await VideoUtils.Instance.GetVideoMetaData(srcFilePath);
dataset.Add(DicomTag.SOPClassUID, DicomUID.CTImageStorage);
dataset.Add(DicomTag.SOPInstanceUID, DicomUID.Generate());
dataset.Add(DicomTag.StudyInstanceUID, DicomUID.Generate());
dataset.Add(DicomTag.SeriesInstanceUID, DicomUID.Generate());
var numberOfFrames = metadata.Duration.TotalSeconds * metadata.VideoData.Fps;
var size = metadata.VideoData.FrameSize.Split("x");
dataset.Add(DicomTag.NumberOfFrames, (int)numberOfFrames);
dataset.Add(DicomTag.Columns, Convert.ToUInt16(size[1]));
dataset.Add(DicomTag.Rows, Convert.ToUInt16(size[0]));
dataset.Add(DicomTag.BitsAllocated, "8");
dataset.Add(DicomTag.FrameAcquisitionDuration, "40");
dataset.Add(DicomTag.PixelRepresentation, (ushort)0);
dataset.Add(DicomTag.VideoImageFormatAcquired, "MPEG4");
dataset.Add(DicomTag.LossyImageCompressionMethod, "ISO_14496_10");
dataset.Add(DicomTag.LossyImageCompression, "01");
dataset.Add(DicomTag.PhotometricInterpretation, PhotometricInterpretation.YbrPartial420.Value);
DicomPixelData pixelData = CreateDicomVideoPixelData(metadata, dataset);
byte[] videoBytes = GetVideoData(srcFilePath);
MemoryByteBuffer buffer = new MemoryByteBuffer(videoBytes);
pixelData.AddFrame(buffer);
var dicomFile = new DicomFile(dataset);
dicomFile.FileMetaInfo.TransferSyntax = DicomTransferSyntax.MPEG4AVCH264HighProfileLevel42For2DVideo;
dicomFile.Save("C:\\testvideos\\test.dcm");}
private DicomPixelData CreateDicomVideoPixelData(MetaData metadata, DicomDataset dataset)
{
DicomPixelData pixelData = DicomPixelData.Create(dataset, true);
var numberOfFrames = metadata.Duration.TotalSeconds * metadata.VideoData.Fps;
var size = metadata.VideoData.FrameSize.Split("x");
pixelData.Width = Convert.ToUInt16(size[0]);
pixelData.Height = Convert.ToUInt16(size[1]);
pixelData.NumberOfFrames = (int)numberOfFrames;
pixelData.HighBit = 7;
pixelData.BitsStored = 8;
//pixelData.BitsAllocated = 8; Readonly?
pixelData.SamplesPerPixel = 3;
pixelData.PlanarConfiguration = 0;
pixelData.PixelRepresentation = 0;
//pixelData.PhotometricInterpretation = PhotometricInterpretation.YbrPartial420;
return pixelData;
}
private static byte[] GetVideoData(string videoFile)
{
byte[] buffer;
FileStream fileStream = new FileStream(videoFile, FileMode.Open, FileAccess.Read);
try
{
int length = (int)fileStream.Length; // get file length
buffer = new byte[length]; // create buffer
int count; // actual number of bytes read
int sum = 0; // total number of bytes read
// read until Read method returns 0 (end of the stream has been reached)
while ((count = fileStream.Read(buffer, sum, length - sum)) > 0)
sum += count; // sum is a buffer offset for next reading
}
finally
{
fileStream.Close();
}
return buffer;}
At least if i can not see a video i expect to get all the frames from the video.
Thanks for the help.
My file on dicom viewer
Related
I am trying to replace an image stream within an SDF document, using PDFNet 7.0.4 and netcoreapp3.1. As much as possible, I want to maintain the original object and its metadata; same dimensions, color system, compression, etc. Ideally object number and even generation would remain the same as well - the goal is that a before and after comparison would show only the changed pixels within the stream.
I'm getting the raw pixel data as a Stream object with this method:
private Stream GetImageData(int objectNum)
{
var image = new PDF.Image(sdfDoc.GetObj(objectNum));
var bits = image.GetBitsPerComponent();
var channels = image.GetComponentNum();
var bytesPerChannel = bits / 8;
var height = image.GetImageHeight();
var width = image.GetImageWidth();
var data = image.GetImageData();
var len = height * width * channels * bytesPerChannel;
using (var reader = new pdftron.Filters.FilterReader(data))
{
var buffer = new byte[len];
reader.Read(buffer);
return new MemoryStream(buffer);
}
}
After manipulating the image data, I want to update it before saving the underlying SDFDoc object. I've tried using the following method:
private void SetImageData(int objectNum, Stream stream)
{
var image = new PDF.Image(sdfDoc.GetObj(objectNum));
var bits = image.GetBitsPerComponent();
var channels = image.GetComponentNum();
var bytesPerChannel = bits / 8;
var height = image.GetImageHeight();
var width = image.GetImageWidth();
var len = height * width * channels * bytesPerChannel;
if (stream.Length != len) { throw new DataMisalignedException("Stream length does not match expected image dimensions"); }
using (var ms = new MemoryStream())
using (var writer = new pdftron.Filters.FilterWriter(image.GetImageData()))
{
stream.CopyTo(ms);
writer.WriteBuffer(ms.ToArray());
}
}
This runs without error, but nothing actually appears to get updated. I've tried playing around with SDFObj.SetStreamData(), but haven't been able to make that work either. What is the lowest impact, highest performance way to directly replace just the raw pixel data within an image stream?
edit
I have this halfway working with this method:
private void SetImageData(int objectNum, Stream stream)
{
var sdfObj = sdfDoc.GetObj(objectNum);
var image = new PDF.Image(sdfObj);
var bits = image.GetBitsPerComponent();
var channels = image.GetComponentNum();
var bytesPerChannel = bits / 8;
var height = image.GetImageHeight();
var width = image.GetImageWidth();
var len = height * width * channels * bytesPerChannel;
if (stream.Length != len) { throw new DataMisalignedException("Stream length does not match expected image dimensions"); }
var buffer = new byte[len];
stream.Read(buffer, 0, len);
sdfObj.SetStreamData(buffer);
sdfObj.Erase("Filters");
}
This works as expected, but with the obvious caveat that it just ignores any existing compression and turns the image into a raw uncompressed stream.
I've tried sdfObj.SetStreamData(buffer, image.GetImageData()); and sdfObj.SetStreamData(buffer, image.GetImageData().GetAttachedFilter());
and this does update the object in the file, but the resulting image fails to render.
The following code shows how to retain an Image object, but change the actual stream data.
static private Stream GetImageData(Obj o)
{
var image = new pdftron.PDF.Image(o);
var bits = image.GetBitsPerComponent();
var channels = image.GetComponentNum();
var bytesPerChannel = bits / 8;
var height = image.GetImageHeight();
var width = image.GetImageWidth();
var data = image.GetImageData();
var len = height * width * channels * bytesPerChannel;
using (var reader = new pdftron.Filters.FilterReader(data))
{
var buffer = new byte[len];
reader.Read(buffer);
return new MemoryStream(buffer);
}
}
static private void SetImageData(PDFDoc doc, Obj o, Stream stream)
{
var image = new pdftron.PDF.Image(o);
var bits = image.GetBitsPerComponent();
var channels = image.GetComponentNum();
var bytesPerChannel = bits / 8;
var height = image.GetImageHeight();
var width = image.GetImageWidth();
var len = height * width * channels * bytesPerChannel;
if (stream.Length != len) { throw new DataMisalignedException("Stream length does not match expected image dimensions"); }
o.Erase("DecodeParms"); // Important: this won'be accurate after SetStreamData
// now we actually do the stream swap
o.SetStreamData((stream as MemoryStream).ToArray(), new FlateEncode(null));
}
static private void InvertPixels(Stream stream)
{
// This function is for DEMO purposes
// this code assumes 3 channel 8bit
long length = stream.Length;
long pixels = length / 3;
for(int p = 0; p < pixels; ++p)
{
int c1 = stream.ReadByte();
int c2 = stream.ReadByte();
int c3 = stream.ReadByte();
stream.Seek(-3, SeekOrigin.Current);
stream.WriteByte((byte)(255 - c1));
stream.WriteByte((byte)(255 - c2));
stream.WriteByte((byte)(255 - c3));
}
stream.Seek(0, SeekOrigin.Begin);
}
And then here is sample code to use.
static void Main(string[] args)
{
PDFNet.Initialize();
var x = new PDFDoc(#"2002.04610.pdf");
x.InitSecurityHandler();
var o = x.GetSDFDoc().GetObj(381);
Stream source = GetImageData(o);
InvertPixels(source);
SetImageData(x, o, source);
x.Save(#"2002.04610-MOD.pdf", SDFDoc.SaveOptions.e_remove_unused);
}
I'm using NAudio to open a wav file.
After I have used the SimpleCompressor class I also must do some normalizing the volume of the file to 0db, but I have no idea how to do that.
At the moment I have this:
string strCompressedFile = "";
byte[] WaveData = new byte[audio.Length];
SimpleCompressorStream Compressor = new SimpleCompressorStream(audio);
Compressor.Enabled = true;
if (Compressor.Read(WaveData, 0, WaveData.Length) > 0)
{
//doing the normalizing now
}
How can I get the volume from the new byte array WaveData and how can I change it?
In WaveData is the entire wav file including the file header.
You definitely can change individual sample value so that it fits maximum level:
string strCompressedFile = "";
byte[] WaveData = new byte[audio.Length];
SimpleCompressorStream Compressor = new SimpleCompressorStream(audio);
Compressor.Enabled = true;
byte maxLevel = 20;
if (Compressor.Read(WaveData, 0, WaveData.Length) > 0)
{
for (int i = 0; i < audio.Length; i++)
{
if (WaveData[i] > maxLevel)
{
WaveData[i] = maxLevel;
}
}
}
I've added a loop which iterates through all the samples and if it's value is higher that maxLevel we set it to maxLevel.
I am new to Naudio and using it to get PCM data from Mp3 files, this is my code to take PCM from mono-channel file, but don't know how to do it with stereo channel file
code:
Mp3FileReader file = new Mp3FileReader(op.FileName);
int _Bytes = (int)file.Length;
byte[] Buffer = new byte[_Bytes];
file.Read(Buffer, 0, (int)_Bytes);
for (int i = 0; i < Buffer.Length - 2; i += 2)
{
byte[] Sample_Byte = new byte[2];
Sample_Byte[0] = Buffer[i + 1];
Sample_Byte[1] = Buffer[i + 2];
Int16 _ConvertedSample = BitConverter.ToInt16(Sample_Byte, 0);
}
How can I get PCM from stereo channel Mp3 file?
In a stereo file, the samples are interleaved: one left channel sample followed by one right channel etc. So in your loop you could go through four bytes at a time to read out the samples.
Also there are some bugs in your code. You should use return value of Read, not the size of the buffer, and you have an off by one error in the code to access the samples. Also, no need to copy into a temporary buffer.
Something like this should work for you:
var file = new Mp3FileReader(fileName);
int _Bytes = (int)file.Length;
byte[] Buffer = new byte[_Bytes];
int read = file.Read(Buffer, 0, (int)_Bytes);
for (int i = 0; i < read; i += 4)
{
Int16 leftSample = BitConverter.ToInt16(Buffer, i);
Int16 rightSample = BitConverter.ToInt16(Buffer, i + 2);
}
I'm working with large files , beginning from 10Gb. I'm loading the parts of the file in the memory for processing. Following code works fine for smaller files (700Mb)
byte[] byteArr = new byte[layerPixelCount];
using (FileStream fs = File.OpenRead(recFileName))
{
using (BinaryReader br = new BinaryReader(fs))
{
fs.Seek(offset, SeekOrigin.Begin);
for (int i = 0; i < byteArr.Length; i++)
{
byteArr[i] = (byte)(br.ReadUInt16() / 256);
}
}
}
After opening a 10Gb file, the first run of this function is OK. But the second Seek() throws an IO exception:
An attempt was made to move the file pointer before the beginning of the file.
The numbers are:
fs.Length = 11998628352
offset = 4252580352
byteArr.Length = 7746048
I assumed that GC didn't collect the closed fs reference before the second call and tried
GC.Collect();
GC.WaitForPendingFinalizers();
but no luck.
Any help is apreciated
I'm guessing it's because either your signed integer indexer or offset is rolling over to negative values. Try declaring offset and i as long.
//Offest is now long
long offset = 4252580352;
byte[] byteArr = new byte[layerPixelCount];
using (FileStream fs = File.OpenRead(recFileName))
{
using (BinaryReader br = new BinaryReader(fs))
{
fs.Seek(offset, SeekOrigin.Begin);
for (long i = 0; i < byteArr.Length; i++)
{
byteArr[i] = (byte)(br.ReadUInt16() / 256);
}
}
}
My following written code logic is appropriate with large files beyond 4GB. The key issue to notice is the LONG data type used with the SEEK method. As a LONG is able to point beyond 2^32 data boundaries. In this example, the code is processing first processing the large file in chunks of 1GB, after the large whole 1GB chunks are processed, the left over (<1GB) bytes are processed. I use this code with calculating the CRC of files beyond the 4GB size. (using https://crc32c.machinezoo.com/ for the crc32c calculation in this example)
private uint Crc32CAlgorithmBigCrc(string fileName)
{
uint hash = 0;
byte[] buffer = null;
FileInfo fileInfo = new FileInfo(fileName);
long fileLength = fileInfo.Length;
int blockSize = 1024000000;
decimal div = fileLength / blockSize;
int blocks = (int)Math.Floor(div);
int restBytes = (int)(fileLength - (blocks * blockSize));
long offsetFile = 0;
uint interHash = 0;
Crc32CAlgorithm Crc32CAlgorithm = new Crc32CAlgorithm();
bool firstBlock = true;
using (FileStream fs = new FileStream(fileName, FileMode.Open, FileAccess.Read))
{
buffer = new byte[blockSize];
using (BinaryReader br = new BinaryReader(fs))
{
while (blocks > 0)
{
blocks -= 1;
fs.Seek(offsetFile, SeekOrigin.Begin);
buffer = br.ReadBytes(blockSize);
if (firstBlock)
{
firstBlock = false;
interHash = Crc32CAlgorithm.Compute(buffer);
hash = interHash;
}
else
{
hash = Crc32CAlgorithm.Append(interHash, buffer);
}
offsetFile += blockSize;
}
if (restBytes > 0)
{
Array.Resize(ref buffer, restBytes);
fs.Seek(offsetFile, SeekOrigin.Begin);
buffer = br.ReadBytes(restBytes);
hash = Crc32CAlgorithm.Append(interHash, buffer);
}
buffer = null;
}
}
//MessageBox.Show(hash.ToString());
//MessageBox.Show(hash.ToString("X"));
return hash;
}
I have a class in a dll which parses a file and returns a Stream which represents an FAT image (or any other)
My problem is when there is any other image the class creates about 3702 (on average) null bytes at the beginning of the stream.
So I have to edit the stream first and then save it to a file.
I have a code already but it works slow.
[Note : fts is the returned FileStream.]
BufferedStream bfs = new BufferedStream(fts);
BinaryReader bbr = new BinaryReader(bfs);
byte[] all_bytes = bbr.ReadBytes((int)fts.Length);
List<byte> nls = new List<byte>();
int index = 0;
foreach (byte bbrs in all_bytes)
{
if (bbrs == 0x00)
{
index++;
nls.Add(bbrs);
}
else
{
break;
}
}
byte[] nulls = new byte[nls.Count];
nulls = nls.ToArray();
//File.WriteAllBytes(outputDir + "Nulls.bin", nulls);
long siz = fts.Length - index;
byte[] file = new byte[siz];
bbr.BaseStream.Position = index;
file = bbr.ReadBytes((int)siz);
bbr.Close();
bfs.Close();
fts.Close();
bfs = null;
fts = null;
fts = new FileStream(outputDir + "Image.bin", FileMode.Create, FileAccess.Write);
bfs = new BufferedStream(fts);
bfs.Write(file, 0, (int)siz);
bfs.Close();
fts.Close();
Now, my question is :
How can I remove the nulls more efficiently and faster than the above code?
Instead of pushing bytes onto a List you could simply loop through your stream until you find the first non-null byte and then just copy the array from there using Array.Copy.
I would think about something like this (untested code):
int index = 0;
int currByte = 0;
while ((currByte = bbrs.ReadByte()) == 0x00)
{
index++;
}
// now currByte and everything to the end of the stream are the bytes you want.