I'm experiencing an issue with adding a file to a DICOMDir. Based on this example I've successfully created and saved to disk an image from a series. Then, I tried adding that file to the DICOMDIR, so that the Dir references the new file, and, though the saving is successful, when I try to open the DICOMDir and its series again, I get a "Tag: (0088,0200) not found in dataset" exception.
Code is as follows:
var dataset = new DicomDataset();
this.FillDataset(dataset); //this function copies several Tag values of an already existing DICOM Series file, such as Patient information
dataset.Add(DicomTag.PhotometricInterpretation, PhotometricInterpretation.Rgb.Value);
dataset.Add(DicomTag.Rows, (ushort)rows);
dataset.Add(DicomTag.Columns, (ushort)columns);
var pixelData = DicomPixelData.Create(dataset, true);
pixelData.AddFrame(buffer);
var dicomfile = new DicomFile(dataset);
var pathImage = Path.Combine(dirImages.FullName, imageFileName);
dicomfile.Save(pathImage); //Image is saved fine and it's well formed, I've checked opening it with an online DICOM viewer
var dicomdirPath = Path.Combine(studyPath, Constants.DICOMDIRFileName);
var dicomdir = DicomDirectory.Open(dicomdirPath);
dicomdir.AddFile(dicomfile, $#"Images\{imageFileName}");
dicomdir.Save(dicomdirPath); //this executes without problems and the DICOMDIR is saved
And this is the series opening method:
var dicomDirectory = await DicomDirectory.OpenAsync(dicomdirPath);
foreach (var patientRecord in dicomDirectory.RootDirectoryRecordCollection)
{
foreach (var studyRecord in patientRecord.LowerLevelDirectoryRecordCollection)
{
foreach (var seriesRecord in studyRecord.LowerLevelDirectoryRecordCollection)
{
foreach (var imageRecord in seriesRecord.LowerLevelDirectoryRecordCollection)
{
var dicomDataset = imageRecord.GetSequence(DicomTag.IconImageSequence).Items.First(); //This line works fine before saving the image in the method above, but throws when opening the same study
//Load data and series from dataset
}
}
}
}
I don't know if I'm missing something regarding saving a DICOMDir file, or if it's an error.
You try to access the IconImageSequence (0088,0200) that is obviously not present. DicomDir does only contain some main data of the image. When Adding an image to the dicomdir it is up to you to add additional information.
One of those optional informations, that fo-dicom does not automatically add, is the Icon. DicomDir allows to contain a small icon to show if you want to display some previews quickly.
Actually imageRecord should contain all the informations you might need like instanceuid or filename etc.
I don't know why the line of code worked well before you stored the file with fo-dicom. I assume there already was a DICOMDIR created with some other application that included the Icon? then the foreach crashes when you reach the newly added entry.
You could either add an Icon yourself when adding the new instance to the DICOMDIR, or you could add a check like "if imageRecord.TryGetSequece(iconImageSequence, out seq).." to handle the cases where there are no icons.
I recommend to add the check anyway, because you might read a DICOMDIR with a reference to some strucured report some day and those structured reports do not have pixel data and therefore will not have an icon included.
Related
I am trying to do what I thought would be a simple task, but can't figure it out.
I have created a new Xamerin Forms project (using .net standard, as sharing strategy,) and it looks like this:
I created a text file and I want to read it into an array so I can use it in my app.
I added the text file to my project (by adding it to the top project, which doesn't have an OS associated with it), here:
In my app I have the following code:
InitializeComponent ();
//Define our array variable
string[] quotations = new string[10];
//Read the text from text file and populate array
StreamReader SR = new StreamReader(#"Quotations.txt");
for (int i = 0; i < 9; i++)
{
quotations[i] = SR.ReadLine();
}
Quote.Text = quotations[0];
//Close text file
SR.Close();
...
I have checked the properties for the file, and set them to 'build action: embedded resource', in the top project.
(I haven't added the file into the individual OS projects...)
When I run my application, on IOS it generates an exception, and exits the app almost as soon as it started:
Unhandled Exception: System.IO.FileNotFoundException:
How can I attach a file to my project, and read the contents into an array , on IOS?
Thanks
new StreamReader(#"Quotations.txt"); loads a file from the file system, but your data isn't in the file system - it is embedded in the assembly.
Embedded resources need to be accessed in a special way - in particular via yourAssembly.GetManifestResourceStream(resourceName). Note that the embedded name might not be quite what you expect, so the best thing to do is to (as a quick test) use yourAssembly.GetManifestResourceNames() and write out the names that are embedded. That'll tell you the one to actually include as resourceName. Once you have the Stream, you can use a new StreamReader(theResourceStream) on it.
Note: the easiest way to get yourAssembly is something like typeof(SomeTypeInYourAssembly).Assembly.
I have two code refactorings, one that amongst other things, generates a csv file and adds it to the Project in the solution, and another that is supposed to edit it.
I am successfully (seems like it, it shows in the solution explorer after it) generating and attaching the csv file to the solution Project through the code:
var doc = editor.GetChangedDocument();
SourceText st = SourceText.From(csv.GetStringDocument(), Encoding.Default);
var additionalDoc = doc.Project.AddAdditionalDocument(filename, st, new List<string> { "_TestData" });
return additionalDoc.Project.Solution;
Before returning, if I check additionalDoc.Project.AdditionalDocuments.Count(), its' value is 1.
And, in the other, when I try to access the additional files in the Project, the IEnumerable is empty.
How is it supposed to access the additional documents added with code refactorings/code fixers?
Closed. This question needs debugging details. It is not currently accepting answers.
Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question.
Closed 6 years ago.
Improve this question
I have a program that searches through a folder at a given path, and pulls out the paths of any files inside the folder or any subfolders inside it.
From here, it uses the individual file path to create an object of a custom class called ImageData. The method that handles this is shown below:
public void saveLatestImages(string chosenPath)
{
//if there is a Pictures folder
if (Directory.Exists(chosenPath))
{
//get number of files in folder
int fileCount = Directory.GetFiles(chosenPath).Count();
//more than one file in folder
if (fileCount > 0)
{
//create data structures to store file info
//filePaths holds path of each file represented as a string
string[] filePaths = Directory.GetFiles(chosenPath);
//for each file in Pictures...
for (int index = 0; index < fileCount; ++index)
{
//get name of image at current index
imageName = filePaths[index];
//separate the part relating to the patient name (everything before (DD/MM/YYYY))
string subSpecifier = imageName.Split('\\').Last();
subSpecifier = subSpecifier.Split('_')[0];
//add to root directory to form subfolder name
subDirectory = Path.Combine(rootDirectory, subSpecifier);
//subdirectory name formulated, check for pre-existing
//subfolder does not exist
if (!Directory.Exists(subDirectory))
{
//create it
Directory.CreateDirectory(subDirectory);
}
//otherwise, file will be added to existing directory
//take everything from end and folder\file division to get unique filename
string fileName = imageName.Split('\\').Last();
//add this to the existing subDirectory
fileName = Path.Combine(subDirectory, fileName);
//copy the image into the subfolder using this unique filename
File.Copy(imageName, fileName, true); //true gives instruction to overwrite any existing file with the same path
//construct new instance with created filename
imageData.Add(new ImageData(fileName));
}
}
}
}
So far, so good.
The problem comes afterwards when the ImageData object created is being displayed on a PictureBox (using a Bitmap attribute). When this image is on the picture box, a number of options are available through buttons.
For example, one button is in place that would remove the ImageData object from the picturebox and delete the file.
This is done using the method below:
private void btnDeleteImage_Click(object sender, EventArgs e)
{
///////////////////////////////////////////////////////////////////////////////////
//imageData is List<ImageData> that contains all ImageData objects currently in use
//imageSlider is the PictureBox where the images are displayed/////////////////////
///////////////////////////////////////////////////////////////////////////////////
//identify image currently on picturebox
Image displayImage = imageData[displayImageIndex()].getThumbnailImage();
//get the file path of this image
string displayImagePath = imageData[displayImageIndex()].getImagePath();
//move to next or previous image in list
//then remove image that was just viewed
//current image not last in list
if (!(imageSlider.Image.Equals(imageData.Last().getThumbnailImage())))
{
displayImage = imageData[displayImageIndex() + 1].getThumbnailImage();
//display the next image in the list
imageSlider.Image = displayImage;
//delete the image just moved on from from list
imageData.RemoveAt(displayImageIndex() - 1);
//delete the file path at this index in the paths list
File.Delete(displayImagePath);
}
//current image is last in list
else
{
displayImage = imageData[displayImageIndex() - 1].getThumbnailImage();
//display previous image in list
imageSlider.Image = displayImage;
//delete the image just moved on from from list
imageData.RemoveAt(displayImageIndex() + 1);
//delete the file path at this index in the paths list
File.Delete(displayImagePath); <--- ////ERROR OCCURS////
}
//check for prior and successive elements in list
checkFirstLast(displayImage);
updateImageInfo();
}
On the File.Delete() command, an exception occurs to inform me that the 'File cannot be accessed because it is being used by another process'.
Basically, the file is opened when it is brought into the program, and is never closed. This means that when I try to access the file to delete (or do something else with it), this cannot be done as the program currently stands.
I know that if I was using a FileStream object then I could call the .Close() method once the object was finished with. But seeing as all file access is done using string variables which are then used to create images, there does not seem to be an equivalent method available to me.
Does anyone know of any other way to implement this behaviour? If this is not going to be possible, is it possible to manage the image files using something like FileStream?
Any advice on where to go from here would be great.
Thanks,
Mark
As stated in my comment, you're probably creating Bitmap objects in your ImageData class, using the constructor that takes a string (the filename):
Bitmap b = new Bitmap(filename);
A Bitmap created with this constructor will create a FileStream from the file at the given path and will keep that FileStream open until the Bitmap is disposed, as mentioned in the documentation:
The file remains locked until the Bitmap is disposed.
To get around this, you can instead build the Bitmap from a MemoryStream that you populate from the file yourself:
byte[] data = File.ReadAllBytes(filename);
MemoryStream stream = new MemoryStream(data);
Bitmap b = new Bitmap(stream);
This way, the stream the Bitmap keeps open is the MemoryStream you created rather than a FileStream that keeps the file locked.
The usual norm is -
Create a staging folder. Like a temp folder
Make a copy of your file there
Display the image from that temp file
For each operation make a copy (1.jpg, 2.jpg)
Finally when done, copy the latest final file back to the original folder
Delete staging folder
Advantages -
Original data is never lost
Accidental corruption is very less
Easy to have undo operations
If you can't delete staging folder immediately, you can background job for this
I am using DotNetZip 1.9.6 in my application which uses a file structure similar to e.g. *.docx: Zip file containing XML files.
Now every module of the application can store such XML files to my custom file management and on "save" they are serialized to streams which are then saved to the Zip file via DotNetZip.
To update the entries I use ZipFile.UpdateEntry(path, stream).
This works fine and the first time I save my file via calling ZipFile.Save() everything works.
But If I do this a second time (first some UpdateEntrycalls then Save) on the same instance the Zip file is corrupted: The file structure and meta-data (e.g. uncompressed size of each file) is still there, but all files are 0 byte in compressed size.
If I create a new instance from the just saved file after saving everything works fine, but shouldn't it be possible to avoid that and "reuse" the same instance?
The following example (also see https://dotnetfiddle.net/mHxEIy) can be used to reproduce the problem:
using System.IO;
using System.Text;
public class Program
{
public static void Main()
{
var zipFile = new Ionic.Zip.ZipFile();
var content1 = new MemoryStream(Encoding.Default.GetBytes("Content 1"));
zipFile.UpdateEntry("test.txt", content1);
zipFile.Save("test.zip"); // here the Zip file is correct
//zipFile = new Ionic.Zip.ZipFile("test.zip"); // uncomment and it works too
var content2 = new MemoryStream(Encoding.Default.GetBytes("Content 2"));
zipFile.UpdateEntry("test.txt", content2);
zipFile.Save(); // after that it is corrupt
}
}
To run this you need to add the "DotNetZip 1.9.6" NuGet package.
After the first save, this is what you get:
and after the second save:
This looks like it's a bug in the library, around removing an entry. If you just remove an entry and then save again, it correctly removes the file.
However, if you remove an entry and then add another one with the same name - which is what UpdateEntry is documented to do if the entry already exists - the old entry appears to be used instead.
The reason you're ending up with an empty file the second time is that the original MemoryStream is being read again - but by now, it's positioned at the end of the data, so there's no data to read. If you reset the position to the start of the stream (content1.Position = 0;) it will rewrite the original data. If you modify the data within content1, you end up with invalid compressed data.
The only workaround I can immediately think of is to keep your own map from filename to MemoryStream, and replace the contents of each MemoryStream when you want to update it... or just load the file each time, as per your existing workaround.
It's definitely worth filing a bug around this though, as it should work as far as I can tell.
As already suspected this was a bug in DotNetZip up to version 1.9.6.
I think I was able to fix this with THIS change which was just released as version 1.9.7 on NuGet. At least for me the problem does not happen anymore.
Some background what happend as far as I found out:
When you call Save the library sets an internal flag which remembers that the ZIP file was just save and on the second Save call instead of "recompressing" all entries in the ZIP file it copies them from the just saved file.
This works fine for adding/removing entries, but breaks when one of the entries was changed as then it "mixes" the old and the new entry and produces the inconsisten ZIP file.
My fix basically disables that "copy from old file" logic if an entry was changed.
I am having an xml file like:
<CurrentProject>
// Elements like
// last opened project file to reopen it when app starts
// and more global project independend settings
</CurrentProject>
Now I asked myself wether I should deliver this xml file with above empty elements with the installer for my app or should I create this file on the fly on application start if it does not exist else read the values from it.
Consider also that the user could delete this file and that should my application not prevent from working anymore.
What is better and why?
UPDATE:
What I did felt ok for me so I post my code here :) It just creates the xml + structure on the fly with some security checks...
public ProjectService(IProjectDataProvider provider)
{
_provider = provider;
string applicationPath = Environment.GetFolderPath(Environment.SpecialFolder.ApplicationData);
_projectPath = Path.Combine(applicationPath,#"TBM\Settings.XML");
if (!File.Exists(_projectPath))
{
string dirPath = Path.Combine(applicationPath, #"TBM");
if (!Directory.Exists(dirPath))
Directory.CreateDirectory(dirPath);
using (var stream = File.Create(_projectPath))
{
XElement projectElement = new XElement("Project");
projectElement.Add(new XElement("DatabasePath"));
projectElement.Save(stream, SaveOptions.DisableFormatting);
}
}
}
In a similar scenario, I recently went for creating the initial file on the fly. The main reason I chose this was the fact that I wasn't depending on this file being there and being valid. As this was a file that's often read from/written to, there's a chance that it could get corrupted (e.g. if the power is lost while the file is being written).
In my code I attempted to open this file for reading and then read the data. If anywhere during these steps I encountered an error, I simply recreated the file with default values and displayed a corresponding message to the user.