I've a plugin based web app that allows an administrator to assign various small pieces of functionality (the actual functionality is unimportant here) to users.
This functionality is configurable and an administrator having an understanding of the plugins is important. The administrator is technical enough to be able to read the very simple source code for these plugins. (Mostly just arithmetic).
I'm aware there are a couple of questions already about accessing source code from within a DLL built from that source:
How to include source code in dll? for example.
I've played with getting the .cs files into a /Resources folder. However doing this with a pre-build event obviously don't include these files in the project. So VS never copies them and I'm unable to access them on the Resources object.
Is there a way to reference the source for a particular class... from the same assembly that I'm missing? Or alternatively a way to extract COMPLETE source from the pdb for that assembly? I'm quite happy to deploy detailed PDB files. There's no security risk for this portion of the solution.
As I've access to the source code, I don't want to go about decompiling it to display it. This seems wasteful and overly complicated.
The source code isn't included in the DLLs, and it isn't in the PDBs either (PDBs only contain a link between the addresses in the DLL and the corresponding lines of code in the sources, as well as other trivia like variable names).
A pre-build event is a possible solution - just make sure that it produces a single file that's included in the project. A simple zip archive should work well enough, and it's easy to decompress when you need to access the source. Text compresses very well, so it might make sense to compress it anyway. If you don't want to use zip, anything else will do fine as well - an XML file, for example. It might even give you the benefit of using something like Roslyn to provide syntax highlighting with all the necessary context.
Decompilation isn't necessarily a terrible approach. You're trading memory for CPU, basically. You'll lose comments, but that shouldn't be a problem if your code is very simple. Method arguments keep their names, but locals don't - you'd need the PDBs for that, which is a massive overkill. It actually does depend a lot on the kind of calculations you're doing. For most cases, it probably isn't the best solution, though.
A bit roundabout way of handling this would be a multi-file T4 template. Basically, you'd produce as many files as there are source code files, and have them be embedded resources. I'm not sure how simple this is, and I'm sure not going to include the code :D
Another (a bit weird) option is to use file links. Simply have the source code files in the project as usual, but also make a separate folder where the same files will be added using "Add as link". The content will be shared with the actual source code, but you can specify a different build action - namely, Embedded Resource. This requires a (tiny) bit of manual work when adding or moving files, but it's relatively simple. If needed, this could also be automated, though that sounds like an overkill.
The cleanest option I can think of is adding a new build action - Compile + Embed. This requires you to add a build target file to your project, but that's actually quite simple. The target file is just an XML file, and then you just manually edit your SLN/CSPROJ file to include that target in the build, and you're good to go. The tricky part is that you'll also need to force the Microsoft.CSharp.Core.target to use your Compile + Embed action to be used as both the source code and the embedded resource. This is of course easily done by manually changing that target file, but that's a terrible way of handling that. I'm not sure what the best way of doing that is, though. Maybe there's a way to redefine #(Compile) to mean #(Compile;MyCompileAndEmbed)? I know it's possible with the usual property groups, but I'm not sure if something like this can be done with the "lists".
Taking from #Luaan's suggestion of using a pre-build step to create a single Zipped folder I created a basic console app to package the source files into a zip file at a specific location.
static void Main(string[] args)
{
Console.WriteLine("Takes a folder of .cs files and flattens and compacts them into a .zip." +
"Arg 1 : Source Folder to be resursively searched" +
"Arg 2 : Destination zip file" +
"Arg 3 : Semicolon List of folders to ignore");
if (args[0] == null || args[1] == null)
{
Console.Write("Args 1 or 2 missing");
return;
};
string SourcePath = args[0];
string ZipDestination = args[1];
List<String> ignoreFolders = new List<string>();
if (args[2] != null)
{
ignoreFolders = args[2].Split(';').ToList();
}
var files = DirSearch(SourcePath, "*.cs", ignoreFolders);
Console.WriteLine($"{files.Count} files found to zip");
if (File.Exists(ZipDestination))
{
Console.WriteLine("Destination exists. Deleting zip file first");
File.Delete(ZipDestination);
}
int zippedCount = 0;
using (FileStream zipToOpen = new FileStream(ZipDestination, FileMode.OpenOrCreate))
{
using (ZipArchive archive = new ZipArchive(zipToOpen, ZipArchiveMode.Create))
{
foreach (var filePath in files)
{
Console.WriteLine($"Writing {Path.GetFileName(filePath)} to zip {Path.GetFileName(ZipDestination)}");
archive.CreateEntryFromFile(filePath, Path.GetFileName(filePath));
zippedCount++;
}
}
}
Console.WriteLine($"Zipped {zippedCount} files;");
}
static List<String> DirSearch(string sDir, string filePattern, List<String> excludeDirectories)
{
List<String> filePaths = new List<string>();
foreach (string d in Directory.GetDirectories(sDir))
{
if (excludeDirectories.Any(ed => ed.ToLower() == d.ToLower()))
{
continue;
}
foreach (string f in Directory.GetFiles(d, filePattern))
{
filePaths.Add(f);
}
filePaths.AddRange(DirSearch(d, filePattern, excludeDirectories));
}
return filePaths;
}
Takes 3 parameters for source dir, output zip file and a ";" separated list of paths to exclude. I've just built this as a binary. Committed it to source control for simplicity and included it in the pre-build for projects I want the source for.
No error checking really and I'm certain it will fail for missing args. But if anyone wants it. Here it is! Again Thanks to #Luaan for clarifying PDBs aren't all that useful!
Related
I need to write a big file in my project.
What I learned:
I should NOT write the big file directly to the destination path,
because this may leave a incomplete file in case the app crash while writing it.
Instead, I should write to a temporary file and move (rename) it. (called atomic file operation)
My code snippet:
[NotNull]
public static async Task WriteAllTextAsync([NotNull] string path, [NotNull] string content)
{
string temporaryFilePath = null;
try {
temporaryFilePath = Path.GetTempFileName();
using (var stream = new StreamWriter(temporaryFilePath, true)) {
await stream.WriteAsync(content).ConfigureAwait(false);
}
File.Delete(path);
File.Move(temporaryFilePath, path);
}
finally {
if (temporaryFilePath != null) File.Delete(temporaryFilePath);
}
}
My Question:
The file will be missing if the app crashes between File.Delete and File.Move. Can I avoid this?
Is there any other best practice for writing big files?
Is there any suggestion on my code?
The file will be missing if the app crashes between File.Delete and File.Move. Can I avoid this?
Not that I'm aware of, but you can detect it - and if you use a more predictable filename, you can recover from that. It helps if you tweak the process somewhat to use three file names: the target, a "new" file and an "old" file. The process becomes:
Write to "new" file (e.g. foo.txt.new)
Rename the target file to the "old" file (e.g. foo.txt.old)
Rename the "new" file to the target file
Delete the "old" file
You then have three files, each of which may be present or absent. That can help you to detect the situation when you come to read the new file:
No files: Nothing's written data yet
Just target: All is well
Target and new: App crashed while writing new file
Target and old: App failed to delete old file
New and old: App failed after the first rename, but before the second
All three, or just old, or just new: Something very odd is going on! User may have interfered
Note: I was unaware of File.Replace before, but I suspect it's effectively just a simpler and possibly more efficient way of doing the code you're already doing. (That's great - use it!) The recovery process would still be the same though.
You can use File.Replace instead of deleting and moving files. In case of hard fault (electricity cut or something like this) you will always lost data, you have to count with that.
I have an ASP.NET Core web project. I am trying to bundle a bunch of static .js files using the BundlerMinifier package on the bundleconfig.json file at the root of my project. The issue I have is that the .js files I want to bundle are in another project in the solution (which I'll call MainProject), so I have to use relative paths to specify them, like so:
"outputFileName": "../MainProject/bundles/libraryscripts.js",
"inputFiles": [
"../MainProject/Scripts/Libraries/Angular/**/*.js",
// more input files
],
"minify": {
"enabled": true
}
When I build my project, the bundler does not give any errors and the file libraryscripts.js is created at the specified folder. The problem is the file is empty, which I believe is due to the globbing pattern (**/*.js). When I enumerate all the files instead of using this pattern, it works fine. What makes this more complicated is that when I don't use relative paths (no ../ at the start), it seems to work fine when using the globbing pattern.
This leads me to believe it's a problem with using relative paths in conjunction with globbing patterns. Can anyone confirm this and does anyone know a way around this? I do not want to enumerate hundreds of .js files (neither elegant nor sustainable).
Note: the following is a hacky workaround that I implemented -- it is not an official solution to this problem, though it may still be helpful.
I cloned the BundlerMinifier project so I could debug how it was bundling the files specified in bundleconfig.json and see what the problem was. The issue was in Bundle.cs, in particular in the GetAbsoluteInputFiles method. This line gets the path of the folder in which bundleconfig.json is stored:
string folder = new DirectoryInfo(Path.GetDirectoryName(FileName)).FullName;
The issue is that this same folder variable is used later when trimming the start of the paths of the files that are found:
var allFiles = Directory.EnumerateFiles(searchDir, "*" + ext, SearchOption.AllDirectories).Select(f => f.Replace(folder + FileHelpers.PathSeparatorChar, ""));
Since the files were in another directory, the .Select(f => f.Replace()); part didn't remove the start of the paths of the files, which meant the comparisons later failed when being matched. Thus, no files with both ../ and a globbing pattern were found.
I came up with a hacky solution for myself, but I don't think it's robust and I will therefore not contribute to the Git project. Nonetheless, I'll put it here in case anyone else has the same issue. I created a copy of folder:
string folder = new DirectoryInfo(Path.GetDirectoryName(FileName)).FullName;
string alternateFolder = folder;
bool folderModified = false;
Then I created a copy of inputFile and checked if it starts with ../. If so, remove it and at the same time remove the last folder name from alternateFolder:
string searchDir = new FileInfo(Path.Combine(folder, relative).NormalizePath()).FullName;
string newInput = inputFile;
while (newInput.StartsWith("../"))
{
// I'm sure there's a better way to do this using some class/library.
newInput = newInput.Substring(3);
if (!folderModified)
{
int lastSlash = alternateFolder.LastIndexOf('\\');
alternateFolder = alternateFolder.Substring(0, lastSlash);
folderModified = true;
}
}
Then I use alternateFolder and newInput only in the following lines:
var allFiles = Directory.EnumerateFiles(searchDir, "*" + ext, SearchOption.AllDirectories).Select(f => f.Replace(alternateFolder + FileHelpers.PathSeparatorChar, ""));
var matches = Minimatcher.Filter(allFiles, newInput, options).Select(f => Path.Combine(alternateFolder, f));
Everywhere else still uses folder and inputFile. Also note the use of the folderModified boolean. It is important to only remove the last folder on alternateFolder once since it is in a foreach loop.
I inherited some code that makes use of ZipArchive to save some information from the database. It uses BinaryFormatter to do this. When you look at the zip file with 7-zip (for example), you see a couple of folders and a .txt file. All is working well. I simply want to modify the code to also have a folder in the ZipArchive called "temp" that consists of files and folders under C:\temp. Is there an easy way to add a entry (ZipArchiveEntry?) that consist of an entire folder or the disc? I saw "CreateEntryFromFile" in the member methods of ZipArchive, but no CreateEntryFromDirectory. Or perhaps there's some other simple way to do it? Anyone have example code? I should say that C:\temp could have variable number of files and directories (that have child directories and files, etc.) Must I enumerate them somehow, create my own directories use CreateEntryFromFile? Any help is appreciated.
Similarly, when I read the ZipArchive, I want to take the stuff related to C:\temp and just dump it in a directory (like C:\temp_old)
Thanks,
Dave
The answer by user1469065 in Zip folder in C# worked for me. user1469065 shows how to get all the files/directories in the directory (using some cool "yield" statements) and then do the serialization. For completeness, I did add the code to deserialize as user1469065 suggested (at least I think I did it the way he suggested).
private static void ReadTempFileStuff(ZipArchive archive) // adw
{
var sessionArchives = archive.Entries.Where(x => x.FullName.StartsWith(#"temp_directory_contents")).ToArray();
if (sessionArchives != null && sessionArchives.Length > 0)
{
foreach (ZipArchiveEntry entry in sessionArchives)
{
FileInfo info = new FileInfo(#"C:\" + entry.FullName);
if (!info.Directory.Exists)
{
Directory.CreateDirectory(info.DirectoryName);
}
entry.ExtractToFile(#"C:\" + entry.FullName,true);
}
}
}
I keep getting the error "Stream was not writable" whenever I try to execute the following code. I understand that there's still a reference to the stream in memory, but I don't know how to solve the problem. The two blocks of code are called in sequential order. I think the second one might be a function call or two deeper in the call stack, but I don't think this should matter, since I have "using" statements in the first block that should clean up the streams automatically. I'm sure this is a common task in C#, I just have no idea how to do it...
string s = "";
using (Stream manifestResourceStream =
Assembly.GetExecutingAssembly().GetManifestResourceStream("Datafile.txt"))
{
using (StreamReader sr = new StreamReader(manifestResourceStream))
{
s = sr.ReadToEnd();
}
}
...
string s2 = "some text";
using (Stream manifestResourceStream =
Assembly.GetExecutingAssembly().GetManifestResourceStream("Datafile.txt"))
{
using (StreamWriter sw = new StreamWriter(manifestResourceStream))
{
sw.Write(s2);
}
}
Any help will be very much appreciated. Thanks!
Andrew
Embedded resources are compiled into your assembly, you can't edit them.
As stated above, embedded resources are read only. My recommendation, should this be applicable, (say for example your embedded resource was a database file, XML, CSV etc.) would be to extract a blank resource to the same location as the program, and read/write to the extracted resource.
Example Pseudo Code:
if(!Exists(new PhysicalResource())) //Check to see if a physical resource exists.
{
PhysicalResource.Create(); //Extract embedded resource to disk.
}
PhysicalResource pr = new PhysicalResource(); //Create physical resource instance.
pr.Read(); //Read from physical resource.
pr.Write(); //Write to physical resource.
Hope this helps.
Additional:
Your embedded resource may be entirely blank, contain data structure and / or default values.
A bit late, but for descendants=)
About embedded .txt:
Yep, on runtime you couldnt edit embedded because its embedded. You could play a bit with disassembler, but only with outter assemblies, which you gonna load in current context.
There is a hack if you wanna to write to a resource some actual information, before programm starts, and to not keep the data in a separate file.
I used to worked a bit with winCE and compact .Net, where you couldnt allow to store strings at runtime with ResourceManager. I needed some dynamic information, in order to catch dllNotFoundException before it actually throws on start.
So I made embedded txt file, which I filled at the pre-build event.
like this:
cd $(ProjectDir)
dir ..\bin\Debug /a-d /b> assemblylist.txt
here i get files in debug folder
and the reading:
using (var f = new StreamReader(Assembly.GetExecutingAssembly().GetManifestResourceStream("Market_invent.assemblylist.txt")))
{
str = f.ReadToEnd();
}
So you could proceed all your actions in pre-build event run some exes.
Enjoy! Its very usefull to store some important information and helps avoid redundant actions.
I have an application that is looking through some files for old data. In order to make sure we don't corrupt good projects, I'm copying the files to a temporary location. Some of the directories I'm checking are source-code directories, and they have .svn folders. We use Subversion to manage our code.
Once I've searched through all of the files, I want to delete the temp cache. Sounds easy, right?
For some reason, all of my .svn directories won't delete from the cache. They crash the app.
For reasons (too deep to go into here), I have to use the temp folder, so just "scan the original file" is out of the question for political reasons.
I can go into explorer and delete them. No problem. No warnings. Just deletes. But the code crashes with "Access to {file} is denied." I'm at my wits end with this one, so any help would be appreciated.
While I've simplified the function a LITTLE for sake of your sanity, the code REALLY is about this simple.
List<string> tmpCacheManifest = new List<string>();
string oldRootPath = "C:\\some\\known\\directory\\";
string tempPath = "C:\\temp\\cache\\";
foreach (string file in ListOfFilesToScan)
{
string newFile = file.Replace(oldRootPath, tempPath);
// This works just fine.
File.Copy(file, newFile);
tmpCacheManifest.add(newFile);
}
// ... do some stuff to the cache to verify what I need.
// Okay.. I'm done.. Delete the cache.
foreach (string file in tmpCacheManifest)
{
// CRASH!
File.Delete(file);
}
* Update *: The exception is UnauthorizedAccessException. The text is "Access to the path 'C:\temp\cache\some-sub-dirs\.svn\entries' is denied."
It happens under XP, XP-Pro and Windows 7.
* Update 2 * None of my validation even ATTEMPTS to look at subversion files. I do need them, however. That's part of the political crap. I have to show that EVERY file was copied... wheter it was scanned or not.
And I realize what the usual suspects are for File.Delete. I realize what UnauthorizedAccessException means. I don't have access. That's a no-brainer. But I just copied the file. How can I NOT have access to the file?
* Update 3 *
The answer was in the "read-only" flag. Here's the code I used to fix it:
foreach (string file in ListOfFilesToScan)
{
string newFile = file.Replace(oldRootPath, tempPath);
// This works just fine.
File.Copy(file, newFile);
//// NEW CODE ////
// Clear any "Read-Only" flags
FileInfo fi3 = new FileInfo(fn);
if ((fi3.Attributes & FileAttributes.ReadOnly) == FileAttributes.ReadOnly)
{
fi3.Attributes = (FileAttributes)(Convert.ToInt32(fi3.Attributes) - Convert.ToInt32(FileAttributes.ReadOnly));
}
tmpCacheManifest.add(newFile);
}
// ... do some stuff to the cache to verify what I need.
As far as I recall, Subversion marks the files in its .svn subdirectories as read-only.
Try resetting the read-only attribute before deleting the file. I don't really know any C#, but a quick Google suggests this might do the trick:
File.SetAttributes(file, FileAttributes.Normal);
The only problem I see would be in this part:
// ... do some stuff to the cache to verify what I need.
If you do open the file and forget to close it, you still have exclusive access to it, and thus can't delete it later on.
Sounds like you don't have access to delete the file...
system.io.file.delete
The above link says you get UnauthorizedAccessException when:
The caller does not have the required permission.
-or-
path is a directory.
-or-
path specified a read-only file.
It's one of those.
Sounds like a permissions issue. Tricky one though as you obviously have write access if the File.Copy already works....
Only thing I could think of is the file still has a handle opened somewhere (as others have suggested perhaps in your do some stuff to the cache part).
First of all: "Crash" means an exception, right? Which one? Can you catch it and show it?
Second thing: You are copying subversion repositories, although you don't care about the subversion metadata? That's what svn export is about (no .svn directory in the target).
The answer to the first question is what you really need to provide. Maybe something grabs the .svn and locks some files. TortoiseSVN maybe (to give you nice overlay icons..)?
If a folder contains read only files, Directory.Delete won't delete it and raise the exception you're getting. For future visitors of this page, I've found a simple solution which doesn't require us to recurse through all the files and changing their read-only attribute:
Process.Start("cmd.exe", "/c " + #"rmdir /s/q C:\Test\TestDirectoryContainingReadOnlyFiles");
(Change a bit to not to fire a cmd window momentarily, which is available all over the internet)
Not understanding what you want to do so much, but what about chmoding it to 777 or 775. :-/
Edit:
Noticed your on windows. You'd have to change the permissions. Don't know how windows does that :-/