I've created fileUpload Controller that posts a file to a server.
myControllerName.PostedFile.SaveAs(saveLocation);
In case I'm trying to delete the file from the server immediately after the upload finishes, it returns me an exception:
[IOException: The process cannot access the file 'C:\RestOFMyFilePath' because it is being used by another process.]
So I've used:
GC.Collect();
GC.WaitForPendingFinalizers();
I tried to put those lines before and after the functions in my code to locate the specific function that locks the resource. If I put it before the problem, it still occurs, and when it was after it fixes it.
Narrowed it down to
somefunctionName(FileUpload myControllerName)
Tried to dispose (actively) the resources by
iconUploadController.PostedFile.InputStream.Dispose();
iconUploadController.Dispose();
and I still get that error.
Another solution I've tried is to create proxy class for fileUpload
FileUpload.cs:
public class FileUploadProxy : IDisposable
{
public FileUpload fileUploadController;
public void Dispose()
{
fileUploadController.Dispose();
}
}
So I could use "using" (releases the resources straight after exit the scope) - still didn't work.
Ok, The problem was that inside the "somefunctionName(FileUpload myControllerName)" I had another call to the same file which I've missed:
System.Drawing.Image objImage = System.Drawing.Image.FromFile(filePathOnDisk)
The solution:
using (System.Drawing.Image objImage = System.Drawing.Image.FromFile(filePathOnDisk))
{
//Operation on the objImage.
}
Related
OK... so here is something I do not even know if it is even possible. Is it possible to upload an image file during the initial phase of a code first migration? For instance, when creating an initial site user or an admin user that has a portrait image, would it be possible to upload that image during the initial creation of that user?
I couldn't find anything relevant in SO or online that would even come close to even suggest a viable solution so this might be the very first time something like this has even been attempted.
First, create the new migration file (or use the existing one).
Inside Up() method you can put code for
file upload, and inside Down() method code for file removal from
repo (in case you want to revert migration).
Below is one of the many possible ways to do some remote upload, this is one of the simplest:
using (var webClient = new WebClient())
{
webClient.UploadFile("ftp://localhost/samplefile.jpg", "samplefile.jpg");
}
For this to work you should add using System.Net; to the migration file. Also, obviously you need to handle upload permissions and credentials, depending on the type of remote repo you are using.
EDIT:
Using File object is even more trivial. Here is the complete code for migration class:
using System;
using System.Data.Entity.Migrations;
using System.IO;
public partial class MigrationWithFileCopy : DbMigration
{
public override void Up()
{
File.Copy("sourceFile.jpg", "destinationFile.jpg");
}
public override void Down()
{
File.Delete("destinationFile.jpg");
}
}
After deploying a new version of a website the browser loads everything from its cache from the old webpage until a hard, force refresh is done.
In ASP.NET MVC if the file becomes in Bundle, it handled by Optimization framework. a version added to your file link, and if a change occurs in your bundle's file a new token generate. follow below code :
for example, js file name is: datatables
when you put it in a bundle with the same name, you will see the
datatables?v=anY9_bo7KitrGnXQr8ITP3ylmhQe9NDzSjgLpLQWQFE1
as a file name.
change datatables and watch again the name of the file in the browser, surely it will change:
datatables?v=r8yhQBxKyDgrOGyqr1ndtdG92Ije09nqTY7yogrOSTk1
But there's two questions:
What we can do if our file wasn't in Bundle?
Is a way to force the browser to refresh cache?
we have one solution with some different way for implementation. we use above solution for it.
datatables?v=1
we can handle the version of the file, it's mean that every time that we change our file, change the version of it too. but it's not a suitable way.
another way used Guide, it wasn't suitable too, because each time it fetches the file and doesn't use from the browser cache.
datatables?v=Guid.NewGuid()
The last way that is the best Way is :
when file change occur , change version too. check follow code :
<script src="~/scripts/main.js?v=#File.GetLastWriteTime(Server.MapPath("/scripts/main.js")).ToString("yyyyMMddHHmmss")"></script>
by this way, when you change the file, LastWriteTime change too, so the version of the file will change and in the next when you open the browser, it detects a new file and fetch it.
Assuming you cannot use bundling for some reason, the solution suggested by the original poster is good enough, however it's better to put the logic inside a helper method.
It makes the code testable, it helps to change the logic without changing .cshtml , and also helps to not repeat the filename twice. Then you can have have a much cleaner code:
<script src="#Url.ContentWithVersion("~/scripts/main.js")"></script>
To do so, you can add ContentWithVersion extension method to the existing UrlHelper:
using System;
using System.IO;
using System.Web;
using System.Web.Mvc;
public static class UrlHelperExtensions
{
public static string ContentWithVersion(this UrlHelper urlHelper, string path)
{
if (urlHelper == null)
throw new ArgumentNullException(nameof(urlHelper));
var result = urlHelper.Content(path);
var file = HttpContext.Current.Server.MapPath(path);
if (File.Exists(file))
result += $"?v={File.GetLastWriteTime(file).ToString("yyyyMMddHHmmss")}";
return result;
}
}
I am very much a novice in HttpHandlers. This is actually my first attempt at it.
I have this Ajax handler that responds to a jQuery ajax call in my web app front end:
public class ajaxMeetingHandler : IHttpHandler {
public void ProcessRequest(HttpContext context) {
string resultsJSON = "";
string requestType = context.Request.Params["requestType"];
if (!String.IsNullOrEmpty(requestType)) {
switch (requestType) {
case "RecentMXMeetings":
resultsJSON = SerialiseRecentMeetings(context, "Maintenance");
// SerialiseRecentMeetings() is a method in the class
// that works fine and is not included for brevity.
break;
// more cases (not included for brevity)
}
}
public bool IsReusable {
get {
return false;
}
}
}
}
And this works perfectly.
However, if I add either of these two statements anywhere in the code:
var x = context.Server.MapPath("/");
var y = HttpContext.Current.Server.MapPath("/");
...my Context.Request.Params[] collection becomes null, and IsNullOrEmpty(requestType) now sees requestType as null. In fact, ALL the Request.Params[] are null.
If I comment out those statements (and completely rebuild the solution) the thing goes back to working properly.
In the meantime, I am going to move the calls to MapPath() out to a static "RunTimeEnvironment" class so I can get the path I need from there without touching MapPath() from inside this HttpHandler. Is that a viable or recommended solution?
It turns out my problem was not related to the code itself, per se, but how I was running the project.
Visual Studio, when you click on the "Start Debug" button will start the solution at it's root document (ie, index.html). That is UNLESS the current open document is of a runnable type, such as .html. If your current open window is one of your class files (ie, .cs), it will not attempt to run the class file, but will start the debug session at your root document.
However, it WILL attempt to run a Generic Handler (.ashx) all by itself if that is the document you currently have open. And, by doing so, it was not starting at the index.html page which issues my ajax calls and sends parameters to the Handler. So, my Params collection was null because it was literally null. Running the .ashx by itself supplies no parameters.
So, the reason it worked after changing my call type from GET to POST and Back to GET again is because in doing so, I opened the index.html file to make that change, and when I started my debug session again, my current document was the index.html file, not the Generic Handler .ashx file.
I should probably lose a hundred reputations points just for making this dumb of a mistake. But in case it helps others, there it is.
I would like to delete the local repo folder that I cloned from remote repository using LibGit2Sharp.
I read here here that I have to Dispose() the Repository before I can delete it, but it still not works fine.
using (var repo = new LibGit2Sharp.Repository(path))
{
repo.Dispose();
}
Directory.DeleteFolder(path);
And I still have an exception:
Access to the path 'c16566a7-202a-4c8a-84de-3e3caadd5af9' is denied.
The content of the 'path' variable is the following:
C:\Users\USERNAME\AppData\Local\dftmp\Resources\c16566a7-202a-4c8a-84de-3e3caadd5af9\directory\UserRepos\github.com\domonkosgabor\testrepo
This folder was created by a worker role to a local storage.
What should I do to delete the whole folder (including .git)?
For the benefit of anyone else having this problem:
I had the same problem, but I was still getting an UnauthorizedAccessException even though I was running as administrator, and I was disposing the repository object correctly. It turns out that some of the files in the .git folder are marked as ReadOnly, so I had to loop through each file and remove the ReadOnly attribute before deleting. I wrote a custom method to do this:
/// <summary>
/// Recursively deletes a directory as well as any subdirectories and files. If the files are read-only, they are flagged as normal and then deleted.
/// </summary>
/// <param name="directory">The name of the directory to remove.</param>
public static void DeleteReadOnlyDirectory(string directory)
{
foreach (var subdirectory in Directory.EnumerateDirectories(directory))
{
DeleteReadOnlyDirectory(subdirectory);
}
foreach (var fileName in Directory.EnumerateFiles(directory))
{
var fileInfo = new FileInfo(fileName);
fileInfo.Attributes = FileAttributes.Normal;
fileInfo.Delete();
}
Directory.Delete(directory);
}
I would like to delete the local repo folder that I cloned from remote repository using LibGit2Sharp. I read here here that I have to Dispose() the Repository before I can delete it.
LibGit2Sharp keeps hold on some files in the .git folder (mainly the packfiles for performance reasons). Calling Dispose() will release those handles and deallocate the non managed memory.
As such, it's indeed a strong recommendation to rely on the using statement (or, at the very least to Dispose() the Repository instance when you're done with it).
If you don't do this, those handles will eventually be released through finalizers when your AppDomain has unloaded, but you will have no real control regarding "when" that's going to happen.
Edit: Reading your code once again, I overlooked something. The recommended pattern is either
using (var repo = new LibGit2Sharp.Repository(path))
{
// Do amazing stuff
}
or
var repo = new LibGit2Sharp.Repository(path);
// Do amazing stuff
repo.Dispose();
Indeed, the using statement will automatically issue a call to Dispose() once the code reach the end of the scope.
Access to the path 'c16566a7-202a-4c8a-84de-3e3caadd5af9' is denied.
Regarding this point, I think this has nothing to do with LibGit2Sharp.
Is the process (trying to delete the folder named after a guid) running under an identity granted with enough rights to do so?
i'm developing a program to convert RTF to html
i'm using the DLLs found here
http://www.codeproject.com/KB/recipes/RtfConverter.aspx?fid=1458864&df=90&mpp=25&noise=3&sort=Position&view=Quick&select=3427424&fr=1#xx0xx
this dll saves a jpg file from html to a specific folder,
when i run the program, it cinverts the rtf for the first time and saves the images to the folder perfectly
but when i try to convert it again i hace this error
"error a generic error occured in GDI+"
i think this dll use SaveImage method and to avoid this you must release the Image object you created but i can't modify the DLL,
is there is any way to release the object i've created from this dll?
this is my code
RtfVisualImageAdapter imageAdapter = new RtfVisualImageAdapter(
#Application.StartupPath + "\\Program Data\\temp\\{0}{1}",
System.Drawing.Imaging.ImageFormat.Jpeg);
RtfImageConvertSettings imageConvertSettings =
new RtfImageConvertSettings(imageAdapter);
RtfImageConverter imageConverter = new RtfImageConverter(imageConvertSettings);
try
{
IRtfDocument rtfDocument = RtfInterpreterTool.BuildDoc(
ConversionText, imageConverter);
RtfHtmlConverter htmlConverter = new RtfHtmlConverter(rtfDocument);
htmlConverter.Settings.ConvertVisualHyperlinks = true;
htmlConverter.Settings.UseNonBreakingSpaces = true;
this.richTextBoxPrintCtrl2.Text = htmlConverter.Convert();
}
catch (Exception exception)
{
MessageBox.Show(this, "Error " + exception.Message, this.Text,
MessageBoxButtons.OK, MessageBoxIcon.Error);
}
The code is sloppy, it doesn't call the Dispose() method on the bitmap after saving it. That keeps a lock on the file, GDI+ uses a memory-mapped file to avoid putting pressure on the paging file. Important because bitmaps can be quite large. Trying to save to the same file again fails because of the lock. GDI+ exception messages are notoriously sloppy as well.
I think the bug is located in Interpreter\Converter\Image\RtfImageConverter.cs, SaveImage() method. The "convertedImage" bitmap doesn't get disposed. Note that the Graphics object in that same method doesn't get disposed either. Fix it by wrapping them with the using statement.
Run this code through FxCop to catch similar mistakes. And ask yourself if you really want to maintain code like this.
If something implements IDisposable, you can call its Dispose() method. Objects are eligible for garbage collection as soon as they go out of scope so you might also try calling GC.Collect() after there are no more references to the object you want "released."
As Max sez. Or better use the using construct. NEVER call GC.Collect unless you are dead sure by doing so you'll free a few GB or RAM!
Since you have the source code you could examine it and figure out where it keeps a reference and make sure it's released.
If you cannot figure out where to do this you could load the code up in a separate AppDomain, and execute your code there. When you are finished you can unload the AppDomain, and your application will release any objects. Then recreate the AppDomain for the next run.
But I would try to spend some time figuring out the real issue before using AppDomains.
And another thing. Do you get the GDI error when you execute the same file twice, or two different files in succession? It could be that it fails to load the image of the second file and gives you the error.