How to clear browser cache programmatically? - c#

I try to clear the firefox 8 browser cache using programmatically. I am developing as a site using asp.net, I need to clear the browser cache for the security reason. I tried many ways to clear the cache but none seems to work. Any ideas?

Yes you can do it But........
You can't clear a browser's history via code because of browsers security reasons.
But you can delete all the files and folders under browsers "cache" directory using
file operation.
eg. Mozilla's default cache location(hidden) is
"..AppData\Local\Mozilla\Firefox\Profiles\2nfq77n2.default\Cache"
How to delete all files and folders in a directory?
try it!

I don't think this would be possible due to security reasons .
At max you can set HTTP header to tell the browser not to chache your pages like this :
Cache-Control: no-cache

It is not possible to clear browser's cache programmatically, however you can stop caching from your application.
Below code will help you for disabling caching and clears existing cache from your application:
public static void DisablePageCaching()
{
//Used for disabling page caching
HttpContext.Current.Response.Cache.SetExpires(DateTime.UtcNow.AddDays(-1));
HttpContext.Current.Response.Cache.SetValidUntilExpires(false);
HttpContext.Current.Response.Cache.SetRevalidation(HttpCacheRevalidation.AllCaches);
HttpContext.Current.Response.Cache.SetCacheability(HttpCacheability.NoCache);
HttpContext.Current.Response.Cache.SetNoStore();
}

Use this code (C#):
public static void DeleteFirefoxCache()
{
string profilesPath = #"Mozilla\Firefox\Profiles";
string localProfiles = Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.LocalApplicationData), profilesPath);
string roamingProfiles = Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.ApplicationData), profilesPath);
if (Directory.Exists(localProfiles))
{
var profiles = Directory.GetDirectories(localProfiles).OfType<string>().ToList();
profiles.RemoveAll(prfl => prfl.ToLowerInvariant().EndsWith("geolocation")); // do not delete this profile.
profiles.ForEach(delegate(string path)
{
var files = Directory.GetFiles(path, "*.*", SearchOption.AllDirectories).ToList<string>();
foreach (string file in files)
{
if (!Common.IsFileLocked(new FileInfo(file)))
File.Delete(file);
}
});
}
if (Directory.Exists(roamingProfiles))
{
var profiles = Directory.GetDirectories(roamingProfiles).OfType<string>().ToList();
profiles.RemoveAll(prfl => prfl.ToLowerInvariant().EndsWith("geolocation")); // do not delete this profile.
profiles.ForEach(delegate(string path)
{
var dirs = Directory.GetDirectories(path, "*", SearchOption.AllDirectories).OfType<string>().ToList();
dirs.ForEach(delegate(string dir)
{
var files = Directory.GetFiles(dir, "*.*", SearchOption.AllDirectories).ToList<string>();
foreach (string file in files)
{
if (!Common.IsFileLocked(new FileInfo(file)))
File.Delete(file);
}
});
var files0 = Directory.GetFiles(path, "*", SearchOption.TopDirectoryOnly).OfType<string>().ToList();
files0.ForEach(delegate(string file)
{
if (!Common.IsFileLocked(new FileInfo(file)))
File.Delete(file);
});
});
}
}

My solution:
string UserProfile = Environment.GetFolderPath(Environment.SpecialFolder.UserProfile);
try
{
string id = string.Empty;
var lines = File.ReadAllLines($#"{UserProfile}\AppData\Roaming\Mozilla\Firefox\profiles.ini");
foreach (var line in lines)
{
if (line.Contains("Path=Profiles/"))
{
var text = line.Replace("Path=Profiles/", "");
id = text.Trim();
}
}
Array.ForEach(Directory.GetFiles($#"{UserProfile}\AppData\Local\Mozilla\Firefox\Profiles\{id}\cache2\entries"), File.Delete);
}
catch { }

In asp.net/ c# you can trigger this.
string cacheKey = "TestCache";
//Add cache
Cache.Add(cacheKey, "Cache content", null, DateTime.Now.AddMinutes(30),
TimeSpan.Zero, CacheItemPriority.High, null);
Cache.Remove(cacheKey); //C# clear cache

Related

Download only files with specified extensions from Azure Git repository

I need to download files with specified extensions from Azure Git repository programmatically (C#, .NET Framework 4.8). Files are located in different server folders and sometimes in different branches. I was able to achieve this goal with the following code:
var connection = new VssConnection(new Uri(collectionUri), new VssBasicCredential("", personalAccessToken));
using (var ghc = connection.GetClient<GitHttpClient>())
{
string[] extensionsToDownload = { "xml", "xslt" }; // just for example, real cases are not limited to these extensions
string branch = "my-branch-name";
GitVersionDescriptor version = new GitVersionDescriptor { Version = branch };
// Get all items information
GitItemRequestData data = new GitItemRequestData
{
ItemDescriptors = new []
{
new GitItemDescriptor
{
Path = "/Some/Path",
RecursionLevel = VersionControlRecursionType.Full,
VersionType = GitVersionType.Branch,
Version = branch
},
new GitItemDescriptor
{
Path = "/Another/Path",
RecursionLevel = VersionControlRecursionType.Full,
VersionType = GitVersionType.Branch,
Version = branch
}
}
};
var items = ghc.GetItemsBatchAsync(data, project: projectName, repositoryId: repoName).Result;
// filter returned items by extension
List<GitItem> filteredItems = items
.SelectMany(item => item)
.Where(item => item.GitObjectType == GitObjectType.Blob && extensionsToDownload.Contains(item.Path.Split('.').Last())).ToList();
// download zipped items and extract
foreach (var item in filteredItems)
{
using (var stream = ghc.GetItemZipAsync(
project: projectName, repositoryId: repoName, path: item.Path, includeContent: true, versionDescriptor: version).Result)
{
ZipArchive archive = new ZipArchive(stream);
foreach (ZipArchiveEntry entry in archive.Entries)
{
entry.ExtractToFile(Path.Combine(localFolder, entry.FullName.Trim('/')), true);
}
}
}
However, it means that each item requires a separate API call. Definitely not good for performance. I thought "There should be a way to batch download all items at once".
GetBlobsZipAsync method seemed like what I exactly needed. However, attempt to use it had failed miserably. All I got was VssUnauthorizedException: 'VS30063: You are not authorized to access https://dev.azure.com'. Very strange because calling GetBlobZipAsync for each individual item id works perfectly (but it's almost the same as initial solution with the same far-from-ideal performance).
Dictionary<string, string> idToNameMappings = filteredItems.ToDictionary(k => k.ObjectId, v => Path.Combine(localFolder, v.Path.Trim('/')));
foreach (var item in filteredItems)
{
using (var stream = ghc.GetBlobsZipAsync(idToNameMappings.Select(i => i.Key), project: projectName, repositoryId: repoName).Result)
{
ZipArchive archive = new ZipArchive(stream);
foreach (ZipArchiveEntry entry in archive.Entries)
{
entry.ExtractToFile(entry.FullName, idToNameMappings[entry.FullName], true);
}
}
}
Another option is to download all items as zip archive and filter it on the client side:
foreach (var desc in data.ItemDescriptors)
{
using (var stream = ghc.GetItemZipAsync(projectName, repoName, null, desc.Path, desc.RecursionLevel, versionDescriptor:version).Result)
{
ZipArchive archive = new ZipArchive(stream);
foreach (ZipArchiveEntry entry in archive.Entries)
{
if (extensionsToDownload.Contains(entry.FullName.Split('.').Last()))
{
entry.ExtractToFile(Path.Combine(localFolder, entry.FullName.Trim('/')), true);
}
}
}
}
But it's even worse because the repository contains a large amount of data files (including some binary content). Downloading several hundreds MB of data to get less than 10 MB of xml files doesn't seem to be very efficient.
So at the moment I gave up and decided to stick with initial solution. But maybe there's something I overlooked?

Error: File operation not allowed. Access to route denied

I am working on a project that uses Silverlight, where I want to show PDFS files of a server path, but when I start debugging my code I find the following exception:
where I generate the flow in the following code:
System.Windows.Browser.HtmlElement myFrame = System.Windows.Browser.HtmlPage.Document.GetElementById("_sl_historyFrame");
if (myFrame != null)
{
DirectoryInfo folderPath = new DirectoryInfo(#"\\192.168.1.216\UploadFileMobilePDF\" + transfer.IdTransfer);
foreach (var file in folderPath.EnumerateFiles("*.pdf", SearchOption.AllDirectories))
{
myFrame.SetStyleAttribute("width", "1024");
myFrame.SetStyleAttribute("height", "768");
Uri uri = new Uri(folderPath + file.FullName);
string path = uri.AbsoluteUri.ToString();
myFrame.SetAttribute("src", path);
myFrame.SetStyleAttribute("left", "0");
myFrame.SetStyleAttribute("top", "50");
myFrame.SetStyleAttribute("visibility", "visible");
}
}
The error marks me when instantiating the DirectoryInfo class folderPath = new DirectoryInfo ()
I don't know if silverlight can't have permissions to server addresses
Your application likely doesn't have permission to access the files on the server you're trying to access.
Look into WindowsImpersonationContext for the most likely way around this. https://learn.microsoft.com/en-us/dotnet/api/system.security.principal.windowsimpersonationcontext?view=netframework-4.8
You'll want a class (say, "MyImpersonator") that uses WindowsImpersonationContext to log onto the server using valid credentials. There are too many details to present an entire solution, but using the class (defined elsewhere) to get a single file might look something like this:
using (var impersonator = new MyImpersonator())
{
string name = ConfigurationManager.AppSettings["name"];
string password = ConfigurationManager.AppSettings["pass"];
if (impersonator.LogOnCrossDomain(account, pass))
{
if (File.Exists(filepath))
{
byte[] content = File.ReadAllBytes(filepath);
}
}
}

How to delete all empty links to files from DB and all redundant files

I am building web app, which can store images. My DB stores paths to this images and all of them are stored in specific directory. How can I delete all files from download folder, which do not exist in DB, and all DB records, which have empty links?
For example, I have 3 files: File1.jpg, File2.jpg, File3.jpg.
My DB stores only File1.jpg and File2.jpg. For some reasons File1.jpg was deleted from directory but it's records still remain in DB. What is the best way to delete File3.jpg from folder(as it is not stored in DB) and File1.jpg from DB(as it does not exist in folder)?
I have written a method to delete files, which are not stored in DB:
public async Task DeleteNonExistingImagesInFolder(string imagesDirectory)
{
var images = _unitOfWork.Images.AsQueryable();
DirectoryInfo d = new DirectoryInfo(imagesDirectory);
FileInfo[] Files = d.GetFiles();
await Task.Run(() =>
{
foreach (var file in Files)
{
if (!images.Where(i => i.Path == file.FullName).Any())
file.Delete();
}
});
}
I have done the same thing for DB records:
public async Task DeleteNonExistingImagesInDB(string imagesDirectory)
{
var images = _unitOfWork.Images.AsQueryable();
DirectoryInfo d = new DirectoryInfo(imagesDirectory);
FileInfo[] Files = d.GetFiles();
await Task.Run(() =>
{
foreach (var image in images)
{
if (!Files.Where(f => f.FullName == image.Path).Any())
_unitOfWork.Images.Remove(image.Id);
}
});
}
But maybe there is a faster approach.
Something like this is pretty efficient and is done in a short bit of code. This just detects the changes you want from 2 collections and is a working example. See the end of the answer for some hints on what you will need to change for your implementation.
IEnumerable<string> files = new List<string> { "file1.txt", "file4.txt" };
IEnumerable<string> dbFiles = new List<string> { "file1.txt", "file2.txt", "file3.txt" };
IEnumerable<string> addsToFileSystem = files.Except(dbFiles);
IEnumerable<string> addsToDb = dbFiles.Except(files);
foreach (string file in addsToFileSystem) {
Console.WriteLine($"delete {file} from file system");
}
foreach (string file in addsToDb) {
Console.WriteLine($"delete {file} from db");
}
Output:
delete file4.txt from file system
delete file2.txt from db
delete file3.txt from db
// get collection of files from "my files" directory and select just the file name
IEnumerable<string> files = Directory.EnumerateFiles("my files").Select(x => Path.GetFileName(x))
// replace with selecting the file names from your database
IEnumerable<string> dbFiles = _unitOfWork.Images.Select(x => x..FileName);
IEnumerable<string> addsToFileSystem = files.Except(dbFiles);
IEnumerable<string> addsToDb = dbFiles.Except(files);
foreach (string file in addsToFileSystem) {
// remove from file system
}
foreach (string file in addsToDb) {
// remove from db
}

How do I check the "cut" permission of a folder in windows based on .net

In my project ,I want to move a folder to a destination.Here is my thought.
First scenario, I want to check Can I move the folder,If I don't have the permission I will not check the sub-items in the folder. Move action is done.
second scenario,If I have the permisson to move the folder,I will check all the sub-items in the folder,then move the items that I can move and leave the items that I can't move.
So I don't know How to implement the first scenario.If I catch the unauthorizedaccessexception in the scenario.I may block the second scenario,because in the second scenario,while moving folder,it will also throw the exception if some sub-items can not be moved.Can someone give me some suggestions?
I did not test this but it should give you a start:
public static void MoveDirectory(string source, string target)
{
var delSourceAtEnd = true;
var sourcePath = source.TrimEnd('\\', ' ');
var targetPath = target.TrimEnd('\\', ' ');
var files = Directory.EnumerateFiles(sourcePath, "*", SearchOption.AllDirectories)
.GroupBy(Path.GetDirectoryName);
foreach (var folder in files)
{
var failed = false;
var targetFolder = folder.Key.Replace(sourcePath, targetPath);
Directory.CreateDirectory(targetFolder);
foreach (var file in folder)
{
var targetFile = Path.Combine(targetFolder, Path.GetFileName(file));
try
{
File.Move(file, targetFile);
} catch (UnauthorizedAccessException uae)
{
failed = true;
delSourceAtEnd = false;
}
}
if (!failed) Directory.Delete(folder.Key);
}
if (delSourceAtEnd) Directory.Delete(source, false);
}
This is heavily based on this answer, which shows different options how you can move a directory manually and handle single files and folders individually.

Moving a SharePoint folder and contents to different location in same Document Library

I'm looking for a way to move a folder and all it's contents to a different location in the same library using the Client Object Model for SharePoint 2010 (C#).
For example we have a folder for a project (say 12345) and it's URL is
http://sharepoint/site/library/2012/12345
where 2012 represents a year. I'd like to programmatically move the 12345 folder to a different year, say 2014 which probably exists already but may not.
I've searched around but the solutions I'm getting seem extremely complicated and relevant to moving folders to different site collections, I'm hoping because it's in the same library there might be a simpler solution? One idea I have is to rely on Explorer View instead of CSOM?
Thanks a lot!
There is no built-in method in SharePoint CSOM API for moving Folder with Files from one location into another.
The following class represents how to move files from source folder into destination folder:
public static class FolderExtensions
{
public static void MoveFilesTo(this Folder folder, string folderUrl)
{
var ctx = (ClientContext)folder.Context;
if (!ctx.Web.IsPropertyAvailable("ServerRelativeUrl"))
{
ctx.Load(ctx.Web, w => w.ServerRelativeUrl);
}
ctx.Load(folder, f => f.Files, f => f.ServerRelativeUrl, f => f.Folders);
ctx.ExecuteQuery();
//Ensure target folder exists
EnsureFolder(ctx.Web.RootFolder, folderUrl.Replace(ctx.Web.ServerRelativeUrl, string.Empty));
foreach (var file in folder.Files)
{
var targetFileUrl = file.ServerRelativeUrl.Replace(folder.ServerRelativeUrl, folderUrl);
file.MoveTo(targetFileUrl, MoveOperations.Overwrite);
}
ctx.ExecuteQuery();
foreach (var subFolder in folder.Folders)
{
var targetFolderUrl = subFolder.ServerRelativeUrl.Replace(folder.ServerRelativeUrl,folderUrl);
subFolder.MoveFilesTo(targetFolderUrl);
}
}
public static Folder EnsureFolder(Folder parentFolder, string folderUrl)
{
var ctx = parentFolder.Context;
var folderNames = folderUrl.Split(new char[] { '/' }, StringSplitOptions.RemoveEmptyEntries);
var folderName = folderNames[0];
var folder = parentFolder.Folders.Add(folderName);
ctx.Load(folder);
ctx.ExecuteQuery();
if (folderNames.Length > 1)
{
var subFolderUrl = string.Join("/", folderNames, 1, folderNames.Length - 1);
return EnsureFolder(folder, subFolderUrl);
}
return folder;
}
}
Key points:
allows to ensure whether destination folder(s) exists
In case of nested folders, its structure is preserved while moving files
Usage
var srcFolderUrl = "/news/pages";
var destFolderUrl = "/news/archive/pages";
using (var ctx = new ClientContext(url))
{
var sourceFolder = ctx.Web.GetFolderByServerRelativeUrl(srcFolderUrl);
sourceFolder.MoveFilesTo(destFolderUrl);
sourceFolder.DeleteObject(); // delete source folder if nessesary
ctx.ExecuteQuery();
}
Just in case someone needs this translated to PnP PowerShell. It's not battle tested but works for me. Versions and metadata moved as well within the same library.
$list = Get-PnPList -Identity Documents
$web = $list.ParentWeb
$folder = Ensure-PnPFolder -Web $list.ParentWeb -SiteRelativePath "Shared Documents/MoveTo"
$tofolder = Ensure-PnPFolder -Web $list.ParentWeb -SiteRelativePath "Shared Documents/MoveTwo"
function MoveFolder
{
[cmdletbinding()]
Param (
$web,
$fromFolder,
$toFolder
)
$fromFolder.Context.Load($fromFolder.Files)
$fromFolder.Context.Load($fromFolder.Folders)
$fromFolder.Context.ExecuteQuery()
foreach ($file in $fromFolder.Files)
{
$targetFileUrl = $file.ServerRelativeUrl.Replace($fromFolder.ServerRelativeUrl, $toFolder.ServerRelativeUrl);
$file.MoveTo($targetFileUrl, [Microsoft.SharePoint.Client.MoveOperations]::Overwrite);
}
$fromFolder.Context.ExecuteQuery();
foreach ($subFolder in $fromFolder.Folders)
{
$targetFolderUrl = $subFolder.ServerRelativeUrl.Replace($fromFolder.ServerRelativeUrl, $toFolder.ServerRelativeUrl);
$targetFolderRelativePath = $targetFolderUrl.SubString($web.RootFolder.ServerRelativeUrl.Length)
$tofolder = Ensure-PnPFolder -Web $list.ParentWeb -SiteRelativePath $targetFolderRelativePath
MoveFolder -Web $web -fromFolder $subFolder -toFolder $tofolder
}
}
$web.Context.Load($web.RootFolder)
$web.Context.ExecuteQuery()
MoveFolder -Web $web -fromFolder $folder -toFolder $tofolder
$folder.DeleteObject()
$web.Context.ExecuteQuery()

Categories