I'm using ASP(C#).NET
Need to access some video files that are kept in another server
say I'm using 111.222.33.33 & files are kept in 'repository' folder of F: drive in 222.111.12.12
below is my code segment to pull file name & path into an asp gridview so that I can download these files by clicking on the download link.
I'm able to download the files when they are in the same IP address.
But I'm having trouble with the file path. Need some help with the UR. Thnx.
String fileSearchPatern = "*.*";
DirectoryInfo directory = new DirectoryInfo(Server.MapPath(#"\\222.111.12.12\F:\repository\"));
if (gvSource == null)
{
gvSource = DisplayFilesInGridViewTwo();
}
DataRow gvRow;
FileInfo[] files = directory.GetFiles(fileSearchPatern, SearchOption.AllDirectories);
foreach (FileInfo fileInfo in files)
{
gvRow = gvSource.NewRow();
gvRow["Name"] = fileInfo.Name;
gvRow["FilePath"] = #"\\222.111.12.12\F:\repository\" + fileInfo.Name;
gvSource.Rows.Add(gvRow);
}
if (files.Length > 0)
{
this.GridView2.DataSource = gvSource;
this.GridView2.DataBind();
}
else
{
this.GridView2.DataSource = null;
this.GridView2.DataBind();
}
Not sure exactly, but this might help, I can see clearly that on the network access you cannot use F: but have to use F$ instead, try to do the following steps please and tell me:
1- Check the network weather you can access the F drive by click on Windows + R, and write #\\222.111.12.12\F$\repository
2- If not, then check the network first connection, then weather this is shared or permission given to you or not..
3- If you want to store on this folder, you should give permission of Network Service to it, if only reading then no need, the exists permission should be enough then...
Related
im trying to read a folder from a remote machine, inside this folder there are many txt files.
The machine is writing every time new datas inside the folder.
string str = "";
try
{
DirectoryInfo d = new DirectoryInfo(#"\\192.168.1.209\user\HST\");
FileInfo[] Files = d.GetFiles("*.txt");
foreach (FileInfo file in Files)
{
str = str + ", " + file.Name;
}
}
catch (Exception pp)
{
System.IO.File.WriteAllText(GlobalVariables.errorFolderLocation + "erroreLettura.1.209.txt", pp.ToString());
}
This is my code, I don't understand how to get these datas, because i get this error: "the system call level is not correct".
For example, if i try to delete the folder or a file, i get error because it's already used by another process.
So, is there a solution to "bypass" this error?
EDIT 1:
I need read every row of every file, i get the error on DirectoryInfo.
If i acces on the folders and files it works fine.
I need read this folder/files in 3 different machine, but only in this (192.168.1.209) not working, and its the only machine where i get error when i try to delete a file
Unfortunately, you are not able to delete a file which is in use by another process or delete a folder that contains a file in use, or whatever other operation that has not been specified to be accessed on the other process. In order to achieve this, the process which has the file in use, must open it with a share access, FileShare from System.IO .NET library.
I'm trying to query list of all files and folders from drive of G Suite user and copy all this files to another domain user.
Anyhow i am able to get list of all files but i am not able copy those files to another domain.
I have gone through several reference on sharing drive files but couldn't understand of granting authority so that destination domain can copy files.
I will be happy if anyone can help me to solve this problem.
public void Execute(){
string strFileId = "";
//get list of all files from source domain(eg. user#a.jp)
IList<Google.Apis.Drive.v3.Data.File> fileList = service.getDriveFiles();
Google.Apis.Drive.v3.Data.File title = new Google.Apis.Drive.v3.Data.File();
//loop files list
foreach (var fileItem in fileList)
{
strFileId = fileItem.Id;
title.Name = fileItem.Name;
//copy files to destination domain(eg. user#b.jp) with file ID
service.Files.Copy(title, strFileId).Execute();
}
}
private IList<Google.Apis.Drive.v3.Data.File> getDriveFiles()
{
// Define parameters of request.
Google.Apis.Drive.v3.FilesResource.ListRequest FileListRequest = source_drive_service.Files.List();
// get all files
FileListRequest.Fields = "nextPageToken, files(*)";
IList<Google.Apis.Drive.v3.Data.File> files = FileListRequest.Execute().Files;
return files;
}
There are many ways to go about this, but one simple way is to:
With the Drive API get a list of all your files (which include folders in Drive) by using the Files.list().
For every file, you can transfer ownership to a new owner (the admin of Domain B for example).
Note: When a file is transferred, the previous owner's role is downgraded to writer.
I'm trying to download multiple files from an SFTP server and save them to the install path (or actually, ANY path at the moment just to get it working). However, I get an UnauthorizedAccess Exception no matter where I try to save the files.
As far as was aware, there are no special permissions required to save files to the install dir (Hence why I chose this folder).
Thread myThread = new Thread(delegate() {
string host;
string username;
string password;
// Path to folder on SFTP server
string pathRemoteDirectory = "public_html/uploads/17015/";
// Path where the file should be saved once downloaded (locally)
StorageFolder localFolder = Windows.ApplicationModel.Package.Current.InstalledLocation;
string pathLocalDirectory = localFolder.Path.ToString();
var methods = new List<AuthenticationMethod>();
methods.Add(new PasswordAuthenticationMethod(username, password));
//TODO - Add SSH Key auth
var con = new ConnectionInfo(host, 233, username, methods.ToArray());
using (SftpClient sftp = new SftpClient(con))
{
try
{
sftp.Connect();
var files = sftp.ListDirectory(pathRemoteDirectory);
// Iterate over them
foreach (SftpFile file in files)
{
Console.WriteLine("Downloading {0}", file.FullName);
using (Stream fileStream = File.OpenWrite(Path.Combine(pathLocalDirectory, file.Name)))
{
sftp.DownloadFile(file.FullName, fileStream);
Debug.WriteLine(fileStream);
}
}
sftp.Disconnect();
}
catch (Exception er)
{
Console.WriteLine("An exception has been caught " + er.ToString());
}
}
});
Connection to the server is all fine, the exception occurs on this line.
using (Stream fileStream = File.OpenWrite(Path.Combine(pathLocalDirectory, file.Name)))
I'm must be missing something obvious here but it's worth noting that I've also tried writing to Special Folders like the Desktop, the users Document folder and also direct to the C:/ drive, all with the same exception. I'm also running with Administrator privileges and I have the correct permissions set in the folders.
It turns out that SFTP was counting '.' and '..' as files and trying to download those, when obviously '.' is the set SFTP folder and '..' is the previous folder. This was causing a permissions exception, not 100% sure why. Simply iterating over the files to make sure they're not named '.' or '..' fixed the issue. Code below.
sftp.Connect();
var files = sftp.ListDirectory(pathRemoteDirectory);
// Iterate over them
foreach (SftpFile file in files)
{
if (!file.IsDirectory && !file.IsSymbolicLink)
{
using (Stream fileStream = File.OpenWrite(Path.Combine(pathLocalDirectory, file.Name)))
{
sftp.DownloadFile(file.FullName, fileStream);
Debug.WriteLine(pathLocalDirectory);
}
}
else if (file.Name != "." && file.Name != "..")
{
Debug.WriteLine("Directory Ignored {0}", file.FullName);
}
else if (file.IsSymbolicLink)
{
Debug.WriteLine("Symbolic link ignored: {0}", file.FullName);
}
}
sftp.Disconnect();
You have multiple problems here. The parent folder ("..") reference you answered is one blocker, but that doesn't address the deeper problem that the InstalledLocation is read-only.
UWP apps do not have direct access to most file system locations. By default they can read and write to their ApplicationData directory and they can read from (but not write to) the InstalledLocation. The failures you saw for Desktop, Documents, and C:\ are all expected.
Other locations (including Desktop, Documents, and C:) may be granted access by the user either explicitly or via the app's declared capabilities. They can be accessed via the file broker through the StorageFile object.
See the UWP File access permissions documentation:
The app's install directory is a read-only location. You can't gain
access to the install directory through the file picker.
For the long term you'll want to download your files somewhere else: probably into one of the ApplicationData folders. These folders are the only ones with no special permission requirements for UWP apps.
So why does this work for you now?
You're running into a debugging quirk where your app is not fully installed but is staged from your VS project directory. This allows the app to write to the staged install directory, but once it is properly deployed into Program Files\WindowsApps writing to the InstalledLocation will fail.
Try Path.GetTempPath();. You should have permission there.
When it says you don't have permission, you don't. 8-)
Also, there's no such thing as "no special permissions". Everything requires some level of permission for access.
How to get a Word file from server in C#? I use following code:
static void Main(string[] args)
{
Word._Application application = new Word.Application();
object fileformat = Word.WdSaveFormat.wdFormatXMLDocument;
//
DirectoryInfo directory = new DirectoryInfo(#"http://www.sample.com/image/");
foreach (FileInfo file in directory.GetFiles("*.doc", SearchOption.AllDirectories))
{
if (file.Extension.ToLower() == ".doc")
{
object filename = file.FullName;
object newfilename = file.FullName.ToLower().Replace(".doc", ".docx");
Word._Document document = application.Documents.Open(filename);
document.Convert();
document.SaveAs(newfilename, fileformat);
document.Close();
document = null;
}
}
application.Quit();
application = null;
}
but when I use this code to get file from local machine or desktop then work fine.
Please tell me.
You can't use DirectoryInfo with a URL.
By design, this class only takes a local (or mapped network) path in its constructor.
You need to use System.Net.HttpWebRequest class to get the file from a URL, since it's located on a server on the internet, the only way to retrieve the file is to download it via HTTP.
Edit:
Based on your comments, you are looking to process 1 million files on a server you have access to. There are many ways to handle this.
You can use a network path to the server, such as
var di = new DirectoryInfo("\\servername\path\filename.doc")
You can just use a local path and create your application as a C# Console Application and use a local path. This is what I call a utility. It would be the faster method since it will process everything locally, and avoid network traffic.
var di = new DirectoryInfo("c:\your-folder\your-doc-file.doc")
Since you would run the C# console app directly on the server, the above would work.
DirectoryInfo is just an object that contains information about a directory entry in your file system. It doesn't download a file, which I presume is what you want to do.
The code example at http://msdn.microsoft.com/en-us/library/system.net.httpwebrequest.connection(v=vs.110).aspx is, I think, similar to what you want.
DirectoryInfo is for accessing local files or UNC paths. You cannot use it to access a http addressed page. You first need to download the file, i.e. using HttpWebRequest.
I have many images on remote server say images.foo.com/222 & i want to access file names of all files that resides in the folder 222 on images.foo.com/.
i have tried following code but getting error "virtual path is not valid" :
imageserver = http://images.foo.com/222;
DirectoryInfo di = new DirectoryInfo(imageserver); // line giving exception
FileInfo[] rgFiles = di.GetFiles();
string simagename = "";
if (rgFiles.Count() > 0)
{
foreach (FileInfo fi in rgFiles)
{
//collect each filename from here
}
}
Please help
thanks in advance
gbaxi
DirectoryInfo need a UNC path of type "\\fileserver\images"
A http address will not work
You can't access a directory residing on the web with the DirectoryInfo class. Instead, use the WebRequest class to get a list from the URL and get the files from that list.
The problem is that HTTP does not have a clear interface on how a directory listing is being displayed. There are roughly two choices:
Parse the HTML retrieved through a WebRequest, but you won't get things like creation/modification time and user;
Go with a different mechanism to retrieve the file details like FTP or File share.