I have many images on remote server say images.foo.com/222 & i want to access file names of all files that resides in the folder 222 on images.foo.com/.
i have tried following code but getting error "virtual path is not valid" :
imageserver = http://images.foo.com/222;
DirectoryInfo di = new DirectoryInfo(imageserver); // line giving exception
FileInfo[] rgFiles = di.GetFiles();
string simagename = "";
if (rgFiles.Count() > 0)
{
foreach (FileInfo fi in rgFiles)
{
//collect each filename from here
}
}
Please help
thanks in advance
gbaxi
DirectoryInfo need a UNC path of type "\\fileserver\images"
A http address will not work
You can't access a directory residing on the web with the DirectoryInfo class. Instead, use the WebRequest class to get a list from the URL and get the files from that list.
The problem is that HTTP does not have a clear interface on how a directory listing is being displayed. There are roughly two choices:
Parse the HTML retrieved through a WebRequest, but you won't get things like creation/modification time and user;
Go with a different mechanism to retrieve the file details like FTP or File share.
Related
I'm using ASP(C#).NET
Need to access some video files that are kept in another server
say I'm using 111.222.33.33 & files are kept in 'repository' folder of F: drive in 222.111.12.12
below is my code segment to pull file name & path into an asp gridview so that I can download these files by clicking on the download link.
I'm able to download the files when they are in the same IP address.
But I'm having trouble with the file path. Need some help with the UR. Thnx.
String fileSearchPatern = "*.*";
DirectoryInfo directory = new DirectoryInfo(Server.MapPath(#"\\222.111.12.12\F:\repository\"));
if (gvSource == null)
{
gvSource = DisplayFilesInGridViewTwo();
}
DataRow gvRow;
FileInfo[] files = directory.GetFiles(fileSearchPatern, SearchOption.AllDirectories);
foreach (FileInfo fileInfo in files)
{
gvRow = gvSource.NewRow();
gvRow["Name"] = fileInfo.Name;
gvRow["FilePath"] = #"\\222.111.12.12\F:\repository\" + fileInfo.Name;
gvSource.Rows.Add(gvRow);
}
if (files.Length > 0)
{
this.GridView2.DataSource = gvSource;
this.GridView2.DataBind();
}
else
{
this.GridView2.DataSource = null;
this.GridView2.DataBind();
}
Not sure exactly, but this might help, I can see clearly that on the network access you cannot use F: but have to use F$ instead, try to do the following steps please and tell me:
1- Check the network weather you can access the F drive by click on Windows + R, and write #\\222.111.12.12\F$\repository
2- If not, then check the network first connection, then weather this is shared or permission given to you or not..
3- If you want to store on this folder, you should give permission of Network Service to it, if only reading then no need, the exists permission should be enough then...
How to get a Word file from server in C#? I use following code:
static void Main(string[] args)
{
Word._Application application = new Word.Application();
object fileformat = Word.WdSaveFormat.wdFormatXMLDocument;
//
DirectoryInfo directory = new DirectoryInfo(#"http://www.sample.com/image/");
foreach (FileInfo file in directory.GetFiles("*.doc", SearchOption.AllDirectories))
{
if (file.Extension.ToLower() == ".doc")
{
object filename = file.FullName;
object newfilename = file.FullName.ToLower().Replace(".doc", ".docx");
Word._Document document = application.Documents.Open(filename);
document.Convert();
document.SaveAs(newfilename, fileformat);
document.Close();
document = null;
}
}
application.Quit();
application = null;
}
but when I use this code to get file from local machine or desktop then work fine.
Please tell me.
You can't use DirectoryInfo with a URL.
By design, this class only takes a local (or mapped network) path in its constructor.
You need to use System.Net.HttpWebRequest class to get the file from a URL, since it's located on a server on the internet, the only way to retrieve the file is to download it via HTTP.
Edit:
Based on your comments, you are looking to process 1 million files on a server you have access to. There are many ways to handle this.
You can use a network path to the server, such as
var di = new DirectoryInfo("\\servername\path\filename.doc")
You can just use a local path and create your application as a C# Console Application and use a local path. This is what I call a utility. It would be the faster method since it will process everything locally, and avoid network traffic.
var di = new DirectoryInfo("c:\your-folder\your-doc-file.doc")
Since you would run the C# console app directly on the server, the above would work.
DirectoryInfo is just an object that contains information about a directory entry in your file system. It doesn't download a file, which I presume is what you want to do.
The code example at http://msdn.microsoft.com/en-us/library/system.net.httpwebrequest.connection(v=vs.110).aspx is, I think, similar to what you want.
DirectoryInfo is for accessing local files or UNC paths. You cannot use it to access a http addressed page. You first need to download the file, i.e. using HttpWebRequest.
For a ftp path say ftp://ftp.something.com/ I am able to list all directories with this code :
WebRequest req = WebRequest.Create(url) as WebRequest;
req.Method = WebRequestMethods.Ftp.ListDirectory;
//code to get response from ftp site and list all files and directories path in a list name name_list.
Now foreach path from a list name_list, if path is a directory then I add that path in a list name sub_list else if it is path of some file(.txt, .pdf, .rar, .html, .tw and many more extensions) then add that path in another list name final_list.
So far what I am able to do is :
foreach(string url in name_list)
{
if (Regex.IsMatch(url, ".*?" + #"(\.[A-Za-z]{2,4}$)"))
//add to sub_list
else
//add to final_list
}
But this is not a reliable and robust way to achieve my goal.
Is there any other best way to this.
I would use the System.IO.Path class to determine if it is a path or file. Specifically:
http://msdn.microsoft.com/en-us/library/system.io.path.getfilename.aspx
You can look at file permissions to see if it is a directory or not. Refer to the following example:
http://www.copyandwaste.com/posts/view/parsing-webrequestmethodsftplistdirectorydetails-and-listdirectory/
I'm trying to write a function in C# that gets a directory path as parameter and returns a dictionary where the keys are the files directly under that directory and the values are their last modification time.
This is easy to do with Directory.GetFiles() and then File.GetLastWriteTime(). However, this means that every file must be accessed, which is too slow for my needs.
Is there a way to do this while accessing just the directory? Does the file system even support this kind of requirement?
Edit, after reading some answers:
Thank you guys, you are all saying pretty much the same - use FileInfo object. Still, it is just as slow to use Directory.GetFiles() (or Directory.EnumerateFiles()) to get those objects, and I suspect that getting them requires access to every file. If the file system keeps last modification time of its files in the files themselves only, there can't be a way to extract that info without file access. Is this the case here? Do GetFiles() and EnumerateFiles() of DirectoryInfo access every file or get their info from the directory entry? I know that if I would have wanted to get just the file names, I could do this with the Directory class without accessing every file. But getting attributes seems trickier...
Edit, following henk's response:
it seems that it really is faster to use FileInfo Object. I created the following test:
static void Main(string[] args)
{
Console.WriteLine(DateTime.Now);
foreach (string file in Directory.GetFiles(#"\\169.254.78.161\dir"))
{
DateTime x = File.GetLastWriteTime(file);
}
Console.WriteLine(DateTime.Now);
DirectoryInfo dirInfo2 = new DirectoryInfo(#"\\169.254.78.161\dir");
var files2 = from f in dirInfo2.EnumerateFiles()
select f;
foreach (FileInfo file in files2)
{
DateTime x = file.LastWriteTime;
}
Console.WriteLine(DateTime.Now);
}
For about 800 files, I usually get something like:
31/08/2011 17:14:48
31/08/2011 17:14:51
31/08/2011 17:14:52
I didn't do any timings but your best bet is:
DirectoryInfo di = new DirectoryInfo(myPath);
FileInfo[] files = di.GetFiles();
I think all the FileInfo attributes are available in the directory file records so this should (could) require the minimum I/O.
The only other thing I can think of is using the FileInfo-Class. As far as I can see this might help you or it might read the file as well (Read Permissions are required)
We have a network folder that is the landing place for csv files processed by different servers. We alter the extension of the csv files to match the server that processed the file; for instance a file with the name FooBar that was process by Server1 would land in the network share with the name FooBar.Server_1.
The information I have available to to me is file name: FooBar and the path to the share. I am not guaranteed the extension as there are multiple servers sending csv files to the network share. I am guaranteed that there will only be one FooBar file in the share.
Is there a method in .net 2.0 that can be used to get the extension of the csv armed only with the path and filename? Or do I need to craft my own?
Directory.GetFiles(NETWORK_FOLDER_PATH, "FooBar.*")[0] will give you the full path.
Path.GetExtension will give you the extension.
If you want to put it all together:
string extension = Path.GetExtension(
Directory.GetFiles(NETWORK_FOLDER_PATH, "FooBar.*")[0]);
This should do the trick:
string[] results = System.IO.Directory.GetFiles("\\\\sharepath\\here", "FooBar.*", System.IO.SearchOption.AllDirectories);
if(results.Length > 0) {
//found it
DoSomethingWith(results[0]);
} else {
// :(
}
DirectoryInfo di = new DirectoryInfo(yourPath)
{
FileInfo[] files = di.GetFiles(FooBar.*);
}
If you know the directory where the file will reside you can use DirectoryInfo.GetFiles("SearchPattern"): DirectoryInfo.GetFiles("FooBar.*")