C# WebClient can not upload large files - c#

I have a large file (500MB) and my application can not send it to my webpage.
public static void UploadFile(string url, string fileName) {
WebClient client = new WebClient();
byte[] responseBinary = client.UploadFile(url, fileName);
string response = Encoding.UTF8.GetString(responseBinary);
MessageBox.Show(response);
}
It sends to a php file. I tried a check
var_dump($_FILES);
var_dump($_REQUEST);
but both of them is empty.
When I try to upload a small file (8KB) it is in $_FILES['file'].
What am I doing wrong?

You have to modify your server's php.ini file to allow big files to be sent to it, those are the two directives you have to modify:
upload_max_filesize = 900M
post_max_size = 900M
Also, remember to restart apache or nginx or whatever is your server after you edited the php.ini file.
EDIT
If you don't have access to this kind of configuration on your host, you could try (it may, or may not be allowed) modifing this settings at .htaccess file (on the root of your site) adding the following to it:
php_value upload_max_filesize 900M
php_value post_max_size 900M

Related

Using webclient to download images from deployed website

i deployed a website on IIS running on localhost/xxx/xxx.aspx . On my WPF side , i download a textfile using webclient from the localhost server and save it at my wpf app folder
this is how i do it :
protected void DownloadData(string strFileUrlToDownload)
{
WebClient client = new WebClient();
byte[] myDataBuffer = client.DownloadData(strFileUrlToDownload);
MemoryStream storeStream = new MemoryStream();
storeStream.SetLength(myDataBuffer.Length);
storeStream.Write(myDataBuffer, 0 , (int)storeStream.Length);
storeStream.Flush();
string currentpath = System.IO.Directory.GetCurrentDirectory() + #"\Folder";
using (FileStream file = new FileStream(currentpath, FileMode.Create, System.IO.FileAccess.ReadWrite))
{
byte[] bytes = new byte[storeStream.Length];
storeStream.Read(bytes, 0, (int)storeStream.Length);
file.Write(myDataBuffer, 0, (int)storeStream.Length);
storeStream.Close();
}
//The below Getstring method to get data in raw format and manipulate it as per requirement
string download = Encoding.ASCII.GetString(myDataBuffer);
}
This is by writing btyes and saving them . But how do i download multiple image files and save it on my WPF app folder? I have a URL like this localhost/websitename/folder/designs/ , under this URL , there is many images , how do i download all of them ? and save it on WPF app folder?
Basically i want to download the contents of the folder whereby the contents are actually images.
First, the WebClient class already has a method for this. Use something like client.DownloadFile(remoteUrl, localFilePath).
See this link:
WebClient.DownloadFile Method (String, String)
Secondly, you will need to index the files you want to download on the server somehow. You can't just get a directory listing over HTTP and then loop through it. The web server will need to be configured to enable directory listing, or you will need a page to generate a directory listing. Then you will need to download the results of that page as a string using WebClient.DownloadString and parse it. A simple solution would be an aspx page that outputs a plaintext list of files in the directory you want to download.
Finally, in the code you posted you're saving every single file you download as a file named "Folder". You need to generate a unique filename for each file you want to download. When you're looping through the files you want to download, use something like:
string localFilePath = Path.Combine("MyDownloadFolder", imageName);
where imageName is a unique filename (with file extension) for that file.

Upload to PHP server from c sharp client application

Currently i have a c sharp application (Client app). and a web application written php. I want to transfer some files whenever a particular action is performed at client side. Here is the client code to upload the file to php server..
private void button1_Click(object sender, EventArgs e)
{
System.Net.WebClient Client = new System.Net.WebClient();
Client.Headers.Add("Content-Type", "binary/octet-stream");
byte[] result = Client.UploadFile("http://localhost/project1/upload.php", "POST",
#"C:\test\a.jpg");
string s = System.Text.Encoding.UTF8.GetString(result, 0, result.Length);
}
Here is the upload.php file to move the file..
$uploads_dir = './files/'; //Directory to save the file that comes from client application.
foreach ($_FILES["pictures"]["error"] as $key => $error) {
if ($error == UPLOAD_ERR_OK) {
$tmp_name = $_FILES["pictures"]["tmp_name"][$key];
$name = $_FILES["pictures"]["name"][$key];
move_uploaded_file($tmp_name, "$uploads_dir/$name");
}
I'm not getting any errors from above code. but it does not seem to be working. Why is it? Am i missing something?
Your current PHP code is for handling multiple file uploads, but your C# code is only uploading one file.
You need to modify your PHP code somewhat, removing the foreach loop:
<?php
$uploads_dir = './files'; //Directory to save the file that comes from client application.
if ($_FILES["file"]["error"] == UPLOAD_ERR_OK) {
$tmp_name = $_FILES["file"]["tmp_name"];
$name = $_FILES["file"]["name"];
move_uploaded_file($tmp_name, "$uploads_dir/$name");
}
?>
You also need to ensure that the ./files directory exists.
I have tested the above PHP code with your C# code and it worked perfectly.
For more information on handling file uploads, refer to the PHP documentation.
For more information on uploading multiple files with C# and PHP, here are some helpful links:
Upload files with HTTPWebrequest (multipart/form-data)
Use Arrays in HTML Form Variables
PHP: Uploading multiple files
If you want something simple for uploading multiple files, you just just upload one file at a time to upload.php in a C# loop.
Your php code seems right, however you try to access the file using the "picture" key of the $_FILES global. It does not seems to be specified in your csharp code. I don't know how to do it thought. You could try to see how it was named in your php by doing a print_r or vardump of you $_FILE global or using the array_keys function
Regards
Edit: I found this link that could help you to add a "name" to your uploaded file:
http://www.bratched.com/en/home/dotnet/69-uploading-multiple-files-with-c.html

Can't connect to FTP: (553) File name not allowed

I need to FTP a file to a directory. In .Net I have to use a file on the destination folder to create a connection so I manually put Blank.dat on the server using FTP. I checked the access (ls -l) and it is -rw-r--r--. But when I attempt to connect to the FTP folder I get: "The remote server returned an error: (553) File name not allowed" back from the server. The research I have done says that this may arrise from a permissions issue but as I have said I have permissions to view the file and can run ls from the folder. What other reasons could cause this issue and is there a way to connect to the folder without having to specify a file?
byte[] buffer;
Stream reqStream;
FileStream stream;
FtpWebResponse response;
FtpWebRequest request = (FtpWebRequest)FtpWebRequest.Create(new Uri(string.Format("ftp://{0}/{1}", SRV, DIR)));
request.Method = WebRequestMethods.Ftp.UploadFile;
request.Credentials = new NetworkCredential(UID, PASS);
request.UseBinary = true;
request.Timeout = 60000 * 2;
for (int fl = 0; fl < files.Length; fl++)
{
request.KeepAlive = (files.Length != fl);
stream = File.OpenRead(Path.Combine(dir, files[fl]));
reqStream = request.GetRequestStream();
buffer = new byte[4096 * 2];
int nRead = 0;
while ((nRead = stream.Read(buffer, 0, buffer.Length)) != 0)
{
reqStream.Write(buffer, 0, nRead);
}
stream.Close();
reqStream.Close();
response = (FtpWebResponse)request.GetResponse();
response.Close();
}
Although replying to an old post just thought it might help someone.
When you create your ftp url make sure you are not including the default directory for that login.
for example this was the path which I was specifying and i was getting the exception 553 FileName not allowed exception
ftp://myftpip/gold/central_p2/inbound/article_list/jobs/abc.txt
The login which i used had the default directory gold/central_p2.so the supplied url became invalid as it was trying to locate the whole path in the default directory.I amended my url string accordingly and was able to get rid of the exception.
my amended url looked like
ftp://myftpip/inbound/article_list/jobs/abc.txt
Thanks,
Sab
This may help for Linux FTP server.
So, Linux FTP servers unlike IIS don't have common FTP root directory. Instead, when you log on to FTP server under some user's credentials, this user's root directory is used. So FTP directory hierarchy starts from /root/ for root user and from /home/username for others.
So, if you need to query a file not relative to user account home directory, but relative to file system root, add an extra / after server name. Resulting URL will look like:
ftp://servername.net//var/lalala
You must be careful with names and paths:
string FTP_Server = #"ftp://ftp.computersoft.com//JohnSmith/";
string myFile="text1.txt";
string myDir=#"D:/Texts/Temp/";
if you are sending to ftp.computersoft.com/JohnSmith a file caled text1.txt located at d:/texts/temp
then
FtpWebRequest request = (FtpWebRequest)WebRequest.Create(FTP_Server+myFile);
request.Method = WebRequestMethods.Ftp.UploadFile;
request.Credentials = new NetworkCredential(FTP_User, FTP_Password);
StreamReader sourceStream = new StreamReader(TempDir+myFile);
byte[] fileContents = Encoding.UTF8.GetBytes(sourceStream.ReadToEnd());
sourceStream.Close();
Stream requestStream = request.GetRequestStream();
requestStream.Write(fileContents, 0, fileContents.Length);
requestStream.Close();
notice that at one moment you use as destination
ftp://ftp.computersoft.com//JohnSmith/text1.txt
which contains not only directory but the new file name at FTP server as well (which in general can be different than the name of file on you hard drive)
and at other place you use as source
D:/Texts/Temp/text1.txt
your directory has access limit.
delete your directory and then create again with this code:
//create folder
FtpWebRequest request = (FtpWebRequest)FtpWebRequest.Create("ftp://mpy1.vvs.ir/Subs/sub2");
request.Method = WebRequestMethods.Ftp.MakeDirectory;
request.Credentials = new NetworkCredential(username, password);
request.UsePassive = true;
request.UseBinary = true;
request.KeepAlive = true ;
using (var resp = (FtpWebResponse)request.GetResponse())
{
}
I saw something similar to this a while back, it turned out to be the fact that I was trying to connect to an internal iis ftp server that was secured using Active Directory.
In my network credentials I was using new NetworkCredential(#"domain\user", "password"); and that was failing. Switching to new NetworkCredential("user", "password", "domain"); worked for me.
I hope this will be helpful for someone
if you are using LINUX server, replace your request path from
FtpWebRequest req= (FtpWebRequest)WebRequest.Create(#"ftp://yourdomain.com//yourpath/" + filename);
to
FtpWebRequest req= (FtpWebRequest)WebRequest.Create(#"ftp://yourdomain.com//public_html/folderpath/" + filename);
the folder path is how you see in the server(ex: cpanel)
Although it's an older post I thought it would be good to share my experience. I came faced the same problem however I solved it myself. The main 2 reasons of this problem is Error in path (if your permissions are correct) happens when your ftp path is wrong. Without seeing your path it is impossible to say what's wrong but one must remember the things
a. Unlike browser FTP doesn't accept some special characters like ~
b. If you have several user accounts under same IP, do not include username or the word "home" in path
c. Don't forget to include "public_html" in the path (normally you need to access the contents of public_html only) otherwise you may end up in a bottomless pit
Another reason for this error could be that the FTP server is case sensitive. That took me good while to figure out.
I had this problem when I tried to write to the FTP site's root directory pro grammatically. When I repeated the operation manually, the FTP automatically rerouted me to a sub-directory and completed the write operation. I then changed my code to prefix the target filename with the sub-directory and the operation was successful.
Mine was as simple as a file name collision. A previous file hadn't been sent to an archive folder so we tried to send it again. Got the 553 because it wouldn't overwrite the existing file.
Check disk space on the remote server first.
I had the same issue and found out it was because the remote server i was attempting to upload files to had run out of disk space on the partition or filessytem.
To whom it concerns...
I've been stuck for a long time with this problem.
I tried to upload a file onto my web server like that:
Say that my domain is www.mydomain.com.
I wanted to upload to a subdomain which is order.mydomain.com, so I've used:
FtpWebRequest ftpReq = (FtpWebRequest)WebRequest.Create($#"ftp://order.mydomain.com//uploads/{FileName}");
After many tries and getting this 553 error, I found out that I must make the FTP request refer to the main domain not the sub domain and to include the subdomain as a subfolder (which is normally created when creating subdomains).
As I've created my subdomain subfolder out of the public_html (at the root), so I've changed the FTP Request to:
FtpWebRequest ftpReq = (FtpWebRequest)WebRequest.Create($#"ftp://www.mydomain.com//order.mydomain.com//uploads/{FileName}");
And it finally worked.

WebClient.DownloadFile vs. WebClient.DownloadData

I am using WebClient.DownloadFile to download a small executable file from the internet. This method is working very well. However, I would now like to download this executable file into a byte array rather than onto my hard drive. I did some reading and came across the WebClient.DownloadData method. The problem that I am having with the downloadData method is that rather than downloading my file, my code is downloading the HTML data behind my file's download page.
I have tried using dozens of sites - each brings me the same issue. Below is the code I am using.
// Create a new instance of the System.Net 'WebClient'
System.Net.WebClient client = new System.Net.WebClient();
// Download URL
Uri uri = new Uri("http://www35.multiupload.com:81/files/4D7B4D2BFC3F1A9F765A433BA32ED2C5883D0CE133154A0FDB7E7786547A3165DA62393141C4AF8FF36C75222566CF3EB64AF6FBCFC02099BB209C891529CF7B90C83D9C63D39D989CBB8ECE6DE2B83B/Project1.exe");
byte[] dbytes = client.DownloadData(uri);
MessageBox.Show(dbytes.Length.ToString()); // Not the size of my file
Keep in mind that I am attempting to download the data of an executable file into a byte array.
Thank you for any help,
Evan
You are attempting to download a file using an expired token url. See below:
URL: http://www35.multiupload.com:81/files/4D7B4D2BFC3F1A9F765A433BA32ED2C5883D0CE133154A0FDB7E7786547A3165DA62393141C4AF8FF36C75222566CF3EB64AF6FBCFC02099BB209C891529CF7B90C83D9C63D39D989CBB8ECE6DE2B83B/Project1.exe`
Server: www35
Token:
4D7B4D2BFC3F1A9F765A433BA32ED2C5883D0CE133154A0FDB7E7786547A3165DA62393141C4AF8FF36C75222566CF3EB64AF6FBCFC02099BB209C891529CF7B90C83D9C63D39D989CBB8ECE6DE2B83B
You can't just download a file by waiting for the timer to end, and copy the direct link, it's a "token" link. It will only work for a specified period of time before redirecting you back to the download page (which is why you are getting HTML instead of binary data).
Workaround
You will have to download the multiupload's HTML and parse the direct download link from the HTML source code. Only this way provides a sure-fire way of getting an up-to-date token url.
How #Dark Slipstream said, you're attempting to download a file using an expired token url
look how get the new url:
System.Net.WebClient client = new System.Net.WebClient();
// Download URL
Uri uri = new Uri("http://www.multiupload.com/39QMACX7XS");
byte[] dbytes = client.DownloadData(uri);
string responseStr = System.Text.Encoding.ASCII.GetString(dbytes);
HtmlAgilityPack.HtmlDocument doc = new HtmlAgilityPack.HtmlDocument();
doc.LoadHtml(responseStr);
string urlToDownload = doc.DocumentNode.SelectNodes("//a[contains(#href,'files/')]")[0].Attributes["href"].Value;
byte[] data = client.DownloadData(uri);
length = data.Length;
I dont parsing the exceptions

Help Needed for parsing FTP files list in c#

I am using this code for getting list of all the files in directory
here webRequestUrl = something.com/directory/
FtpWebRequest fwrr = (FtpWebRequest)FtpWebRequest.Create(new Uri("ftp://" + webRequestUrl));
fwrr.Credentials = new NetworkCredential(username, password);
fwrr.Method = WebRequestMethods.Ftp.ListDirectoryDetails;
StreamReader srr = new StreamReader(fwrr.GetResponse().GetResponseStream());
string str = srr.ReadLine();
ArrayList strList = new ArrayList();
while (str != null)
{
strList.Add(str);
str = srr.ReadLine();
}
but I am not getting the list of files, but getting some HTML document type lines.
This ftp server is windows based while it is working fine in unix server.
Please help.
Thanks.
It works for me when the FTP on a internal machine and I do a ftp://192.168.0.155 - If I try that in IE I get the same HTML result like yours.
I doubt if its happening because of the url. Can you try replacing the url with the IP address (just a wild guess). Even if you are getting HTML, you can strip the unnecessary part and parse the files.
I even tried with a ftp://sub.a.com/somefolder and it worked for me. It seems the browser wraps the HTML around the FTP response because I get different HTML when I opened the FTP site in IE and Chrome.

Categories