I have tried to implement a REST WCF in order to explore difference between PUT and POST verb. I have uploded a file in a location using the service.
The service implementation is as folowing:
[OperationContract]
[WebInvoke(UriTemplate = "/UploadFile", Method = "POST")]
void UploadFile(Stream fileContents);
public void UploadFile(Stream fileContents)
{
byte[] buffer = new byte[32768];
MemoryStream ms = new MemoryStream();
int bytesRead, totalBytesRead = 0;
do
{
bytesRead = fileContents.Read(buffer, 0, buffer.Length);
totalBytesRead += bytesRead;
ms.Write(buffer, 0, bytesRead);
} while (bytesRead > 0);
using (FileStream fs = File.OpenWrite(#"C:\temp\test.txt"))
{
ms.WriteTo(fs);
}
ms.Close();
}
Client code is as following:
HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create("http://localhost:1922 /EMPRESTService.svc/UploadFile");
request.Method = "POST";
request.ContentType = "text/plain";
byte[] fileToSend = File.ReadAllBytes(#"C:\TEMP\log.txt"); // txtFileName contains the name of the file to upload.
request.ContentLength = fileToSend.Length;
using (Stream requestStream = request.GetRequestStream())
{
// Send the file as body request.
requestStream.Write(fileToSend, 0, fileToSend.Length);
//requestStream.Close();
}
using (HttpWebResponse response = (HttpWebResponse)request.GetResponse())
Console.WriteLine("HTTP/{0} {1} {2}", response.ProtocolVersion, (int)response.StatusCode, response.StatusDescription);
Console.ReadLine();
The file is being uploaded and the response status code is being returned as "200 OK". The satus code is same in case of existance or non-existance of the file in the upload location.
I have changed the REST verb to PUT and the status code is same as above.
Could anybody explain, how I can identify the differences between the verbs in this context? I couldn't able to simulate generating continious request fron client code. If the behaviour will differ on doing so, could anybody help me in modifying the client code in ordrr to send continious request in a row ?
POST verb is used when are you creating a new resource (a file in your case) and repeated operations would create multiple resources on the server. This verb would make sense if uploading a file with the same name multiple times creates multiple files on the server.
PUT verb is used when you are updating an existing resource or creating a new resource with a predefined id. Multiple operations would recreate or update the same resource on the server. This verb would make sense if uploading a file with the same name for the second, third... time would overwrite the previously uploaded file.
Related
I have a small wav sound file in which I want to get the text of it, so I used Azure speech to text API to test it.
first thing I convert the audio file as they recommended in their documentation to PCM - Mono -16K sample rate.
and I use this code in c# in the documentation example here to upload the file and get the result.
HttpWebRequest request = null;
request = (HttpWebRequest)HttpWebRequest.Create("https://speech.platform.bing.com/speech/recognition/interactive/cognitiveservices/v1?language=en-US&format=detailed");
request.SendChunked = true;
request.Accept = #"application/json;text/xml";
request.Method = "POST";
request.ProtocolVersion = HttpVersion.Version11;
request.ContentType = #"audio/wav; codec=audio/pcm; samplerate=16000";
request.Headers["Ocp-Apim-Subscription-Key"] = "my key";
// Send an audio file by 1024 byte chunks
using (FileStream fs = new FileStream("D:/b.wav", FileMode.Open, FileAccess.Read))
{
/*
* Open a request stream and write 1024 byte chunks in the stream one at a time.
*/
byte[] buffer = null;
int bytesRead = 0;
using (Stream requestStream = request.GetRequestStream())
{
/*
* Read 1024 raw bytes from the input audio file.
*/
buffer = new Byte[checked((uint)Math.Min(1024, (int)fs.Length))];
while ((bytesRead = fs.Read(buffer, 0, buffer.Length)) != 0)
{
requestStream.Write(buffer, 0, bytesRead);
}
// Flush
requestStream.Flush();
}
}
string responseString;
Console.WriteLine("Response:");
using (WebResponse response = request.GetResponse())
{
Console.WriteLine(((HttpWebResponse)response).StatusCode);
using (StreamReader sr = new StreamReader(response.GetResponseStream()))
{
responseString = sr.ReadToEnd();
}
Console.WriteLine(responseString);
Console.ReadLine();
}
also i tried using cUrl tool and also write it in java as i was thought that maybe it's problem with the programming language I use that i not upload the file correctly.
this the link of the sound file i want to convert it to text here.
so Now i need to help to figure it out if the problem comes from the format of the sound file? or from maybe code that i not upload it correctly? or it's from the API I mean to be not accurate enough?
i tried IBM speech to text and it got all the text with no problem.
iam using now the free trial of Azure speech to text API and I want to figure where the problem comes if anyone has experience with this to see if I will work with this API or not.
Update
I want to clear that iam not got any error i just got incomplete result to my sound file I upload, for example the sound file i upload he said at the end of the sound "What is up with that", the result i got from Azure is just the first sentence only which is "I say that like it's a bad thing.", also I upload another sound file which contains the "What is up with that" only check it here,and it just gives me an empty result like this.
{"RecognitionStatus":"NoMatch","Offset":17300000,"Duration":0}
so all that i want to know if this normal from the Speech to text API Azure or the problem with my code or from the sound file? this what i want to get an answer with it.
when i test another API on those files it worked like IBM for example.
Thanks in advance.
I am working on an ASP.NET framework 2.0 application. On a particular page I am providing a link to user. By clicking on this link a window opens with another aspx page. This page actually sends http request to a third-party url which points to a file(like - mirror urls to download file from cloud). The http response is sent back to user on the very first page using response.write from where user click the link.
Now, the problem I am facing is if the file size is low then it works fine. But, if the file is large (i.e., more than 1 GB), then my application waits until whole file is downloaded from the URL. I have tried using response.flush() to send chunk by chunk data to user, but still user is unable to use application because the worker process is busy getting streams of data from third party URL.
Is there any way by which large files can be downloaded asynchronously so that my pop-up window finishes its execution(download will be in progress) and also user can do other activities on application parallely.
Thanks,
Suvodeep
Use WebClient to read the remote file. Instead of downloading you can take the Stream from the WebClient. Put that in while() loop and push the bytes from the WebClient stream in the Response stream. On this way, you will be async downloading and uploading at the same time.
HttpRequest example:
private void WriteFileInDownloadDirectly()
{
//Create a stream for the file
Stream stream = null;
//This controls how many bytes to read at a time and send to the client
int bytesToRead = 10000;
// Buffer to read bytes in chunk size specified above
byte[] buffer = new byte[bytesToRead];
// The number of bytes read
try
{
//Create a WebRequest to get the file
HttpWebRequest fileReq = (HttpWebRequest)HttpWebRequest.Create("Remote File URL");
//Create a response for this request
HttpWebResponse fileResp = (HttpWebResponse)fileReq.GetResponse();
if (fileReq.ContentLength > 0)
fileResp.ContentLength = fileReq.ContentLength;
//Get the Stream returned from the response
stream = fileResp.GetResponseStream();
// prepare the response to the client. resp is the client Response
var resp = HttpContext.Current.Response;
//Indicate the type of data being sent
resp.ContentType = "application/octet-stream";
//Name the file
resp.AddHeader("Content-Disposition", $"attachment; filename=\"{ Path.GetFileName("Local File Path - can be fake") }\"");
resp.AddHeader("Content-Length", fileResp.ContentLength.ToString());
int length;
do
{
// Verify that the client is connected.
if (resp.IsClientConnected)
{
// Read data into the buffer.
length = stream.Read(buffer, 0, bytesToRead);
// and write it out to the response's output stream
resp.OutputStream.Write(buffer, 0, length);
// Flush the data
resp.Flush();
//Clear the buffer
buffer = new byte[bytesToRead];
}
else
{
// cancel the download if client has disconnected
length = -1;
}
} while (length > 0); //Repeat until no data is read
}
finally
{
if (stream != null)
{
//Close the input stream
stream.Close();
}
}
}
WebClient Stream reading:
using (WebClient client = new WebClient())
{
Stream largeFileStream = client.OpenRead("My Address");
}
I am trying to upload a file on FTP folder, but getting the following error.
The remote server returned an error: (550) File unavailable (e.g.,
file not found, no access)
I am using the following sample to test this:
// Get the object used to communicate with the server.
string path = HttpUtility.UrlEncode("ftp://host:port//01-03-2017/John, Doe S. M.D/file.wav");
FtpWebRequest request = (FtpWebRequest)WebRequest.Create(path);
request.Method = WebRequestMethods.Ftp.UploadFile;
// This example assumes the FTP site uses anonymous logon.
request.Credentials = new NetworkCredential("user", "password");
// Copy the contents of the file to the request stream.
StreamReader sourceStream = new StreamReader(#"localpath\example.wav");
byte[] fileContents = Encoding.UTF8.GetBytes(sourceStream.ReadToEnd());
sourceStream.Close();
request.ContentLength = fileContents.Length;
Stream requestStream = request.GetRequestStream();
requestStream.Write(fileContents, 0, fileContents.Length);
requestStream.Close();
FtpWebResponse response = (FtpWebResponse)request.GetResponse();
Console.WriteLine("Upload File Complete, status {0}", response.StatusDescription);
response.Close();
I am able to upload files on the parent folder 01-03-2017 but not in the target folder ROLLINS, SETH S. M.D which clearly has special characters in it.
I am able to upload files using FileZilla
I have tried to HttpUtility.UrlEncode but that did n't help
Thanks for your time and help.
You need to encode the spaces (and maybe commas) in the URL path, like:
string path =
"ftp://host:port/01-03-2017/" +
HttpUtility.UrlEncode("John, Doe S. M.D") + "/file.wav";
Effectively, you get:
ftp://host:port/01-03-2017/John%2c+Doe+S.+M.D/file.wav
Use something like this:
string path = HttpUtility.UrlEncode("ftp://96.31.95.118:2121//01-03-2017//ROLLINS, SETH S. M.D//30542_3117.wav");
or You can form a Uri using the following code and pass it webrequest.
var path = new Uri("ftp://96.31.95.118:2121//01-03-2017//ROLLINS, SETH S. M.D//30542_3117.wav");
The code works on a C# console application but did not work in Web Api Action. I could not manage to find the reason.
So I have used a free library for the same.
Posting the sample code from one of the examples available here:
So i have used FluentFtp libary available through Nuget.
using System;
using System.IO;
using System.Net;
using FluentFTP;
namespace Examples {
public class OpenWriteExample {
public static void OpenWrite() {
using (FtpClient conn = new FtpClient()) {
conn.Host = "localhost";
conn.Credentials = new NetworkCredential("ftptest", "ftptest");
using (Stream ostream = conn.OpenWrite("01-03-2017/John, Doe S. M.D/file.wav")) {
try {
// istream.Position is incremented accordingly to the writes you perform
}
finally {
ostream.Close();
}
}
}
}
}
}
Again, if the file is a binary file, StreamReader should not be used as explained here.
I have a windows service which is uploading files to the other website which is processing them. The problem is that with small files it's working fine and it's getting response from there, but with large files (about 6 minute to process) it leaves forever in a waiting mode.
Here is the part of external website post method code:
try
{
...
LogResults();
return string.Empty;
}
catch (Exception e)
{
return e.Message;
}
The problem is that I can see logs even for large files, so it means that website always returning value, but for large files my windows service doesn't wait for them.
And here is the code from windows service
var valuesp = new NameValueCollection
{
{ "AccountId", datafeed.AccountId }
};
byte[] resultp = UploadHelper.UploadFiles(url, uploadFiles, valuesp);
response = Encoding.Default.GetString(resultp);
UploadFiles method returns value for small files, but waiting forever for large ones.
Here is complete code of UploadFiles
public static byte[] UploadFiles(string address, IEnumerable<UploadFile> files, NameValueCollection values)
{
var request = WebRequest.Create(address);
request.Timeout = System.Threading.Timeout.Infinite; //3600000; // 60 minutes
request.Method = "POST";
var boundary = "---------------------------" +
DateTime.Now.Ticks.ToString("x", NumberFormatInfo.InvariantInfo);
request.ContentType = "multipart/form-data; boundary=" + boundary;
boundary = "--" + boundary;
using (var requestStream = request.GetRequestStream())
{
// Write the values
if (values != null)
{
foreach (string name in values.Keys)
{
var buffer = Encoding.ASCII.GetBytes(boundary + Environment.NewLine);
requestStream.Write(buffer, 0, buffer.Length);
buffer =
Encoding.ASCII.GetBytes(string.Format("Content-Disposition: form-data; name=\"{0}\"{1}{1}", name,
Environment.NewLine));
requestStream.Write(buffer, 0, buffer.Length);
buffer = Encoding.UTF8.GetBytes(values[name] + Environment.NewLine);
requestStream.Write(buffer, 0, buffer.Length);
}
}
// Write the files
if (files != null)
{
foreach (var file in files)
{
var buffer = Encoding.ASCII.GetBytes(boundary + Environment.NewLine);
requestStream.Write(buffer, 0, buffer.Length);
buffer =
Encoding.UTF8.GetBytes(
string.Format("Content-Disposition: form-data; name=\"{0}\"; filename=\"{1}\"{2}", file.Name,
file.Filename, Environment.NewLine));
requestStream.Write(buffer, 0, buffer.Length);
buffer =
Encoding.ASCII.GetBytes(string.Format("Content-Type: {0}{1}{1}", file.ContentType,
Environment.NewLine));
requestStream.Write(buffer, 0, buffer.Length);
requestStream.Write(file.Stream, 0, file.Stream.Length);
buffer = Encoding.ASCII.GetBytes(Environment.NewLine);
requestStream.Write(buffer, 0, buffer.Length);
}
}
var boundaryBuffer = Encoding.ASCII.GetBytes(boundary + "--");
requestStream.Write(boundaryBuffer, 0, boundaryBuffer.Length);
}
using (var response = request.GetResponse())
using (var responseStream = response.GetResponseStream())
using (var stream = new MemoryStream())
{
responseStream.CopyTo(stream);
return stream.ToArray();
}
}
What I'm doing wrong here?
EDIT: Locally it's working even for 7-8 minutes processing. But in live environment doesn't. Can it be related with main app IIS settings? Can it be related with windows service server settings?
EDIT 2: Remote server web.config httpRuntime settings
<httpRuntime enableVersionHeader="false" maxRequestLength="300000" executionTimeout="12000" targetFramework="4.5" />
The problem was with Azure Load Balancer, which has Idle Timeout set to 4 minutes by default. The new post of Windows Azure blog says that this timeout is now configurable to the value between 4-30 minutes
http://azure.microsoft.com/blog/2014/08/14/new-configurable-idle-timeout-for-azure-load-balancer/
However, the problem was solved with sending additional KeepAlive bytes via TCP, which told Load Balancer to not kill requests.
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(address);
...
request.Proxy = null;
request.ServicePoint.SetTcpKeepAlive(true, 30000, 5000); //after 30 seconds, each 5 second
Setting Proxy to null (not using proxy) is mandatory action here, because otherwise proxy will not pass tcp bytes to the server.
Upload your files with a method that comply with RFC1867
See this
And then :
UploadFile[] files = new UploadFile[]
{
new UploadFile(fileName1),
new UploadFile(fileName2)
};
NameValueCollection form = new NameValueCollection();
form["name1"] = "value1";
form["name2"] = "xyzzy";
string response = UploadHelper.Upload(url, files, form);
That's all folks!
EDIT :
I use the method above to upload files with over 100MB in size, I don't use ASP at all, it works just perfect!
I had a similar issue where upload would work locally but timeout on server. There are 2 things at play here - I'm assuming you're talking about IIS7+. If not, let me know and I'll gladly delete my answer.
So first, there's ASP.NET which looks at this setting (under system.web):
<!-- maxRequestLength is in KB - 10 MB here. This is needed by ASP.NET. Must match with maxAllowedContentLength under system.webserver/security/requestLimits -->
<httpRuntime targetFramework="4.5" maxRequestLength="1048576" />
Then there's IIS:
<system.webServer>
<security>
<requestFiltering>
<!-- maxAllowedContentLength in bytes - 10MB -->
<requestLimits maxAllowedContentLength="10485760" />
</requestFiltering>
</security>
<system.webServer>
You need both of these settings to be present and the limit to match for things to work. Notice that one is in KB and the other one is in bytes.
Have you checked the IIS file size limit on the remote server? It defaults to 30MB, so if the files you are trying to upload are larger than that, it will fail. here's how to change the upload limit (IIS7): http://www.web-site-scripts.com/knowledge-base/article/AA-00696/0/Increasing-maximum-allowed-size-for-uploads-on-IIS7.html
You can achieve this task with AsyncCallback technique.
Whats AsyncCallback?
When the async method finish the processing, AsyncCallback method is automatically called, where post processing stmts can be executed. With this technique there is no need to poll or wait for the async thread to complete.
you can find more details from this link
AsynCallback
Maybe it is caused by the IIS upload limit?
In IIS7 the standard value is 30MB. See MSDN
EDIT1:
Please be aware that if you are uploading multiple files in 1 request the size adds up.
EDIT2:
In all MS examples there is always only one Stream.Write().
In MSDN it is stated: After the Stream object has been returned, you can send data with the HttpWebRequest by using the Stream.Write method. My interpretation of this sentence would be that you should call Write() only once. Put all data you want to send into the buffer and call the Write() afterwards.
you should change this executionTimeout="12000" to executionTimeout="360000" wich means changing execution timeout from 2 minutes to 6 minutes.
I am referring to this article to understand file downloads using C#.
Code uses traditional method to read Stream like
((bytesSize = strResponse.Read(downBuffer, 0, downBuffer.Length)) > 0
How can I divide a file to be downloaded into multiple segments, so that I can download separate segments in parallel and merge them?
using (WebClient wcDownload = new WebClient())
{
try
{
// Create a request to the file we are downloading
webRequest = (HttpWebRequest)WebRequest.Create(txtUrl.Text);
// Set default authentication for retrieving the file
webRequest.Credentials = CredentialCache.DefaultCredentials;
// Retrieve the response from the server
webResponse = (HttpWebResponse)webRequest.GetResponse();
// Ask the server for the file size and store it
Int64 fileSize = webResponse.ContentLength;
// Open the URL for download
strResponse = wcDownload.OpenRead(txtUrl.Text);
// Create a new file stream where we will be saving the data (local drive)
strLocal = new FileStream(txtPath.Text, FileMode.Create, FileAccess.Write, FileShare.None);
// It will store the current number of bytes we retrieved from the server
int bytesSize = 0;
// A buffer for storing and writing the data retrieved from the server
byte[] downBuffer = new byte[2048];
// Loop through the buffer until the buffer is empty
while ((bytesSize = strResponse.Read(downBuffer, 0, downBuffer.Length)) > 0)
{
// Write the data from the buffer to the local hard drive
strLocal.Write(downBuffer, 0, bytesSize);
// Invoke the method that updates the form's label and progress bar
this.Invoke(new UpdateProgessCallback(this.UpdateProgress), new object[] { strLocal.Length, fileSize });
}
}
you need several threads to accomplish that.
first you start the first download thread, creating a webclient and getting the file size. then you can start several new thread, which add a download range header.
you need a logic which takes care about the downloaded parts, and creates new download parts when one finished.
http://msdn.microsoft.com/de-de/library/system.net.httpwebrequest.addrange.aspx
I noticed that the WebClient implementation has sometimes a strange behaviour, so I still recommend implementing an own HTTP client if you really want to write a "big" download program.
ps: thanks to user svick