I have an application that deploys game data files to different gaming consoles. If matching files on the users machine and the console have identical size and dates, they must not be re-deployed.
On Xbox, this is easily accomplished because an XDK library used to upload files on the console allows me to set the date on the uploaded files to match the dates on the user's machine.
On Ps3 however, I use an FTP service running on the console. I use WebClient.UploadFileAsync to upload files to the console. However, I cannot figure out how I can set the uploaded file's date timestamp, leaving me with only the file size to determine identical files which is unsafe.
I was wondering if there was a way to set a file's date timestamp through the WebClient interface?
I don't think you can use the WebClient interface for this.
There seem to be various non-standard FTP extension commands implemented by some FTP servers to support the setting of a file's last modified time. The ones I know about are:
MDTM - This is the standard command for getting the a file's last modification time (as used by GetDateTimestamp()). Some servers support a set operation by specifying a timestamp argument to the command. as well as a filename.
MFMT - This was defined in an IETF experimental draft MFMT, to standardise this operation and avoid the non-standard use of the MDTM command described above.
SITE UTIME
If the FTP server running on the PS3 supports any of these extensions (check the result of the FEAT command), then you could use a simple socket FTP connection to issue the appropriate command to the server, after uploading the file.
WebClient will hand off ftp connections to FtpWebRequest. If you use FtpWebRequest directly you can send FTP commands to the server. The commands that are supported are defined as fields of WebRequestMethods.Ftp. One of those commands is GetDateTimestamp.
So if you construct an FtpWebRequest manually (instead of through WebClient) and send either the GateDateTimestamp or the ListDirectoryDetails command, you should be able to get the timestamp of the target file.
Related
I have successfully established a connection to a remote Linux server using the SSH.NET package with the following code (I am using a ShellStream because I have to use sudo su):
using (var client = new SshClient(server, username, password))
{
client.Connect();
List<string> commands = new List<string>();
commands.Add("sudo su - user");
commands.Add("vi test.properties");
ShellStream shellStream = client.CreateShellStream("xterm", 80, 24, 800, 600, 1024);
// Execute commands under root account
foreach (string command in commands) {
WriteStream(command, shellStream);
}
client.Disconnect();
}
private static void WriteStream(string cmd, ShellStream stream)
{
stream.WriteLine(cmd + "; echo this-is-the-end");
while (stream.Length == 0)
Thread.Sleep(500);
}
I am trying to edit the test.properties file that is in the remote Linux server using a C# function that I created.
My code (using C# in Visual Studio) that is used to modify the text file is uses System.IO.File.ReadAllText, but it does not recognize the path of the remote server, for example:
The text file in the Linux server is in this location: /home/user/test.properties, so I am using this in my code:
System.IO.File.ReadAllText("/home/user/test.properties")
I am getting the following error:
Could not find a part of the path 'C:\home\user\test.properties'
For some reason it tries to look in my local file system instead of the remote server.
Is there a different approach I should be taking?
Thanks in advance!
In general, to modify remote files, use SFTP. In SSH.NET that's what SftpClient is for.
Though as you seem to need to use elevated privileges (su) – an important factor that your question title fails to mention – it's way more difficult. The right solution is to avoid the need for su. See somewhat related:
Allowing automatic command execution as root on Linux using SSH.
Another option would be to try to execute the SFTP server under su. Though that would require modification of SSH.NET code. See related Java question:
Using JSch to SFTP when one must also switch user
If you want to keep your current shell approach with su, you are stuck with simulating shell commands. Note that connecting to SSH server won't make other .NET classes (like the File) magically be able to work with remote files (even if SFTP was possible, let only when it is not, due to the su requirement).
The easiest way to read remote file using shell is using the cat command:
cat /home/user/test.properties
Alright, so after finishing the task, this is what I did since asking the question here:
1.After your(#Martin Prikryl) response I've tried using a combination of SSH and WinSCP:
WinSCP to download the file.
.NET to modify the locally downloaded file.
WinSCP to upload the file(and deleting it from the local folder afterwards).
SSH to move the file to its appropriate location in the server.
I discarded this solution because it worked pretty well in the lower environment,
but in the production I had permissions issues so I couldn't even download it, let alone that it might be a security issue(its a sensitive file).
2.My next solution was using only SSH to simulate shell commands, as you previously mentioned, I was limited to that because I was stuck using sudo su.
I connected to the server with SSH and used the 'sed' command to only show lines that contain specific words(instead of using cat to get the whole file).
I then used my .NET code to pull the values that I needed for my GET operation
For the POST operation I used 'sed' again to replace lines.
I am using Telerik Kendo File Upload for uploading folder.
In Production environment, few users are complaining issue with Folder Upload, during upload few files get errored out, using Developer tool in the console tab it logs "ERR_HTTP2_PROTOCOL_ERROR" error as attached for the failed files.
When i am trying i am not getting this error and all folders are getting uploaded properly. I asked user to share the files for which they are facing error and when i tried it uploaded successfully. When user tried again uploading same files which errored out it got succeeded today which were failing yesterday but sill there are files which is giving the same error.
I went through a post where it say the problem could be due to use of HTTP/2 and when they switched to HTTP /1.1 it worked fine. We are also using HTTP/2 but we don't have option of going back to HTTP/1.1. Link below :
https://www.telerik.com/forums/problems-with-multi-file-upload-and-http-2
Any suggestions ?
This is because on your clients machine http/2 is not enabled thus the error prompts.
If you look in your local machine you will see that under your server, you have Https protocol enabled and a valid certificate.
Your clients either lack a valid certificate on the server or are using the site through Http protocol.
you can learn more here:
Http/2 explanation
SETTINGS_MAX_CONCURRENT_STREAMS (0x3):
Indicates the maximum number of concurrent streams that the sender will allow. This limit is directional: it applies to the number of streams that the sender permits the receiver to create. Initially, there is no limit to this value. It is recommended that this value be no smaller than 100, so as to not unnecessarily limit parallelism.
A value of 0 for SETTINGS_MAX_CONCURRENT_STREAMS SHOULD NOT be treated as special by endpoints. A zero value does prevent the creation of new streams; however, this can also happen for any limit that is exhausted with active streams. Servers SHOULD only set a zero value for short durations; if a server does not wish to accept requests, closing the connection is more appropriate.
Resolution : : Add “Http2MaxConcurrentClientStreams” under HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\services\HTTP\Parameters
In Registry and restart server.
Set this value to 100 or >100
I am currently uploading files to a unix server using SFTP using Renci SSH.NET, and it works fine. However, I now would like to upload files to a "Symlink container", and then create a symlink in another directory pointing to these files. Is that possible? I haven't found a class to manage symlinks, so how can this be achieved?
Use the SftpClient.SymbolicLink method.
public void SymbolicLink(string path, string linkPath)
Note that the meaning of the paths is a mess. The most common SFTP server, the OpenSSH, is buggy and uses the paths in a wrong order. Many other SFTP servers follow the bug for a compatibility. But not all. So you have to test which order is used by your servers.
See
https://bugzilla.mindrot.org/show_bug.cgi?id=861
http://bugs.proftpd.org/show_bug.cgi?id=4080
I have worked with SSH.net and I don't have found something about symlinks. However I found that SSH.net supports custom commands therefor you can write the Linux command (I think that it's ln) to create the symblink.
In this url you can get an example about how to use a custom command with SSH How to run several commands with SSH.Net?
Maybe if you need to create a lot symlinks, you can create a class or you can download the source code and add a new method.
I hope this can help you.
I'm using FileUpload.SaveAs() function of C# to upload files to the server but I want to save the files on another partition. Let us say, save the files on Drive D of the server instead on the current drive which is Drive C. Please share your thoughts. Thanks is advance.
I have learned that using full path such as
FileUpload.SaveAs("D:\FileUpload");
will save the file outside the web server.
Check this out.
To simplify the question, how can I upload files on the other partition of the server that hosts my web app?
Based on the documentation from http://msdn.microsoft.com/en-us/library/system.web.ui.webcontrols.fileupload.saveas.aspx, the String filename is the full path name of the location to save. Meaning you should be able to do so e.g:
FileUpload.SaveAs("D:\where_you_want_to_save")
By the way what have you tried and what error did you get?
Looking at the example on MSDN, it would appear that .SaveAs() accepts a fully qualified file name as a parameter. You could potentially use a Path object to cleanly build a path for the file, or just specify one directly as a string:
uploader.SaveAs("d:\\someFolder\\someFile.ext");
Resolved this by using Virtual Directory of IIS and providing admin credentials for authentication
The problem:
My company puts out a monthly newsletter which I host on our internal website. I have a page for the author of the newsletter to upload the latest version. Once the author has uploaded the latest newsletter, he sends a broadcast email to announce the new newsletter. Employees invariably check the new newsletter and send feedback to the author with corrections that need to be made.
Once the author has made the necessary corrections (typically within an hour of sending the broadcast email), he revisits my page and replaces the latest version with the updated newsletter.
Immediately following the replacement (or update, if you will) of the newsletter, anyone attempting to access it gets a 500 - Internal Server Error.
My IT guy who maintains the server cannot delete/rename/move the file because of a permissions error and has to do a lot of convoluted things to get the file deleted (and once the file is deleted, the author of the newsletter can re-upload the corrected copy and it works fine.
My IT guy and I are pretty sure that the problem stems from that I'm trying to replace the file while IIS is actively serving it to users (which I thought of and thought that I had coded against happening).
The code that runs the replacement is as follows:
Protected Sub ReplaceLatestNewsletter()
Dim dr As DataRow
Dim sFile As String
Dim mFileLock As Mutex
Try
If Me.Archives.Rows.Count > 0 Then
dr = Me.Archives.Rows(0)
sFile = dr("File").ToString
If dr("Path").ToString.Length > 0 Then
mFileLock = New Mutex(True, "MyMutexToPreventReadsOnOverwrite")
Try
mFileLock.WaitOne()
System.IO.File.Delete(dr("Path").ToString)
Catch ex As Exception
lblErrs.Text = ex.ToString
Finally
mFileLock.ReleaseMutex()
End Try
End If
fuNewsletter.PostedFile.SaveAs(Server.MapPath("~/Newsletter/archives/" & sFile))
End If
Catch ex As Exception
lblErrs.Text = ex.ToString
End Try
dr = Nothing
sFile = Nothing
mFileLock = Nothing
End Sub
I thought the Mutex would take care of this (although after re-reading documentation I'm not sure I can actually use it like I'm trying to). Other comments on the code above:
Me.Archives is a DataTable stored in ViewState
dr("File").ToString is the filename (no path)
dr("Path").ToString is the full local machine path and filename (i.e., 'C:\App_Root\Newsletters\archives\20120214.pdf')
The filenames of the newsletters are set to "YYYYMMDD.pdf" where YYYYMMDD is the date (formatted) of the upload.
In any case, I'm pretty sure that the code above is not establishing an exclusive lock on the file so that the file can be overwritten safely.
Ultimately, I would like to make sure that the following happens:
If IIS is currently serving the file, wait until IIS has finished serving it.
Before IIS can serve the file again, establish an exclusive lock on the file so that no other process, thread, user (etc.) can read from or write to the file.
Either delete the file entirely and write a new file to replace it or overwrite the existing file with the new content.
Remove the exclusive lock so that users can access the file again.
Suggestions?
Also, can I use a Mutex to get a mutually exclusive lock on a file in the Windows filesystem?
Thank you in advance for your assistance and advice.
EDIT:
The way that the links for the newsletter are generated is based on the physical filename. The method used is:
Get all PDF files in the "archives" directory. For each file:
Parse the date of publication from the filename.
Store the date, the path to the file, the filename, and a URL to each file in a DataRow in a DataTable
Sort the DataTable by date (descending).
Output the first row as the current issue.
Output all subsequent rows as "archives" organized by year and month.
UPDATE:
In lieu of not being able to discern when all existing requests for that file have completed, I took a closer look at the first part of #Justin's answer ("your mutex will only have an effect if the process that reads from the file also obtains the same mutex.")
This led me to Configure IIS7 to server static content through ASP.NET Runtime and the linked article in the accepted answer.
To that end, I have implemented a handler for all PDF files which implements New Mutex(True, "MyMutexToPreventReadsOnOverwrite") to ensure that only one thread is doing something with the PDF at any given time.
Thank you for you answer, #Justin. While I did not wind up using the implementation you suggested, your answer pointed me towards an acceptable solution.
Your mutex will only have an effect if the process that reads from the file also obtains the same mutex. What is the method used to serve up the file? Is ASP.Net used or is this just a static file?
My workflow would be a little different:
Write the new newsletter to a new file
Have IIS start serving up the new file instead of the old one for the given Newsletter url
Delete the old file once all existing requests for that file have completed
This requires no locking and also means that we don't need to wait for requests for the current file be completed (something which could potentially take an indefinite amount of time if people keep on making new requests). The only interesting bit is step 2 which will depend on how the file is served - the easiest way would probably be to either set up a HTTP redirect or use URL rewriting
HTTP Redirect
A HTTP Redirect is where the server tells the client to look in a different place when it gets a request for a given resource so that the browser URL is automatically updated to match the new location. For example if the user requested http://server/20120221.pdf then they could be automatically redirected to another URL such as http://server/20120221_v2.pdf (the URL shown in the browser would change however the URL they need to type in would not).
You can do this in IIS 7 using the httpRedirect configuration element, for example:
<configuration>
<system.webServer>
<httpRedirect enabled="true" exactDestination="true" httpResponseStatus="Found">
<!-- Note that I needed to add a * in for IIS to accept the wildcard even though it isn't used in this case -->
<add wildcard="*20120221.pdf" destination="20120221_v2.pdf" />
</httpRedirect>
</system.webServer>
</configuration>
The linked page shows how to change these settings from ASP.Net
Url Rewriting
Alternatively IIS can be set up to automatically serve up the content of a different file for a given URL without the client (the browser) ever knowing the difference. This is called URL rewriting and can be done in IIS using something like this however it does require that additional components be installed to IIS to work.
Using a HTTP Redirect is probably the easiest method.