.NET WebClient works on my localhost but not on Azure - c#

I'm trying to FTP a file from Azure up to some FTP destination.
When I do it locally, it's working great. Ok ...
I publish this webjob up to azure and it's works great until it tries to do the FTP bit.
The following error is shown:
[12/21/2015 03:18:03 > 944464: INFO] 2015-12-21 03:18:03.6127| ERROR| Unable to connect to the remote server. Stack Trace: at System.Net.WebClient.UploadFile(Uri address, String method, String fileName)
[12/21/2015 03:18:03 > 944464: INFO] at System.Net.WebClient.UploadFile(String address, String method, String fileName)
And this is my simple code...
public void UploadFile(FileInfo fileInfo)
{
try
{
using (var client = new WebClient())
{
var destination = new Uri($"ftp://{_server}/{fileInfo.Name}");
client.Credentials = new NetworkCredential(_username, _password);
client.UploadFile(destination, "STOR", fileInfo.FullName);
}
}
catch (Exception exception)
{
// snipped.
throw;
}
}
So it's pretty damn simple.
Could this be some weird PASSIVE/ACTIVE issue? Like, how our office internet is simple while azure's setup is complex and ... as such ... it just cannot create the FTP connection?
Now - this is where things also get frustrating. Originally I was using a 3rd party FTP library and that DID work .. but some files were only getting partially uploaded .. which is why i'm trying to use WebClient instead. So I know that my azure webjob "website" thingy can FTP up. (no blocked ports, etc).
Any clues to what I can do, here?

Following Kristensen's simple example, I could do it. But I'm not sure about the format of your FTP url destination.
Did you try:
var destination = new Uri(string.Format("ftp://{0}/{1}", _server,fileInfo.Name));
Assuming _server is defined somewhere, I don't see it in your code and that your credentials have permissions to write on that location.

Related

Azure functions and persist token cache file

I have an HTTP triggered function (.NET 3.1 TLS, running not from zip package), that connects to my personal OneDrive and dropping files.
To minimize authentication prompts I used this approach to serialize token cache: https://learn.microsoft.com/en-us/azure/active-directory/develop/msal-net-token-cache-serialization?tabs=custom#simple-token-cache-serialization-msal-only
On localhost it works great, but it is not so good while running function on Azure - the cache file is lost quite often.
I put in on Path.GetTemporaryPath(). Not sure which location is the best for such purpose. However the cache seems to be bound to a computer that created the file: when I use cache file, created on my localhost and uploaded to Azure, then using of such file shows errors around getting account details / enumeration.
Any ideas how to get this fixed by minimal cost? My app is using only one token: for my, personal, OneDrive.
The final solution is:
remove ProtectedData from code from the sample custom token serialization helper, because it encrypts the file with current profile or machine
save args.TokenCache.DeserializeMsalV to Azure Storage Blob and read it later.
So I've made changes in code from https://learn.microsoft.com/en-us/azure/active-directory/develop/msal-net-token-cache-serialization?tabs=custom#simple-token-cache-serialization-msal-only to the following:
EnableSerialization:
string cstring = System.Environment.GetEnvironmentVariable("AzureWebJobsStorage");
string containerName = System.Environment.GetEnvironmentVariable("TokenCacheStorageContainer");
try { bcc = new BlobContainerClient(cstring, containerName); } catch { }
if (!bcc.Exists())
{
bcc = (new BlobServiceClient(cstring)).CreateBlobContainerAsync(containerName).GetAwaiter().GetResult();
}
tokenCache.SetBeforeAccess(BeforeAccessNotification);
tokenCache.SetAfterAccess(AfterAccessNotification);
BeforeAccessNotification:
lock (FileLock)
{
var blob = bcc.GetBlobClient("msalcache.bin3");
if (blob.Exists())
{
var stream = new MemoryStream();
blob.DownloadToAsync(stream).GetAwaiter().GetResult();
args.TokenCache.DeserializeMsalV3(stream.ToArray());
}
}
AfterAccessNotification:
if (args.HasStateChanged)
{
lock (FileLock)
{
// reflect changesgs in the persistent store
(bcc.GetBlobClient("msalcache.bin3")).UploadAsync(new MemoryStream(args.TokenCache.SerializeMsalV3()),true).GetAwaiter().GetResult();
}
}

C# connect to remote server using Microsoft Terminal Services Active Client (RDP)

I have a piece of code that should connect to the server. The code is as following:
var rdp = new MsRdpClient8NotSafeForScripting();
rdp.Server = "192.168.0.101"; //adress
rdp.Domain = "localdomain"; //domain
rdp.UserName = "test"; //login
rdp.AdvancedSettings8.ClearTextPassword = "123456";//password
try
{
rdp.Connect();
}
catch (Exception e)
{
Console.WriteLine(e);
}
Console.WriteLine(rdp.Connected);
if (rdp.Connected != 0)
{
rdp.Disconnect();
}
Console.ReadLine();
This is supposed to "connect" to my remote server via 3389 port so that I can be able to read a file from my desktop which is called: "min.txt"
So far I have tried specifying the login data of my server but I always just get output of "0" in the console's window, regardless of whether I specify correct or incorrect login data..
My questions here are:
Why is it connecting even with wrong login data (ip, user + password)
How can I, once I've been indeed successfully connected to the server, access the min.txt file on my remote server, which is located at desktop...
Can someone help me out?
Probably you can try specifying the password like below:
MSTSClib.IMsTscNonScriptable secured = (MSTSClib.IMsTscNonScriptable)rdp.GetOcx();
secured.ClearTextPassword = “123456”;
For reference: MSDN link is here
Once connected, you can access the file like a shared network file via UNC.
Example:
System.IO.FileStream stream = System.IO.File.OpenRead("\\servername\sharedname\path\somefile.txt");
Then need to ensure that permissions are in place to access the folder.

SftpClient using Url instead of IP as host

I am using the following code to upload a file to an sFTP server. I have done a test upload using FileZilla and the file is uploaded successfully.
try
{
var client = new SftpClient(host, port, username, password);
client.Connect();
client.ChangeDirectory(workingDir);
var listDirectory = client.ListDirectory(workingDir);
foreach (var file in files)
{
var fileStream = new FileStream(file, FileMode.OpenOrCreate);
client.BufferSize = 4 * 1024; // bypass payload error large files
client.UploadFile(fileStream, Path.GetFileName(file));
Log.Info(string.Format("File [{0}] upload complete",file));
}
}
catch (Exception ex)
{
Log.Error(ex.Message);
}
But in the code above, I cannot seem to get the upload done because I get the following error:
No such host is known
And that's maybe because I am using the Url value of my host instead of IP? (I have done a test with another FtP server using IP and that also seems to be working) I'm wondering if that's the case? If so, is there a way to let SftpClient (Ssh.Net) handle the Url?
The host parameter may be an IP address or a host name that can be resolved to an IP address. It may not be a URL, that is something completely different from a technical point of view.
The fact that some full application accepts something else than a host name says little: it splits that string into separat tokens and connects to the IP address it gets returned when resolving the host name. But that does not mean that you can open any network connection to a URL or some arbitrary string. It is only possible to open a connection to an IP address.
So if you successfully test some URL like sftp://ftp.example.com/ContactImport in some "program", this does not mean that you can internally use that string as host parameter in your code. You need to use the host name that is a part of such URL, so ftp.example.com in this case, since only that can be successfully resolved to an IP address.

SSRS Report Portal Management

I am working on an C# utility that would help clients publishing SSRS reports on their sites. Here my code:
public void createFolder()
{
ReportingService2010 rs = new ReportingService2010();
rs.Credentials = System.Net.CredentialCache.DefaultCredentials;
// Create a custom property for the folder.
Property newProp = new Property();
newProp.Name = "Department";
newProp.Value = "Finance";
Property[] props = new Property[1];
props[0] = newProp;
string folderName = "Budget";
try
{
rs.CreateFolder(folderName, "/", props);
Console.WriteLine("Folder created: {0}", folderName);
}
catch (SoapException e)
{
Console.WriteLine(e.Detail.InnerXml);
}
}
I am getting the following error:
ErrorCode xmlns="http://www.microsoft.com/sql/reportingservices">rsAccessDenied</ErrorCode><HttpStatus xmlns="http://www.microsoft.com/sql/reportingservices">400</HttpStatus><Message xmlns="http://www.microsoft.com/sql/reportingservices">The permissions granted to user 'NT AUTHORITY\NETWORK SERVICE' are insufficient for performing this operation.</Message><HelpLink xmlns="http://www.microsoft.com/sql/reportingservices">https://go.microsoft.com/fwlink/?LinkId=20476&EvtSrc=Microsoft.ReportingServices.Diagnostics.Utilities.ErrorStrings&EvtID=rsAccessDenied&ProdName=Microsoft%20SQL%20Server%20Reporting%20Services&ProdVer=13.0.1601.5</HelpLink><ProductName xmlns="http://www.microsoft.com/sql/reportingservices">Microsoft SQL Server Reporting Services</ProductName><ProductVersion xmlns="http://www.microsoft.com/sql/reportingservices">13.0.1601.5</ProductVersion><ProductLocaleId xmlns="http://www.microsoft.com/sql/reportingservices">127</ProductLocaleId><OperatingSystem xmlns="http://www.microsoft.com/sql/reportingservices">OsIndependent</OperatingSystem><CountryLocaleId xmlns="http://www.microsoft.com/sql/reportingservices">1033</CountryLocaleId><MoreInformation xmlns="http://www.microsoft.com/sql/reportingservices"><Source>ReportingServicesLibrary</Source><Message msrs:ErrorCode="rsAccessDenied" msrs:HelpLink="https://go.microsoft.com/fwlink/?LinkId=20476&EvtSrc=Microsoft.ReportingServices.Diagnostics.Utilities.ErrorStrings&EvtID=rsAccessDenied&ProdName=Microsoft%20SQL%20Server%20Reporting%20Services&ProdVer=13.0.1601.5" xmlns:msrs="http://www.microsoft.com/sql/reportingservices">The permissions granted to user 'NT AUTHORITY\NETWORK SERVICE' are insufficient for performing this operation.</Message></MoreInformation><Warnings xmlns="http://www.microsoft.com/sql/reportingservices" />
Any idea?
Please note that a point of this exercise is not just get it work but knowing that it is going to run on a client side ideally without any tweaking on their part.
Thanks for help.
Your question is a little unclear. It sounds like you're not sure why you're getting a permissions error. The answer is that you need to create a new role. I assume (after your comment) that your question is how you can do that using C#.
You can use the CreateRole() method, which will only work with SSRS Native Mode. If SSRS is installed in SharePoint Integrated mode, there is no supported method other than using the UI.
This method throws an OperationNotSupportedSharePointMode exception
when invoked in SharePoint mode

Getting Access Denied Exception when deleting a file in Amazon S3 using the .Net AWSSDK

I am trying to do some simple file IO using amazon S3 and C#.
So far I have been able to create files and list them. I am the bucket owner and I should have full access. In CloudBerry I can create and delete files in the bucket. In my code when I try to delete a file I get an access denied exception.
This is my test method:
[Test]
public void TestThatFilesCanBeCreatedAndDeleted()
{
const string testFile = "test.txt";
var awsS3Helper = new AwsS3Helper();
awsS3Helper.AddFileToBucketRoot(testFile);
var testList = awsS3Helper.ListItemsInBucketRoot();
Assert.True(testList.ContainsKey(testFile)); // This test passes
awsS3Helper.DeleteFileFromBucket(testFile); // Access denied exception here
testList = awsS3Helper.ListItemsInBucketRoot();
Assert.False(testList.ContainsKey(testFile));
}
My method to add a file:
var request = new PutObjectRequest();
request.WithBucketName(bucketName);
request.WithKey(fileName);
request.WithContentBody("");
S3Response response = client.PutObject(request);
response.Dispose();
My method to delete a file:
var request = new DeleteObjectRequest()
{
BucketName = bucketName,
Key = fileKey
};
S3Response response = client.DeleteObject(request);
response.Dispose();
After running the code the file is visible in CloudBerry and I can delete it from there.
I have very little experience with Amazon S3 so I don't know what could be going wrong. Should I be putting some kind of permissions on to any files I create or upload? Why would I be able to delete a file while I am logged in to CloudBerry with the same credentials provided to my program?
I'm not sure what is the source of problem. Possibly security rules, but maybe something very simple with your bucket configuration. You can check them using S3 Organizer Firefox plugin, using AWS management site or any other management tool. Also I recommend request-responce logging - that helped a lot in different investigating for me. AWSSDK has plenty of good exmamples with logging - so you need only copy-paste them and everything works. If you have actual requests sending to Amazon, you can compare them with documentation. Please check AccessKeyId for your deleteRequest

Categories