Upload contents of csv file from sftp folder to cloud SQL Server database - best practice - c#

I am trying to find the best way to take a csv file (20-30MB) from an sftp folder, and upload the data to a cloud based SQL Server. I'm just looking for the best approach, not looking for someone to write the code for me.
As of now I have this working (which fails 50% of the time) the following way:
On local PC, setup HTTPWEBREQUEST, convert file to byte array, send byte array via Stream.
On web server, receive stream, convert and save to temp folder on webserver.
Then have local PC send another message to web server that starts the following -- grab saved file on server, convert file to datatable... then execute a SqlBulkCopy, finally delete file on server.
This process fails half the time. I have created a new WebApi service using authentication and I would like to find the best way to do this using the new service I am creating. In my research sometimes I find information on POST a single item to the database, not sure if I should be doing something like that to 20,000 rows of data.
Can someone please point me in the right direction. Obviously what I am doing is not the best way if it keeps failing.

Related

C# - How to temporary stream file online to be accessed URL

I have been working on project recently where eventually I need two (or more) applications on different machines to be able to access the same file on one of them without uploading it to a server first.
Thus, the best idea I could get was to create something like a mini-temporary file server where the desired file path is streamed over the ip address and the other machine can access it via a URL like that "http://xxx.xxx.xxx.xxx/path/file.ext".
I have a good experience with C# but this approach... I never stepped into before, so any help is appreciated either in achieving this approach or any other method that can lead to allowing cross-internet access to a file on a machine.
Thanks in advance.
[Edit]
This operation has to be done without prior port forwarding, I don't know if this is possible, but I guess if it is not then I might need to do something like streaming to a php server first or something, again any help is appreciated.
Set up a virtual directory pointing to a local directory in IIS.
Application A writes file to local directory.
Application B reads
file from http.
Otherwise, you could use a network drive for both applications to read/write from.

How to save file using Azure/Web API to folder on shared network drive

Currently trying to save a file to a folder on a network location.
network//location//folder/save-here
The WebAPI is connected though azure/VPN /Entity Framework, however I need to save the file on the protected network location, not just a record in the database.
I've started trying to use a Hybrid Connection, however I'm not sure it will help solve this issue.
What is the best way to achieve saving a file to a folder on a network location from a Web API/Azure?
Unfortunately, in the Application Service Plan you cannot mount a file share.
If you were using Azure's file share from Azure Storage, you could just save the file using the API. However, since you are trying to save to an on-prem file share, you might need to set up some kind of service (possibly another API) running on-prem than you would call and it would be able to save the file for you.

How to verify that an SCP file upload was successful

I've been given an assignment to use SCP to upload a file to a Secure FTP Server (I'm doing this using Workflow Foundation activities). I can upload the file OK, but I need to do some sort of validation to make sure that the file is, in fact, there.
Any ideas ?
Thanks,
Chew
I've solved this type of problem before by:
uploading the file to the server.
downloading it as a different file name.
running a local comparison of the two files.
A script to do this is relatively easy to set up.

Winforms Document Manager Using Filesystem & SQL Database

I am trying to create a document manager for my winforms application. It is not web-based.
I would like to be able to allow users to "attach" documents to various entities (personnel, companies, work orders, tasks, batch parts etc) in my application.
After lots of research I have made the decision to use the file system to store the files instead of a blob in SQL. I will set up a folder to store all the files, but I will store the document information (filepath, uploaded by, changed by, revision etc) in parent-child relationship with the entity in an sql database.
I only want users to be able to work with the documents through the application to prevent the files and database records getting out of sync. I some how need to protect the document folder from normal users but at the same time allow the application to work with it. My original thoughts were to set the application up with the only username and password with access to the folder and use impersonation to login to the folder and work with the files. From feedback in a recent thread I started I now believe this was not a good idea, and working with impersonation has been a headache.
I also thought about using a webservice but some of our clients just run the application on there laptops with no windows server. Most are using windows server or citrix/windows server.
What would be the best way to set this up so that only the application handles the documents?
I know you said you read about blobs but are you aware of the FILESTREAM options in SQL Server 2008 and onwards? Basically rather than saving blobs into your database which isn't always a good idea you can instead save the blobs to the NTFS file system using transactional NTFS. This to me sounds like exactly what you are trying to achieve.
All the file access security would be handled through SQL server (as it would be the only thing needing access to the folder) and you don't need to write your own logic for adding and removing files from the file system. To remove a file from the file system you just delete the related record in the sql server table and it handles removing it from the file system.
See:
http://technet.microsoft.com/en-us/library/bb933993.aspx
Option 1 (Easy): Security through Obscurity
Give everyone read (and write as appropriate) access to your document directories. Save your document 'path' as the full URI (\\servername\dir1\dir2\dir3\file.ext) so that your users can access the files, but they're not immediately available if someone goes wandering through their mapped drives.
Option 2 (Harder): Serve the File from SQL Server
You can use either a CLR function or SQLDMO to read the file from disk, present it as a varbinary field and reconstruct it at the client side. Upside is that your users will see a copy, not the real thing; makes viewing safer, editing and saving harder.
Enjoy! ;-)
I'd go with these options, in no particular order.
Create a folder on the server that's not accessible to users. Have a web service running on the server (either using IIS, or standalone WCF app) that has a method to upload & download files. Your web service should manage the directory where the files are being stored. The SQL database should have all the necessary metadata to find the documents. In this manner, only your app can get access to these files. Thus the users could only see the docs via the app.
I can see that you chose to store the documents on the file system. I wrote a similar system (e.g. attachments to customers/orders/sales people/etc...) except that I am storing it in SQL Server. It actually works pretty well. I initially worried that so much data is going to slowdown the database, but that turned out to be not the case. It's working great. The only advice I can give if you take this route is to create a separate database for all your attachments. Why? Because if you want to get a copy of the RDBMS for your local testing, you do not want to be copying a 300GB database that's made up of 1GB of actual data and 299GB of attachments.
You mentioned that some of your users will be carrying laptops. In that case, they might not be connected to the LAN. If that is the case, I'd consider storing the files (and maybe metadata itself) in the cloud (EC2, Azure, Rackspace, etc...).

How to correctly write dynamic files to an FTP server?

I'm using C# and i have written a locally installed application that dynamically generates files which need to be on an FTP server.
Do i generate them to disk then upload them to the FTP server? or is there a way to open a stream to an FTP server and write the files directly?
Check the code sample I gave in this answer, doesn't rely on writing to files. It's not SQL specific and was just a suggestion on how to use SQL CLR integration assemblies to upload output from sql queries to an FTP server. The for loop in the method is just to demonstrate writing to the FTP stream. You should be able to rework to you needs:
How to write stored procedure output directly to a file on an FTP without using local or temp files?
You should follow the class:
System.Net.FtpWebRequest
You will see that its arguments are streams and you can send data to them from any source.
When seaching for .Net capabilities you should be aware of the object browser for visual studio acessible in:
View > other windows > object browser
Is supplies a search over all known assembly .Net objects.
The better way is to save file locally, and upload it later, since there could be problems with upload process.
Since you are using c# I'm thinking maybe you are in a Windows Env. Something I know little about :)
If you are dealing with a unix env, you could just pipe your output thru SSH, which would also take care of encryption overhead.

Categories