Is there anyone hehe who has experience with uploading very large files (3-5Gb) in Blazor and know if it works well, eg using c#, JavaScript manual shunk up files or HTML5 File API multipart upload? Preferably without a third-party library.
I also have a general question about the scenario of a logged in user if there are no special restrictions set for allowed file types possible to upload, what security concerns can still be handled using c#, JavaScript client, server side with regard to eg OWASP?
The only way I have gotten it to work properly is to make a native call to an API that is running on the same server.
You can use fetch or ajax to do so.
An example on how to use fetch: https://flaviocopes.com/how-to-upload-files-fetch
After it has been uploaded you can check the file location using dotnets file system.
You can always check the file extension however I would not recommend allowing just anyone to upload files to your website.
There is an API for virus total, I have not used it so look it through yourself!
https://github.com/Genbox/VirusTotalNet
You have to do it using pure javascript or jquery any attempt to do it using c# will use signalR and that has a very slow upload.
Related
I need to find a way to upload asynchronously multiple files.
I want to make a smtp client, User can chose files, create message and then send the mail.
I use ajaxfileupload but it create a folder witch look like c:/windows/tem/_ajaxFileUpload/[a guid]/[uploaded files]
is it possible to determine the guid and then find files to attach without saving it to an other folder.
this must be done after user click on send button.
One possible option is the HTML 5 File Api which allows client side uploads. Here is a tutorial on it http://www.html5rocks.com/en/tutorials/file/dndfiles/
I have used it in conjunction with a MVC 4 C# application, and have successfully got it to post (ajax post) file contents to an API Controller which can then be used to persist the file. While you don't mention if you are using MVC, the same should be possible using any sort of service.
Hope this helps.
I have some files in a shared location/folder. I need to provide a link/path on a webpage. If an user click on the link, user should be able to view the file.
I am using asp.net with C# (VS2010)
Is the above requirement is possible ?
If yes please help me in enabling this feature.
Thank you in advance
It should be possible using something like file://server/share/path/to/file.txt. Firefox is a lot more tolerant of encoding of characters in filenames than IE, so you may need to use Server.UrlEncode on the file path.
What I would do is to write an ASHX Generic Handler where you pass the requested file name as an URL parameter and that ASHX handler actually fetches the file for you, sends it to the browser.
This has these benefits in my opinion:
It uses the HTTP protocol, not the FILE protocol.
You are keeping internal structures internal, without exposing them to the visitors.
You can implement access rights and other things since the files are streamed through your handler, and are not directly delivered by the web server (IIS).
If NTFS security permissions are an issue you might use impersonation for fetching the files from your shared folder location. I've written a small impersonator class some years ago to simplify this task.
I'm trying to write a desktop application using C# which Google Maps API is embedded in. For this purpose, I embedded a web browser in the application and using Maps API. I could use basic functionalities of the Maps API inside the application.
My purpose is to create a KML file on the fly and show it on the maps. The KML files are successfully created from shapefiles using GDAL. I checked their validity by importing them to the maps.google.com. The problem I'm facing is it is not possible to show KML files on local disk using Maps API. There is no way that I could upload those KML files to a public server that Google can reach. I searched on the web and found that geoxml3 could be used for those purpose but I'm not able to handle it. Since Geoxml3 is subject to same cross-domain download restrictions, I get the "Access denied" error when I try to parse the KML file on my local disk. How could I make sure that my KML document is served from the same domain as the containing map page? I'm pretty new to javascript so any help will be appreciated.
If there is another way of achieving what I'm trying to do(like importing the local kml file to Google servers on the fly,etc.), please tell so. Thank you in advance.
Ekin Gedik
The usual answer to cross-domain issues (at least before recent browsers with CORS: Cross-Origin Resource Sharing, which I have seen but not experimented with), is to use a proxy in the same domain as the web-page. I'm not sure how that will work with a local file though.
You should be able to access a local file from an application running on your local machine, but I'm not sure if that would comply with the terms of use other than for development.
Another option would be to pass the kml to geoxml3 as a string and use parseKmlString method. That would avoid the xmlHttpRequest.
There are a lot of JavaScript utilities now to allow posting a file in an AJAXified way to the server these days. Is there any utility that allows streaming bytes to the client and download a file? Or dues that still have to be a server-side solution? I'm using .NET.
Thanks.
There's no cross-browser support for accessing the filesystem of the client. You could probably do it with Flash/Java, but a much cleaner solution would be to do it in the server and create a download link for the user.
Brian: What you said about Javascript posting files to a server seems an incomplete statement. Only way javascript can post files to a server is by having the user manually selecting the file he/she wants to upload...
To answer your question...
You should be able to issue some sort of ajax call (to a web service, for example), have the web service read the file into a byte array and return it to the client. On the client side, you would need to assemble the byte array. I would assume you'll have to also set the appropriate response type from the web service call.
This seems to be something like what you are looking for.
I need to let a company push information up to my site.
The best way to explain what I am talking about is to explain how it is currently done with their previous website:
This company uploads a CSV file to an FTP set up by the website. The website then processes the CSV file and puts it into an SQL database so that it can be used by the website.
In this case, I am the website and I am working with the company. Both sides are willing to change what they do. So my question is...
What is the best way to accept batch information like this? Is there a more automated way that doesn't involve FTP? In the future I may have a lot of companies wanting to do this, and I'd hate to have to setup accounts for each one.
The project is C# ASP.NET MSSQL
Let me know if you need more information...
Set up a web service to accept incoming data. That way you can validate immediately and reject bad data before it ever gets into your system.
If you want to eliminate FTP, you could allow them to upload files to your site leveraging using FileUpload. Once the file is uploaded you can do your server side processing.
EDIT: From the OP's comment's it seems to be an automated process. That said, if their process generates the file, you could:
Allow them to continue their current process which would involve them generating their file and placing it somewhere where it could be accessed via a URI with authentication, you could access this file on a schedule and process it. From what it seems right now they generate a file and upload it to your FTP server, so there seems to a manual element to begin with.