Passing potential huge files in chunks to Web API - c#

I have to pass potential huge files from an ASP.NET Core middle Server to an ASP.NET backend.
I can’t use the ASP.NET backend web API directly, I have to go over a MVC Controller.
Currently my middle Server gets the file in Chunks (and does some verification), saves it to disk and after it completes it rereads it in chunks to pass it forward.
Is there an easy way to pass the chunks without buffering the file?
I currently got this:
MVC Controler:
[HttpPost]
public ActionResult UploadChunk(IFormFile fileChunk, string chunkMetadata)
{
if (!string.IsNullOrEmpty(chunkMetadata))
{
var metaDataObject = JsonConvert.DeserializeObject<ChunkMetadata>(chunkMetadata);
...
AppendContentToFile(tempFilePath, fileChunk); //writes file with FileMode.Append,
}
}
my upload to back end [Edit]:
public IHttpActionResult FileUpload(string fileUri)
{
try
{
if (Request.Content.IsMimeMultipartContent())
{
var configProvider = Resolve<IApplicationConfigurationProvider>();
var uploadRootDir = configProvider.TemporaryFileUploadPath;
var streamProvider = new MultipartStreamProvider(uploadRootDir);
// If the file is huge and is not split into chunks, the 'ReadAsMultipartAsync' call
// takes as long as full file has been copied
var readResult = Request.Content.ReadAsMultipartAsync(streamProvider).Result;
var fileSvc = Resolve<IFileService>();
string targetFilePath = string.Empty;
foreach (MultipartFileData fileData in streamProvider.FileData)
{
ContentDispositionHeaderValue contentDisposition = fileData.Headers.ContentDisposition;
string fileName = contentDisposition.FileName;
if (!GetFileName(fileName, out var targetFileName))
{
return BadRequest($"ContentDisposition.FileName must match 'file' of URI-query! Actual: {targetFileName}");
}
var rawSourceFileInfo = new FileInfo(targetFileName);
if (contentDisposition.Size.HasValue && contentDisposition.Size.Value > 0)
{
if (!fileSvc.CreateNewFilePath(fileUri, new PathOptions(true), out var validFileFullPath))
{
return BadRequest($"Unable to create physical-path from fileId='{fileUri}'");
}
targetFilePath = validFileFullPath.FullName;
fileSvc.AddChunk(validFileFullPath.FullName, contentDisposition.Size.Value, fileData.LocalFileName);
}
else
{
return BadRequest("File upload must set a valid file-length in ContentDisposition");
}
}
return Ok(targetFilePath);
}
else
{
return BadRequest("File upload must be a 'IsMimeMultipartContent'");
}
}
catch (Exception error)
{
LogError(error, "FileUpload");
return InternalServerError(error);
}
}
Thanks in advance for any help!
[Edit]
my not working call from client to back end:
<script>
$(document).ready(function (e) {
$("#uploadImage").on('change', (function (e) {
// append file input to form data
var fileInput = document.getElementById('uploadImage');
var file = fileInput.files[0];
var formData = new FormData();
formData.append('uploadImage', file);
$.ajax({
url: "http://localhost/service/filestore/v1/upload?fileUri=someText",
type: "POST",
data: formData,
contentType: false,
cache: false,
processData: false,
success: function (data) {
if (data == 'invalid') {
// invalid file format.
$("#err").html("Invalid File !").fadeIn();
}
else {
// view uploaded file.
$("#preview").html(data).fadeIn();
$("#form")[0].reset();
}
},
error: function (e) {
$("#err").html(e).fadeIn();
}
});
}));
});
</script>

Related

uploading and accessing images into asp .net api folder using angular

I have angular project as my client side and .Net 6 web api project as my backend. I am still new to both technologies. I am creating a website and there is a functionality that I am trying to add and haven't been successful so far. I want to upload images into a .Net web api project images folder using angular. I also want to later access those images from angular project. I want to store the path of the image files in the database. I have tried to check for the code on the internet without success. Your assistance will be appreciated.
First, submit your's files from frontend with FormData
postWithFile(url: string, obj: any, files: File[]) {
let cloneHeader: any = {};
let options: any = {
headers: new HttpHeaders(cloneHeader),
observe: 'response',
responseType: 'json'
};
let formData: FormData = new FormData();
if (typeof obj == 'object') { // obj is external submit data
formData.append('data', JSON.stringify(obj));
} else {
formData.append('data', obj);
}
if (files && files.length > 0) {
files.forEach((ds, index) => {
formData.append('file_' + index, ds, ds.name);
});
}
return this._http
.post(this.host + url, formData, options)
.pipe(map((res: any) => {
return res.body;
}));
}
And backend handle request with HttpContext.Current.Request.Files, save images to server and store path of images in database
[HttpPost]
public ResponseMessage<bool?> UploadImages()
{
var response = new ResponseMessage<bool?>();
try
{
if (!Request.Content.IsMimeMultipartContent())
throw new HttpResponseException(HttpStatusCode.UnsupportedMediaType);
ExternalDataModel model = MessageConvert.DeserializeObject<ExternalDataModel>(HttpContext.Current.Request["data"]); // obj in frontend
//
List<string> listImages = new List<string>();
DateTime now = DateTime.Now;
string buildPath = $"{string.Format("{0:00}", now.Year)}\\{string.Format("{0:00}", now.Month)}\\{string.Format("{0:00}", now.Day)}"; // change by your's folder path
foreach (string file in HttpContext.Current.Request.Files)
{
var fileContent = HttpContext.Current.Request.Files[file];
int fileLength = fileContent.ContentLength;
if (fileContent != null && fileLength > 0)
{
var stream = fileContent.InputStream;
byte[] imgByteArray;
using (MemoryStream memoryStream = new MemoryStream())
{
stream.CopyTo(memoryStream);
imgByteArray = memoryStream.ToArray();
}
string fileName = $"format_file_name_if_need_{fileContent.FileName}";
string RelativeFolder = $"{buildPath}";
string AbsoluteFolder = Path.Combine("FOLDER_IN_SERVER_FULL_PATH", RelativeFolder);
if (!Directory.Exists(AbsoluteFolder))
{
Directory.CreateDirectory(AbsoluteFolder);
}
string pathSave = Path.Combine(RelativeFolder, fileName);
FileHelper.SaveFileFromBinaryArray(pathSave, imgByteArray);
listImages.Add(pathSave);
}
}
// model.listImages = listImages; // assign to model to save to DB
//
// var data = _bus.uploadImage(model);
// if (data)
// {
// response.Data = true;
// response.MessageCode = MessageCodes.UpdateSuccessfully;
// }
}
catch (Exception ex)
{
response.MessageCode = ex.Message;
}
return response;
}

.NET Web API Blueimp multiple file upload error "Unexpected end of MIME multipart stream. MIME multipart message is not complete."

I am building a video management app that allows for uploading multiple videos to Azure Storage which is then encoded by Azure Media Services.
My issue is that if I upload just 1 file at a time with blueimp, everything works fine. When I add more than one file to the upload, I get the error on the second file.
Unexpected end of MIME multipart stream. MIME multipart message is not complete.
I have read that it could be to the stream missing the end of file terminator, so I added in the suggested tweak to append the line terminator (per this article ASP.NET Web API, unexpected end of MIME multi-part stream when uploading from Flex FileReference) with no luck.
If I post as single files (by either iterating over the files selected for upload) and send them as individual posts, it works. My issue is that I want to select several files as well as add additional metadata and hit one submit button. When I do it this way, the first file uploads, and the second one appears to start to upload but then I get the error 500 "Unexpected end of MIME multipart stream. MIME multipart message is not complete" message.
Here is my upload code (Web API):
[HttpPost]
public async Task<HttpResponseMessage> UploadMedia()
{
HttpResponseMessage result = null;
var httpRequest = HttpContext.Current.Request;
if (httpRequest.Headers["content-type"] != null)
{
httpRequest.Headers.Remove("content-type");
}
httpRequest.Headers.Add("enctype", "multipart/form-data");
if (httpRequest.Files.Count > 0)
{
var docfiles = new List<string>();
foreach (string file in httpRequest.Files)
{
var postedFile = httpRequest.Files[file];
var filePath = HttpContext.Current.Server.MapPath("~/" + postedFile.FileName);
string assignedSectionList = string.Empty;
postedFile.SaveAs(filePath);
docfiles.Add(filePath);
string random = Helpers.Helper.RandomDigits(10).ToString();
string ext = System.IO.Path.GetExtension(filePath);
string newFileName = (random + ext).ToLower();
MediaType mediaType = MediaType.Video;
if (newFileName.Contains(".mp3"))
{
mediaType = MediaType.Audio;
}
if (httpRequest.Form["sectionList"] != null)
{
assignedSectionList = httpRequest.Form["sectionList"];
}
MediaUploadQueue mediaUploadQueueItem = new MediaUploadQueue();
mediaUploadQueueItem.OriginalFileName = postedFile.FileName;
mediaUploadQueueItem.FileName = newFileName;
mediaUploadQueueItem.UploadedDateTime = DateTime.UtcNow;
mediaUploadQueueItem.LastUpdatedDateTime = DateTime.UtcNow;
mediaUploadQueueItem.Status = "pending";
mediaUploadQueueItem.Size = postedFile.ContentLength;
mediaUploadQueueItem.Identifier = random;
mediaUploadQueueItem.MediaType = mediaType;
mediaUploadQueueItem.AssignedSectionList = assignedSectionList;
db.MediaUploadQueue.Add(mediaUploadQueueItem);
db.SaveChanges();
byte[] chunk = new byte[httpRequest.ContentLength];
httpRequest.InputStream.Read(chunk, 0, Convert.ToInt32(httpRequest.ContentLength));
var provider = new AzureStorageMultipartFormDataStreamProviderNoMod(new AzureMediaServicesHelper().container);
provider.fileNameOverride = newFileName;
await Request.Content.ReadAsMultipartAsync(provider); //this uploads it to the storage account
//AzureMediaServicesHelper amsHelper = new AzureMediaServicesHelper();
string assetId = amsHelper.CommitAsset(mediaUploadQueueItem); //begin the process of encoding the file
mediaUploadQueueItem.AssetId = assetId;
db.SaveChanges();
////start the encoding
amsHelper.EncodeAsset(assetId);
}
result = Request.CreateResponse(HttpStatusCode.Created, docfiles);
}
else
{
result = Request.CreateResponse(HttpStatusCode.BadRequest);
}
return result;
}
Here is the code for the upload handler that sends to Azure Blob storage
public override Stream GetStream(HttpContent parent, HttpContentHeaders headers)
{
if (parent == null) throw new ArgumentNullException(nameof(parent));
if (headers == null) throw new ArgumentNullException(nameof(headers));
if (!_supportedMimeTypes.Contains(headers.ContentType.ToString().ToLower()))
{
throw new NotSupportedException("Only jpeg and png are supported");
}
// Generate a new filename for every new blob
var fileName = Guid.NewGuid().ToString();
if (!String.IsNullOrEmpty(fileNameOverride))
fileName = fileNameOverride;
CloudBlockBlob blob = _blobContainer.GetBlockBlobReference(fileName);
if (headers.ContentType != null)
{
// Set appropriate content type for your uploaded file
blob.Properties.ContentType = headers.ContentType.MediaType;
}
this.FileData.Add(new MultipartFileData(headers, blob.Name));
return blob.OpenWrite();
}
Here is the javascript code. The first one is sending the files individual as separate posts, it works.
$("#fileupload").fileupload({
autoUpload: false,
dataType: "json",
add: function (e, data) {
data.context = $('<p class="file">')
.append($('<a target="_blank">').text(data.files[0].name))
.appendTo(document.body);
data.submit();
},
progress: function (e, data) {
var progress = parseInt((data.loaded / data.total) * 100, 10);
data.context.css("background-position-x", 100 - progress + "%");
},
done: function (e, data) {
data.context
.addClass("done")
.find("a")
.prop("href", data.result.files[0].url);
}
});
This code below does not work. It pushes all the files into and array and sends them in one single post. This one fails on the second file. If I upload just one file using this code, it works.
var filesList = new Array();
$(function () {
$('#fileupload').fileupload({
autoUpload: false,
dropZone: $('#dropzone'),
add: function (e, data) {
filesList.push(data.files[0]);
data.context = $('<div class="file"/>', { class: 'thumbnail pull-left' }).appendTo('#files');
var node = $('<p />').append($('<span/>').text(data.files[0].name).data(data));
node.appendTo(data.context);
},
progress: function (e, data) { //Still working on this part
//var progress = parseInt((data.loaded / data.total) * 100, 10);
//data.context.css("background-position-x", 100 - progress + "%");
},
}).on('fileuploadprocessalways', function (e, data) {
var index = data.index,
file = data.files[index],
node = $(data.context.children()[index]);
if (file.preview) {
node.prepend('<br>').prepend(file.preview);
}
if (file.error) {
node.append('<br>').append($('<span class="text-danger"/>').text(file.error));
}
}).prop('disabled', !$.support.fileInput)
.parent().addClass($.support.fileInput ? undefined : 'disabled');
$("#uploadform").submit(function (event) {
if (filesList.length > 0) {
console.log("multi file submit");
event.preventDefault();
$('#fileupload').fileupload('send', { files: filesList })
.success(function (result, textStatus, jqXHR) { console.log('success'); })
.error(function (jqXHR, textStatus, errorThrown) { console.log('error'); })
.complete(function (result, textStatus, jqXHR) {
console.log('complete: ' + JSON.stringify(result)); //The error 500 is returned here. In fiddler, it shows and error 500. If I try to trap in Visual Studio, I can't seem to pinpoint the exception.
// window.location='back to view-page after submit?'
});
} else {
console.log("plain default form submit");
}
});
});
Any thoughts on why this would be happening? I have tried every approach I can think about with no luck. Thank you in advance!
I want to point out that the architecture of your code might cause timeouts or errors.
I would first upload everything to azure storage, storage the status in cache or database.
then I would fire a background job (hangfire, azure functions, webjobs) to process uploading to media service to do the other stuff.
I would suggest doing this asynchronously from the user input.
as per the documentation of dropzone make sure you add name in the HTML tag
<form action="/file-upload" class="dropzone">
<div class="fallback">
<input name="file" type="file" multiple />
</div>
</form>
if you are doing it programatically:
function param() {
return "files";
}
Dropzone.options.myDropzone = {
uploadMultiple: true,
paramName: param,
}
on the backend you need to add \r\n after each stream:

open file from ajax responce in mvc c#

I want to open file from Ajax response. here is the code. Here Ajax call response contain PDF file.
I want to open file in new tab of browser.here i am using mvc framework.
function ViewPDF(key){
$.ajax({
url: '#Url.Action("OpenDocument", "DocumentApproveUser")',
type: "POST",
data: { "key": key},
async: true,
cache: false,
success: function (data, status, xhr) {
alert(data);
window.open(data);
if (xhr.getResponseHeader("Forcefullylogin") == "true") {
var url = "/Login/Login";
window.location.href = url;
}
else {
}
},
error: function (error) {
$("#divLoading").hide();
if (error.getResponseHeader("Forcefullylogin") == true") {
var url = '#Url.Action("Login", "Login")';
window.location.href = url;
}
else {
alert('Something went wrong in system.Please try again later!or contact to system administrator.');
}
}
});
}
Server Code :
see below is code of my controller. this code return pdf file as ajax response.
I want to open that response in my browser.
[HttpPost]
public ActionResult OpenDocument(string key)
{
try
{
int Id = 0;
try
{
byte[] data = Convert.FromBase64String(key);
string decodedString = System.Text.Encoding.UTF8.GetString(data);
if (!(String.IsNullOrEmpty(decodedString)))
Id = Convert.ToInt32(decodedString);
}
catch (Exception ex)
{
ViewBag.ErrorName = "An error occured while opening document.";
base.ErrorLogger.Error("***OpenDocument***", ex);
return null;
}
DocumentApproveViewModel vm = new DocumentApproveViewModel();
vm.DocumentsApprovalModel = DocumentApproveViewModel.GetDocTransactionModelList(_repo.GetAll());
DocumentApprovalModel lst;
lst = (from x in vm.DocumentsApprovalModel where x.Id.Equals(Id) select x).FirstOrDefault();
base.Logger.InfoFormat("User : '{0}' going to access pdf document at {1} ", SessionFactory.CurrentUser.Name, System.DateTime.Now);
/////////////////////////////////////////////////////////////////////
ICollection<PasswordManagementViewModel> passwordList = null;
PasswordManagementViewModel password = null;
passwordList = PasswordManagementViewModel.GetSystemEncryptionKeyList(_encryption.GetAll());
password = passwordList.OrderByDescending(x => x.CreatedDateTime).FirstOrDefault();
string decryptPassword = Base64EncodeDecode.Decrypt(password.EncryptionKey, true);
/////////////////////////////////////////////////////////////////////
// Inhariting Logic from PDFSharpUtil Class.
byte[] PdfFileByte = _docSecurity.OpenPdfFile(lst.File, decryptPassword, SessionFactory.CurrentUser.EncryptionKey, SessionFactory.CurrentUser.Name, lst.DocumentTransactionName, false, SessionFactory.PdfViewCount);
/// Added logic for adding data into Document History ///
DocumentHistory objDocumentHistory = new DocumentHistory();
objDocumentHistory.SentTo = null;
objDocumentHistory.Timestamp = DateTime.UtcNow;
objDocumentHistory.ActionPerformedBy = SessionFactory.CurrentUser.Id;
objDocumentHistory.Action = EDocumentAction.View;
objDocumentHistory.DocumentId = Id;
_docHistoryRepo.Add(objDocumentHistory);
//Increment view count not to ask password from second attempt to open PDF file
SessionFactory.PdfViewCount++;
return File(PdfFileByte, "application/pdf");
}
catch (Exception ex)
{
ViewBag.ErrorName = "An error occured while opening Document";
base.ErrorLogger.Error("***OpenDocument :: DocumentView***", ex);
}
return null;
}
Do not try to use ajax to download file. Just open the url in a new browser tab. Based on your browser settings, it will either open in the tab or ask whether you want to save it.
You can set the new url to window.location.href
function ViewPDF(key)
{
var url= '#Url.Action("OpenDocument", "DocumentApproveUser")?key='+key,
window.location.href = url;
}
Based on the browser setting, the above 2 approaches will either ask user whether he wishes to download or open the file or simply download/open the file. If you prefer to show the file content directly in the browser, you may send a filestream to the browser.
Here is a quick example, which reads the pdf from a disk in the Contents/Downloads directory in app root and return the file stream.
public ActionResult View(int id)
{
var pathToTheFile=Server.MapPath("~/Content/Downloads/sampleFile.pdf");
var fileStream = new FileStream(pathToTheFile,
FileMode.Open,
FileAccess.Read
);
return new FileStreamResult(fileStream, "application/pdf");
}

How does ReadAsMultipartAsync actually work?

I have a canvas element in a web site that I'm uploading using the FormData API. Here's how I do it:
upload: function (e) {
var file = this.imagePreview.ui.canvas.get(0).toDataURL("image/jpeg")
.replace('data:image/jpeg;base64,', '');
if (file) {
var formData = new FormData();
formData.append('file', file);
$.ajax({
url: app.getApiRoot + 'UserFiles/',
type: "post",
data: formData,
processData: false,
contentType: false,
error: function () {
$("#file_upload_result").html('there was an error while submitting');
}
});
}
}
where I'm replacing the whole data:image/jpeg;base64, business as per this post.
On the backend I have the following multipart controller:
public async Task<IHttpActionResult> PostUserFile()
{
string imageDir = ConfigurationManager.AppSettings["UploadedImageDir"];
var PATH = HttpContext.Current.Server.MapPath("~/" + imageDir);
var rootUrl = Request.RequestUri.AbsoluteUri.Replace(Request.RequestUri.AbsolutePath, String.Empty);
if (Request.Content.IsMimeMultipartContent())
{
var streamProvider = new CustomMultipartFormDataStreamProvider(PATH);
await Request.Content.ReadAsMultipartAsync(streamProvider).ContinueWith(t =>
{
if (t.IsFaulted || t.IsCanceled)
{
throw new HttpResponseException(HttpStatusCode.InternalServerError);
}
var files = streamProvider.FileData.Select(i =>
{
var info = new FileInfo(i.LocalFileName);
return new UserFile(User.Identity.GetUserId(), info.Name, rootUrl + "/" + imageDir + "/" + info.Name, info.Length / 1024);
});
db.UserFiles.AddRange(files);
db.SaveChangesAsync();
});
return Ok();
}
else
{
throw new HttpResponseException(Request.CreateResponse(HttpStatusCode.NotAcceptable, "This request is not properly formatted"));
}
}
Where I implement a CustomMultipartFormDataStreamProvider as follows:
public class CustomMultipartFormDataStreamProvider : MultipartFormDataStreamProvider
{
public CustomMultipartFormDataStreamProvider(string path)
: base(path)
{ }
public override string GetLocalFileName(Headers.HttpContentHeaders headers)
{
var name = !string.IsNullOrWhiteSpace(headers.ContentDisposition.FileName) ?
headers.ContentDisposition.FileName :
"NoName";
// This is here because Chrome submits files in quotation
// marks which get treated as part of the filename and get escaped
return name.Replace("\"\"", string.Empty);
}
}
Sadly, the FileData on the streamProvider fed into the ReadAsMultipartAsync method is empty:
And another thing I find interesting is that the uploaded binary file is escaped, as I notice from the watch window:
The %2f is the escape sequence for the / character.
I simply can't pinpoint the problem. Anyone have any suggestions?
Mixing ContinueWith and async/await leads to problems, just do await Request.Content.ReadAsMultipartAsync(streamProvider); and everything in the .ContinueWith would just go after the await like normal code.
Or remove async/await and add:
return Request.Content.ReadAsMultipartAsync(streamProvider).Continu‌​eWith...
Also you can look this link: Sending HTML Form Data in ASP.NET Web API: File Upload and Multipart MIME

Stop Execution of HTTPPOST ActionMethod in MVC

I have webApplication in MVC Framework..
i have situation where i have to provide user to export some data to csv file
for that i have written following code ..
[HttpPost]
public ActionResult ExportReportToFile(ReportCriteriaViewModels posdata, string name)
{
string strQuery = GetReportQuery(posdata, name);
IEnumerable<REP_MM_DEMOGRAPHIC_CC> lstDemographics = ReportDataAccess.GetReportData<REP_MM_DEMOGRAPHIC_CC>(strQuery);
if (lstDemographics.Count() > 0)
return new CsvActionResult<REP_MM_DEMOGRAPHIC_CC>(lstDemographics.ToList(), "LISDataExport.csv");
else
return view(posdata);
}
Above code works fine... if in listResult Count is Greater than zero then i returns File to download.. but if i dont get any records in lstDemographics then i returns view..
My problem is when i dont get any result in lstDemographics, i dont want to return view coz it refreshes whole view.. so is there any way where we can stop execution of Action Method and browser doesn't refresh the view and stay as it is..
Thanks..
You will have to make an AJAX call to stop page refresh.
To achieve file export, we actually broke the process in two AJAX calls. First call sends a request to server, server prepare a file and store it in temp table. Server return the file name to AJAX call if there is data. If no data or error, it return a JSON result to indicate a failure.
On success, view make another AJAX request to download the file passing file name.
Something like this:
[Audit(ActionName = "ExportDriverFile")]
public ActionResult ExportDriverFile(int searchId, string exportType, string exportFormat)
{
var user = GetUser();
var driverSearchCriteria = driverSearchCriteriaService.GetDriverSearchCriteria(searchId);
var fileName = exportType + "_" + driverSearchCriteria.SearchType + "_" + User.Identity.Name.Split('#')[0] + "." + exportFormat;
//TempData["ExportBytes_" + fileName] = null;
_searchService.DeleteTempStore(searchId);
var exportBytes = exportService.ExportDriverFileStream(driverSearchCriteria, searchId, exportType, exportFormat, user.DownloadCode, user.OrganizationId);
if (exportBytes != null)
{
var tempStore = new TempStore
{
SearchId = searchId,
FileName = fileName,
ExportFormat = exportFormat,
ExportType = exportType,
DataAsBlob = exportBytes
};
var obj = _searchService.AddTempStore(tempStore);
return Json(fileName);
}
else
{
return Json("failed");
}
}
[HttpGet]
public ActionResult DownloadStream(string fileName, int searchId)
{
var tempStore = _searchService.GetTempStore(searchId);
var bytes = tempStore.DataAsBlob;
if (bytes != null)
{
var stream = new MemoryStream(bytes);
// TempData["ExportBytes_" + fileName] = null;
_searchService.DeleteTempStore(searchId);
return File(stream, "application/vnd.ms-excel", fileName);
}
_logger.Log("Export/DownloadStream request failed", LogCategory.Error);
return Json("Failed");
}
At client side, we do something like:
function ExportData(exportType, exportFormat) {
var searchId = document.getElementById('hfSelectedDriverId').value;
var model = { searchId: searchId, exportType: exportType, exportFormat: exportFormat };
//$('div[class=ajax_overlay]').remove();
//alert("The file will be downloaded in few minutes..");
$.ajax({
url: '#Url.Action("ExportDriverFile", "Export")',
contentType: 'application/json; charset=utf-8',
type: 'POST',
dataType: 'html',
data: JSON.stringify(model)
})
.success(function (result) {
result = result.toString().replace(/"/gi, "");
if (result == "" || result == "failed")
{
alert("File download failed. Please try again!");
}
else
{
window.location = '/Export/DownloadStream?fileName=' + result+"&searchId="+searchId;
}
})
.error(function (xhr, status) {
//
//alert(status);
});
//$('div[class=ajax_overlay]').remove();
}
You should create javascript function with $.getJSON method.
On controller side you just have to check, if you get data from database then return file path else return message.
Your JS code should be something like this:
$.getJSON(url)
.done(function (data) {
if (data.filePath) // If get data, fill filePath
window.location = data.filePath;
else
alert(data.msg);
});
And from controller you can create a HTTPGET Action method that return JSON data like:
return Json(new { msg = "No data found" }, JsonRequestBehavior.AllowGet);
Based on condition you can simple change msg with filePath.

Categories