I am trying to output a CSV file using an endpoint on a service in ServiceStack, using the HttpResult class.
The CSV string itself is being constructed via StringWriter and CsvHelper.
If the content type is set to "text/plain", the text appears on the browser screen fine when the endpoint URL is hit. However, if it is set to "text/csv", a CSV file is generated, but the information inside it is not correct.
For example:
Expected output:
Header 1, Header 2, Header 3
Actual output:
H,e,a,d,e,r, ,1,,, ,H,e,a,d,e,r, ,2,,, ,H,e,a,d,e,r, 3,"
Is there something I'm possibly missing?
Also, on a side note, how do I set the file name for the file itself? It appears I have to use HttpHeaders.ContentDisposition, but when I tried to set it I got an error along the lines of having multiple header elements.
EDIT: Sorry forgot to include code snippet.
string response = string.Empty;
using (var writer = new StringWriter())
{
using (var csv = new CsvWriter(writer))
{
csv.WriteHeader<TestClass>();
foreach (var element in elements)
{
csv.WriteField(elements.header1);
csv.WriteField(elements.header2);
csv.WriteField(elements.header3);
csv.NextRecord();
}
}
//apparently double quotes can cause the rendered CSV to go wrong in some parts, so I added this as part of trying
response = writer.ToString().Replace("\"", "");
}
return new HttpResult(response)
{
StatusCode = HttpStatusCode.OK,
ContentType = "test/csv"
};
And the info on TestClass:
public class TestClass
{
public string Header1 { get; set; }
public string Header2 { get; set; }
public string Header3 { get; set; }
}
From your description your HttpResult File Response may be serialized by the built-in CSV Format.
If you're not using it you can remove it with:
Plugins.RemoveAll(x => x is CsvFormat);
Otherwise if you are using it, you can circumvent its serialization by writing the CSV file in your Services implementation, e.g:
public class MyCsv : IReturn<string> {}
public async Task Any(MyCsv request)
{
var file = base.VirtualFileSources.GetFile("path/to/my.csv");
if (file == null)
throw HttpError.NotFound("no csv here");
Response.ContentType = MimeTypes.Csv;
Response.AddHeader(HttpHeaders.ContentDisposition,
$"attachment; filename=\"{file.Name}\";");
using (var stream = file.OpenRead())
{
await stream.CopyToAsync(Response.OutputStream);
await Response.OutputStream.FlushAsync();
Response.EndRequest(skipHeaders:true);
}
}
Edit since you're returning a raw CSV string you can write it to the response with:
Response.ContentType = MimeTypes.Csv;
await Response.WriteAsync(response);
Response.EndRequest(skipHeaders:true);
Related
I have a json that is read locally and is used in my program. Now I want the local json to be checked with the json from the api url and if it is not equal to the local json file it will be downloaded. Only the biggest problem is that it doesn't execute the code of the two if statements in the check! What is wrong with my code for this check?
This is what I tried to use
public static rootObject LoadJsonLocal()
{
rootObject rootObject = new rootObject();
var path = pathToJson();
string file;
using (StreamReader r = new StreamReader(path))
{
file = r.ReadToEnd();
rootObject = JsonSerializer.Deserialize<rootObject>(file);
}
return rootObject;
}
public static string pathToJson()
{
string extractPath = Environment.GetFolderPath(Environment.SpecialFolder.LocalApplicationData);
string path = extractPath + "/" + "getlatest.json";
return path;
}
public static async Task UpdateOrDownloadJson()
{
try
{
string str = "url";
WebClient webClient = new WebClient();
webClient.Headers.Add("Authorization", await Header.getAuthorizationHeader());
string JsonSting = webClient.DownloadString(str);
if (JsonSting.Equals(LoadJsonLocal()))
{
Console.WriteLine("Json is equal");
}
// json is not equal download it
if (!JsonSting.Equals(LoadJsonLocal()))
{
await Downloader.DownloadJson();
}
}
catch (Exception ex)
{
Console.WriteLine(ex.ToString());
}
}
You are comparing the downloaded json string to the already present local json deserialized to object form, they are not going to get compared correctly. You should override the Equals() method for your JSON model object[you named it rootObject] so that the comparison can be proper - between local json deserialized into object form and downloaded string deserialized into object form.
To use Equals() or to implement custom equality comparer refer this SO answer
I'm having troubles using StringWriter on our application.
I do a rest call over a nosql db and it returns a list of dynamics.
I use StringWriter to write a csv file that contains a header and records from my list.
I also tried to extend the StringWriter with a sealed class with constructor method which allows you to enter the type of encoding as a parameter. But trying all the encodings available it still generates wrong charachters.
This is our extension of StringWriter:
public sealed class StringWriterWithEncoding : StringWriter
{
private readonly Encoding encoding;
public StringWriterWithEncoding() : this(Encoding.UTF8) { }
public StringWriterWithEncoding(Encoding encoding)
{
this.encoding = encoding;
}
public override Encoding Encoding
{
get { return encoding; }
}
}
and this is the code for generate the csv file:
StringWriterWithEncoding sw = new StringWriterWithEncoding();
// Header
sw.WriteLine(string.Format("{0};{1};{2};{3};{4};{5};{6};{7};{8};{9};", "Soddisfazione", "Data Ricerca", "Categorie Cercate", "Id Utente", "Utente", "Categoria", "Id Documento", "Documento", "Id Sessione", "Testo Ricerca"));
foreach (var item in result.modelListDyn)
{
sw.WriteLine(string.Format("{0};{1};{2};{3};{4};{5};{6};{7};{8};{9};", item.Satisfaction, item.Date, item.Cluster, item.UserId, item.Username, item.Category, item.DocumentId, HttpUtility.HtmlDecode(item.DocumentTitle.ToString()), item.SessionId,
item.TextSearch));
}
var response = Request.CreateResponse(HttpStatusCode.OK, sw.ToString());
response.Content.Headers.ContentType = new System.Net.Http.Headers.MediaTypeHeaderValue("text/plain");
return response;
When the file is generated on in a column with some text, it display strange chars:
L’indennità di licenziamento del Jobs Act è incostituzionale
This is italian, and the wrong chars are seems to be à è ò ' ù etc.
Anyone can suggest a solution?
Thank you!
UPDATE
As user suggested, i started using CsvHelper
I created a Class and a ClassMap but it still returns corrupted chars.
StringWriter sw = new StringWriter();
CsvWriter cw = new CsvWriter(sw);
using (CsvWriter csv = new CsvWriter(sw))
{
csv.Configuration.RegisterClassMap<HistorySearchModelCsvHelperMap>();
csv.Configuration.CultureInfo = CultureInfo.InvariantCulture;
csv.WriteRecords(csvModelHelperList);
}
Result:
UPDATE 2
The problem is client-side, my action returns the correct text, without broken chars.
Action is triggered when i call it with an axios get instance.
axios.get(url, {
headers: {
'Accept': 'application/vnd.ms-excel',
'Content-Type': 'application/vnd.ms-excel'
}
})
.then(({ data }) => {
const blob = new Blob([data], {
type: 'application/vnd.ms-excel',
});
// "fileDownload" is 'js-file-download' module.
fileDownload(blob, 'HistorySearches.csv', 'application/vnd.ms-excel');
this.setState({ exportLoaded: true, exportLoading: false });
}).catch(() => {
this.setState({ exportLoaded: false, exportLoading: false });
});
I read to set responseType to blob but even passing the type: 'application/vnd.ms-excel' the chars over my csv file are still corrupted.
In my action when i return the Response:
// ... some code
StringWriterWithEncoding sw = new StringWriterWithEncoding();
CsvWriter cw = new CsvWriter(sw);
using (CsvWriter csv = new CsvWriter(sw))
{
csv.Configuration.RegisterClassMap<HistorySearchModelCsvHelperMap>();
csv.Configuration.CultureInfo = CultureInfo.InvariantCulture;
csv.WriteRecords(csvModelHelperList);
}
return Request.CreateResponse(HttpStatusCode.OK, sw.ToString());
// response.Content.Headers.ContentType = new System.Net.Http.Headers.MediaTypeHeaderValue("application/vnd.ms-excel");
return response;
I tried to set content type server-side too, but the format is incorrect anyway.
If you want to be able to open your csv in Excel, you need to write it with an encoding of Windows-1255.
If you open the csv in a generic text editor and it still displays incorrectly, I'm not sure what's wrong, as your code looks sane.
Solved directly on client-side.
I made my own download routine and passed the UTF-8 BOM as first value of response string:
downloadFile2(data, fileName, type="text/string") {
// Create an invisible A element
const a = document.createElement("a");
a.style.display = "none";
// Using "universal BOM" https://technet.microsoft.com/en-us/2yfce773(v=vs.118)
const universalBOM = "\uFEFF";
a.setAttribute('href', 'data:text/csv; charset=utf-8,' + encodeURIComponent(universalBOM+data));
// Use download attribute to set set desired file name
a.setAttribute('download', fileName);
document.body.appendChild(a);
// Trigger the download by simulating click
a.click();
// Cleanup
window.URL.revokeObjectURL(a.href);
document.body.removeChild(a);
},
I now have this scenario:
I have a table in SQL Server, and a handful of webpage-user-defined queries that generates a results page showing the results. The controller functions are all ready to use.
Now I would like to be able to download the results to local computers accessing the website. I'm not sure yet what to put the results into. I've searched for it and both xls and csv files seem pretty straight-forward enough. But they only create a file and then save it onto the server side.
So my questions are:
Does the task must be accomplished by creating a temporary file ==> download the temporary file to client ==> delete the temporary file on the server?
If it must be so, how do I create a button for downloading that temporary file? And what will happen if it is serving multiple users at the same time?
Not sure what to do now and any help would be appreciated.
You should create a MemoryStream from the data received from Sql Server. Create a new class with the code below.
public abstract class FileActionResult : IHttpActionResult
{
private string MediaType { get; }
private string FileName { get; }
private Stream Data { get; }
protected FileActionResult(Stream data, string fileName, string mediaType)
{
Data = data;
FileName = fileName;
MediaType = mediaType;
}
public Task<HttpResponseMessage> ExecuteAsync(CancellationToken cancellationToken)
{
Data.Position = 0;
var response = new HttpResponseMessage
{
Content = new StreamContent(Data)
};
response.Content.Headers.ContentDisposition = new ContentDispositionHeaderValue("attachment");
response.Content.Headers.ContentType = new MediaTypeHeaderValue(MediaType);
response.Content.Headers.ContentDisposition.FileName = FileName;
response.Content.Headers.ContentLength = Data.Length;
return Task.FromResult(response);
}
}
public class ExcelFileActionResult : FileActionResult
{
public ExcelFileActionResult(Stream data) : base(data, "Exported.xls", "application/vnd.ms-excel")
{
}
}
Calling code from Controller.
return new ExcelFileActionResult(stream);
stream is the memorystream.
I am working on a project that is reading the data from a Azure Blob and saving that data into an Object. I am currently running into a problem. The way my code is set up now - it will read all the .txt data within a container if there are no Virtual Folders present.
However, if there is a virtual folder structure present within a Azure Container
my code will error out, with a NullExceptionReference. My idea was to do a if check to see if there was Virtual Folders present within an Azure Container if so execute //some code. Is there a way to tell if there is a virtual folder is present?
ReturnBlobObject()
private List<Blob> ReturnBlobObject(O365 o365)
{
List<Blob> listResult = new List<Blob>();
string textToFindPattern = "(\\/)";
string fileName = null;
string content = null;
//Loop through all Blobs and split the container form the file name.
foreach (var blobItem in o365.Container.ListBlobs(useFlatBlobListing: true))
{
string containerAndFileName = blobItem.Parent.Uri.MakeRelativeUri(blobItem.Uri).ToString();
string[] subString = Regex.Split(containerAndFileName, textToFindPattern);
//subString[2] is the name of the file.
fileName = subString[2];
content = ReadFromBlobStream(o365.Container.GetBlobReference(subString[2]));
Blob blobObject = new Blob(fileName, content);
listResult.Add(blobObject);
}
return listResult;
}
ReadFromBlobStream
private string ReadFromBlobStream(CloudBlob blob)
{
Stream stream = blob.OpenRead();
using (StreamReader reader = new StreamReader(stream))
{
return reader.ReadToEnd();
}
}
I was able to solve this by refactoring my code. Instead of using Regex - which was returning some very odd behavior I decided to take a step back and think the problem. Below is the solution I came up with.
ReturnBlobObject()
private List<Blob> ReturnBlobObject(O365 o365)
{
List<Blob> listResult = new List<Blob>();
//Loop through all Blobs and split the container form the file name.
foreach (var blobItem in o365.Container.ListBlobs(useFlatBlobListing: true))
{
string fileName = blobItem.Uri.LocalPath.Replace(string.Format("/{0}/", o365.Container.Name), "");
string content = ReadFromBlobStream(o365.Container.GetBlobReference(fileName));
Blob blobObject = new Blob(fileName, content);
listResult.Add(blobObject);
}
return listResult;
}
I attempting to read the raw input stream in a ServiceStack Service. I have marked the DTO with IRequiresRequestStream, and the code to read executes, but the content always shows as blank.
Using debug mode in IE9, I can see the raw HttpRequest contains text within the POST as delivered.
Here is my code from a minimal test service intended only to show reading of the content and query:
[Route("/qtest")]
public class QueryTestRequest : IReturn<string>, IRequiresRequestStream
{
public Stream RequestStream { get; set; }
}
public class QueryTestService : Service
{
public string Any(QueryTestRequest request)
{
var r = new StringBuilder();
r.Append("<p>This is the query test service:");
r.AppendFormat("<p>Parameter value={0}", base.Request.QueryString["value"]);
var postStream = new StreamReader(request.RequestStream);
var postContent = postStream.ReadToEnd();
r.AppendFormat("<p>Raw Content={0}", postContent);
return r.ToString();
}
}
What am I missing here?
Yes I find that weird as well, but maybe it's me who doesn't understand the nature of the HttpRequestStream.
Anyway... I managed to get hold of the file using:
var stream = Request.Files[0].InputStream;
And then you can handle that stream.
It appears that more than one file can be uploaded, but I guess that would be difficult to wrap into a REST-framework.