My goal is to write values to a preexisting .csv file stored in a collection. Currently, I am reading the values stored in the .csv file and storing them in a collection using the filerhelper library. But I want to add 4 column fields {Product ,Base Price,Ship Price,Total Price}, then, if the customer name in my records collection matches the customer name in customerandproducts collection - write in the corresponding data from customerandproducts collection to the 4 fields I added.
Here's a simplified version of my source code:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using FileHelpers;
namespace ReadFile
{
class Program
{
static void Main()
{
var engine = new FileHelperEngine<Orders>();
var records = engine.ReadFile("Daniel.csv");
// var customersandproducts = addresses.Select(x => new { x.Name, x.AddressLine1, x.AddressLine2, x.City, x.State, x.S_OrderId, x.PostalCode })
// .Join(products, custs => custs.S_OrderId, prod => prod.P_OrderId,(custs, prod) => new { custs.Name, custs.AddressLine1, custs.AddressLine2, custs.City, custs.State, custs.PostalCode, prod.Title, prod.Quantity, prod.ItemPrice, prod.ShippingPrice });
foreach (var record in records)
{
Console.WriteLine(record.Name);
Console.WriteLine(record.Track);
Console.WriteLine(record.Price);
}
}
}
[DelimitedRecord(",")]
public class Orders
{
public string Name;
public string Track;
public string Price;
}
}
Here is the data in my .csv file:
ShipToCompanyorName,PackageTrackingNumber,ShipmentInformationTotalShipmentReferenceCharge
customer1,1Z620Y1V0347293915,12.6
customer2,1Z620Y1V0347106126,9.18
customer3,1Z620Y1V0347584931,13.63
customer4,1Z620Y1V0346366348,11.37
customer5,1Z620Y1V0348472309,31.
customer6,1Z620Y1V0348325290,31.34
My Problem: I want to add {Product ,Base Price,Ship Price,Total Price} into the first row of the .csv file, then fill in each column with information in customerandproducts collection using the customer name as the key.
You can use two classes and later map them with AutoMapper
namespace ReadFile
{
class Program
{
static void Main(string[] args)
{
var engine = new FileHelperEngine<Orders>();
var records = engine.ReadFile("Daniel.csv");
var mapped = Mapper.Map<Orders[], MoreFields[]>(records );
// mapped is an array of the class with more fields
}
}
[DelimitedRecord(",")]
public class Orders
{
public string Name;
public string Track;
public string Price;
}
public class MoreFields: Orders
{
public string Product;
public string BasePrice;
...
}
}
You should use the FieldOptional attribute for your extra fields. Then they will be ignored for import, but included for export.
[DelimitedRecord(",")]
public class Orders
{
public string Name;
public string Track;
public string Price;
[FieldOptional]
public string Product
[FieldOptional]
public string BasePrice
[FieldOptional]
public string ShipPrice
[FieldOptional]
public string TotalPrice
}
Then something like:
var engine = new FileHelperEngine<Orders>();
var records = engine.ReadFile("Daniel.csv");
// transform the collection with a foreach loop
foreach(var record in records)
{
var customer = custs.ByName(record.Name);
record.Product = etc...
record.BasePrice = etc...
}
// write out the file including the optional fields.
engine.WriteFile("Daniel_transformed.csv", records);
Related
I currently have an API which I use dapper to connect to a SQL database to find the file path for documents related to a certain ID. The result is returned in JSON format as follows:
[{"DOC-ID": 1, "DOCUMENT_FULL_PATH": "/PATH/FILENAME.DOC"},{"DOC-ID": 2, "DOCUMENT_FULL_PATH": "/PATH/FILENAME2.DOC"}]
I am trying to get my API to deserialize the JSON data then link that too a model (I prefer to not use models but the only solution I found was using JSON DOM which is briefly discussed on MS website but does not provide an example of how to loop through the JSON array so I could not move forward with this example). When I try to deserialize the dapper query result I get the error indicated below (shown at the line item in the code). I am not sure what is triggering this as I would think the QuerySingle could be deserialized with this method. Once this error is fixed I need to check the files last modified date and save that value to the model which I then again need to serialize to send to the front end! I have spent so much time on this so some help would be much appreciated!
[HttpPost]
public ActionResult MainReviewDocuments([FromForm] string ID)
{
//Using FormData on frontend
//checking ID exists on searching in dashboard
if (ID == null || ID.Length == 0)
{
return Ok(new { Result = "Failed" });
}
else
{
//We have parameters here just in case we want to use them
var UserID = HttpContext.User.FindFirst(ClaimTypes.Name).Value;
String query = "select dbo.A2Q_0132W_RR_IDDocumentation_JSON(#ID) as output";
using (var connection = new SqlConnection(connectionString))
{
var json = connection.QuerySingle<string>(query, new { ID = ID});
MainReviewDocuments? mainreviewdocuments = JsonSerializer.Deserialize<MainReviewDocuments>(json); // this line draws an error 'The JSON value could not be converted to Project.Models.MainReviewDocuments. Path: $ | LineNumber: 0 | BytePositionInLine: 1.'
var rootPath = Path.Combine(Directory.GetParent(env.ContentRootPath).ToString(), "Files");
foreach (var document in mainreviewdocuments)
{
filePath = Path.Combine(rootPath, document.DOCUMENT_FULL_PATH);
//Check file system for each file path and set last modified value to model object LAST_MODIFIED. Struggling with this as well
}
return Ok(mainreviewdocuments); // Can I use Ok() method to convert model back to JSON?
}
}
}
In your original call, you need to de-serialize to a List:
MainReviewDocuments? mainreviewdocuments = JsonSerializer.Deserialize<List<MainReviewDocuments>>(json);
and then access your properties are required.
Using Newtonsoft.Json library:
You can de-serialize your JSON string that you receive from your DB into the following class:
public class MainReviewDocuments
{
[JsonProperty("DOC-ID")]
public int DOCID { get; set; }
public string DOCUMENT_FULL_PATH { get; set; }
}
Or you can use dynamic to de-serialize your JSON:
var mainreviewdocuments = JsonSerializer.Deserialize<dynamic>(json);
and then access the properties as shown in the example below.
You can refer to a working example below:
using System;
using Newtonsoft.Json;
using System.Collections.Generic;
public class Program
{
public static void Main()
{
var myJsonString=#"[{'DOC-ID': 1, 'DOCUMENT_FULL_PATH': '/PATH/FILENAME.DOC'},{'DOC-ID': 2, 'DOCUMENT_FULL_PATH': '/PATH/FILENAME2.DOC'}]";
var mainreviewdocuments =JsonConvert.DeserializeObject<List<MainReviewDocuments>>(myJsonString);
Console.WriteLine("Example using Model: \n");
foreach(var item in mainreviewdocuments)
{
Console.WriteLine(item.DOCID);
Console.WriteLine(item.DOCUMENT_FULL_PATH);
}
Console.WriteLine("\n");
Console.WriteLine("Example using Dynamic: \n");
//Example using dynamic
var mainreviewdocumentsDynamic=JsonConvert.DeserializeObject<dynamic>(myJsonString);
foreach(var item in mainreviewdocumentsDynamic)
{
Console.WriteLine(item["DOC-ID"]);
Console.WriteLine(item["DOCUMENT_FULL_PATH"]);
}
}
}
public class MainReviewDocuments
{
[JsonProperty("DOC-ID")]
public int DOCID { get; set; }
public string DOCUMENT_FULL_PATH { get; set; }
}
Output:
Example using Model:
1
/PATH/FILENAME.DOC
2
/PATH/FILENAME2.DOC
Example using Dynamic:
1
/PATH/FILENAME.DOC
2
/PATH/FILENAME2.DOC
Using System.Text.Json library:
using System;
using System.Collections.Generic;
using System.Text.Json;
using System.Text.Json.Serialization;
public class Program
{
public static void Main()
{
var myJsonString="[{\"DOC-ID\": 1, \"DOCUMENT_FULL_PATH\": \"/PATH/FILENAME.DOC\"},{\"DOC-ID\": 2, \"DOCUMENT_FULL_PATH\": \"/PATH/FILENAME2.DOC\"}]";
var mainreviewdocuments = JsonSerializer.Deserialize<List<MainReviewDocuments>>(myJsonString);
Console.WriteLine("Example using Model: \n");
foreach(var item in mainreviewdocuments)
{
Console.WriteLine(item.DOCID);
Console.WriteLine(item.DOCUMENT_FULL_PATH);
}
Console.WriteLine("\n");
Console.WriteLine("Example using Dynamic: \n");
using (JsonDocument document = JsonDocument.Parse(myJsonString))
{
foreach (JsonElement element in document.RootElement.EnumerateArray())
{
Console.WriteLine(element.GetProperty("DOC-ID"));
Console.WriteLine(element.GetProperty("DOCUMENT_FULL_PATH"));
}
}
}
}
public class MainReviewDocuments
{
[JsonPropertyName("DOC-ID")]
public int DOCID { get; set; }
public string DOCUMENT_FULL_PATH { get; set; }
}
Output:
Example using Model:
1
/PATH/FILENAME.DOC
2
/PATH/FILENAME2.DOC
Example using Dynamic:
1
/PATH/FILENAME.DOC
2
/PATH/FILENAME2.DOC
Working example: https://dotnetfiddle.net/nEjPIK
You can read more on the comparison between the two libraries in this article
I am using ElasticClient.Search function to get the values of the fields.
The issue is :
The code that i make below make the mapping correctly but for searching it returns null values of the fields that was mapped before.
Main.cs
using Nest;
using System;
using System.Linq;
using System.Threading;
namespace DataAccessConsole
{
class Program
{
public static Uri node;
public static ConnectionSettings settings;
public static ElasticClient client;
static void Main(string[] args)
{
{
node = new Uri("http://localhost:9200");
settings = new ConnectionSettings(node).DefaultIndex("getallcommissionspermanentes");
settings.DefaultFieldNameInferrer(p => p);
client = new ElasticClient(settings);
var indexSettings = new IndexSettings();
indexSettings.NumberOfReplicas = 1;
indexSettings.NumberOfShards = 1;
client.Indices.Create("getallcommissionspermanentes", index => index
.Map<GetAllCommissionsPermanentes>(
x => x
.AutoMap<GetAllCommissionsPermanentes>()
));
client.Search<GetAllCommissionsPermanentes>(s => s
.AllIndices()
);
}
}
GetAllCommissionsPermanentes.cs
the table is located in an edmx model of Entityframework and Data came from SQL SERVER Database
public partial class GetAllCommissionsPermanentes
{
public int ID { get; set; }
public string NomAr { get; set; }
public string NomFr { get; set; }
}
if you need more informations just make a comment below.
Thanks
Code is correct but '.All Indices ()' searches in all indexes, results that do not match the model are coming. This code will return more accurate results;
client.Search<GetAllCommissionsPermanentes>(s => s.Index("getallcommissionspermanentes");
I'm new to C# and OOP I have two different files. File A where I've created the list.
//This file will contain predetermine list of responses.
using System.Linq;
using System.Collections.Generic;
public class Responses
{
//bot name
static string nomber = "Jarvis";
List<string> answer = new List<string>(){
$"Mi nomber ta {nomber}",
"Mi ta bon"
};
public void AddToList(string value){
this.answer.Add(value);
}
public string Answer(int id)
{
return answer.ElementAt(id);
}
}
And in file B I have these two lines of code to add the string 1 to the list, I've also included the Generics and Linq System in file B.
var response = new Responses();
response.answer.Add("1");
I've tried creating a method called AddToList to pass the value and add it to the list, but with no luck. When I try to display the list at index 2 I'll get an argument out of range instead of the value "1".
*Also both files are located in the same folder.
After read your source code, I understand your problem. First you Add new element to Response.answers of Interperter and it return id then you get Response.answer of Output by that id. Of course you never get that, because they were 2 different instances.
I provide 2 options for you:
Option 1: Make Reponses single instance (singleton)
Responses.cs
using System.Linq;
using System.Collections.Generic;
public class Responses
{
private static Responses _instance = new Responses();
public static GetInstance() {
return _instance;
}
//bot name
static string nomber = "Jarvis";
List<string> answer = new List<string>(){
$"Mi nomber ta {nomber}",
"Mi ta bon"
};
public void AddToList(string value){
this.answer.Add(value);
}
public string Answer(int id)
{
return answer.ElementAt(id);
}
}
Then change on other files
//from
var responses = new Responses();
//to
var responses = Responses.GetInstance();
//from
responses.answer.Add()
//to
reponses.AddToList()
Option 2: Make Responses static
Response.cs
using System.Linq;
using System.Collections.Generic;
public static class Responses
{
//bot name
static string nomber = "Jarvis";
static List<string> answer = new List<string>(){
$"Mi nomber ta {nomber}",
"Mi ta bon"
};
public static void AddToList(string value){
this.answer.Add(value);
}
public static string Answer(int id)
{
return answer.ElementAt(id);
}
}
Output.cs
using System;
public class Output
{
public void Return(int respondType, int respond)
{
switch(respondType)
{
case 0:
Console.WriteLine(Responses.Answer(respond));
break;
default:
Console.WriteLine("Mi no ta kompronde");
break;
}
}
}
Interperter.cs
using System.Collections.Generic;
using System.Linq;
public class Interpreter
{
public int UserInputType(string value)
{
// Turns the user input into an array of words
string[] words = value.Split(' ');
int returnValue = 2;
//int match = 0;
Responses.AddToList("1");
//This stores the correct response to the given question
//var element = new List<int>();
foreach(var word in words)
{
// if(!string.IsNullOrWhiteSpace(word))
// {
foreach(var listOfQuestions in userInputedQuestions)
{
//Convert words in the listOfQuestions to array string to match them with the userInputedQuestion
string[] listOfQWords = listOfQuestions.Split(" ");
//Check how many words matches the predefined list of questions
foreach(var qWord in listOfQWords){
if(word == qWord){
returnValue = 0;
}
}
}
}
// }
return returnValue;
}
private List<string> userInputedQuestions = new List<string>(){
"Ki ta bo nomber?",
"Konta ku bo?"
};
}
Hope it helps
I have this data and I need to make an object out of it so I can easily read/write to this object.
I tried to make a dictionary and fill it with hashtables but it is very hard to maintain and I don't even know how to modify the data inside the hashtable inside that dictionary.
this is my bad practice:
Dictionary<string,Hashtable> DICTFormsControls = new Dictionary<string,Hashtable>();
DICTFormsControls[parentfrm] = new Hashtable();
DICTFormsControls[parentfrm][controlid] = new FormControl(controlnm,parentfrm,controlid,controlpermission);
offcourse the "FormControl" is just a simple class to hold the control proberties.
I need to make it so I can get the data easily and modify it just as easy.
A dictionary already contains a hash table for the key. Try something like this
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
namespace ConsoleApplication1
{
class Program
{
static void Main(string[] args)
{
DICTFormsControls dictFormcontrols = new DICTFormsControls();
dictFormcontrols.Add(controlnm,parentfrm,controlid,controlpermission)
}
}
public class DICTFormsControls
{
public static Dictionary<int, DICTFormsControls> dict = new Dictionary<int, DICTFormsControls>();
public string controlnm { get; set;}
public string parentfrm { get; set;}
public int controlid { get; set;}
public bool controlpermission {get;set;}
public void Add(string controln, string parentfrm, int controlid, bool controlpermission)
{
dict.Add(controlid, new DICTFormsControls() { controlnm = controlnm, parentfrm = parentfrm, controlid = controlid, controlpermission = controlpermission });
}
}
}
I'm sure its very straightforward but I am struggling to figure out how to write an array to file using CSVHelper.
I have a class for example
public class Test
{
public Test()
{
data = new float[]{0,1,2,3,4};
}
public float[] data{get;set;}
}
i would like the data to be written with each array value in a separate cell. I have a custom converter below which is instead providing one cell with all the values in it.
What am I doing wrong?
public class DataArrayConverter<T> : ITypeConverter
{
public string ConvertToString(TypeConverterOptions options, object value)
{
var data = (T[])value;
var s = string.Join(",", data);
}
public object ConvertFromString(TypeConverterOptions options, string text)
{
throw new NotImplementedException();
}
public bool CanConvertFrom(Type type)
{
return type == typeof(string);
}
public bool CanConvertTo(Type type)
{
return type == typeof(string);
}
}
To further detail the answer from Josh Close, here what you need to do to write any IEnumerable (including arrays and generic lists) in a recent version (anything above 3.0) of CsvHelper!
Here the class under test:
public class Test
{
public int[] Data { get; set; }
public Test()
{
Data = new int[] { 0, 1, 2, 3, 4 };
}
}
And a method to show how this can be saved:
static void Main()
{
using (var writer = new StreamWriter("db.csv"))
using (var csv = new CsvWriter(writer))
{
var list = new List<Test>
{
new Test()
};
csv.Configuration.HasHeaderRecord = false;
csv.WriteRecords(list);
writer.Flush();
}
}
The important configuration here is csv.Configuration.HasHeaderRecord = false;. Only with this configuration you will be able to see the data in the csv file.
Further details can be found in the related unit test cases from CsvHelper.
In case you are looking for a solution to store properties of type IEnumerable with different amounts of elements, the following example might be of any help:
using CsvHelper;
using CsvHelper.Configuration;
using System.Collections.Generic;
using System.IO;
using System.Linq;
namespace CsvHelperSpike
{
class Program
{
static void Main(string[] args)
{
using (var writer = new StreamWriter("db.csv"))
using (var csv = new CsvWriter(writer))
{
csv.Configuration.Delimiter = ";";
var list = new List<AnotherTest>
{
new AnotherTest("Before String") { Tags = new List<string> { "One", "Two", "Three" }, After="After String" },
new AnotherTest("This is still before") {After="after again", Tags=new List<string>{ "Six", "seven","eight", "nine"} }
};
csv.Configuration.RegisterClassMap<TestIndexMap>();
csv.WriteRecords(list);
writer.Flush();
}
using(var reader = new StreamReader("db.csv"))
using(var csv = new CsvReader(reader))
{
csv.Configuration.IncludePrivateMembers = true;
csv.Configuration.RegisterClassMap<TestIndexMap>();
var result = csv.GetRecords<AnotherTest>().ToList();
}
}
private class AnotherTest
{
public string Before { get; private set; }
public string After { get; set; }
public List<string> Tags { get; set; }
public AnotherTest() { }
public AnotherTest(string before)
{
this.Before = before;
}
}
private sealed class TestIndexMap : ClassMap<AnotherTest>
{
public TestIndexMap()
{
Map(m => m.Before).Index(0);
Map(m => m.After).Index(1);
Map(m => m.Tags).Index(2);
}
}
}
}
By using the ClassMap it is possible to enable HasHeaderRecord (the default) again. It is important to note here, that this solution will only work, if the collection with different amounts of elements is the last property. Otherwise the collection needs to have a fixed amount of elements and the ClassMap needs to be adapted accordingly.
This example also shows how to handle properties with a private set. For this to work it is important to use the csv.Configuration.IncludePrivateMembers = true; configuration and have a default constructor on your class.
Unfortunately, it doesn't work like that. Since you are returning , in the converter, it will quote the field, as that is a part of a single field.
Currently the only way to accomplish what you want is to write manually, which isn't too horrible.
foreach( var test in list )
{
foreach( var item in test.Data )
{
csvWriter.WriteField( item );
}
csvWriter.NextRecord();
}
Update
Version 3 has support for reading and writing IEnumerable properties.