Could you please help me with reading the txt file which is shown in the image below?
[DelimitedRecord("│")]
public class Orders
{
public int Belegnr { get; set; }
public string Pos { get; set; }
public string Belegdatum { get; set; }
public string Auftrag { get; set; }
}
var engine = new FileHelperEngine<Orders>();
if (engine.Options.FieldCount==19)
{
var records = engine.ReadFile(#"\\bosch.com\dfsrb\dfstr\div\dc\BUP2_TEF_Share\02_TEF3\90_Projeler\Pems\Maliyetler\KOB1_Order.XML");
foreach (var record in records)
{
Console.WriteLine(record.Belegnr);
}
}
I don't want to read the first 20 rows in the txt file.
Those are info rows.
You can use the IgnoreFirst attribute which indicates the numbers of lines to be ignored at the beginning of a file or stream when the engine reads it.
[IgnoreFirst(20)]
[DelimitedRecord("│")]
public class Orders
{
// etc...
}
There is also an IgnoreLast attribute for ignoring the last rows of the file.
Related
I am trying to convert a value from a CSV file using CsvHelper. The value itself is an integer but the value inside the csv contains a whitespace e.g. "0 " or "12 ".
How can I get it to work now? On StackOverFlow I found this thread but the trimming doesn't apply to the binding. According to a comment from the creator of this library it should since V2.9.0.
How do you ignore Whitespace when using CsvHelper, CsvReader.Read()?
I try to read my CSV in this way:
using (TextReader reader = new StreamReader(datei))
{
CsvConfiguration configuration = new CsvConfiguration(CultureInfo.GetCultureInfo("de-DE"));
configuration.BadDataFound = null;
configuration.TrimOptions = TrimOptions.Trim;
using (var csv = new CsvReader(reader, configuration))
{
ArtikeldatenLieferant = csv.GetRecords<AllnetArtikel>().ToList();
}
}
Edit:
This is one line of my CSV which results in this issue:
Nr.;ALLNET-Artikelnummer;Hersteller-Artikelnummer;Hersteller;Produktbezeichnung;EAN Nummer;Kategorieebene1;Kategorieebene2;Kategorieebene3;HEK;Artikelzustand;UVP;Produktbeschreibung;Gewicht;Lagerbestand
9234;193301;AL-MSUC-SUF-S;Audiocodes Live;Audiocodes Live - AL-MSUC-SUF-S;;Telekommunikation;Voice over IP;Voice over IP - Gateway Support;2739,13;neu;3043,48;"AudioCodes Live non-recurring setup fee, for each customer site with up to 500 users. Includes delivery, Planning and Design consulting service, Implementation service (configuration and basic verification) for AudioCodes hardware or software, and cutover support into production for a single event. Does not include Project Management.;0,001;0
Edit2: Here is the AllnetArtikel class.
public class AllnetArtikel
{
[Name("Nr.")]
public string Nr { get; set; } = string.Empty;
[Name("ALLNET-Artikelnummer")]
public string AllnetArtiNr{ get; set; } = string.Empty;
[Name("Hersteller-Artikelnummer")]
public string HerstellerArtiNr{ get; set; } = string.Empty;
[Name("Hersteller")]
public string Hersteller{ get; set; } = string.Empty;
[Name("Produktbezeichnung")]
public string Produktbezeichnung{ get; set; } = string.Empty;
[Name("EAN Nummer")]
public string EAN{ get; set; } = string.Empty;
[Name("Kategorieebene1")]
public string Kategorie1{ get; set; } = string.Empty;
[Name("Kategorieebene2")]
public string Kategorie2{ get; set; } = string.Empty;
[Name("Kategorieebene3")]
public string Kategorie3{ get; set; } = string.Empty;
[Name("HEK")]
public decimal HEK{ get; set; }
[Name("Artikelzustand")]
public string Zustand{ get; set; } = string.Empty;
[Name("UVP")]
public decimal UVP{ get; set; }
[Name("Produktbeschreibung")]
public string Produktbeschreibung{ get; set; } = string.Empty;
[Name("Gewicht")]
[Default(-1)]
public decimal Gewicht{ get; set; }
[Name("Lagerbestand")]
public int Lagerbestand { get; set; }
}
Final Update
The issue ended up being a quote in a field and the quote was not escaped or the field contained in quotes. ;"AudioCodes ... Management.;. Since the quote was a part of the data that was needed, the solution was to change the configuration mode to NoEscape. This mode will ignore quotes and escape characters.
configuration.Mode = CsvMode.NoEscape;
Update
Any string that works with int.Parse should work with CsvHelper. For instance int.Parse("0 ") == 0. Since having extra whitespace is not an issue and it works when you change your integer field to string, I'm going to take a guess that your issue is actually with an empty value. CsvHelper is unable to convert any empty values into an int. If that is the case, you have two choices.
Either turn your integer field into a Nullable<int>
void Main()
{
CsvConfiguration configuration = new CsvConfiguration(CultureInfo.GetCultureInfo("de-DE"));
using (var reader = new StringReader("Id;Name\n1;Joe\n;Jenny"))
using (var csv = new CsvReader(reader, configuration))
{
var records = csv.GetRecords<Foo>().ToList().Dump();
}
}
public class Foo
{
public int? Id { get; set; }
public string Name { get; set; }
}
Or give your integer field a default value.
void Main()
{
CsvConfiguration configuration = new CsvConfiguration(CultureInfo.GetCultureInfo("de-DE"));
using (var reader = new StringReader("Id;Name\n1;Joe\n;Jenny"))
using (var csv = new CsvReader(reader, configuration))
{
var records = csv.GetRecords<Foo>().ToList().Dump();
}
}
public class Foo
{
[Default(0)]
public int Id { get; set; }
public string Name { get; set; }
}
Original
Are you sure you aren't getting an exception for something else? I can put in as many spaces after the number as I want and it still reads it correctly.
void Main()
{
CsvConfiguration configuration = new CsvConfiguration(CultureInfo.GetCultureInfo("de-DE"));
using (var reader = new StringReader("Id;Name\n1 ;Joe\n10 ;Jenny"))
using (var csv = new CsvReader(reader, configuration))
{
var records = csv.GetRecords<Foo>().ToList();
}
}
public class Foo
{
public int Id { get; set; }
public string Name { get; set; }
}
Currently I am working with the Shopify GraphQL Bulk Query.
This Query returns a JSON Lines file. Such a file may look like this:
{"id":"gid:\/\/shopify\/Product\/5860091625632","title":"Levis Jeans","description":"Cool Jeans","vendor":"Levis","status":"ACTIVE"}
{"id":"gid:\/\/shopify\/ProductImage\/20289865679008","__parentId":"gid:\/\/shopify\/Product\/5860091625632"}
{"id":"gid:\/\/shopify\/ProductVariant\/37178118963360","title":"32","position":1,"image":null,"selectedOptions":[{"name":"Size","value":"32"}],"inventoryItem":{},"__parentId":"gid:\/\/shopify\/Product\/5860091625632"}
{"available":10,"location":{"id":"gid:\/\/shopify\/Location\/57510625440"},"__parentId":"gid:\/\/shopify\/ProductVariant\/37178118963360"}
{"id":"gid:\/\/shopify\/ProductVariant\/37178118996128","title":"31","position":2,"image":null,"selectedOptions":[{"name":"Size","value":"31"}],"inventoryItem":{},"__parentId":"gid:\/\/shopify\/Product\/5860091625632"}
{"available":5,"location":{"id":"gid:\/\/shopify\/Location\/57510625440"},"__parentId":"gid:\/\/shopify\/ProductVariant\/37178118996128"}
{"available":3,"location":{"id":"gid:\/\/shopify\/Location\/57951518880"},"__parentId":"gid:\/\/shopify\/ProductVariant\/37178118996128"}
{"id":"gid:\/\/shopify\/ProductVariant\/37178119028896","title":"34","position":3,"image":null,"selectedOptions":[{"name":"Size","value":"34"}],"inventoryItem":{},"__parentId":"gid:\/\/shopify\/Product\/5860091625632"}
{"available":5,"location":{"id":"gid:\/\/shopify\/Location\/57510625440"},"__parentId":"gid:\/\/shopify\/ProductVariant\/37178119028896"}
{"available":15,"location":{"id":"gid:\/\/shopify\/Location\/57951518880"},"__parentId":"gid:\/\/shopify\/ProductVariant\/37178119028896"}
Each line of this file is a valid JSON-object and the lines are connected via __parentId with each other.
My Goal is to Deserialize this into C# Classes like this:
class Product
{
public string Id { get; set; }
public string Title { get; set; }
public string Description { get; set; }
public IEnumerable<ProductImage> Images { get; set; }
public IEnumerable<ProductVariant> Variants { get; set; }
}
class ProductImage
{
public string Id { get; set; }
}
class ProductVariant
{
public string Id { get; set; }
public IEnumerable<IDictionary<string, string>> SelectedOptions { get; set; }
public IEnumerable<InventoryLevel> Levels { get; set; }
}
class InventoryLevel
{
public int Available { get; set; }
}
And the output of a potential function performing the deserialization:
var file = new System.IO.StreamReader(#"c:\test.jsonl");
var products = DeserializeJsonL<IEnumerable<Product>>(file);
Shopify suggests to read the file in reverse. I get the Idea.
But I cannot imagine how to deserialize this file in a type safe way. How could I determine if the current line is a ProductVariant, a ProductImage or something else? I cannot influence the JSONL Output to include type information.
I am pretty sure without type information I cannot deserialize it safely. But how should I handle this data then to insert into a database for example?
EDIT the classname in {"id":"gid:\/\/shopify\/Product\/5860091625632"} cannot be used to determine the Type!
I ended up adding some sort of type information to my graphql-query by defining a unique fieldname for each type which may be on a new line in the resulting JSON Lines file.
For that i used GraphQL field aliases:
someQuery {
uniqueFieldAlias : fieldName
}
When i read the file i search on each line for the unique fieldname. Then i deserialize the line into the corresponding class.
using (var file = new StreamReader(await res.Content.ReadAsStreamAsync()))
{
string line;
while ((line = await file.ReadLineAsync()) != null)
{
if (line.Contains("\"uniqueFieldAlias\""))
{
var product = JsonSerializer.Deserialize<Product>(line);
products.Add(product);
continue;
}
if (line.Contains("\"otherUniqueAlias\""))
{
var somethingElse = JsonSerializer.Deserialize<SomeClass>(line);
products[productIndex].Something.Add(somethingElse);
continue;
}
}
}
The idea is inspired by #Caius Jard comments
I have a formatted log file I'm trying to parse; the file is divided in sections with an header and the data inside each section is formatted with JSON like follows. Link to an extract of the log file here
[UnityCrossThreadLogger]1/8/2019 7:49:19 PM
==> Deck.GetDeckLists(112):
{
"jsonrpc": "2.0",
"method": "Deck.GetDeckLists",
"params": {},
"id": "112"
}
My issue here is manipulating the whole string in a way I get to the section I want and there strip the meaningless data and parse the remaining through Newtonsoft JSON. For now I'm cutting everything I don't need using this function, since the log file is in chronological order and only the latest occurrence of the entry is needed:
//Cut the whole log to the last entry
private static string CutLog(string fromWhereToCut)
{
string log = GetLog();
//In this case fromWhereToCut would be "Deck.GetDeckLists"
string s = log.Substring(log.LastIndexOf(fromWhereToCut));
return s;
}
The problem is the fact it leaves the header in place I need to remove before deserializing the JSON and it's prone to breaking because the name of the sections aren't that unique and they could be repeated further down as non-header titles (as can be seen in my example). Furthermore I don't know how to stop at the end of the section I need before another one begins.
I thought RegEx could be used but this seems way to big even for a RegEx and maybe there's a better solution.
If the Log is the same as the one found in PasteBin, this deserializes fine.
I'm using a support class (JSON_Logs) to contain the extracted data.
The JSON is read from a file in this simulation.
Reading the structure of the data, the most probable candidate to identify the start of the actual data, is the recurring string "Deck.GetDeckLists". In the parsing method it's assigned to a variable called excludedSection.
The data starts right after the last one of those string. I'm using logFile.LastIndexOf(excludedSection) to find the index of the last of these entries, then use this index to identify the first data structure.
JsonConvert.DeserializeObject is then used to deserialize the data into a List of class objects.
I didn't find any problem during the deserialization process.
string searchString = "Deck.GetDeckLists";
List<JSON_Logs.Header> jsonLogs = ParseJsonLog(searchString, "JSON_Logs.txt");
private List<JSON_Logs.Header> ParseJsonLog(string excludedSection, string fileName)
{
string logFile = File.ReadAllText(fileName);
int refIndex = logFile.LastIndexOf(excludedSection);
logFile = logFile.Substring(logFile.IndexOf("[", refIndex));
return JsonConvert.DeserializeObject<List<JSON_Logs.Header>>(logFile);
}
Support class:
public class JSON_Logs
{
public class Header
{
public string id { get; set; }
public string name { get; set; }
public string description { get; set; }
public string format { get; set; }
public string resourceId { get; set; }
public int deckTileId { get; set; }
public MainDeck[] mainDeck { get; set; }
public object[] sideboard { get; set; }
public DateTime lastUpdated { get; set; }
public bool lockedForUse { get; set; }
public bool lockedForEdit { get; set; }
public bool isValid { get; set; }
}
public class MainDeck
{
public string id { get; set; }
public int quantity { get; set; }
}
}
I hope this is what you need. :) Actually, regex finds json in all sections, but I included getting only last section (matches[matches.Count - 1]). Since JToken doesn't have TryParse method, you have to use try/catch:
static void ParseLog()
{
var s = File.ReadAllText(#"C:\log.json");
var pattern =
#"(?s)(?'header'\[\w+\]\d{1,2}/\d{1,2}/\d{4}\s\d{1,2}:\d{1,2}:\d{1,2}\s(A|P)M\r\n" +
#"<?==>?.+?\r\n)" +
#"(?'body'.+?)(?=$|\[\w+\]\d{1,2}/\d{1,2}/\d{4}\s\d{1,2}:\d{1,2}:\d{1,2}\s(A|P)M)";
var matches = Regex.Matches(s, pattern);
if (matches.Count > 0)
{
JToken last_json = null;
try
{
var text = matches[matches.Count - 1].Groups["body"].Value;
last_json = JToken.Parse(text);
WriteLine(last_json.ToString());
}
catch (Exception ex) { WriteLine(ex.ToString()); }
}
else
{
WriteLine("No matches found");
}
}
I have csv file (file) and structure of file like this:
Amount;P_price;Ean;Number;Name;DPH;certifikate;o1;o2;ZC
0;168,00;8806333394584;E1347;MISSHA Gel;21;106;0002;0001;290
0;156,80;8806336488488;E1357;MISSHA Lotion;21;106;0002;0001;271
0;123,20;8806584752571;E1367;MISSHA Mist;21;106;0002;0001;213
I want to load all rows without first, where i have names of columns...This values i want to save into sql table...I know how i write into sql, but i need to know how i could load value from each row to this variables:
Amount
P_price
Ean
Number
Name
DPH
certifikate
o1
o2
ZC
The filepath i have into: string file;
Have you any ideas?
If you are sure that's the format and there will never be any ; in the actual values, then you can use a quick and dirty method:
foreach(String line in File.ReadAllLines(path).Skip(1))
{
String[] columns = line.Split(';');
String amount = columns[0];
String P_price = columns[1];
//etc
}
please tried with below link and let me know any help need
http://www.mssqltips.com/sqlservertutorial/203/simple-way-to-import-data-into-sql-server/
in sql server
BULK INSERT [dbo].[csv]
FROM 'C:\Users\...' --file path
WITH
(
FIRSTROW = 2,
FIELDTERMINATOR = ',', --CSV field delimiter
ROWTERMINATOR = '\n', --Use to shift the control to next row
ERRORFILE = 'C:\Users\file\...',
TABLOCK
)
DRY - Don't Repeat Yourself
http://www.filehelpers.com/ will help you to work with the csv files
Example:
File with CSV:
10248|VINET|04071996|32.38
10249|TOMSP|05071996|11.61
10250|HANAR|08071996|65.83
10251|VICTE|08071996|41.34
...............
Data transfer object:
[DelimitedRecord("|")]
public class Orders
{
public int OrderID;
public string CustomerID;
[FieldConverter(ConverterKind.Date, "ddMMyyyy")]
public DateTime OrderDate;
public decimal Freight;
}
Reader:
FileHelperEngine<Orders> engine = new FileHelperEngine<Orders>();
// to Read use:
Orders[] res = engine.ReadFile("TestIn.txt");
And than:
For you:
[DelimitedRecord(";")]
[IgnoreFirst(1)]
public class RootObject
{
public string Amount { get; set; }
public string P_price { get; set; }
public object Ean { get; set; }
public string Number { get; set; }
public string Name { get; set; }
public int DPH { get; set; }
public int certifikate { get; set; }
public int o1 { get; set; }
public int o2 { get; set; }
public int ZC { get; set; }
}
FileHelperEngine<Orders> engine = new FileHelperEngine<RootObject>();
// to Read use:
Orders[] res = engine.ReadFile("TestIn.txt");
So I am trying to read a csv file that essentially is a list of data separated by words, and I what I have done so far is used ReadAllLines and then from there separated with text.Split(',');
The only problem is I just read about this sort of list/array class method rather than creating an actual array, so I have no clue how to call it, or use it. Here is what I have so far:
using System;
using System.IO;
public class Earthquake
{
public double Magnitude { get; set; }
public string Location { get; set; }
public double Latitude { get; set; }
public double Longitude { get; set; }
public double depth { get; set; }
public string date { get; set; }
public string EventID { get; set; }
public string URL { get; set; }
public Earthquake(double magna, string locate, double lat, double longi, double dept, string dat, string Event, string website)
{
Magnitude = magna;
Location = locate;
Latitude = lat;
Longitude= longi;
depth = dept;
date = dat;
EventID = Event;
URL = website;
}
}
public class ManageData
{
public int count;
public void getData()
{
string[] text = File.ReadAllLines(#"Earthquakes.csv");
foreach (string word in text[count].Split(','))
{
//here i want to put each data in the Earthquake class
}
}
}
You may replace
foreach (string word in text[count].Split(','))
{
//here i want to put each data in the Earthquake class
}
with following code lines and see if it helps
foreach (string line in text)
{
string[] myColumns = line.Split(',');
EarthQuake eQ = new EarthQuake(myColumns[0],myColumns[1],myColumns[2],myColumns[3],myColumns[4],myColumns[5],myColumns[6],myColumns[7],myColumns[8]);
//Add eQ into a list you want to save
}
This assumes the columns are in same order. If the order is different than rather than using this constructor, use default constructor and then assign values to the properties of EarthQuake class accordingly.
Hope that helps.
This Project helped me a lot when I needed to manage CSV files, C# CSV Reader and Writer