I've put together a CSV importer which I assume works, though I get this error, how do I allow this column to be null so when it adds it to the table it automatically sets the ID? I've tried:
csv.Configuration.WillThrowOnMissingFields = false;
but it doesn't recognise it, this is the error I get when attempting to upload:
CsvHelper.ValidationException: 'Header matching ['ID'] names at index 0 was not found. If you are expecting some headers to be missing and want to ignore this validation, set the configuration HeaderValidated to null. You can also change the functionality to do something else, like logging the issue.'
[HttpPost]
[ActionName("CreateBulk")]
public ActionResult CreateBulkUpload()
{
object db;
var file = Request.Files["attachmentcsv"];
using (var csv = new CsvReader(new StreamReader(file.InputStream), true))
{
var records = csv.GetRecords<Client>().ToList();
foreach (var item in records)
{
var strip = item.homePage.Replace("https://www.", "").Replace("http://www.", "")
.Replace("https://", "").Replace("http://", "").Replace("www.", "");
string[] URLtests =
{"https://www." + strip, "http://www." + strip, "https://" + strip, "http://" + strip};
string[] Metric = MajesticFunctions.MajesticChecker(URLtests);
var userId = User.Identity.GetHashCode();
var UserTableID = 1;
var newclient = new Client
{
clientN = item.clientN,
homePage = Metric[0],
clientEmail = item.clientEmail,
monthlyQuota = item.monthlyQuota,
TrustFlow = Int32.Parse(Metric[1]),
CitationFlow = Int32.Parse(Metric[2]),
RI = Int32.Parse(Metric[3]),
MJTopicsID = item.MJTopicsID,
UserTableID = UserTableID
};
ViewBag.newdomain = newclient;
return RedirectToAction("Index");
}
}
return RedirectToAction("Index");
}
Did you try out the suggestion mentioned in the error message?
like this?
csv.configuration.HeaderValidated = null;
The developer made some breaking changes this year, so the accepted answer will no longer work.
Instead, you have to create a configuration object in advance and inject it in the constructor:
var config = new CsvConfiguration(CultureInfo.InvariantCulture)
{
HeaderValidated = null
};
using (var reader = new StreamReader(file))
using (var csv = new CsvReader(reader, config))
Make sure to include both these lines:
csv.Configuration.HeaderValidated = null;
csv.Configuration.MissingFieldFound = null;
Related
I have a .csv file structured like so, first row is the header column, for each SomeID I need to add the NetCharges together(or substract if the code calls for it) and put each item into its own column by the SomeCode column.
Heres the file I receive;
SomeID,OrderNumber,Code,NetCharge,Total
23473,30388,LI 126.0000, 132.00
96021, 000111, LI, 130.00, 126.00
23473,30388,FU 6.0000, 132.00
4571A,10452,LI,4100.0000, 4325.0000
4571A,10452,FU,150.00,4325.0000
4571A,10452,DT,75.00,4325.0000
I need to insert the data to my sql table which is structured like this. This is what I'm aiming for:
ID OrderNumber LICode LICodeValue FUCode FUCodeValue DTCode, DTCodeValue, total
23473 30388n LI 126.000 FU 6.0000 NULL NULL 132.0000
4571A 10452 LI 4100.0000 FU 150.0000 DT 75.00 4325.0000
My SomeID will not always be grouped together like the 4571A id is.I basically need to iterate over this file and create one record for each SomeID. I cannot seem to find a way with csvHelper. I'm using C# and csvHelper. I have trid this so far but I cannot get back to the SomeId after passing on to the nexr one:
using (var reader = new StreamReader( "C:\testFiles\some.csv" ))
using (var csv = new CsvReader( reader, CultureInfo.InvariantCulture ))
{
var badRecords = new List<string>();
var isRecordBad = false;
csv.Configuration.HasHeaderRecord = true;
csv.Configuration.HeaderValidated = null;
csv.Configuration.IgnoreBlankLines = true;
csv.Configuration.Delimiter = ",";
csv.Configuration.BadDataFound = context =>
{
isRecordBad = true;
badRecords.Add( context.RawRecord );
};
csv.Configuration.MissingFieldFound = ( s, i, context ) =>
{
isRecordBad = true;
badRecords.Add( context.RawRecord );
};
List<DataFile> dataFile = csv.GetRecords<DataFile>().ToList();
//initialize variable
string lastSomeId = "";
if (!isRecordBad)
{
foreach (var item in dataFile)
{
// check if its same record
if (lastSomeId != item.SomeID)
{
MyClass someClass = new MyClass();
lastSomeId = item.SomeID;
//decimal? LI = 0;//was going to use these as vars for calculations not sure I need them???
//decimal? DSC = 0;
//decimal? FU = 0;
someClass.Id = lastSomeId;
someClass.OrdNum = item.OrderNumber;
if (item.Code == "LI")
{
someClass.LICode = item.Code;
someClass.LICodeValue = item.NetCharge;
}
if (item.Code == "DT")
{
someClass.DTCode = item.Code;
someClass.DTCodeValue = item.NetCharge
}
if (item.Code == "FU")
{
someClass.FUCode = item.Code;
someClass.FUCodeValue = item.NetCharge;
}
someClass.Total = (someClass.LICodeValue + someClass.FUCodeValue);
//check for other values to calculate
//insert record to DB
}
else
{
//Insert into db after maipulation of values
}
}
}
isRecordBad = false;
}//END Using
Any clues would be greatly appreciated. Thank you in advance.
I have this function where it opens a .TXT file with some products and insert line by line on the sqlitedb. The process is working fine, the problem is. This file contains 2000+ lines, because of that, the process is taking several hours to finish. I wonder if there is a way to make the process a little bit faster.
here is the function:
private void carrega_produtos()
{
var assembly = typeof(sincroniza_page).GetTypeInfo().Assembly;
foreach (var res in assembly.GetManifestResourceNames())
{
if (res.Contains("produtos.txt"))
{
Stream stream = assembly.GetManifestResourceStream(res);
var st = res.Count();
using (var reader = new StreamReader(stream))
{
string linha;
acesso_banco_produtos banco = new acesso_banco_produtos();
while ((linha = reader.ReadLine()) != null)
{
List<string> lista = linha.Split(new char[] { 'ยง' }).ToList();
var cod = int.Parse(lista.ElementAt(0));
var nome_prod = lista.ElementAt(1);
var cod_grupo = lista.ElementAt(2);
var nm_grupo = lista.ElementAt(3);
var ind_ativo = lista.ElementAt(4);
var val_custo_unit = lista.ElementAt(5);
var val_custo = lista.ElementAt(6);
var perc_imposto = lista.ElementAt(7);
var unidade_med = lista.ElementAt(8);
var qtd_mes_1 = lista.ElementAt(9);
var qtd_mes_2 = lista.ElementAt(10);
var qtd_mes_3 = lista.ElementAt(11);
var qtd_mes_6 = lista.ElementAt(12);
var qtd_mes_12 = lista.ElementAt(13);
var data = lista.ElementAt(14);
var bd = new banco_produtos()
{
cod_produto = cod,
nm_produto = nome_prod,
cod_grupo = cod_grupo,
nm_grupo = nm_grupo,
ind_ativo = ind_ativo,
val_custo_unitario = Double.Parse(val_custo_unit),
val_lista_preco = val_custo,
perc_impostos = perc_imposto,
unidade_medida = unidade_med,
qtde_vendida_mes_1 = qtd_mes_1,
qtde_vendida_mes_2 = qtd_mes_2,
qtde_vendida_mes_3 = qtd_mes_3,
qtde_vendida_mes_6 = qtd_mes_6,
qtde_vendida_mes_12 = qtd_mes_12
};
//here i look in the DB if already exists the new product
var procura = banco.get_produto(cod);
if (procura == null)
{
// here is inserted to the db
banco.inserir_produto(bd);
}
}
valor += 1;
}
}
}
}
I'm not sure what is inside your method which inserts data into db but the most common issue with SQLite and massive inserts is the fact that SQLite by default wraps every insert with transaction which creates significant overhead. A good practice for such cases is to make signle transaction for all insterts which should singificantly improve the perfomance, see the example.
I did what #Dmytro said, I used the method "insertORIgnore". It improved a lot using that method.thank you for the help.
I am not able to add or update milestones field for the Features in the Rally. If anyone having the code available using C# to update the same, please share with me. I am searching and doing from last one week with no luck.
When I am trying to add/Update milestones in the Features. I am getting the error as "Could not read: Could not read referenced object null". My code is as follows:-
public DynamicJsonObject UpdateFeaturesbyName(string fea, string bFun)
{
//getting list of Feature.
Request feat = new Request("PortfolioItem/Feature");
feat.Query = new Query("Name", Query.Operator.Equals, fea);
QueryResult TCSResults = restApi.Query(feat);
foreach (var res in TCSResults.Results)
{
var steps = res["Milestones"];
Request tsteps = new Request(steps);
QueryResult tstepsResults = restApi.Query(tsteps);
foreach (var item in tstepsResults.Results)
{
}
if (res.Name == fea)
{
var targetFeature = TCSResults.Results.FirstOrDefault();
DynamicJsonObject toUpdate = new DynamicJsonObject();
//toUpdate["Milestones"] = "";
// CreateResult createResult = restApi.Create(steps._ref, toUpdate);
// String contentRef = steps._ref;
//String contentRef = createResult._ref;
string[] value = null;
string AccCri = string.Empty;
if (!string.IsNullOrWhiteSpace(bFun))
{
value = bFun.Split(new string[] { "\r\n", "\n" }, StringSplitOptions.None);
foreach (string item in value)
{
//if (string.IsNullOrWhiteSpace(AccCri))
// AccCri = item;
//else
// AccCri = AccCri + "<br/>" + item;
if (!string.IsNullOrWhiteSpace(item))
{
//Query for Milestone.
Request ms = new Request("Milestone");
ms.Fetch = new List<string>() { "Name", "ObjectID" };
ms.Query = new Query("Name", Query.Operator.Equals, item);
QueryResult msResults = restApi.Query(ms);
var targetMLResult = msResults.Results.FirstOrDefault();
long MLOID = targetMLResult["ObjectID"];
DynamicJsonObject tarML = restApi.GetByReference("Milestone", MLOID, "Name", "_ref", "DisplayColor");
DynamicJsonObject targetML = new DynamicJsonObject();
targetML["Name"] = tarML["Name"];
//targetML["_ref"] = tarML["_ref"];
targetML["_ref"] = "/milestone/" + Convert.ToString(MLOID);
targetML["DisplayColor"] = tarML["DisplayColor"];
// Grab collection of existing Milestones.
var existingMilestones = targetFeature["Milestones"];
long targetOID = targetFeature["ObjectID"];
// Milestones collection on object is expected to be a System.Collections.ArrayList.
var targetMLArray = existingMilestones;
var tagList2 = targetMLArray["_tagsNameArray"];
tagList2.Add(targetML);//
//targetMLArray.Add(targetML);
targetMLArray["_tagsNameArray"] = tagList2;
toUpdate["Milestones"] = targetMLArray;
OperationResult updateResult = restApi.Update(res._ref, toUpdate);
bool resp = updateResult.Success;
}
}
}
//toUpdate["c_AcceptanceCriteria"] = AccCri;
//OperationResult updateResult = restApi.Update(res._ref, toUpdate);
}
}
var features = TCSResults.Results.Where(p => p.Name == fea).FirstOrDefault();
var featuresref = features._ref;
return features;
}
Now that v3.1.1 of the toolkit has been released you can use the AddToCollection method to do this.
Otherwise, you can still always just update the full collection. The value should be an arraylist of objects with _ref properties.
Check out this example (which adds tasks to defects, but should be very similar to what you're doing): https://github.com/RallyCommunity/rally-dot-net-rest-apps/blob/master/UpdateTaskCollectionOnDefect/addTaskOnDefect.cs
The application I am building allows a user to upload a .csv file, which will ultimately fill in fields of an existing SQL table where the Ids match. First, I am using LinqToCsv and a foreach loop to import the .csv into a temporary table. Then I have another foreach loop that loops the fields from the temporary table into an existing table where the Ids match. The only way I have gotten this to work consistently and successfully is nesting the second foreach loop within the first:
[HttpPost]
public ActionResult UploadValidationTable(HttpPostedFileBase csvFile)
{
var inputFileDescription = new CsvFileDescription
{
SeparatorChar = ',',
FirstLineHasColumnNames = true
};
var cc = new CsvContext();
var filePath = uploadFile(csvFile.InputStream);
var model = cc.Read<Credit>(filePath, inputFileDescription);
try
{
var entity = new TestEntities();
foreach (var item in model)
{
var tc = new TemporaryCsvUpload
{
Id = item.Id,
CreditInvoiceAmount = item.CreditInvoiceAmount,
CreditInvoiceDate = item.CreditInvoiceDate,
CreditInvoiceNumber = item.CreditInvoiceNumber,
CreditDeniedDate = item.CreditDeniedDate,
CreditDeniedReasonId = item.CreditDeniedReasonId,
CreditDeniedNotes = item.CreditDeniedNotes
};
entity.TemporaryCsvUploads.Add(tc);
var idMatches = entity.Authorizations.ToList().Where(x => x.Id == tc.Id);
foreach (var number in idMatches)
{
number.CreditInvoiceDate = tc.CreditInvoiceDate;
number.CreditInvoiceNumber = tc.CreditInvoiceNumber;
number.CreditInvoiceAmount = tc.CreditInvoiceAmount;
number.CreditDeniedDate = tc.CreditDeniedDate;
number.CreditDeniedReasonId = tc.CreditDeniedReasonId;
number.CreditDeniedNotes = tc.CreditDeniedNotes;
}
}
entity.SaveChanges();
entity.Database.ExecuteSqlCommand("TRUNCATE TABLE TemporaryCsvUpload");
TempData["Success"] = "Updated Successfully";
}
catch (LINQtoCSVException)
{
TempData["Error"] = "Upload Error: Ensure you have the correct header fields and that the file is of .csv format.";
}
return View("Upload");
}
The issue is speed. It takes about 1 minute and 49 seconds to search through an SQL table of 7000 entries, match the ids, and fill in the fields.
So, I looked at this and thought that the second loop really didn't need to be nested. I switched up the code like so:
[HttpPost]
public ActionResult UploadValidationTable(HttpPostedFileBase csvFile)
{
var inputFileDescription = new CsvFileDescription
{
SeparatorChar = ',',
FirstLineHasColumnNames = true
};
var cc = new CsvContext();
var filePath = uploadFile(csvFile.InputStream);
var model = cc.Read<Credit>(filePath, inputFileDescription);
try
{
var entity = new TestEntities();
var tc = new TemporaryCsvUpload();
foreach (var item in model)
{
tc.Id = item.Id;
tc.CreditInvoiceAmount = item.CreditInvoiceAmount;
tc.CreditInvoiceDate = item.CreditInvoiceDate;
tc.CreditInvoiceNumber = item.CreditInvoiceNumber;
tc.CreditDeniedDate = item.CreditDeniedDate;
tc.CreditDeniedReasonId = item.CreditDeniedReasonId;
tc.CreditDeniedNotes = item.CreditDeniedNotes;
entity.TemporaryCsvUploads.Add(tc);
}
var idMatches = entity.Authorizations.ToList().Where(x => x.Id == tc.Id);
foreach (var number in idMatches)
{
number.CreditInvoiceDate = tc.CreditInvoiceDate;
number.CreditInvoiceNumber = tc.CreditInvoiceNumber;
number.CreditInvoiceAmount = tc.CreditInvoiceAmount;
number.CreditDeniedDate = tc.CreditDeniedDate;
number.CreditDeniedReasonId = tc.CreditDeniedReasonId;
number.CreditDeniedNotes = tc.CreditDeniedNotes;
}
entity.SaveChanges();
entity.Database.ExecuteSqlCommand("TRUNCATE TABLE TemporaryCsvUpload");
TempData["Success"] = "Updated Successfully";
}
catch (LINQtoCSVException)
{
TempData["Error"] = "Upload Error: Ensure you have the correct header fields and that the file is of .csv format.";
}
return View("Upload");
}
This time around, it only took 19 seconds to complete. A vast improvement on the first. But when I checked the database, only one row of the 7 that should match was filled in. Can anybody spot a reason why the second code block would not be filling in all the rows it should be? Or a better way to optimize the first block? Thanks!
I am developing a web application in which i have to import the data in SQL Server from given Excel files using C# and ASP.NET MVC. For this purpose I followed this article. So I used ExcelDataReader to read the Excel files. Furthermore I have used SqlBulkCopy in my code to insert the data into the database. following is my code:
The Create method
var bData = getBillData();
var connString = ConfigurationManager.ConnectionStrings["WASABill"].ConnectionString;
DataTable table = new DataTable();
using (var reader = ObjectReader.Create(bData))
{
table.Load(reader);
}
using (SqlBulkCopy bcp = new SqlBulkCopy(connString))
{
bcp.ColumnMappings.Add("AccountNo", "AccountNo");
bcp.ColumnMappings.Add("BillNo", "BillNo");
bcp.ColumnMappings.Add("Category", "Category");
bcp.ColumnMappings.Add("Billing_Period", "Billing_Period");
bcp.ColumnMappings.Add("Name", "Name");
bcp.ColumnMappings.Add("Address", "Address");
bcp.ColumnMappings.Add("Issue_Date", "Issue_Date");
bcp.ColumnMappings.Add("Due_Date", "Due_Date");
bcp.ColumnMappings.Add("Water_Bill", "Water_Bill");
bcp.ColumnMappings.Add("Sewerage_Bill", "Sewerage_Bill");
bcp.ColumnMappings.Add("Aquifer_Charges", "Aquifer_Charges");
bcp.ColumnMappings.Add("Current_Amount", "Current_Amount");
bcp.ColumnMappings.Add("Arrears", "Arrears");
bcp.ColumnMappings.Add("Service_Charges", "Service_Charges");
bcp.ColumnMappings.Add("Payable_within_DueDate", "Payable_within_DueDate");
bcp.ColumnMappings.Add("Surcharge", "Surcharge");
bcp.ColumnMappings.Add("Payable_after_DueDate", "Payable_after_DueDate");
bcp.ColumnMappings.Add("Payment_History_1", "Payment_History_1");
bcp.ColumnMappings.Add("Paid_1", "Paid_1");
bcp.ColumnMappings.Add("Payment_History_2", "Payment_History_2");
bcp.ColumnMappings.Add("Paid_2", "Paid_2");
bcp.ColumnMappings.Add("Payment_History_3", "Payment_History_3");
bcp.ColumnMappings.Add("Paid_3", "Paid_3");
bcp.ColumnMappings.Add("Area", "Area");
bcp.ColumnMappings.Add("Water_Rate", "Water_Rate");
bcp.ColumnMappings.Add("Sewerage_Rate", "Sewerage_Rate");
bcp.ColumnMappings.Add("Discharge_Basis", "Discharge_Basis");
bcp.ColumnMappings.Add("Pump_Size", "Pump_Size");
bcp.ColumnMappings.Add("Ferrule_Size", "Ferrule_Size");
bcp.ColumnMappings.Add("Meter_Type", "Meter_Type");
bcp.ColumnMappings.Add("Meter_Status", "Meter_Status");
bcp.ColumnMappings.Add("Last_Readin", "Last_Readin");
bcp.ColumnMappings.Add("Current_Reading", "Current_Reading");
bcp.ColumnMappings.Add("Water_Aquiffer_Charges", "Water_Aquiffer_Charges");
bcp.DestinationTableName = "WASA_Bill_Detail";
bcp.WriteToServer(table);
}
var rowCount = table.Rows.Count; //Number of rows in data table
//if (ModelState.IsValid)
//{
// db.WASA_Bill_Detail.Add(wASA_Bill_Detail);
// db.SaveChanges();
// return RedirectToAction("Index");
//}
TempData["RowCount"] = rowCount;
return RedirectToAction("Index");
The method which reads the Excel file and returns the data as a list
public IEnumerable<WASA_Bill_Detail> getBillData()
{
List<WASA_Bill_Detail> billDetaileList = new List<WASA_Bill_Detail>();
//string path = #TempData["FilePath"].ToString();//#"E:\W317.xlsx";
string path = TempData["FilePath"].ToString();
string excelpath = Server.MapPath(path);
if(path!=null)
{
var excelData = new ExcelData(excelpath);
var billRecords = excelData.getData("Sheet1");
foreach (var row in billRecords)
{
var billDetail = new WASA_Bill_Detail()
{
AccountNo = row["ACCOUNT#"].ToString(),
BillNo = row["BILLNO"].ToString(),
Category = row["CATEGORY"].ToString(),
Billing_Period = row["BILLING_PERIOD"].ToString(),
Name = row["NAME"].ToString(),
Address = row["ADDRESS"].ToString(),
Issue_Date = row["ISSUE_DATE"].ToString(),
Due_Date = row["DUE_DATE"].ToString(),
Water_Bill = row["WATER_BILL"].ToString(),
Sewerage_Bill = row["SEWERAGE BILL"].ToString(),
Aquifer_Charges = row["AQUIFER"].ToString(),
Current_Amount = row["CURRENT AMOUNT"].ToString(),
Arrears = row["ARREARS"].ToString(),
Service_Charges = row["SERVICE CHARGES"].ToString(),
Payable_within_DueDate = row["PAYABLE WITHIN DUEDATE"].ToString(),
Surcharge = row["SURCHARGE"].ToString(),
Payable_after_DueDate = row["AFTER DUE DATE"].ToString(),
Payment_History_1 = row["PAY HISTORY 1"].ToString(),
Paid_1 = row["PAID 1"].ToString(),
Payment_History_2 = row["PAY HISOTRY 2"].ToString(),
Paid_2 = row["PAID 2"].ToString(),
Payment_History_3 = row["PAY HISOTRY 3"].ToString(),
Paid_3 = row["PAID 3"].ToString(),
Area = row["AREA"].ToString(),
Water_Rate = row["WATER RATE"].ToString(),
Sewerage_Rate = row["SEWER RATE"].ToString(),
Discharge_Basis = row["DISCHAGE"].ToString(),
Pump_Size = row["PUMP SIZE"].ToString(),
Ferrule_Size = row["FERRULE SIZE"].ToString(),
Meter_Type = row["METER TYPE"].ToString(),
Meter_Status = row["METER STATUS"].ToString(),
Last_Readin = row["LAST READING"].ToString(),
Current_Reading = row["CURRENT READING"].ToString(),
Water_Aquiffer_Charges = row["AQUIFER CHARGES"].ToString(),
};
billDetaileList.Add(billDetail);
}
}
return billDetaileList;
}
Everything is working fine on my development machine. File uploaded properly and then inserted into the database using bcp.
But when I publish this to the hosting server the NullReferenceException occurred at
WASAWeb.Controllers.AdminControllers.WASA_Bill_DetailController.getBillData() +128
I could not understand as it is working 100% fine in my development machine. I have checked that file is properly uploaded to the server.
Any help with this?
you can use this:
private string GetStringValue(object obj)
{
string str = null;
if(obj != null)
str = obj.ToString().Trim();
return str;
}
call
......
AccountNo = GetStringValue(row["ACCOUNT#"])
......