Getting "0" (invalid data ) from Sqlite command - c#

I am trying to fetch a column a from table, but it gets 0 value and column contains a value which is (2).
here is my code
obj.GetCon().Open();
obj.cmd = new SQLiteCommand("SELECT Payable FROM VendorLedgers WHERE VendorId=" + Convert.ToInt32( payBill.VendorId), obj.con);
obj.read = obj.cmd.ExecuteReader();
while (obj.read.Read())
{
MessageBox.Show(obj.read.GetValue(0).ToString());
dues = obj.read["Payable"].ToString();
string[] split_due = dues.Split('(', ')');
int dueamount1 = Convert.ToInt32(split_due[1]); //and it prints here index was outside the bound error
dueamount.Add(dueamount1);
}
obj.GetCon().Close();
and here is my query output on database
please help if anyone know how to fix it

I have not worked for SQLite. But with while(obj.read.Read()) you read one row at a time.
Then access the corresponding column with obj.read["Payable"].
So what do you need split!
while (obj.read.Read())
{
MessageBox.Show(obj.read.GetInt32(0).ToString());
int dueamount1 = obj.read.GetInt32(0);
dueamount.Add(dueamount1);
}

Related

this code will not work in calculating the total of all data within a specific column of my database and i cant see why

When i run my program i get the error "Object refferance not set to an instance of object" im not sure what this is reffering to or how to fix this. i am trying to calculate the sum of all values within a colum of my database, Any help would be great.
DataTable table = ds.Tables["Finance"];
//Error Here
int sumIncome = Convert.ToInt32(table.Compute("SUM(Income)", string.Empty));
int sumExpenditure = Convert.ToInt32(table.Compute("SUM(Expenditure)", string.Empty));
int calc;
calc = sumIncome - sumExpenditure;
string ProfitLoss = $"{calc:C2}";
lblProfitLoss.Text = ProfitLoss.ToString();

C# - check if column in CSV exists before assigning it to datatable

I have this:
var productDetailsFromFile = (from row in dt.AsEnumerable()
select new ProductDetails
{
ItemNumber = row.Field<string>("Item Number"),
Cost = row.Field<string>("Cost").ToDecimal(),//custom method .ToDecimal
WHQtyList = new List<int>()
{
row.Field<string>("foo").ToInteger(),//custom method .ToInteger
row.Field<string>("bar").ToInteger(),
row.Field<string>("foo2").ToInteger(),
}
}).ToList();
It reads the info from a .csv file. What I am trying to achieve is an elegant way of checking if Fields "foo", "bar" or "foo2".
Right now the issue is that if from the CSV file I remove one of the columns, column not in datatable error pops up. I can't get this to work for 2 hours now.
What I am essentially seeking is - how to check if a column exists as I use it to initialize the list, or if the column doesn't exist the default value to be 0 for each row, where it doesn't exist.
I did it through a method. I was wondering if there was a way to do it faster without having to add additional lines of code or make the additional lines of code less than what they are now.
int ContainsColumn (string columnName, DataTable table, DataRow row)
{
DataColumnCollection columns = table.Columns;
if (columns.Contains(columnName))
{
return int.Parse(row.Field<string>(columnName));
}
else
{
return 0;
}
}

Seeking a less costly solution for matching Ids of two tables

The application I am building allows a user to upload a .csv file containing multiple rows and columns of data. Each row contains a unique varchar Id. This will ultimately fill in fields of an existing SQL table where there is a matching Id.
Step 1: I am using LinqToCsv and a foreach loop to import the .csv fully into a temporary table.
Step 2: Then I have another foreach loop where I am trying to loop the rows from the temporary table into an existing table only where the Ids match.
Controller Action to complete this process:
[HttpPost]
public ActionResult UploadValidationTable(HttpPostedFileBase csvFile)
{
var inputFileDescription = new CsvFileDescription
{
SeparatorChar = ',',
FirstLineHasColumnNames = true
};
var cc = new CsvContext();
var filePath = uploadFile(csvFile.InputStream);
var model = cc.Read<Credit>(filePath, inputFileDescription);
try
{
var entity = new TestEntities();
var tc = new TemporaryCsvUpload();
foreach (var item in model)
{
tc.Id = item.Id;
tc.CreditInvoiceAmount = item.CreditInvoiceAmount;
tc.CreditInvoiceDate = item.CreditInvoiceDate;
tc.CreditInvoiceNumber = item.CreditInvoiceNumber;
tc.CreditDeniedDate = item.CreditDeniedDate;
tc.CreditDeniedReasonId = item.CreditDeniedReasonId;
tc.CreditDeniedNotes = item.CreditDeniedNotes;
entity.TemporaryCsvUploads.Add(tc);
}
var idMatches = entity.PreexistingTable.Where(x => x.Id == tc.Id);
foreach (var number in idMatches)
{
number.CreditInvoiceDate = tc.CreditInvoiceDate;
number.CreditInvoiceNumber = tc.CreditInvoiceNumber;
number.CreditInvoiceAmount = tc.CreditInvoiceAmount;
number.CreditDeniedDate = tc.CreditDeniedDate;
number.CreditDeniedReasonId = tc.CreditDeniedReasonId;
number.CreditDeniedNotes = tc.CreditDeniedNotes;
}
entity.SaveChanges();
entity.Database.ExecuteSqlCommand("TRUNCATE TABLE TemporaryCsvUpload");
TempData["Success"] = "Updated Successfully";
}
catch (LINQtoCSVException)
{
TempData["Error"] = "Upload Error: Ensure you have the correct header fields and that the file is of .csv format.";
}
return View("Upload");
}
The issue in the above code is that tc is inside the first loop, but the matches are defined after the loop with var idMatches = entity.PreexistingTable.Where(x => x.Id == tc.Id);, so I am only getting the last item of the first loop.
If I nest the second loop then it is way to slow (stopped it after 10 minutes) because there are roughly 1000 rows in the .csv and 7000 in the preexisting table.
Finding a better way to do this is plaguing me. Pretend that the temporary table didn't even come from a .csv and just think about the most efficient way to fill in rows in table 2 from table 1 where the id of that row matches. Thanks for your help!
As your code is written now, much of the work is being done by the application that could much more efficiently be done by SQL Server. You are making hundreds of unnecessary roundtrip calls to the database. When you are mass importing data you want a solution like this:
Bulk import the data. See this answer for helpful guidance on bulk import efficiency with EF.
Join and update destination table.
Processing the import should only require a single mass update query:
update PT set
CreditInvoiceDate = CSV.CreditInvoiceDate
,CreditInvoiceNumber = CSV.CreditInvoiceNumber
,CreditInvoiceAmount = CSV.CreditInvoiceAmount
,CreditDeniedDate = CSV.CreditDeniedDate
,CreditDeniedReasonId = CSV.CreditDeniedReasonId
,CreditDeniedNotes = CSV.CreditDeniedNotes
from PreexistingTable PT
join TemporaryCsvUploads CSV on PT.Id = CSV.Id
This query would replace your entire nested loop and apply the same update in a single database call. As long as your table is indexed properly this should run very fast.
After saving CSV record into second table which have same fileds as your primary table, execute following procedure in sqlserver
create proc [dbo].[excel_updation]
as
set xact_abort on
begin transaction
-- First update records
update first_table
set [ExamDate] = source.[ExamDate],
[marks] = source.[marks],
[result] = source.[result],
[dob] = source.[dob],
[spdate] = source.[spdate],
[agentName] = source.[agentName],
[companycode] = source.[companycode],
[dp] = source.[dp],
[state] = source.[state],
[district] = source.[district],
[phone] = source.[phone],
[examcentre] = source.[examcentre],
[examtime] = source.[examtime],
[dateGiven] = source.[dateGiven],
[smName] = source.[smName],
[smNo] = source.[smNo],
[bmName] = source.[bmName],
[bmNo] = source.[bmNo]
from tbUser
inner join second_table source
on tbUser.[UserId] = source.[UserId]
-- And then insert
insert into first_table (exprdate, marks, result, dob, spdate, agentName, companycode, dp, state, district, phone, examcentre, examtime, dateGiven, smName, smNo, bmName, bmNo)
select [ExamDate], [marks], [result], [dob], [spdate], [agentName], [companycode], [dp], [state], [district], [phone], [examcentre], [examtime], [dateGiven], [smName], [smNo], [bmName], [bmNo]
from second_table source
where not exists
(
select *
from first_table
where first_table.[UserId] = source.[UserId]
)
commit transaction
delete from second_table
The condition of this code is only that both table must have same id matching data. Which id match in both table, data of that particular row will be updated in first table.
As long as the probability of the match is high you should simply attempt update with every row from your CSV, with a condition that the id matches,
UPDATE table SET ... WHERE id = #id

set column default value of data table when filed with mysqldataadapter

this is my code right now:
private static MySqlConnection conn = null;
private static MySqlDataAdapter AccountsDa = null;
private static MySqlCommandBuilder AccountsCb = null;
AccountsDa = new MySqlDataAdapter("SELECT * FROM accounts", conn);
AccountsCb = new MySqlCommandBuilder(AccountsDa);
Accounts = new DataTable();
AccountsDa.Fill(Accounts);
I'm trying to figure out how to define the column default values without having to do it by hand
if I do like this:
DataColumn col = new DataColumn();
col.ColumnName = "id";
col.AllowDBNull = false;
col.DataType = System.Type.GetType("System.Int32");
col.DefaultValue = 0;
Accounts.Columns.Add(col);
for every colum it works fine but how do I have it automatically set the default values from the database when the table is filled. I'm hoping I don't have to define 30 columns by hand.
I tried the Accountsda.FillSchema(Accounts, SchemaType.Source);
which sets up the allow nulls and auto increments but not default values
the problem arrises when adding a row to the data table later sometimes I only need to set the value for one column and let the rest of the columns resort to their default value.
I could put 180 lines of code to manually define the default values for inserting rows but there has to be a way to grab that from the database when creating/filling the data table
I'm using in memory data tables because there are times where data will only exist for example 2 minutes and then be deleted again as this is for a dedicated server for an online rts game. so to save hits on the database I'm using data tables and manipulating them and flushing them every 10 minutes so that I only have 1,000 hits to the database every 10 mins instead of possibly 40,000 hits
well according to the msdn gurus after finally getting a response on their forums its not possible to get the default values. all you can do is load wether the value is allowed to be null and wether its autoincrememnt but then you stll have to set the seed and step on auto incrememnt it doesn't get that from the database either but they gave a shorthand version that cuts it down to 30 lines of code instead of 180
after calling fillschema and then filling the data table can simply do like this wich cuts it down to one line instead of the six
Cities.Columns["wood"].DefaultValue = 0;
after a few replies there is even a much easier way to do this not the way I wanted but maybe it will help someone else down the same road instead of one line for each column this does them all in 3 lines
foreach (DataColumn col in Cities.Columns) {
if (col.ColumnName != "id") col.DefaultValue = 0;
}
id is the primary key and can't set a default value
So I was trying to do something similar to you (except I have no idea how to get the information about auto increment) - I got the idea from https://stackoverflow.com/a/12731310/222897
private void AssignMandatoryColumns([NotNull] DataTable structure, string tableName)
{
// find schema
string[] restrictions = new string[4]; // Catalog, Owner, Table, Column
restrictions[2] = tableName;
DataTable schemaTable = _dbCon.GetSchema("Columns", restrictions);
if (schemaTable == null) return;
// set values for columns
foreach (DataRow row in schemaTable.Rows)
{
string columnName = row["COLUMN_NAME"].ToString();
if (!structure.Columns.Contains(columnName)) continue;
if (row["IS_NULLABLE"].ToString() == "NO") structure.Columns[columnName].AllowDBNull = false;
//if (structure.Columns[columnName].AutoIncrement) continue; // there can be no default value
var valueType = row["DATA_TYPE"];
var defaultValue = row["COLUMN_DEFAULT"];
try
{
structure.Columns[columnName].DefaultValue = defaultValue;
if (!structure.Columns[columnName].AllowDBNull && structure.Columns[columnName].DefaultValue is DBNull)
{
Logger.DebugLog("Database column {0} is not allowed to be null, yet there is no default value.", columnName);
}
}
catch (Exception exception)
{
if (structure.Columns[columnName].AllowDBNull) continue; // defaultvalue is irrelevant since value is allowed to be null
Logger.LogWithoutTrace(exception, string.Format("Setting DefaultValue for {0} of type {1} {4} to {2} ({3}).", columnName, valueType, defaultValue, defaultValue.GetType(), structure.Columns[columnName].AllowDBNull ? "NULL" : "NOT NULL"));
}
}
}
The function takes the DataTable you want to set the values for (I get mine by querying the DB) and the name of the table.
For some reason the timestamp and date columns don't like their default value no matter what I do.

Autonumber and datatable with dbnull exception

i was doing some work on a datatable i filled with a oledbdataadapter made from an access database. and i stumbled upon this error:
Turns out that my table has this structure:
ID --> autonumber(PK)
lazos_>text
Asociaciones->text
and when i fill my datatable all values pass to it without any problems with all the correct values. I insert a new row like shown on the "insert row" part.
i do this asumming that my pk will instert the "autonumber" on row creation, but apparently it is not doing it because when i loop trought the rows i get a "invalid cast exception" with a Object cannot be cast from DBNull to other types."
I COULD insert an id value to the column, but when i update my dt to my database wont it create an error, because i have no way of knowing wich was the last row created?, or do i?
for example lets say in my datatable the last ID is 50, but on the database y previously made a record with id "51" but then erased it, if i inserted 51 based on my dt info, it would give an error right?
//// INSERT ROW
DataRow newRow = Tabla_Cods_Proy.NewRow();
newRow["Lazos"] = textBox1.Text ;
newRow["Asociaciones"] = textBox2.Text;
Tabla_Cods_Proy.Rows.Add(newRow);
MessageBox.Show("Enhorabuena!");
//CHECK ID's
for (int i = 0; i < Tabla_Cods_Proy.Rows.Count; i++)
{
if (Tabla_Cods_Proy.Rows[i].RowState != DataRowState.Deleted)
{
if (Tabla_Cods_Proy.Rows[i]["Lazos_asociados"].ToString() == "")
{
listBox7.Items.Add(Tabla_Cods_Proy.Rows[i]["Cod_Cliente"]);
listBox8.Items.Add(Tabla_Cods_Proy.Rows[i]["Cod_Inelectra"]);
ID_Cods_Proy_Sin_Asociar.Add(Convert.ToInt32(Tabla_Cods_Proy.Rows[i]["ID"]));
}
else
{
listBox3.Items.Add(Tabla_Cods_Proy.Rows[i]["Cod_Cliente"]);
listBox4.Items.Add(Tabla_Cods_Proy.Rows[i]["Cod_Inelectra"]);
ID_Cods_Proy_Asociados.Add(Convert.ToInt32(Tabla_Cods_Proy.Rows[i]["ID"]));
}
}
I had once similiar problem. What you need to do is that you retrieve the new identity ##IDENTITY of this column once you insert it into table. You can do that by using RowUpdated event.
Here is quick example from MSDN page (similiar to your case, see bottom of the page):
public static void Main()
{
//...connecting to access db and getting data to datatable...
// ...
// Adding a new row to datatable.
DataRow newRow = catDS.Tables["Categories"].NewRow();
newRow["CategoryName"] = "New Category";
catDS.Tables["Categories"].Rows.Add(newRow);
// Include an event to fill in the Autonumber value.
catDA.RowUpdated += new OleDbRowUpdatedEventHandler(OnRowUpdated);
}
protected static void OnRowUpdated(object sender, OleDbRowUpdatedEventArgs args)
{
// Include a variable and a command to retrieve the identity value from the Access database.
int newID = 0;
OleDbCommand idCMD = new OleDbCommand("SELECT ##IDENTITY", nwindConn);
if (args.StatementType == StatementType.Insert)
{
// Retrieve the identity value and store it in the CategoryID column.
newID = (int)idCMD.ExecuteScalar();
args.Row["CategoryID"] = newID;
}
}

Categories