What I'm trying to do is a report bringing values of a table, but just if this table have specific selected client and others specific properties
Activation Report, follow a part of the code that already is working
var sql = "SELECT * FROM ativacao WHERE id_cliente = " + id_cliente;
var itemAtivacao = db.ativacao.SqlQuery(sql).ToList();
foreach (var item in itemAtivacao)
{
dt.Rows.Add(item.id, item.codigo, item.cliente.nome);
}
Until here Ok, easy. but I need to bring elements of table cliente, of other table comparing with a column. Damnt, it is boring.
Apart from the specifit client, I need to:
var id_executivo = Convert.ToInt32(Request.Form["id_executivo"]);
var id_prospector = Convert.ToInt32(Request.Form["id_prospector"]);
var id_cluster = Convert.ToInt32(Request.Form["id_cluster"]);
var id_status = Convert.ToInt32(Request.Form["id_status"]);
var id_cliente = Convert.ToInt32(Request.Form["id_cliente"]);
This is Columns of table Client. I'm thinking about INNER JOIN. is it???
summing up - I need to bring ativacoes of specific cliente if this cliente contains ......
Someone help me how can I do it please. Thanks so much.
Related
The application I am building allows a user to upload a .csv file containing multiple rows and columns of data. Each row contains a unique varchar Id. This will ultimately fill in fields of an existing SQL table where there is a matching Id.
Step 1: I am using LinqToCsv and a foreach loop to import the .csv fully into a temporary table.
Step 2: Then I have another foreach loop where I am trying to loop the rows from the temporary table into an existing table only where the Ids match.
Controller Action to complete this process:
[HttpPost]
public ActionResult UploadValidationTable(HttpPostedFileBase csvFile)
{
var inputFileDescription = new CsvFileDescription
{
SeparatorChar = ',',
FirstLineHasColumnNames = true
};
var cc = new CsvContext();
var filePath = uploadFile(csvFile.InputStream);
var model = cc.Read<Credit>(filePath, inputFileDescription);
try
{
var entity = new TestEntities();
var tc = new TemporaryCsvUpload();
foreach (var item in model)
{
tc.Id = item.Id;
tc.CreditInvoiceAmount = item.CreditInvoiceAmount;
tc.CreditInvoiceDate = item.CreditInvoiceDate;
tc.CreditInvoiceNumber = item.CreditInvoiceNumber;
tc.CreditDeniedDate = item.CreditDeniedDate;
tc.CreditDeniedReasonId = item.CreditDeniedReasonId;
tc.CreditDeniedNotes = item.CreditDeniedNotes;
entity.TemporaryCsvUploads.Add(tc);
}
var idMatches = entity.PreexistingTable.Where(x => x.Id == tc.Id);
foreach (var number in idMatches)
{
number.CreditInvoiceDate = tc.CreditInvoiceDate;
number.CreditInvoiceNumber = tc.CreditInvoiceNumber;
number.CreditInvoiceAmount = tc.CreditInvoiceAmount;
number.CreditDeniedDate = tc.CreditDeniedDate;
number.CreditDeniedReasonId = tc.CreditDeniedReasonId;
number.CreditDeniedNotes = tc.CreditDeniedNotes;
}
entity.SaveChanges();
entity.Database.ExecuteSqlCommand("TRUNCATE TABLE TemporaryCsvUpload");
TempData["Success"] = "Updated Successfully";
}
catch (LINQtoCSVException)
{
TempData["Error"] = "Upload Error: Ensure you have the correct header fields and that the file is of .csv format.";
}
return View("Upload");
}
The issue in the above code is that tc is inside the first loop, but the matches are defined after the loop with var idMatches = entity.PreexistingTable.Where(x => x.Id == tc.Id);, so I am only getting the last item of the first loop.
If I nest the second loop then it is way to slow (stopped it after 10 minutes) because there are roughly 1000 rows in the .csv and 7000 in the preexisting table.
Finding a better way to do this is plaguing me. Pretend that the temporary table didn't even come from a .csv and just think about the most efficient way to fill in rows in table 2 from table 1 where the id of that row matches. Thanks for your help!
As your code is written now, much of the work is being done by the application that could much more efficiently be done by SQL Server. You are making hundreds of unnecessary roundtrip calls to the database. When you are mass importing data you want a solution like this:
Bulk import the data. See this answer for helpful guidance on bulk import efficiency with EF.
Join and update destination table.
Processing the import should only require a single mass update query:
update PT set
CreditInvoiceDate = CSV.CreditInvoiceDate
,CreditInvoiceNumber = CSV.CreditInvoiceNumber
,CreditInvoiceAmount = CSV.CreditInvoiceAmount
,CreditDeniedDate = CSV.CreditDeniedDate
,CreditDeniedReasonId = CSV.CreditDeniedReasonId
,CreditDeniedNotes = CSV.CreditDeniedNotes
from PreexistingTable PT
join TemporaryCsvUploads CSV on PT.Id = CSV.Id
This query would replace your entire nested loop and apply the same update in a single database call. As long as your table is indexed properly this should run very fast.
After saving CSV record into second table which have same fileds as your primary table, execute following procedure in sqlserver
create proc [dbo].[excel_updation]
as
set xact_abort on
begin transaction
-- First update records
update first_table
set [ExamDate] = source.[ExamDate],
[marks] = source.[marks],
[result] = source.[result],
[dob] = source.[dob],
[spdate] = source.[spdate],
[agentName] = source.[agentName],
[companycode] = source.[companycode],
[dp] = source.[dp],
[state] = source.[state],
[district] = source.[district],
[phone] = source.[phone],
[examcentre] = source.[examcentre],
[examtime] = source.[examtime],
[dateGiven] = source.[dateGiven],
[smName] = source.[smName],
[smNo] = source.[smNo],
[bmName] = source.[bmName],
[bmNo] = source.[bmNo]
from tbUser
inner join second_table source
on tbUser.[UserId] = source.[UserId]
-- And then insert
insert into first_table (exprdate, marks, result, dob, spdate, agentName, companycode, dp, state, district, phone, examcentre, examtime, dateGiven, smName, smNo, bmName, bmNo)
select [ExamDate], [marks], [result], [dob], [spdate], [agentName], [companycode], [dp], [state], [district], [phone], [examcentre], [examtime], [dateGiven], [smName], [smNo], [bmName], [bmNo]
from second_table source
where not exists
(
select *
from first_table
where first_table.[UserId] = source.[UserId]
)
commit transaction
delete from second_table
The condition of this code is only that both table must have same id matching data. Which id match in both table, data of that particular row will be updated in first table.
As long as the probability of the match is high you should simply attempt update with every row from your CSV, with a condition that the id matches,
UPDATE table SET ... WHERE id = #id
I know this is obviously a repeated question to ask but I am unable to figure out the issue, as I am new to LINQ.
Basically I have to matchup for duplicate entry of data while adding multiple records at a time. So, I have a Table in my database that has few rows and then I create DataTable dynamically which is clone(in terms of structure) of that table. Now dtDup is the Database Table, returned as dataset/datatable from a select query, and dupVals is the dynamic clone that is to be cross checked for duplicates
var CommnRows =
from dbA in dtDup.AsEnumerable()
join appB in dupVals.AsEnumerable() on
new {
MonthID = dbA.Field<int>("MonthID"),
UserID = dbA.Field<int?>("UserID"), //nullable int
IsActive = dbA.Field<bool?>("IsActive"), //nullable bit
Gender = dbA.Field<String>("Gender").ToString().ToUpper()
}
equals
new {
MonthID = appB.Field<int>("MonthID"),
UserID = appB.Field<int?>("UserID"),
IsActive = appB.Field<bool?>("IsActive")
Gender = appB.Field<String>("Gender").ToString().ToUpper()
}
select dbA;
So, in case I have some rows returned then (I assume, that above join is correct inner join) this means that there are duplicate rows.
But I am getting an error:
Object reference not set to an instance of an object
at new after equals
I found the issue. I was trying to change the type of one of my column that was a string and at first I did not included in the question (now updated), but when I tried debugging it line by line, I found that it was breaking while near Gender. So, just removed the ToString().ToUpper() from that area and it worked.
I would like to ask a question about the following function related to reports in C#.
crystalReportViewer1.SelectionFormula = "{Cars.CarCode} = " + CarCode;
Does this formula add a filter condition to the current query code that is used in the report, or does it create a whole new query?
If not is there an other method of further filtering a report?
This will add a filter condition to the query. If carcode is a string you need to change it to :
crystalReportViewer1.SelectionFormula = string.Format("{Cars.CarCode} = '{0}'",CarCode);
I'm new to writing LINQ queries, and I'm having trouble with string comparisons. I'm loading the data into a DataTable (confirmed that table in SQL DB and DataTable have same number of rows), but for some reason I can't find a value that I know exists in both.
The text box contains 'servername' while the datarows contain 'servername.mydomain.net', so here's what my code looks like
string strParameter = txtAutoComplete.ToString().ToLower();
//WUG TableAdapter and DataTable
dsCIInfoTableAdapters.DeviceTableAdapter taWUG;
taWUG = new dsCIInfoTableAdapters.DeviceTableAdapter();
dsCIInfo.DeviceDataTable dtWUG = new dsCIInfo.DeviceDataTable();
taWUG.Fill(dtWUG);
var qstWUG = (from row in dtWUG.AsEnumerable()
where row.Field<string>("sDisplayName").ToLower().Contains(strParameter)
select row.Field<string>("sDisplayName"));
Beleive in your LINQ statement dtWUG needs to be dtWUG.AsEnumerable(). Linq only works on data sources that implement the IEnumerable Interface.
You can debug it easier if you add some let statements where you can add breakpoints:
var qstWUG = (from row in dtWUG
let display = row.Field<string>("sDisplayName")
let lower = display.ToLower()
let contains = lower.Contains(strParameter)
where contains
select display).ToArray();
Also convert it to an array using .ToArray() at the end, will make it execute immediately (LINQ is lazy by paradigm, doesn't execute until it's needed), and also easier to look at in subsequent breakpoints.
Yeah, I feel stupid... I forgot to use the textbox.text to assign it to a string
string strParameter = txtAutoComplete.Text.ToLower();
//WUG TableAdapter and DataTable
dsCIInfoTableAdapters.DeviceTableAdapter taWUG;
taWUG = new dsCIInfoTableAdapters.DeviceTableAdapter();
dsCIInfo.DeviceDataTable dtWUG = new dsCIInfo.DeviceDataTable();
taWUG.Fill(dtWUG);
var qstWUG = (from row in dtWUG.AsEnumerable()
let display = row.Field<string>("sDisplayName")
where display.ToLower().Contains(strParameter)
select display).ToArray();
I previously asked the question and got answer to Best approach to write query but the problem is that if you have to save this result in a list then there duplication of records. For example
the resultant table of the join given EXAMPLE
See there are duplicate rows. How can you filter them out, and yet save the data of order number?
Of course there may be some ways but I am looking for some great ways
How can we store the data in list and not create duplicate rows in list?
My current code for my tables is
int lastUserId = 0;
sql_cmd = new SqlCommand();
sql_cmd.Connection = sql_con;
sql_cmd.CommandText = "SELECT * FROM AccountsUsers LEFT JOIN Accounts ON AccountsUsers.Id = Accounts.userId ORDER BY AccountsUsers.accFirstName";
SqlDataReader reader = sql_cmd.ExecuteReader();
if (reader.HasRows == true)
{
Users userToAdd = new Users();
while (reader.Read())
{
userToAdd = new Users();
userToAdd.userId = int.Parse(reader["Id"].ToString());
userToAdd.firstName = reader["accFirstName"].ToString();
userToAdd.lastName = reader["accLastName"].ToString();
lastUserId = userToAdd.userId;
Websites domainData = new Websites();
domainData.domainName = reader["accDomainName"].ToString();
domainData.userName = reader["accUserName"].ToString();
domainData.password = reader["accPass"].ToString();
domainData.URL = reader["accDomain"].ToString();
userToAdd.DomainData.Add(domainData);
allUsers.Add(userToAdd);
}
}
For second table I have custom list that will hold the entries of all the data in second table.
The table returned is table having joins and have multiple rows for same
Besides using the Dictionary idea as answered by Antonio Bakula...
If you persist the dictionary of users and call the code in your sample multiple times you should consider that a user account is either new, modifed, or deleted.
The algorithm to use is the following when executing your SQL query:
If row in query result is not in dictionary create and add new user to the dictionary.
If row in query result is in dictionary update the user information.
If dictionary item not in query result delete the user from the dictionary.
I'd also recommend not using SELECT *
Use only the table columns your code needs, this improves the performance of your code, and prevents a potential security breach by returning private user information.
i am not sure why are you not using distinct clause in your sql to fetch unique results. also that will be faster. did you look at using hashtables.
I would put users into Dictonary and check if allready exists, something like this :
Dictionary<int, Users> allUsers = new Dictionary<int, Users>()
and then in Reader while loop :
int userId = int.Parse(reader["Id"].ToString());
Users currUser = allUsers[userId];
if (currUser == null)
{
currUser = new Users();
currUser.userId = userId);
currUser.firstName = reader["accFirstName"].ToString();
currUser.lastName = reader["accLastName"].ToString();
allUsers.Add(userID, currUser);
}
Websites domainData = new Websites();
domainData.domainName = reader["accDomainName"].ToString();
domainData.userName = reader["accUserName"].ToString();
domainData.password = reader["accPass"].ToString();
domainData.URL = reader["accDomain"].ToString();
currUser.DomainData.Add(domainData);
Seems like the root of your problem is in your database table.
When you said duplicate data rows, are you saying you get duplicate entries in the list or you have duplicate data in your table?
Give 2 rows that are duplicate.
Two options:
First, prevent pulling duplicate data from sql by using a distinct clause like:
select distinct from where
Second option as mentioned Antonio, is to check if the list already has it.
First option is recommended unless there are other reasons.