Convert Function Based on TSQL with Function Based with Entity Framework - c#

I have two functions that each return the same list of objects. But, the one that uses TSQL is much faster than the one using Entity Framework and I do not understand why one would be faster than the other. Is it possible to modify my EF function to work as fast as the TSQL one?
Any help will be appreciated. My code is below:
TSQL:
public static List<ChartHist> ListHistory_PureSQL()
{
List<DataRow> listDataRow = null;
string srtQry = #"Select LoginHistoryID,
LoginDuration as LoginDuration_Pass,
0 as LoginDuration_Fail,
LoginDateTime,
LoginLocationID,
LoginUserEmailID,
LoginApplicationID,
LoginEnvironmentID,
ScriptFrequency,
LoginStatus,
Reason
From LoginHistory
Where LoginStatus = 'Pass'
UNION
Select LoginHistoryID,
0 as LoginDuration_Pass,
LoginDuration as LoginDuration_Fail,
LoginDateTime,
LoginLocationID,
LoginUserEmailID,
LoginApplicationID,
LoginEnvironmentID,
ScriptFrequency,
LoginStatus,
Reason
From LoginHistory
Where LoginStatus = 'Fail'";
using (SqlConnection conn = new SqlConnection(Settings.ConnectionString))
{
using (SqlCommand objCommand = new SqlCommand(srtQry, conn))
{
objCommand.CommandType = CommandType.Text;
DataTable dt = new DataTable();
SqlDataAdapter adp = new SqlDataAdapter(objCommand);
conn.Open();
adp.Fill(dt);
if (dt != null)
{
listDataRow = dt.AsEnumerable().ToList();
}
}
}
var listChartHist = (from p in listDataRow
select new ChartHist
{
LoginHistoryID = p.Field<Int32>("LoginHistoryID"),
LoginDuration_Pass = p.Field<Int32>("LoginDuration_Pass"),
LoginDuration_Fail = p.Field<Int32>("LoginDuration_Fail"),
LoginDateTime = p.Field<DateTime>("LoginDateTime"),
LoginLocationID = p.Field<Int32>("LoginLocationID"),
LoginUserEmailID = p.Field<Int32>("LoginUserEmailID"),
LoginApplicationID = p.Field<Int32>("LoginApplicationID"),
LoginEnvironmentID = p.Field<Int32>("LoginEnvironmentID"),
ScriptFrequency = p.Field<Int32>("ScriptFrequency"),
LoginStatus = p.Field<String>("LoginStatus"),
Reason = p.Field<String>("Reason")
}).ToList();
return listChartHist;
}
EF:
public static List<ChartHist> ListHistory()
{
using (var db = new LatencyDBContext())
{
var loginHist = (from hist in db.LoginHistories
select new { LoginHistory = hist }).ToList();
//PUT LOGIN HISTORY RECORDS INTO A LOCAL LIST
var listHistory = new List<ChartHist>();
foreach (var item in loginHist)
{
var localHistData = new ChartHist();
localHistData.LoginHistoryID = item.LoginHistory.LoginHistoryID;
//split up the duration for pass and fail values
if (item.LoginHistory.LoginStatus.ToUpper() == "PASS")
{
localHistData.LoginDuration_Pass = Convert.ToDouble(item.LoginHistory.LoginDuration);
localHistData.LoginDuration_Fail = 0;
}
else if (item.LoginHistory.LoginStatus.ToUpper() == "FAIL")
{
localHistData.LoginDuration_Pass = 0;
localHistData.LoginDuration_Fail = Convert.ToDouble(item.LoginHistory.LoginDuration);
}
localHistData.LoginDateTime = item.LoginHistory.LoginDateTime;
localHistData.LoginLocationID = item.LoginHistory.LoginLocationID;
localHistData.LoginUserEmailID = item.LoginHistory.LoginUserEmailID;
localHistData.LoginApplicationID = item.LoginHistory.LoginApplicationID;
localHistData.LoginEnvironmentID = item.LoginHistory.LoginEnvironmentID;
localHistData.LoginStatus = item.LoginHistory.LoginStatus;
localHistData.Reason = item.LoginHistory.Reason;
localHistData.ScriptFrequency = item.LoginHistory.ScriptFrequency;
listHistory.Add(localHistData);
}
return listHistory;
}
}

Of course EF will take longer to execute than a plain old SQL query, and there's very little that you can do about it (except write the most optimal LINQ queries that you can).
There's a very simple reason why this is so. Running a direct SQL command will just send back the data, with no muss and no fuss attached to it, waiting for you to do the data manipulations to get it to the point where it fits nicely into whatever data structure you want it in. Running EF, on the other hand, means that not only does it run the SQL command, but it massages the data for you into objects that you can manipulate right away. That extra action of going through ADO.NET and converting the data into the objects automatically means that it will take longer than just doing the plain SQL query.
On the flip side of that coin, however, EF does provide a very nice and simple way to debug and solve whatever problems you might have from a specific query/function (like by any exceptions thrown).

I can't performance test this, but try this solution instead before you remove EF entirely:
var loginHist = db.LoginHistories.Where(item => item.LoginStatus.ToUpper() == "PASS" || item.LoginStatus.ToUpper() == "FAIL")
.Select(item => new ChartHist()
{
LoginHistoryID = item.LoginHistoryID,
LoginDuration_Pass = item.LoginStatus.ToUpper() == "PASS" ? Convert.ToDouble(item.LoginDuration) : 0,
LoginDuration_Fail = item.LoginStatus.ToUpper() == "FAIL" ? Convert.ToDouble(item.LoginDuration) : 0,
LoginDateTime = item.LoginDateTime,
LoginLocationID = item.LoginLocationID,
LoginUserEmailID = item.LoginUserEmailID,
LoginApplicationID = item.LoginApplicationID,
LoginEnvironmentID = item.LoginEnvironmentID,
LoginStatus = item.LoginStatus,
Reason = item.Reason,
ScriptFrequency = item.ScriptFrequency,
});
return loginHist.ToList();
This is the "correct" way to populate a new object from a select. It will only retrieve the data you care about, and will put it directly into the object, rather than converting it into an object and then converting it again, from one object to another.
Note: I prefer the functional calls to the from / select form, but it'd be correct either way.

Related

C# Mysql populate list<T>

I'm trying to populate list from mysql with the below code. My model looks like this https://pastebin.com/iQEVV42C
My goal is to populate the lists then take the sub lists and put them into the root (tickets) how can I do this?
public static async Task<IEnumerable<Models.Zelkon.Tickets.Root>> Tickets(int ticketId = 0, int clientId = 0)
{
DataSet ds = new DataSet();
List<Models.Zelkon.Tickets.Root> tickets = new List<Models.Zelkon.Tickets.Root>();
List<Models.Zelkon.Tickets.Incidents> ticketIncidents = new List<Models.Zelkon.Tickets.Incidents>();
List<Models.Zelkon.Tickets.Files> ticketFiles = new List<Models.Zelkon.Tickets.Files>();
List<Models.Zelkon.Tickets.Comments> ticketComments = new List<Models.Zelkon.Tickets.Comments>();
using (MySqlConnection con = new MySqlConnection(Variables.conString))
{
await con.OpenAsync();
string query = "SELECT * FROM tickets;" +
"SELECT * FROM ticket_incidents;" +
"SELECT * FROM ticket_files;" +
"SELECT * FROM ticket_comments;";
using (MySqlCommand cmd = new MySqlCommand(query, con))
{
using (MySqlDataReader reader = (MySqlDataReader)await cmd.ExecuteReaderAsync())
{
// tickets
while (await reader.ReadAsync())
{
tickets.Add(new Models.Zelkon.Tickets.Root()
{
header_text = reader.GetString("name"),
description = reader.GetString("description"),
status = reader.GetString("status"),
source = reader.GetString("source"),
owner = reader.GetString("owner"),
temp = reader.GetString("temp"),
priority = reader.GetString("priority"),
category = reader.GetString("category"),
creation_team = reader.GetString("creation_team"),
last_updated = reader.GetDateTime("last_updated"),
creation_date = reader.GetDateTime("creation_date"),
client_id = reader.GetInt32("client_id"),
ticket_id = reader.GetInt32("ticket_id"),
unique_id = reader.GetInt32("unique_id")
});
}
await reader.NextResultAsync();
// ticket_incidents
while (await reader.ReadAsync())
{
ticketIncidents.Add(new Models.Zelkon.Tickets.Incidents()
{
header_text = reader.GetString("header_text"),
description = reader.GetString("description"),
type = reader.GetString("type"),
creation_team = reader.GetString("creation_team"),
created_by = reader.GetString("created_by"),
notify_team = reader.GetInt32("notify_team"),
notification_seen = reader.GetInt32("notification_seen"),
is_pinned = reader.GetInt32("is_pinned"),
creation_date = reader.GetDateTime("creation_date"),
incident_id = reader.GetInt32("incident_id"),
ticket_id = reader.GetInt32("ticket_id"),
unique_id = reader.GetInt32("unique_id")
});
}
await reader.NextResultAsync();
// ticket_files
while (await reader.ReadAsync())
{
// Not filled yet
}
await reader.NextResultAsync();
// ticket_comments
while (await reader.ReadAsync())
{
// Not filled yet
}
await reader.NextResultAsync();
}
}
}
System.Diagnostics.Debug.WriteLine(tickets.First().owner);
return tickets;
}
My issue is how I can fill the sub classes (ticket_incidents, ticket_files, ticket_comments) and then put it into the root class?
Firstly, you might save a lot of time by using an ORM to query your database; it will turn a set of related tables into an object graph.
Secondly, if you're going to avoid a full ORM and read the data directly via ADO.NET, I would recommend Dapper, which will remove a lot of boilerplate code.
Here's how the main body of your method would look with Dapper:
string query = "SELECT * FROM tickets;" +
"SELECT * FROM ticket_incidents;" +
"SELECT * FROM ticket_files;" +
"SELECT * FROM ticket_comments;";
using var gridReader = await con.QueryMultipleAsync(query);
// NOTE: Assumes database columns are named exactly the same as the object properties
var tickets = (await gridReader.ReadAsync<Models.Zelkon.Tickets.Root>()).ToList();
var ticketIncidents = (await gridReader.ReadAsync<Models.Zelkon.Tickets.Incidents>()).ToList();
var ticketFiles = (await gridReader.ReadAsync<Models.Zelkon.Tickets.Files>()).ToList();
var ticketComments = (await gridReader.ReadAsync<Models.Zelkon.Tickets.Comments>()).ToList();
Thirdly, if you're using async with MySQL, uninstall Oracle's MySQL Connector/NET (i.e., MySql.Data) and use MySqlConnector instead. It's a longstanding bug in Connector/NET that async operations actually operate completely synchronously.
Finally, to link your objects together, iterate over each collection and attach the object to the right Root object. One way to do this would be to put your Root objects in a Dictionary keyed on their ID:
var ticketsById = tickets.ToDictionary(x => x.ticket_id, x => x);
foreach (var incident in ticketIncidents)
ticketsById[incident.ticket_id].incidents.Add(incident);
foreach (var file in ticketFiles)
ticketsById[file.ticket_id].files.Add(file);
foreach (var comment in ticketComments)
ticketsById[comment.ticket_id].comments.Add(comment);
This assumes that the Root object has an incidents property (files, comments) that's initialized to an empty List<Incidents> (List<Files>, List<Comments>). You will need to write that code if it doesn't exist.
Again, any decent ORM (Entity Framework, NHibernate, etc.) will provide all of this code for you "out of the box", so you wouldn't even have to write this method at all.

how to optimize the time in executing the query in c# react

I am new to React and new to Web API. I am uploading data in a tabulator in react front end from the value that I am passing through the web API. I am passing value through the getReports function like this:
[HttpPost]
[Route("GetReports")]
public IHttpActionResult GetReports(string jwt, List<object> data)
{
if (!Common.VerificationToken.VerifyJWToken(jwt))
{
return null;
}
var to = data[0];
var from = data[1];
DateTime toDate = Convert.ToDateTime(to);
DateTime fromDate = Convert.ToDateTime(from);
var ReportData = db.T_CQL_COIL_DESC.Where(t => t.CCD_CREATED_ON >= toDate &&
t.CCD_CREATED_ON <= fromDate).ToList();
ReportsDTO dto = new ReportsDTO();
List<ReportsDTO> ReportDTO = new List<ReportsDTO>();
try
{
foreach (var report in ReportData)
{
List<vehicledetail> vehicle = new List<vehicledetail>();
var imgurl = "https://firebasestorage.googleapis.com/v0/b/tsl-coil-qlty-
monitoring-dev.appspot.com/";
dto = new ReportsDTO();
dto.Type = report.CCD_OP_TYPE;
dto.ID = report.CCD_COIL_ID;
vehicle = GetVehicleID(dto.ID);
vehicledetail vehicledetails = vehicle[0];
dto.vehicleno = vehicledetails.vehicleno.ToString();
dto.wagonno = vehicledetails.wagonno.ToString();
dto.Active = report.CCD_ACTIVE;
dto.ImgURL = report.CCD_IMAGE_URL != null ? imgurl + report.CCD_IMAGE_URL : "NA";
dto.Desc = report.CCD_VIEW_DESC != null ? report.CCD_VIEW_DESC : "NA";
ReportDTO.Add(dto);
}
return Ok(ReportDTO);
}
catch (Exception ex)
{
return Content(HttpStatusCode.NoContent, "Something went wrong");
}
}
The data in vehicledetail in vehicledetail vehicledetails = vehicle[0]; is getting populated from this function:
public List<vehicledetail> GetVehicleID(string coilID)
{
List<vehicledetail> vehicle = new List<vehicledetail>();
vehicledetail vehicledetails = new vehicledetail();
string oradb = Utilities.STAR_DB;
OracleConnection conn = new OracleConnection(oradb);
string query = "SELECT a.Vbeln, b.WAGON_NO FROM sapr3.lips a, sapr3.ZVTRRDA b WHERE
a.MANDT='600' AND a.CHARG='" + coilID + "' AND a.LFIMG > 0 AND a.MANDT = b.MANDT AND
a.VBELN = b.VBELN";
OracleDataAdapter da = new OracleDataAdapter(query, conn);
conn.Open();
DataTable dt = new DataTable();
da.Fill(dt);
foreach (DataRow row in dt.Rows)
{
vehicledetails.vehicleno = row["VBELN"].ToString();
vehicledetails.wagonno = row["WAGON_NO"].ToString();
}
conn.Close();
vehicle.Add(vehicledetails);
return (vehicle);
}
It is working fine but it is taking 30 seconds to load the below data:
How do I optimize this . Please help. Note: that it is taking 30 seconds to upload this data
Few other things aside, the major problem seems to be that you are querying database for each vehicle.
In this particular scenario it might very well be better to select all vehicle ids and query them all.
var vehicleIds = reportData.SelectMany(t => t.ID);
You can then form a query that will get all vehicle details together. This will reduce the number of database calls and it can have massive impact on time.
Another thing to check is if there's any index created on the vehicle id column in database as that may also help speed things up.

C# MVC Loop through list and update each record efficiently

I have a list of 'Sites' that are stored in my database. The list is VERY big and contains around 50,000+ records.
I am trying to loop through each record and update it. This takes ages, is there a better more efficient way of doing this?
using (IRISInSiteLiveEntities DB = new IRISInSiteLiveEntities())
{
var allsites = DB.Sites.ToList();
foreach( var sitedata in allsites)
{
var siterecord = DB.Sites.Find(sitedata.Id);
siterecord.CabinOOB = "Test";
siterecord.TowerOOB = "Test";
siterecord.ManagedOOB = "Test";
siterecord.IssueDescription = "Test";
siterecord.TargetResolutionDate = "Test";
DB.Entry(siterecord).State = EntityState.Modified;
}
DB.SaveChanges();
}
I have cut the stuff out of the code to get to the point. The proper function code I am using basically pulls a list out from Excel, then matches the records in the sites list and updates each record that matches accordingly. The DB.Find is slowing the loop down dramatically.
[HttpPost]
public ActionResult UploadUpdateOOBList()
{
CheckPermissions("UpdateOOBList");
string[] typesallowed = new string[] { ".xls", ".xlsx" };
HttpPostedFileBase file = Request.Files[0];
var fname = file.FileName;
if (!typesallowed.Any(fname.Contains))
{
return Json("NotAllowed");
}
file.SaveAs(Server.MapPath("~/Uploads/OOB List/") + fname);
//Create empty OOB data list
List<OOBList.OOBDetails> oob_data = new List<OOBList.OOBDetails>();
//Using ClosedXML rather than Interop Excel....
//Interop Excel: 30 seconds for 750 rows
//ClosedXML: 3 seconds for 750 rows
string fileName = Server.MapPath("~/Uploads/OOB List/") + fname;
using (var excelWorkbook = new XLWorkbook(fileName))
{
var nonEmptyDataRows = excelWorkbook.Worksheet(2).RowsUsed();
foreach (var dataRow in nonEmptyDataRows)
{
//for row number check
if (dataRow.RowNumber() >= 4 )
{
string siteno = dataRow.Cell(1).GetValue<string>();
string sitename = dataRow.Cell(2).GetValue<string>();
string description = dataRow.Cell(4).GetValue<string>();
string cabinoob = dataRow.Cell(5).GetValue<string>();
string toweroob = dataRow.Cell(6).GetValue<string>();
string manageoob = dataRow.Cell(7).GetValue<string>();
string resolutiondate = dataRow.Cell(8).GetValue<string>();
string resolutiondate_converted = resolutiondate.Substring(resolutiondate.Length - 9);
oob_data.Add(new OOBList.OOBDetails
{
SiteNo = siteno,
SiteName = sitename,
Description = description,
CabinOOB = cabinoob,
TowerOOB = toweroob,
ManageOOB = manageoob,
TargetResolutionDate = resolutiondate_converted
});
}
}
}
//Now delete file.
System.IO.File.Delete(Server.MapPath("~/Uploads/OOB List/") + fname);
Debug.Write("DOWNLOADING LIST ETC....\n");
using (IRISInSiteLiveEntities DB = new IRISInSiteLiveEntities())
{
var allsites = DB.Sites.ToList();
//Loop through sites and the OOB list and if they match then tell us
foreach( var oobdata in oob_data)
{
foreach( var sitedata in allsites)
{
var indexof = sitedata.SiteName.IndexOf(' ');
if( indexof > 0 )
{
var OOBNo = oobdata.SiteNo;
var OOBName = oobdata.SiteName;
var SiteNo = sitedata.SiteName;
var split = SiteNo.Substring(0, indexof);
if (OOBNo == split && SiteNo.Contains(OOBName) )
{
var siterecord = DB.Sites.Find(sitedata.Id);
siterecord.CabinOOB = oobdata.CabinOOB;
siterecord.TowerOOB = oobdata.TowerOOB;
siterecord.ManagedOOB = oobdata.ManageOOB;
siterecord.IssueDescription = oobdata.Description;
siterecord.TargetResolutionDate = oobdata.TargetResolutionDate;
DB.Entry(siterecord).State = EntityState.Modified;
Debug.Write("Updated Site ID/Name Record: " + sitedata.Id + "/" + sitedata.SiteName);
}
}
}
}
DB.SaveChanges();
}
var nowdate = DateTime.Now.ToString("dd/MM/yyyy");
System.IO.File.WriteAllText(Server.MapPath("~/Uploads/OOB List/lastupdated.txt"),nowdate);
return Json("Success");
}
Looks like you are using Entity Framework (6 or Core). In either case both
var siterecord = DB.Sites.Find(sitedata.Id);
and
DB.Entry(siterecord).State = EntityState.Modified;
are redundant, because the siteData variable is coming from
var allsites = DB.Sites.ToList();
This not only loads the whole Site table in memory, but also EF change tracker keeps reference to every object from that list. You can easily verify that with
var siterecord = DB.Sites.Find(sitedata.Id);
Debug.Assert(siterecord == sitedata);
The Find (when the data is already in memory) and Entry methods themselves are fast. But the problem is that they by default trigger automatic DetectChanges, which leads to quadratic time complexity - in simple words, very slow.
With that being said, simply remove them:
if (OOBNo == split && SiteNo.Contains(OOBName))
{
sitedata.CabinOOB = oobdata.CabinOOB;
sitedata.TowerOOB = oobdata.TowerOOB;
sitedata.ManagedOOB = oobdata.ManageOOB;
sitedata.IssueDescription = oobdata.Description;
sitedata.TargetResolutionDate = oobdata.TargetResolutionDate;
Debug.Write("Updated Site ID/Name Record: " + sitedata.Id + "/" + sitedata.SiteName);
}
This way EF will detect changes just once (before SaveChanges) and also will update only the modified record fields.
I have followed Ivan Stoev's suggestion and have changed the code by removing the DB.Find and the EntitySate Modified - It now takes about a minute and a half compared to 15 minutes beforehand. Very suprising as I didn't know that you dont actually require that to update the records. Clever. The code is now:
using (IRISInSiteLiveEntities DB = new IRISInSiteLiveEntities())
{
var allsites = DB.Sites.ToList();
Debug.Write("Starting Site Update loop...");
//Loop through sites and the OOB list and if they match then tell us
//750 records takes around 15-20 minutes.
foreach( var oobdata in oob_data)
{
foreach( var sitedata in allsites)
{
var indexof = sitedata.SiteName.IndexOf(' ');
if( indexof > 0 )
{
var OOBNo = oobdata.SiteNo;
var OOBName = oobdata.SiteName;
var SiteNo = sitedata.SiteName;
var split = SiteNo.Substring(0, indexof);
if (OOBNo == split && SiteNo.Contains(OOBName) )
{
sitedata.CabinOOB = oobdata.CabinOOB;
sitedata.TowerOOB = oobdata.TowerOOB;
sitedata.ManagedOOB = oobdata.ManageOOB;
sitedata.IssueDescription = oobdata.Description;
sitedata.TargetResolutionDate = oobdata.TargetResolutionDate;
Debug.Write("Thank you, next: " + sitedata.Id + "\n");
}
}
}
}
DB.SaveChanges();
}
So first of all you should turn your HTTPPost in an async function
more info https://learn.microsoft.com/en-us/dotnet/csharp/programming-guide/concepts/async/
What you then should do is create the tasks and add them to a list. Then wait for them to complete (if you want/need to) by calling Task.WaitAll()
https://learn.microsoft.com/en-us/dotnet/api/system.threading.tasks.task.waitall?view=netframework-4.7.2
This will allow your code to run in parallel on multiple threads optimizing performance quite a bit already.
You can also use linq to for example reduce the size of allsites beforehand by doing something that will roughly look like this
var sitedataWithCorrectNames = allsites.Where(x => x //evaluate your condition here)
https://learn.microsoft.com/en-us/dotnet/framework/data/adonet/ef/language-reference/supported-and-unsupported-linq-methods-linq-to-entities
and then start you foreach (var oobdata) with the now foreach(sitedate in sitedataWithCorrectNames)
Same goes for SiteNo.Contains(OOBName)
https://learn.microsoft.com/en-us/dotnet/csharp/programming-guide/concepts/linq/getting-started-with-linq
P.S. Most db sdk's also provide asynchornous functions so use those aswell.
P.P.S. I didn't have an IDE so I eyeballed the code but the links should provide you with plenty of samples. Reply if you need more help.

Dapper query with dynamic list of filters

I have a c# mvc app using Dapper. There is a list table page which has several optional filters (as well as paging). A user can select (or not) any of several (about 8 right now but could grow) filters, each with a drop down for a from value and to value. So, for example, a user could select category "price" and filter from value "$100" to value "$200". However, I don't know how many categories the user is filtering on before hand and not all of the filter categories are the same type (some int, some decimal/double, some DateTime, though they all come in as string on FilterRange).
I'm trying to build a (relatively) simple yet sustainable Dapper query for this. So far I have this:
public List<PropertySale> GetSales(List<FilterRange> filterRanges, int skip = 0, int take = 0)
{
var skipTake = " order by 1 ASC OFFSET #skip ROWS";
if (take > 0)
skipTake += " FETCH NEXT #take";
var ranges = " WHERE 1 = 1 ";
for(var i = 0; i < filterRanges.Count; i++)
{
ranges += " AND #filterRanges[i].columnName BETWEEN #filterRanges[i].fromValue AND #filterRanges[i].toValue ";
}
using (var conn = OpenConnection())
{
string query = #"Select * from Sales "
+ ranges
+ skipTake;
return conn.Query<Sale>(query, new { filterRanges, skip, take }).AsList();
}
}
I Keep getting an error saying "... filterRanges cannot be used as a parameter value"
Is it possible to even do this in Dapper? All of the IEnumerable examples I see are where in _ which doesn't fit this situation. Any help is appreciated.
You can use DynamicParameters class for generic fields.
Dictionary<string, object> Filters = new Dictionary<string, object>();
Filters.Add("UserName", "admin");
Filters.Add("Email", "admin#admin.com");
var builder = new SqlBuilder();
var select = builder.AddTemplate("select * from SomeTable /**where**/");
var parameter = new DynamicParameters();
foreach (var filter in Filters)
{
parameter.Add(filter.Key, filter.Value);
builder.Where($"{filter.Key} = #{filter.Key}");
}
var searchResult = appCon.Query<ApplicationUser>(select.RawSql, parameter);
You can use a list of dynamic column values but you cannot do this also for the column name other than using string format which can cause a SQL injection.
You have to validate the column names from the list in order to be sure that they really exist before using them in a SQL query.
This is how you can use the list of filterRanges dynamically :
const string sqlTemplate = "SELECT /**select**/ FROM Sale /**where**/ /**orderby**/";
var sqlBuilder = new SqlBuilder();
var template = sqlBuilder.AddTemplate(sqlTemplate);
sqlBuilder.Select("*");
for (var i = 0; i < filterRanges.Count; i++)
{
sqlBuilder.Where($"{filterRanges[i].ColumnName} = #columnValue", new { columnValue = filterRanges[i].FromValue });
}
using (var conn = OpenConnection())
{
return conn.Query<Sale>(template.RawSql, template.Parameters).AsList();
}
You can easily create that dynamic condition using DapperQueryBuilder:
using (var conn = OpenConnection())
{
var query = conn.QueryBuilder($#"
SELECT *
FROM Sales
/**where**/
order by 1 ASC
OFFSET {skip} ROWS FETCH NEXT {take}
");
foreach (var filter in filterRanges)
query.Where($#"{filter.ColumnName:raw} BETWEEN
{filter.FromValue.Value} AND {filter.ToValue.Value}");
return conn.Query<Sale>(query, new { filterRanges, skip, take }).AsList();
}
Or without the magic word /**where**/:
using (var conn = OpenConnection())
{
var query = conn.QueryBuilder($#"
SELECT *
FROM Sales
WHERE 1=1
");
foreach (var filter in filterRanges)
query.Append($#"{filter.ColumnName:raw} BETWEEN
{filter.FromValue.Value} AND {filter.ToValue.Value}");
query.Append($"order by 1 ASC OFFSET {skip} ROWS FETCH NEXT {take}");
return conn.Query<Sale>(query, new { filterRanges, skip, take }).AsList();
}
The output is fully parametrized SQL, even though it looks like we're doing plain string concatenation.
Disclaimer: I'm one of the authors of this library
I was able to find a solution for this. The key was to convert the List to a Dictionary. I created a private method:
private Dictionary<string, object> CreateParametersDictionary(List<FilterRange> filters, int skip = 0, int take = 0)
{
var dict = new Dictionary<string, object>()
{
{ "#skip", skip },
{ "#take", take },
};
for (var i = 0; i < filters.Count; i++)
{
dict.Add($"column_{i}", filters[i].Filter.Description);
// some logic here which determines how you parse
// I used a switch, not shown here for brevity
dict.Add($"#fromVal_{i}", int.Parse(filters[i].FromValue.Value));
dict.Add($"#toVal_{i}", int.Parse(filters[i].ToValue.Value));
}
return dict;
}
Then to build my query,
var ranges = " WHERE 1 = 1 ";
for(var i = 0; i < filterRanges.Count; i++)
ranges += $" AND {filter[$"column_{i}"]} BETWEEN #fromVal_{i} AND #toVal_{i} ";
Special note: Be very careful here as the column name is not a parameter and you could open your self up to injection attacks (as #Popa noted in his answer). In my case those values come from an enum class and not from user in put so I am safe.
The rest is pretty straight forwared:
using (var conn = OpenConnection())
{
string query = #"Select * from Sales "
+ ranges
+ skipTake;
return conn.Query<Sale>(query, filter).AsList();
}

IBM db2Command SQL SELECT * FROM IN #namedParameter

When using the C# code below to construct a DB2 SQL query the result set only has one row. If I manually construct the "IN" predicate inside the cmdTxt string using string.Join(",", ids) then all of the expected rows are returned. How can I return all of the expected rows using the db2Parameter object instead of building the query as a long string to be sent to the server?
public object[] GetResults(int[] ids)
{
var cmdTxt = "SELECT DISTINCT ID,COL2,COL3 FROM TABLE WHERE ID IN ( #ids ) ";
var db2Command = _DB2Connection.CreateCommand();
db2Command.CommandText = cmdTxt;
var db2Parameter = db2Command.CreateParameter();
db2Parameter.ArrayLength = ids.Length;
db2Parameter.DB2Type = DB2Type.DynArray;
db2Parameter.ParameterName = "#ids";
db2Parameter.Value = ids;
db2Command.Parameters.Add(db2Parameter);
var results = ExecuteQuery(db2Command);
return results.ToArray();
}
private object[] ExecuteQuery(DB2Command db2Command)
{
_DB2Connection.Open();
var resultList = new ArrayList();
var results = db2Command.ExecuteReader();
while (results.Read())
{
var values = new object[results.FieldCount];
results.GetValues(values);
resultList.Add(values);
}
results.Close();
_DB2Connection.Close();
return resultList.ToArray();
}
You cannot send in an array as a parameter. You would have to do something to build out a list of parameters, one for each of your values.
e.g.: SELECT DISTINCT ID,COL2,COL3 FROM TABLE WHERE ID IN ( #id1, #id2, ... #idN )
And then add the values to your parameter collection:
cmd.Parameters.Add("#id1", DB2Type.Integer).Value = your_val;
Additionally, there are a few things I would do to improve your code:
Use using statements around your DB2 objects. This will automatically dispose of the objects correctly when they go out of scope. If you don't do this, eventually you will run into errors. This should be done on DB2Connection, DB2Command, DB2Transaction, and DB2Reader objects especially.
I would recommend that you wrap queries in a transaction object, even for selects. With DB2 (and my experience is with z/OS mainframe, here... it might be different for AS/400), it writes one "accounting" record (basically the work that DB2 did) for each transaction. If you don't have an explicit transaction, DB2 will create one for you, and automatically commit after every statement, which adds up to a lot of backend records that could be combined.
My personal opinion would also be to create a .NET class to hold the data that you are getting back from the database. That would make it easier to work with using IntelliSense, among other things (because you would be able to auto-complete the property name, and .NET would know the type of the object). Right now, with the array of objects, if your column order or data type changes, it may be difficult to find/debug those usages throughout your code.
I've included a version of your code that I re-wrote that has some of these changes in it:
public List<ReturnClass> GetResults(int[] ids)
{
using (var conn = new DB2Connection())
{
conn.Open();
using (var trans = conn.BeginTransaction(IsolationLevel.ReadCommitted))
using (var cmd = conn.CreateCommand())
{
cmd.Transaction = trans;
var parms = new List<string>();
var idCount = 0;
foreach (var id in ids)
{
var parm = "#id" + idCount++;
parms.Add(parm);
cmd.Parameters.Add(parm, DB2Type.Integer).Value = id;
}
cmd.CommandText = "SELECT DISTINCT ID,COL2,COL3 FROM TABLE WHERE ID IN ( " + string.Join(",", parms) + " ) ";
var resultList = new List<ReturnClass>();
using (var reader = cmd.ExecuteReader())
{
while (reader.Read())
{
var values = new ReturnClass();
values.Id = (int)reader["ID"];
values.Col1 = reader["COL1"].ToString();
values.Col2 = reader["COL2"].ToString();
resultList.Add(values);
}
}
return resultList;
}
}
}
public class ReturnClass
{
public int Id;
public string Col1;
public string Col2;
}
Try changing from:
db2Parameter.DB2Type = DB2Type.DynArray;
to:
db2Parameter.DB2Type = DB2Type.Integer;
This is based on the example given here

Categories