I am using the following helper function:
public List<EventRecord> GetEvents(DateTime afterTime)
{
var formattedDateTime = $"{afterTime:yyyy-MM-dd}T{afterTime:HH:mm:ss}.000000000Z";
var query = $"*[(System/Provider/#Name='.Net Runtime') and (System/EventID=1000) and (System/TimeCreated/#SystemTime >= '{formattedDateTime}')]";
var queryResult = new EventLogQuery("Application", PathType.LogName, query);
var reader = new EventLogReader(queryResult);
var events = new List<EventRecord>();
while (true)
{
var rec = reader.ReadEvent();
if (rec == null)
{
break;
}
events.Add(rec);
}
return events;
}
This code almost works except the query seems to be ignoring the TimeCreated entirely. It's returning all events with the given ProviderName and EventId. I have tried all sorts of different things to get this to work but no matter what, TimeCreated is ignored.
Anyone see what I'm doing wrong?
Edit 1
Even replacing the query line with:
var query = $"*[System[TimeCreated[#SystemTime >= '{formattedDateTime}']]]";
Doesn't work. Returns all events regardless of when they were Created.
Edit 2
So I tried using the 'custom view' builder to generate an XML query for me and what I found was even more perplexing:
So currently the time displayed on my machine is: 2:42pm.
In 24 hour time it should be 14:42pm.
When I create a query using the custom view and select:
From: 'Events On' 03/18/2021 2:42pm , it creates the following:
<QueryList>
<Query Id="0" Path="Application">
<Select Path="Application">*[System[Provider[#Name='.NET Runtime'] and (EventID=1000) and TimeCreated[#SystemTime>='2021-03-18T20:42:13.000Z']]] </Select>
</Query>
</QueryList>
Why on gods green earth did it convert 2:42pm to 20:42?
So apparently you need to convert your time to UniversalTime for this to work.
Here is a working sample:
public List<EventRecord> GetEvents(DateTime afterTime)
{
var formattedDateTime = afterTime.ToUniversalTime().ToString("o");
var query = $"*[System[Provider[#Name='.NET Runtime'] and (EventID=1000) and TimeCreated[#SystemTime>='{formattedDateTime}']]]";
var queryResult = new EventLogQuery("Application", PathType.LogName, query);
var reader = new EventLogReader(queryResult);
var events = new List<EventRecord>();
while (true)
{
var rec = reader.ReadEvent();
if (rec == null)
{
break;
}
events.Add(rec);
}
return events;
}
Related
Here is some simple code for asking the user to select some LINE and / or ARC entities:
_AcDb.TypedValue[] dxfs = new _AcDb.TypedValue[]
{
new _AcDb.TypedValue((int)_AcDb.DxfCode.Operator, "<or"),
new _AcDb.TypedValue((int)_AcDb.DxfCode.Start, "LINE"),
new _AcDb.TypedValue((int)_AcDb.DxfCode.Start, "ARC"),
new _AcDb.TypedValue((int)_AcDb.DxfCode.Operator, "or>"),
};
_AcEd.SelectionFilter sFilter = new _AcEd.SelectionFilter(dxfs);
_AcEd.PromptSelectionOptions pso = new _AcEd.PromptSelectionOptions
{
MessageForAdding = "Select LINES and/or ARCS",
MessageForRemoval = "Remove LINES and/or ARCS",
AllowDuplicates = false
};
_AcEd.PromptSelectionResult res = editor.GetSelection(pso, sFilter);
if (res.Status == _AcEd.PromptStatus.OK)
Now, suppose be modify our tool so that it uses CommandFlags.UsePickSet. Now I can test for an existing selection set:
_AcEd.PromptSelectionResult res = editor.SelectImplied();
If the implied selection set result is OK, how can we easily validate that selection set against our filter? Afterall, the user might accidentally pick up a CIRCLE which we would want to ignore.
I can confirm that the answer here (Filter Pickfirst Selectionset) is still correct. To quote:
With CommandFlags.UsePickSet, the selection filter passed to the EditorGetSelection() method is automatically applied to the active selection if any.
I repeat the code snippet in case the link breaks:
[CommandMethod("Test", CommandFlags.UsePickSet)]
public void Test()
{
Document doc = AcAp.DocumentManager.MdiActiveDocument;
Database db = doc.Database;
Editor ed = doc.Editor;
TypedValue[] filter = { new TypedValue(0, "INSERT") };
PromptSelectionResult psr = ed.GetSelection(new SelectionFilter(filter));
if (psr.Status != PromptStatus.OK) return;
using (Transaction tr = db.TransactionManager.StartTransaction())
{
foreach (SelectedObject obj in psr.Value)
{
BlockReference br = (BlockReference)tr.GetObject(obj.ObjectId, OpenMode.ForWrite);
br.Color = Color.FromColorIndex(ColorMethod.ByAci, 30);
}
tr.Commit();
}
}
All we need to do is add the CommandFlags.UsePickSet and the system will take care of the rest (using your filter). Cool.
I have the following code:
var search = new TransactionSearchAdvanced();
search.savedSearchId = "680";
SearchResult searchResult = Client.Service.search(search);
var resultList = searchResult.searchRowList;
var castList = resultList.Cast<TransactionSearchRow>();
Everytime I call this method I get 0 search results returned. If I view the saved search in NetSuite itself I have over 1000 results.
I am running a similar search on customers that is 100% working.
public static List<Account> GetCustomerList()
{
var search = new CustomerSearchAdvanced();
search.savedSearchId = "678";
try
{
SearchResult searchResult = Client.Service.search(search);
var resultList = searchResult.searchRowList;
var castList = resultList.Cast<CustomerSearchRow>();
var accountList = new List<Account>();
foreach (var resultRow in castList)
{
var basic = resultRow.basic;
var account = new Account();
account.NsAccountId = basic.entityId?.FirstOrDefault()?.searchValue;
account.Name = basic.companyName?.FirstOrDefault()?.searchValue;
account.EmailAddress1 = basic.email?.FirstOrDefault()?.searchValue;
account.Address = basic.address?.FirstOrDefault()?.searchValue;
account.BillingAddress = basic.billAddress?.FirstOrDefault()?.searchValue;
account.Telephone1 = basic.phone?.FirstOrDefault()?.searchValue;
account.BillingPhone = basic.billPhone?.FirstOrDefault()?.searchValue;
account.Fax = basic.fax?.FirstOrDefault()?.searchValue;
account.WebAddress = basic.url?.FirstOrDefault()?.searchValue;
accountList.Add(account);
}
return accountList;
}
I have tried adding the the role to view transactions. I am totally unfamiliar with netsuite itself and have no idea what it could be since all the settings on my 2 searches are identical.
EDIT
The SearchResult objects are actually different:
looking into this now
In the saved search interface of NetSuite there is a field that needs to be chacked "Run Unrestricted" this is what solved it for me.
I have a query using Npgsql and Postgres. For building my query I am using Dapper and its SqlBuilder.
When I make the normal statement in the DB it is returning the correct result. When I am doing it via the SqlBuilder it is returning the wrong result.
I tried different way, changed the addTemplate or the parameters, but it changed nothing.
Also I have tried to change the line builder.Where("period = #period", new { model.Period }); in different ways:
builder.Where("period = #Period", new { model.Period });
builder.Where("period = period", new { model.Period });
builder.Where("period = #TestPeriod", new { TestPeriod = model.Period });
Or is this a more common way:
builder.Where("period = '" + model.Period + "'");
using (NpgsqlConnection con = Helper.GetNpgsqlConnection())
{
var builder = new SqlBuilder();
var selector = builder.AddTemplate("SELECT * FROM szzRecord.folders /**where**/");
if (model.Period != null)
builder.Where("period = #period", new { model.Period });
var result = con.Query(selector.RawSql);
return result;
}
The result with the normal sql query for example: SELECT * FROM szzRecord.folders WHERE period = 24 is returning 251 rows - which is correct.
The result with the dapper query is 1223, which are all rows. So it kinda looks that the parameter doesn't exist. On expecting the selector I find my parameter for period. I found Period = 24 inselector.parameters.templates[0]. Is this correct? selector.parameters.parameters is empty.
You need to pass the SqlBuilder's parameters into your query. You have:
var result = con.Query(selector.RawSql);
Change this to:
var result = con.Query(selector.RawSql, selector.Parameters);
I have a list of 'Sites' that are stored in my database. The list is VERY big and contains around 50,000+ records.
I am trying to loop through each record and update it. This takes ages, is there a better more efficient way of doing this?
using (IRISInSiteLiveEntities DB = new IRISInSiteLiveEntities())
{
var allsites = DB.Sites.ToList();
foreach( var sitedata in allsites)
{
var siterecord = DB.Sites.Find(sitedata.Id);
siterecord.CabinOOB = "Test";
siterecord.TowerOOB = "Test";
siterecord.ManagedOOB = "Test";
siterecord.IssueDescription = "Test";
siterecord.TargetResolutionDate = "Test";
DB.Entry(siterecord).State = EntityState.Modified;
}
DB.SaveChanges();
}
I have cut the stuff out of the code to get to the point. The proper function code I am using basically pulls a list out from Excel, then matches the records in the sites list and updates each record that matches accordingly. The DB.Find is slowing the loop down dramatically.
[HttpPost]
public ActionResult UploadUpdateOOBList()
{
CheckPermissions("UpdateOOBList");
string[] typesallowed = new string[] { ".xls", ".xlsx" };
HttpPostedFileBase file = Request.Files[0];
var fname = file.FileName;
if (!typesallowed.Any(fname.Contains))
{
return Json("NotAllowed");
}
file.SaveAs(Server.MapPath("~/Uploads/OOB List/") + fname);
//Create empty OOB data list
List<OOBList.OOBDetails> oob_data = new List<OOBList.OOBDetails>();
//Using ClosedXML rather than Interop Excel....
//Interop Excel: 30 seconds for 750 rows
//ClosedXML: 3 seconds for 750 rows
string fileName = Server.MapPath("~/Uploads/OOB List/") + fname;
using (var excelWorkbook = new XLWorkbook(fileName))
{
var nonEmptyDataRows = excelWorkbook.Worksheet(2).RowsUsed();
foreach (var dataRow in nonEmptyDataRows)
{
//for row number check
if (dataRow.RowNumber() >= 4 )
{
string siteno = dataRow.Cell(1).GetValue<string>();
string sitename = dataRow.Cell(2).GetValue<string>();
string description = dataRow.Cell(4).GetValue<string>();
string cabinoob = dataRow.Cell(5).GetValue<string>();
string toweroob = dataRow.Cell(6).GetValue<string>();
string manageoob = dataRow.Cell(7).GetValue<string>();
string resolutiondate = dataRow.Cell(8).GetValue<string>();
string resolutiondate_converted = resolutiondate.Substring(resolutiondate.Length - 9);
oob_data.Add(new OOBList.OOBDetails
{
SiteNo = siteno,
SiteName = sitename,
Description = description,
CabinOOB = cabinoob,
TowerOOB = toweroob,
ManageOOB = manageoob,
TargetResolutionDate = resolutiondate_converted
});
}
}
}
//Now delete file.
System.IO.File.Delete(Server.MapPath("~/Uploads/OOB List/") + fname);
Debug.Write("DOWNLOADING LIST ETC....\n");
using (IRISInSiteLiveEntities DB = new IRISInSiteLiveEntities())
{
var allsites = DB.Sites.ToList();
//Loop through sites and the OOB list and if they match then tell us
foreach( var oobdata in oob_data)
{
foreach( var sitedata in allsites)
{
var indexof = sitedata.SiteName.IndexOf(' ');
if( indexof > 0 )
{
var OOBNo = oobdata.SiteNo;
var OOBName = oobdata.SiteName;
var SiteNo = sitedata.SiteName;
var split = SiteNo.Substring(0, indexof);
if (OOBNo == split && SiteNo.Contains(OOBName) )
{
var siterecord = DB.Sites.Find(sitedata.Id);
siterecord.CabinOOB = oobdata.CabinOOB;
siterecord.TowerOOB = oobdata.TowerOOB;
siterecord.ManagedOOB = oobdata.ManageOOB;
siterecord.IssueDescription = oobdata.Description;
siterecord.TargetResolutionDate = oobdata.TargetResolutionDate;
DB.Entry(siterecord).State = EntityState.Modified;
Debug.Write("Updated Site ID/Name Record: " + sitedata.Id + "/" + sitedata.SiteName);
}
}
}
}
DB.SaveChanges();
}
var nowdate = DateTime.Now.ToString("dd/MM/yyyy");
System.IO.File.WriteAllText(Server.MapPath("~/Uploads/OOB List/lastupdated.txt"),nowdate);
return Json("Success");
}
Looks like you are using Entity Framework (6 or Core). In either case both
var siterecord = DB.Sites.Find(sitedata.Id);
and
DB.Entry(siterecord).State = EntityState.Modified;
are redundant, because the siteData variable is coming from
var allsites = DB.Sites.ToList();
This not only loads the whole Site table in memory, but also EF change tracker keeps reference to every object from that list. You can easily verify that with
var siterecord = DB.Sites.Find(sitedata.Id);
Debug.Assert(siterecord == sitedata);
The Find (when the data is already in memory) and Entry methods themselves are fast. But the problem is that they by default trigger automatic DetectChanges, which leads to quadratic time complexity - in simple words, very slow.
With that being said, simply remove them:
if (OOBNo == split && SiteNo.Contains(OOBName))
{
sitedata.CabinOOB = oobdata.CabinOOB;
sitedata.TowerOOB = oobdata.TowerOOB;
sitedata.ManagedOOB = oobdata.ManageOOB;
sitedata.IssueDescription = oobdata.Description;
sitedata.TargetResolutionDate = oobdata.TargetResolutionDate;
Debug.Write("Updated Site ID/Name Record: " + sitedata.Id + "/" + sitedata.SiteName);
}
This way EF will detect changes just once (before SaveChanges) and also will update only the modified record fields.
I have followed Ivan Stoev's suggestion and have changed the code by removing the DB.Find and the EntitySate Modified - It now takes about a minute and a half compared to 15 minutes beforehand. Very suprising as I didn't know that you dont actually require that to update the records. Clever. The code is now:
using (IRISInSiteLiveEntities DB = new IRISInSiteLiveEntities())
{
var allsites = DB.Sites.ToList();
Debug.Write("Starting Site Update loop...");
//Loop through sites and the OOB list and if they match then tell us
//750 records takes around 15-20 minutes.
foreach( var oobdata in oob_data)
{
foreach( var sitedata in allsites)
{
var indexof = sitedata.SiteName.IndexOf(' ');
if( indexof > 0 )
{
var OOBNo = oobdata.SiteNo;
var OOBName = oobdata.SiteName;
var SiteNo = sitedata.SiteName;
var split = SiteNo.Substring(0, indexof);
if (OOBNo == split && SiteNo.Contains(OOBName) )
{
sitedata.CabinOOB = oobdata.CabinOOB;
sitedata.TowerOOB = oobdata.TowerOOB;
sitedata.ManagedOOB = oobdata.ManageOOB;
sitedata.IssueDescription = oobdata.Description;
sitedata.TargetResolutionDate = oobdata.TargetResolutionDate;
Debug.Write("Thank you, next: " + sitedata.Id + "\n");
}
}
}
}
DB.SaveChanges();
}
So first of all you should turn your HTTPPost in an async function
more info https://learn.microsoft.com/en-us/dotnet/csharp/programming-guide/concepts/async/
What you then should do is create the tasks and add them to a list. Then wait for them to complete (if you want/need to) by calling Task.WaitAll()
https://learn.microsoft.com/en-us/dotnet/api/system.threading.tasks.task.waitall?view=netframework-4.7.2
This will allow your code to run in parallel on multiple threads optimizing performance quite a bit already.
You can also use linq to for example reduce the size of allsites beforehand by doing something that will roughly look like this
var sitedataWithCorrectNames = allsites.Where(x => x //evaluate your condition here)
https://learn.microsoft.com/en-us/dotnet/framework/data/adonet/ef/language-reference/supported-and-unsupported-linq-methods-linq-to-entities
and then start you foreach (var oobdata) with the now foreach(sitedate in sitedataWithCorrectNames)
Same goes for SiteNo.Contains(OOBName)
https://learn.microsoft.com/en-us/dotnet/csharp/programming-guide/concepts/linq/getting-started-with-linq
P.S. Most db sdk's also provide asynchornous functions so use those aswell.
P.P.S. I didn't have an IDE so I eyeballed the code but the links should provide you with plenty of samples. Reply if you need more help.
I try to combine a projection and a distinct with the MongoDB driver but don't get anywhere...
I have:
var coll = db.GetCollection<Vat>(CommonConstants.VatCodeCollection);
// I like to combine in one statement:
var result = coll.Distinct<DateTime>("ValidSince", filter).ToList();
var projection = Builders<Vat>.Projection.Expression(x => new VatPeriod { ValidSince = x.ValidSince });
So at the end I like to get a List<VatPeriod> as a result of one statement. Of course I could do something like
var coll = db.GetCollection<Vat>(CommonConstants.VatCodeCollection);
List<VatPeriod> vatPeriods = null;
try
{
var result = coll.Distinct<DateTime>("ValidSince", filter).ToList();
if (result.Count > 0)
{
vatPeriods = new List<VatPeriod>(result.Count);
foreach (var dateTime in result)
{
vatPeriods.Add(new VatPeriod() {ValidSince = dateTime});
}
}
return vatPeriods;
}
catch .....
in my repository class, but I would prefer to do everything on the Mongo server. Any idea if and how this is possible?