I have a list of objects which I sort multiple times throughout code and when the user interacts with the program. I was wondering if it would be better to insert new items into the list rather than add to the end of the list and resort the entire list.
The code below is for importing browser bookmarks - Here I add a bunch of bookmarks to the List (this._MyLinks) which are Link objects and then sort the final List - Which I think is probably best in this given scenario....
public void ImportBookmarks(string importFile)
{
using (var file = File.OpenRead(importFile))
{
var reader = new NetscapeBookmarksReader();
var bookmarks = reader.Read(file);
foreach (var b in bookmarks.AllLinks)
{
bool duplicate = this._MyLinks.Any(link => link._URL == b.Url);
if(duplicate)
{
continue;
}
Link bookmark = new Link();
bookmark._URL = b.Url;
bookmark._SiteName = b.Title;
bookmark.BrowserPath = "";
bookmark.BrowserName = "";
if (bookmark.AddToConfig(true))
{
this._MyLinks.Add(bookmark);
}
}
}
this._MyLinks = this._MyLinks.OrderBy(o => o._SiteName).ToList();
}
Now a user also has the option to add their own links (one at a time). Whenever the user adds a link the ENTIRE list is sorted again using
this._MyLinks = this._MyLinks.OrderBy(o => o._SiteName).ToList();
Is it better from a preformance standpoint (or just generally) to just insert the item directly into it's specified location? If so would you have suggestions on how I can go about doing that?
Thanks!
Since you want a sorted set of data you should be using a more appropriate data structure, specifically a sorted data structure, rather than using an unsorted data structure that you re-sort every time, or that forces you to inefficiently add items to the middle of a list.
SortedSet is specifically designed to maintain a sorted set of data efficiently.
Related
I have some problems with collection lists. I use the list for saving some customer data which i collect from xml and textfiles.
// First i create an instance of the list
List<Customer> cusList = new List<Customer>();
// Save files
String[] somefiles = Directory.GetFiles(//FromPath");
// Then i loop through some files and collect data for the list
for (int i=0; i<somefiles.length; i++)
{
if (some statements match)
{
// call a methode and save file data to the cuslist
cuslist= callmethode(somefiles);
}
else
{
System.Console.WriteLine("Do nothing");
}
}
I want to extend the list for all files, but at the moment i get after the loop only data from the last file.
How can i handle it, that it saves all the data from all files?
Kind regards
When you write cuslist= callmethode(file); you re-assign the list at every iteration. What you need instead is something like this:
cuslist.AddRange(callmethode(file));
This will just add the elements returned from the method to your list instead of replacing the entire list.
If the method just returns one single element use cuslist.Add instead.
HimBromBeere explained the issue and provided the correct answer, just as a side-note:
You could use LINQ to simplify this task:
List<Customer> cusList = somefiles
.Where(f => some statements match)
.SelectMany(f => callmethode(f))
.ToList();
I know how I can output my graph database to a file (like a GraphML file), but I'd rather iterate through the nodes and have them as C# objects because I need to use them elsewhere.
Something like this:
var it = graph.GetNodeIterator();
while (it.HasNext()) {
var node = new MyNodeObject();
//get attributes or whatever
nodeList.Add(node);
}
//nodeList now contains all nodes in a List
I can't find a convenient way to do this and the Dex documentation isn't very helpful. Clearly Dex has some way of doing this because I can easily export to GraphML, but I don't want to export to GraphML and then parse the GraphML into C# objects.
Here is how I do it, not sure if it's the best way though:
//find the type(s) youre looking for based on how you assigned
//them in the first place. you may already know.
var typeIt = graph.FindTypes().Iterator();
while (typeIt.HasNext()) {
//find your types if you dont know
}
//select the types you want. lets say all of your nodes have one of two types,
//which map to attributes 4 and 3.
var select = graph.Select(4);
//union them
select.Union(graph.Select(3));
//start to iterate
var it = select.Iterator();
while (it.HasNext()) {
var next = it.Next();
//map to your own objects using next
}
I am fairly new to C#
I am trying to retrieve some information from an external data source and store it in array, once it is in an array I wish to sort it by time.
I know how to do this for just one column in a row, however the information I require has multiple columns.
For example:
foreach (Appointment Appoint in fapts)
{
// Store Appoint.Subject, Appoint.Start, Appoint.Organiser.Name.ToString(), Appoint.Location in an array
}
// Sort my array by Appoint.Start
foreach ( item in myNewArray )
{
//print out Appoint.Subject - Appoint.Start, Appoint.Organiser.Name.ToString() and Appoint.location
}
Many thanks for your help.
EDIT:
I have multiple data sources which pull in this:
foreach (Appointment Appoint in fapts)
{
// Store Appoint.Subject, Appoint.Start, Appoint.Organiser.Name.ToString(), Appoint.Location in an array
}
Hence the need to sort the items in a new array, I know this isn't very efficent but there is no way of getting the information I need in any other way.
You can sort a list using the LINQ sorting operators OrderBy and ThenBy, as shown below.
using System.Linq;
and then...
var appointments = new List<Appointment>();
var sortedAppointments = list.OrderBy(l => l.Subject).ThenBy(l => l.Name).ToList();
This will create a new list of appointments, sorted by subject and then by name.
It's unclear what your final aim is but:
Use a generic List instead of an array:
See this SO question for more information as to why using a List is prefered.
List<Appointment> appointments = new List<Appointment>();
foreach (Appointment Appoint in fapts)
{
appointments.Add(Appoint);
}
foreach (var item in appointments)
{
Console.WriteLine(item.Subject);
Console.WriteLine(item.Foo);
// Here you could override ToString() on Appointment to print eveything in one Console.WriteLine
}
If the aim of your code is to order by time, try the following:
var sortedAppointments = fapts.OrderBy(a => a.Start); // assuming Start is a DateTime property of `Appointment`.
Consider a Dictionary Object instead of an array if the data is conceptually one row multiple columns.
foreach(KeyValuePair<string, string> entry in MyDic)
{
// do something with entry.Value or entry.Key
}
You already have a list of objects in fpts, sort that list itself:
fpts.OrderBy(x => x.Subject).ThenBy(x => x.Location).ToList();
LINQ is your friend here.
fapts appears to already be a collection so you could just operate on it.
var myNewArray = fapts.OrderBy(Appoint => Appoint.Start).ToArray()
I've used the ToArray() call to force immediate evaluation and means that myNewArray is already sorted so that if you use it more than once you don't have to re-evaluate the sort.
Alternatively if you are only using this once you can just as easily miss the ToArray() portion out and then execution of the sort will be deferred until you try and enumerate through myNewArray.
This solution puts the source objects into the array, but if you are just wanting to store the specific fields you mention then you will need to use a select. You have two choices for the array item type, you can either use an anonymous class which provides difficulties if you are returning this array from a function or define a class.
For anonymous:
var myNewArray = fapts.OrderBy(Appoint => Appoint.Start)
.Select(Appoint => new {
Start = Appoint.Start,
Organiser = Appoint.Organiser.Name.ToString(),
Location = Appoint.Location
}).ToArray();
For named class assuming class is MyClass:
var myNewArray = fapts.OrderBy(Appoint => Appoint.Start)
.Select(Appoint => new MyClass {
Start = Appoint.Start,
Organiser = Appoint.Organiser.Name.ToString(),
Location = Appoint.Location
}).ToArray();
You have a wide range of options. The 2 most common are:
1) Create a class, then define an array or list of that class, and populate that
2) Create a structure that matches the data format and create an array or list of that
Of course, you could put the data into an XML format or dataset, but that's probably more work than you need.
public List<foo> appointments = new List<foo>();
public struct foo
{
public string subject ;
public DateTime start ;
public string name ;
public string location ;
}
public void foo1()
{
// parse the file
while (!File.eof())
{
// Read the next line...
var myRecord = new foo() ;
myRecord.subject = data.subject ;
myRecord.start = data.Start ;
myRecord.name = data.Name ;
//...
appointments.Add(myRecord);
}
}
Enjoy
(Since I can't comment and reply to the comment - it wasn't clear if he had a class, etc. or was just showing us what he wanted to do. I assumed it was just for demonstration purposes since there wasn't any info as to how the data was being read. If he could already put it into a class, than the first answer applied anyway. I just tossed the last 2 in there because they were options for getting the data first.)
I've been having a problem for some time, and I've exhausted all means of figuring this out for myself.
I have 2 lists in a MS Sharepoint 2010 environment that are holding personal physician data for a medical group...nothing special just mainly text fields and a few lookup choice fields.
I am trying to write a program that will migrate the data over from List A to List B. I am using LINQ to Sharepoint to accomplish this. Everything compiles just fine, but when it runs and hits the SubmitChanges() method, I get a runtime error that states:
"All new entities within an object graph must be added/attached before changes are submitted."
this issue must be outside of my realm of C# knowledge because I simply cannot find the solution for it. The problem is DEFINITELY stemming from the fact that some of the columns are of type "Lookup", because when I create a new "Physician" entity in my LINQ query, if I comment out the fields that deal with the lookup columns, everything runs perfectly.
With the lookup columns included, if I debug and hit breakpoints before the SubmitChanges() method, I can look at the new "Physician" entities created from the old list and the fields, including data from the lookup columns, looks good, the data is in there the way I want it to be, it just flakes out whenever it tries to actually update the new list with the new entities.
I have tried several methods of working around this error, all to no avail. In particular, I have tried created a brand new EntityList list and calling the Attach() method after each new "Physician" Entity is created, but to no avail, it just sends me around in a bunch of circles, chasing other errors such as "ID cannot be null", "Cannot insert entities that have been deleted" etc.,
I am no farther now than when I first got this error and any help that anyone can offer would certainly be appreciated.
Here is my code:
using (ProviderDataContext ctx = new ProviderDataContext("http://dev"))
{
SPSite sitecollection = new SPSite("http://dev");
SPWeb web = sitecollection.OpenWeb();
SPList theOldList = web.Lists.TryGetList("OldList_Physicians");
//Create new Physician entities.
foreach(SPListItem l in theOldList.Items)
{
PhysiciansItem p = new PhysiciansItem()
{
FirstName = (String)l["First Name"],
Title = (String)l["Last Name"],
MiddleInitial = (String)l["Middle Init"],
ProviderNumber = Convert.ToInt32(l["Provider No"]),
Gender = ConvertGender(l),
UndergraduateSchool =(String)l["UG_School"],
MedicalSchool = (String)l["Med_School"],
Residency = (String)l["Residency"],
Fellowship = (String)l["Fellowship"],
Internship = (String)l["Internship"],
PhysicianType = ConvertToPhysiciantype(l),
Specialty = ConvertSpecialties(l),
InsurancesAccepted = ConvertInsurance(l),
};
ctx.Physicians.InsertOnSubmit(p);
}
ctx.SubmitChanges(); //this is where it flakes out
}
}
//Theses are conversion functions that I wrote to convert the data from the old list to the new lookup columns.
private Gender ConvertGender(SPListItem l)
{
Gender g = new Gender();
if ((String)l["Sex"] == "M")
{
g = Gender.M;
}
else g = Gender.F;
return g;
}
//Process and convert the 'Physician Type', namely the distinction between MD (Medical Doctor) and
//DO (Doctor of Osteopathic Medicine). State Regualtions require this information to be attached
//to a physician's profile.
private ProviderTypesItem ConvertToPhysiciantype(SPListItem l)
{
ProviderTypesItem p = new ProviderTypesItem();
p.Title = (String)l["Provider_Title:Title"];
p.Intials = (String)l["Provider_Title"];
return p;
}
//Process and convert current Specialty and SubSpecialty data into the single multi-choice lookup column
private EntitySet<Item> ConvertSpecialties(SPListItem l)
{
EntitySet<Item> theEntityList = new EntitySet<Item>();
Item i = new Item();
i.Title = (String)l["Provider Specialty"];
theEntityList.Add(i);
if ((String)l["Provider SubSpecialty"] != null)
{
Item theSubSpecialty = new Item();
theSubSpecialty.Title = (String)l["Provider SubSpecialty"];
theEntityList.Add(theSubSpecialty);
}
return theEntityList;
}
//Process and add insurance accepted.
//Note this is a conversion from 3 boolean columns in the SP Environment to a multi-select enabled checkbox
//list.
private EntitySet<Item> ConvertInsurance(SPListItem l)
{
EntitySet<Item> theEntityList = new EntitySet<Item>();
if ((bool)l["TennCare"] == true)
{
Item TenncareItem = new Item();
TenncareItem.Title = "TennCare";
theEntityList.Add(TenncareItem);
}
if ((bool)l["Medicare"] == true)
{
Item MedicareItem = new Item();
MedicareItem.Title = "Medicare";
theEntityList.Add(MedicareItem);
}
if ((bool)l["Commercial"] == true)
{
Item CommercialItem = new Item();
CommercialItem.Title = "Commercial";
theEntityList.Add(CommercialItem);
}
return theEntityList;
}
}
So this may not be the answer you're looking for, but it's what's worked for me in the past. I've found that updating lookup fields using Linq to Sharepoint to be quite frustrating. It frequently doesn't work, or doesn't work efficiently (forcing me to query an item by ID just to set the lookup value).
You can set up the entity so that it has an int property for the lookup id (for each lookup field) and a string property for the lookup value. If, when you generate the entities using SPMetal, you don't generate the list that is being looked up then it will do this on it's own. What I like to do is (using your entity as an example)
Generate the entity for just that one list (Physicians) in some temporary folder
Pull out the properties for lookup id & value (there will also be private backing fields that need to come along for the ride too) for each of the lookups (or the ones that I'm interested in)
Create a partial class file for Physicians in my actual project file, so that regenerating the entire SPMetal file normally (without restricting to just that list) doesn't overwrite changes
Paste the lookup id & value properties in this partial Physicians class.
Now you will have 3 properties for each lookup field. For example, for PhysicianType there will be:
PhysicianType, which is the one that is currently there. This is great when querying data, as you can perform joins and such very easily.
PhysicianTypeId which can be occasionally useful for queries if you only need ID as it makes it a bit simpler, but mostly I use it whenever setting the value. To set a lookup field you only need to set the ID. This is easy, and has a good track record of actually working (correctly) in my experiences.
PhysicianTypeValue which could be useful when performing queries if you just need the lookup value, as a string (meaning it will be the raw value, rather than something which is already parsed if it's a multivalued field, or a user field, etc. Sometimes I'd rather parse it myself, or maybe just see what the underlying value is when doing development. Even if you don't use it and use the first property, I often bring it along for the ride since I'm already doing most of the work to bring the PhysicianTypeId field over.
It seems a bit hacky, and contrary to the general design of linq-to-SharePoint. I agree, but it also has the advantage of actually working, and not actually being all that hard (once you get the rhythm of it down and learn what exactly needs to be copied over to move the properties from one file to another).
I'm trying to achieve a super-fast search, and decided to rely heavily on caching to achieve this. The order of events is as follows;
1) Cache what can be cached (from entire database, around 3000 items)
2) When a search is performed, pull the entire result set out of the cache
3) Filter that result set based on the search criteria. Give each search result a "relevance" score.
4) Send the filtered results down to the database via xml to get the bits that can't be cached (e.g. prices)
5) Display the final results
This is all working and going at lightning speed, but in order to achieve (3) I've given each result a "relevance" score. This is just a member integer on each search result object. I iterate through the entire result set and update this score accordingly, then order-by it at the end.
The problem I am having is that the "relevance" member is retaining this value from search to search. I assume this is because what I am updating is a reference to the search results in the cache, rather than a new object, so updating it also updates the cached version. What I'm looking for is a tidy solution to get around this. What I've come up with so far is either;
a) Clone the cache when i get it.
b) Create a seperate dictionary to store relevances in and match them up at the end
Am I missing a really obvious and clean solution or should i go down one of these routes? I'm using C# and .net.
Hopefully it should be obvious from the description what I'm getting at, here's some code anyway; this first one is the iteration through the cached results in order to do the filtering;
private List<QuickSearchResult> performFiltering(string keywords, string regions, List<QuickSearchResult> cachedSearchResults)
{
List<QuickSearchResult> filteredItems = new List<QuickSearchResult>();
string upperedKeywords = keywords.ToUpper();
string[] keywordsArray = upperedKeywords.Split(' ');
string[] regionsArray = regions.Split(',');
foreach (var item in cachedSearchResults)
{
//Check for keywords
if (keywordsArray != null)
{
if (!item.ContainsKeyword(upperedKeywords, keywordsArray))
continue;
}
//Check for regions
if (regionsArray != null)
{
if (!item.IsInRegion(regionsArray))
continue;
}
filteredItems.Add(item);
}
return filteredItems.OrderBy(t=> t.Relevance).Take(_maxSearchResults).ToList<QuickSearchResult>();
}
and here is an example of the "IsInRegion" method of the QuickSearchResult object;
public bool IsInRegion(string[] regions)
{
int relevanceScore = 0;
foreach (var region in regions)
{
int parsedRegion = 0;
if (int.TryParse(region, out parsedRegion))
{
foreach (var thisItemsRegion in this.Regions)
{
if (thisItemsRegion.ID == parsedRegion)
relevanceScore += 10;
}
}
}
Relevance += relevanceScore;
return relevanceScore > 0;
}
And basically if i search for "london" i get a score of "10" the first time, "20" the second time...
If you use the NetDataContractSerializer to serialize your objects in the cache, you could use a [DataMember] attribute to control what gets serialized and what doesn't. For instance, you could store your temporarary calculated relevance value in a field that is not serialized.