i'm trying to import some legacy poll data into our sitecore solution. For part of the import, i'm trying to create new sitecore items to hold the data based off of a master page we already have set up for polls. this master has a couple of default items set up underneath of it, and I want to delete a couple of specific default items before I add the legacy poll data. however, when i attempt to delete one of the default items using item.DeleteChildren(), i get a NullReferenceException thrown by Sitecore.Tasks.ItemEventHandler.OnItemDeleted(Object sender, EventArgs args) in the sitecore kernel. if anyone has any idea what could be causing this, i'd appreciate it. we're on sitecore version 5.3.2.
here is the code i'm using to attempt to create/edit the item based off of a master. the creation all works perfectly, it's the DeleteChildren() call that doesn't work.
Guid LegacyPollFolderGuid = new Guid("8AE89A44-9DCD-4AC2-B0F3-DD438188A575");
Guid QuizOMaticMasterGuid = new Guid("74B95ABF-1898-4870-8B4F-50AF0078AE22");
var master = Sitecore.Configuration.Factory.GetDatabase("master");
var root = master.GetItem(new Sitecore.Data.ID(LegacyPollFolderGuid));
var quizMasterTemplate = master.Masters[new Sitecore.Data.ID(QuizOMaticMasterGuid)];
var quizPage = root.Add("Test Quiz", quizMasterTemplate);
if (quizPage != null)
{
var quiz = quizPage.Children["Column One"].Children["QuizOMatic"];
if (quiz != null)
{
var questionFolder = quiz.Children["Questions"];
var questionTemplate = questionFolder.Children[0].Template;
var resultsFolder = quiz.Children["Results"];
var linksFolder = quiz.Children["Links"];
using (new Sitecore.SecurityModel.SecurityDisabler())
{
questionFolder.DeleteChildren();
}
}
I've built a tool with the same functionality; removing all items under a specific folder. I got the same error, but I saw items being deleted. So I reran the tool multiple times and eventually all items were deleted.
Related
I created a MongoDB watcher to create actions based on the created document.
Of course, the watcher does not detect documents that are created while the service itself is not running.
The current code is only detecting newly created documents.
How can I fetch and add older documents to the pipeline based on a field state e.g.: actionDone : true/false.
var pipeline =
new EmptyPipelineDefinition<ChangeStreamDocument<BsonDocument>>()
.Match(x => x.OperationType == ChangeStreamOperationType.Insert);
using (var cursor = collection.Watch(pipeline))
{
foreach (var change in cursor.ToEnumerable())
{
string mongoID = change.FullDocument.GetValue("_id").ToString();
}
}
Is StartAtOperationTime a option? Didnt find any good documentation here.
Update:
StartAtOperationTime was the solution I was looking for. If anybody is having the same problem, here my solution.
Start to lookup the last 10 days.
var options = new ChangeStreamOptions
{
StartAtOperationTime = new BsonTimestamp(DateTime.Now.AddDays(-10).Ticks)
};
var cursor = collection.Watch(pipeline,options)
I'm currently consuming information from Sharepoint 2010 using WCF Data Services from C# (an asp.net mvc4 application).
With regular Sharepoint lists items, when saving the changes to the context, the id of the new item is automatically populated on the original object. Meaning:
SomeContext context = /* creation of the context*/;
var someEntity = new SomeEntity {...};
context.AddToSomeEntityItems(someEntity);
context.SaveChanges();
var newId = someEntity.Id; //this will have the new id
However, if what you are creating is an item for a document library, the id doesn't seem to be updated on the saved entity. For example:
SomeContext context = /* creation of the context*/;
var someOtherEntity = new SomeOtherEntity {...};
Stream data = /* some stream*/;
context.AddToSomeOtherEntityItems(someOtherEntity);
context.SetSaveStream(someOtherEntity, data, true, "SomeMIMEType", "SomeSlugPath");
context.SaveChanges();
var newId = someOtherEntity.Id; //this will instead always have 0
I initially thought this was a bug in the first version of WCF Data Services so I updated to the latest, 5.4.0. But the behavior seems to be the same.
I would prefer to have to avoid strange lookups that ultimately could fail during heavy load, such as using .OrderByDescending(x => x.Id).First() to get the recently created item.
When looking at the actual network traffic using Fiddler, I can see that the initial upload of the binary data actually does return the information for the item along with its ID. And if I configure other relationships to that item using SetLink prior to saving changes they are linked correctly, so the context does handle the value accordingly.
Is there any way to get WCF Data Services to update the ID on the entity when the item is for a document library?
Typically, I set the Id manually before inserting it with
someOtherEntity.Id = Guid.NewGuid();
That way you can TDD it more easily as well. I think it is best to leave the responsibilities to your code than to rely upon an external storage system to tell you what the Id is.
What I am doing is parsing out the id from the response headers:
var responses = context.SaveChanges();
var created = responses.FirstOrDefault(r => r.StatusCode == (int)HttpStatusCode.Created);
if (response != null)
{
var location = response.Headers.First(h => h.Key == "Location").Value;
//location will be http://sharepoint.url/site/_vti_bin/ListData.svc/{listname}(id)
//you can parse out the id from the end of that string
}
It's not exactly elegant, but it works.
In Revit 2013 I have tool that I'm making that copies dimensions from one drafting view to another. I've got it to properly create a new version of a dimension including the Curve, DimensionType, and References but I'm having trouble with the properties Above, Below, Prefix, and Suffix. They copy just fine if at least one of them has a value. However, if none of them have a value then it will throw an AccessViolationException when I try to access them. I have tried to catch that exception but it bubbles up and it crashes Revit (I'm assuming it's caused by native code that fails).
How can I check to see if these properties have any value when I do my copying without triggering this AccessViolationException?
Autodesk Discussion Group Question
The DimensionData class is my own used for storing the dimension information so that it can be used to create the dimension in a separate document.
private IEnumerable<DimensionData> GetDimensionDataSet(Document document,
View view)
{
if (document == null)
throw new ArgumentNullException("document");
if (view == null)
throw new ArgumentNullException("view");
List<DimensionData> dimensionDataSet = new List<DimensionData>();
FilteredElementCollector dimensionCollector =
new FilteredElementCollector(document, view.Id);
dimensionCollector.OfClass(typeof(Dimension));
foreach (Dimension oldDimension in dimensionCollector)
{
Line oldDimensionLine = (Line)oldDimension.Curve;
string dimensionTypeName = oldDimension.DimensionType.Name;
List<ElementId> oldReferences = new List<ElementId>();
foreach (Reference oldReference in oldDimension.References)
oldReferences.Add(oldReference.ElementId);
DimensionData dimensionData;
try
{
string prefix = oldDimension.Prefix;
dimensionData = new DimensionData(oldDimensionLine,
oldReferences,
dimensionTypeName,
prefix,
oldDimension.Suffix,
oldDimension.Above,
oldDimension.Below);
}
catch (AccessViolationException)
{
dimensionData = new DimensionData(oldDimensionLine,
oldReferences, dimensionTypeName);
}
dimensionDataSet.Add(dimensionData);
}
return dimensionDataSet;
}
Regarding transactions: As far as I'm aware, you are only required to be inside a transaction when you are making any sort of CHANGE (modifications, deletions, additions). If all you are doing is collecting dimension information, you would not need a transaction, but when you use that information to create new dimensions in another document, that code would have to be inside a transaction. I have had a number of programs under development which did not yet modify the document but simply collected parameter settings and posted them to a TaskDialog.Show(). These programs worked fine, and I don't see anything in your code that actually modifies your model, so that doesn't seem to be your issue.
It seems like I bug.
Can you post the issue to the ADN Support?
The solution I can suggest is to use Parameters of the Dimension element instead of Dimension class properties.
For example, you can get Suffix and Prefix by following code
var suffixParameter =
oldDimension.get_Parameter(BuiltInParameter.SPOT_SLOPE_SUFFIX);
string suffix = null;
if (suffixParameter != null)
{
suffix = suffixParameter.AsString();
}
var prefixParameter =
oldDimension.get_Parameter(BuiltInParameter.SPOT_SLOPE_PREFIX);
string prefix = null;
if (prefixParameter != null)
{
prefix = prefixParameter.AsString();
}
Unfortunatelly, I don't tell you how to get Above and Below Properties via parameters, because I don't have a project to test. But you can easily determine parameters using BuiltInParameter Checker
Hope it helps.
I'm trying to save a TFS Work Item programmatically but always get the exception:
TF237124: Work Item is not ready to save
Now, I understand what this is telling me - that the Work Item is missing a required field or similar - and my code is anticipating this by calling:
ArrayList ValidationResult = wi.Validate();
before the save. However my ArrayList contains no elements following this call.
I've tried logging in to the TFS web interface using the same credentials and creating a Work Item that way which works fine.
How can I discover why my Work Item won't save? Here's my code:
// get a reference to the team project collection (authenticate as generic service account)
using (var tfs = new TfsTeamProjectCollection(tfsuri, new System.Net.NetworkCredential("My_User", "password")))
{
tfs.EnsureAuthenticated();
var workItemStore = GetWorkItemStore(tfs);
// create a new work item
WorkItem wi = new WorkItem(GetWorkItemType(type, workItemStore));
{
//Values are supplied as a KVP - Field Name/Value
foreach (KeyValuePair<string,string> kvp in values)
{
if (wi.Fields.Contains(kvp.Key))
{
wi.Fields[kvp.Key].Value = kvp.Value;
}
}
ValidationResult = wi.Validate();
}
if (ValidationResult.Count == 0)
{
wi.State = wi.GetNextState("Microsoft.VSTS.Actions.Checkin");
wi.Save();
return wi.Id;
}
else
{
return 0;
}
}
}
You are validating the work item before you are changing it's state. Transitioning to a new state can cause Work Item Template actions/rules to be processed. These could be changing the values of some of your fields and/or adding new rules to the fields which would cause the previously valid data to be invalid.
Moving from an Open state to a Closed state might require someone to complete a "Review" field (for example) - if it's empty it cannot transission.
Try validating after the State change and see if there are any failures.
I am using C# code and Tridion(CMS) classes to fetch data from Tridion, below is the code to get all the publications List from Tridion.
protected void btnPublishPublicationList_Click(object sender, EventArgs e)
{
try
{
PublicationBL pubBL = new PublicationBL();
TridionCollection<Publication> pubAllList = pubBL.getAllPublicationList();
List<PublicationsBO> pubBos = new List<PublicationsBO>();
foreach (Publication pub in pubAllList)
{
if ((pub.Title.StartsWith("07"))||(pub.Title.StartsWith("08")))
{
PublicationsBO pubBO = new PublicationsBO();
pubBO.publicationID = pub.ID;
pubBO.publicationName = pub.Title;
pubBos.Add(pubBO);
}
}
pubBL.createPublicationListXML(pubBos);
}
catch (Exception ex)
{
log.Error(ex.Message);
}
}
In above code on the button click, I am using .net code and using Tridion class to get all the publications List as below:
TridionCollection<Publication> pubAllList = pubBL.getAllPublicationList();
I am getting my all the publications list very fast from the Tridion, however when I am going for foreach loop as below my process gets stuck and it takes lots of time to do this.
foreach (Publication pub in pubAllList)
{
if ((pub.Title.StartsWith("07"))||(pub.Title.StartsWith("08")))
{
PublicationsBO pubBO = new PublicationsBO();
pubBO.publicationID = pub.ID;
pubBO.publicationName = pub.Title;
pubBos.Add(pubBO);
}
}
After debugging I found that when debugger comes to foreach (Publication pub in pubAllList) it is taking lots of time. I think while making the Publication class object is taking time and it is Tridion class.
Please suggest any other way to do this or suggest what is wrong in above code.
Thanks.
It is indeed because of Tridion's lazy-loading. If all you need is a list of Publication IDs and Titles, I'd recommend using:
TDSE tdse = new TDSEClass():
XmlDocument publicationList = new XmlDocument();
publicationList.LoadXml(tdse.GetListPublications(ListColumnFilter.XMLListIDAndTitle));
This will give you an XML document containing a list of all publications (/tcm:ListPublications/tcm:Item), and each Item will contain a Title and ID attributes.
If you need more detail than just ID and Title, then you will have to individually load each publication, which you can do by using the ID attribute tdse.GetObject().
Hope this helps.
N
according to this which I think is the developers site... it looks like getAllPublicationList is using some sort of lazy loading so even though you have the collection you don't really have the items in it.
It appears that you can set filters on their collection rather than after the fact so that it will only load the records you are interested in.
to clarify, lazy loading means that when the collection is returned the data needed to populate the items in it are not yet laded. It isn't until you access the item in the collection that the data (e.g. by creating a Publication item) for that item is actually loaded. The purpose for using a lazy collection is to allow filtering on the collection so that unnecessary expensive loads can be avoided.
HTH