.NET Graph SDK Updating Choice Column values in Sharepoint Online - c#

I would like to update the value of a "choice column" but when I call the UpdateAsync method it throws an exception with the following message "Code: invalidRequest - Message: Invalid request".
In previous versions of Sharepoint, the values of the choice columns were separated by ";#" characters, but with Microsoft Graph and Sharepoint Online it seems that this requirement has changed to an array of values. At least I think so...
Any ideas on how to solve this problem?
I am using the following code:
var fieldValueSet = new FieldValueSet
{
AdditionalData = new Dictionary<string, object>()
{
{"Field1", "Test1"},
{ "Field2", ["Test2-A", "Test2-B", "Test3-C"]}
}
};
await graphClient
.Sites["{site-id}"]]
.Lists["{list-id}"]]
.Items["{listItem-id}"]
.Fields
.Request()
.UpdateAsync(fieldValueSet);```

You need to add a field to specify that the value for Field2 is an array of strings. Collection(Edm.String).
Also send the array of values in string "[\"Test2-A\",\"Test2-B\",\"Test3-C\"]"
var fieldValueSet = new FieldValueSet
{
AdditionalData = new Dictionary<string, object>()
{
{"Field1", "Test1"},
{"Field2#odata.type", "Collection(Edm.String)"},
{"Field2", "[\"Test2-A\",\"Test2-B\",\"Test3-C\"]"}
}
};

Related

DynamoDB - Update item but retain database's current version of a field in that item

I want to Update one or more fields for an item in DynamoDB while keeping the database's current values for others and want to know if this is possible using context.SaveAsync(item) or another method in the AWS SDK for .NET or if I have to read the entry from the database first, apply those changes, and then perform the save (which is still prone to issues).
The reason for this being, is that I have a counter that goes up from other user's modifying the item, but the owner of the item wishes to change other fields. The owner may not have the most recent version of this entry (due to other users modifying that counter since the time the owner retrieved it) so I want to save the owner's edits but retain the Database's instance of the counter.
You can perform updates to DynamoDB columns using UpdateItemAsync method.
Use UpdateExpression, ConditionExpression and ExpressionAttributeNames, ExpressionAttributeValues in UpdateItemRequest to configure your updates.
var updateItemRequest = new UpdateItemRequest
{
TableName = "DDBTableName",
Key = new Dictionary<string, AttributeValue>()
{
{"key_name", new AttributeValue { S = "Value"} }
},
ExpressionAttributeNames = new Dictionary<string, string>() { { "#C", "ColumnName" } },
ExpressionAttributeValues = new Dictionary<string, AttributeValue>()
{
{":unlock", new AttributeValue{ BOOL= false }},
{":lock", new AttributeValue{ BOOL = true }}
},
ConditionExpression = "attribute_exists(ColumnName) AND #C = :lock",
UpdateExpression = "SET #C = :unlock",
ReturnValues = ReturnValue.ALL_NEW
};
var updatedItemResponse = await dynamoDBClient.UpdateItemAsync(updateItemRequest);

C# to Sort by Last Created (oldest record) & Limit results to 20 records from DynamoDB Table

How to apply Sort by Last Created (oldest record) & Limit results to 20 records from DynamoDB Table using BatchGetItemAsync Method. Thanks in Advance.
var table = Table.LoadTable(client, TableName);
var request = new BatchGetItemRequest
{
RequestItems = new Dictionary<string, KeysAndAttributes>()
{
{ TableName,
new KeysAndAttributes
{
AttributesToGet = new List<string> { "ID", "Status", "Date" },
Keys = new List<Dictionary<string, AttributeValue>>()
{
new Dictionary<string, AttributeValue>()
{
{ "Status", new AttributeValue { S = "Accepted" } }
}
}
}
}
}
};
var response = await client.BatchGetItemAsync(request);
var results = response.Responses;
var result = results[fullTableName];
There isn't a way to do what you're asking for with BatchGetItemAsync. That call is to get specific records, when you know the specific keys you are looking for. You'll need to use a query to do this, and you'll want to get your data in a structure that supports this access pattern. There was a really great session on DynamoDB access patterns at re:Invent 2018. I suggest watching it: https://www.youtube.com/watch?v=HaEPXoXVf2k

How to create a suggester in azure search index

When I try to create a suggester on an index using .net sdk I get an error.
I can create index successfully using .net SDK but when I try to add suggester I get an error.
My index code:
var index = new Index()
{
Name = "customeridex",
Fields = FieldBuilder.BuildForType<AutocompleteResponseDetail>(),
Suggesters = new List<Suggester>() {new Suggester()
{
Name="cg",
SourceFields= new string[] { "Title", "Description" }
}}
};
The error message I get:
'The request is invalid. Details: definition: One or more fields in suggester 'cg' are not defined as a field in the index. Fields: Title, Description.'
Although I have fields: Title and description in my index
Try this :
var definition = new Index()
{
Name = "customeridex",
Fields = FieldBuilder.BuildForType<AutocompleteResponseDetail>(),
Suggesters = new List<Suggester> {new Suggester("cg","Title", "Description") }
};
I have tested on my side and it works for me.
My bad, it was casing error. Above sourcefields need to be all small to match index schema.

.NET Graph SDK Updating Sharepoint Online List Item Values

I'm trying to add values to a custom column on a list item after uploading the list item to the list. I can get the item into the list, and I can query the list and get back the item's data, but when I try to add the data for the extra field I get the following Microsoft.SharePoint.Client.InvalidClientQueryException error:
A value without a type name was found and no expected type is available.
When the model is specified, each value in the payload must have a type which can be either
specified in the payload, explicitly by the caller or implicitly inferred from the parent value.
I'm not sure what value or model the error message is referring to. This is my code:
var item = await graphClient
.Drives[driveId]
.Root.ItemWithPath(fileName)
.ListItem.Request()
.Select("WebURL,Fields,SharepointIds")
.Expand("Fields")
.GetAsync();
var fieldVals = await graphClient
.Sites[SPUrl + ":"]
.Sites[SpPath + ":"]
.Lists[libId]
.Items[item.SharepointIds.ListItemId]
.Fields
.Request()
.GetAsync();
fieldVals.AdditionalData.Add("Phase",
JsonConvert.SerializeObject(tags));
await graphClient
.Drives[driveId]
.Root
.ItemWithPath(fileName)
.ListItem
.Fields
.Request()
.UpdateAsync(fieldVals);
Originally when I was doing fieldVals.AdditionalData.Add() I had "Phase" and a List(string) and that caused an error about the OData field type not being set but I haven't found anywhere in the documentation that says what expected OData field values are. I tried setting it to microsoft.graph.fieldValueSet but that didn't work.
I'm trying to update a Choice column that allows multiple choices as checkboxes.
For multi-choice field type, indeed, the presence of odata.type annotation is mandatory in request payload, here is an example how to specify it:
PATCH https://graph.microsoft.com/v1.0/sites/{site-id}/lists/{list-id}/items/{item-id}/
{
"fields": {
"<ChoiceFieldName>#odata.type": "Collection(Edm.String)",
"<ChoiceFieldName>":["<val1>","<val2>"]
}
}
where
ChoiceFieldName - choice field name
val1, val2 - field values
Example
Assuming a List contains a choice field named Categories, then the following example demonstrates how to update list item via msgraph-sdk-dotnet:
var choiceVals = new []{ "Cat1", "Cat2"};
await graphClient.Sites[siteId].Lists[listId].Items[itemId].Request().UpdateAsync(new ListItem()
{
Fields = new FieldValueSet
{
AdditionalData = new Dictionary<string, object>
{
{ "Categories#odata.type", "Collection(Edm.String)" },
{ "Categories", choiceVals }
}
}
});
References
Update an item in a list
Entity Data Model: Primitive Data Types

Insert List<object> into Elasticsearch with NEST

I have a lot of different collections of values I generate at runtime and want to send them to ElasticSearch. I can represent them as List<object> or if really doesn't work any other way, as List<string>. But I can't find any example how to do that. Here is an example of the code which doesn't work. There is probably a lot wrong with it, so any additional pointers are highly appreciated.
var client = new ElasticClient(new Uri("http://localhost:9200"));
client.CreateIndex("testentry");
var values = new List<object> {"StringValue", 123, DateTime.Now};
var indexResponse = client.Index(values, descriptor => descriptor.Index("testentry"));
Console.WriteLine(indexResponse.DebugInformation);
Which results in:
Invalid NEST response built from a unsuccessful low level call on POST: /testentry/list%601
# Audit trail of this API call:
- [1] BadResponse: Node: http://localhost:9200/ Took: 00:00:00.0600035
# ServerError: ServerError: 400Type: mapper_parsing_exception Reason: "failed to parse" CausedBy: "Type: not_x_content_exception Reason: "Compressor detection can only be called on some xcontent bytes or compressed xcontent bytes""
and
[2016-09-17 14:16:20,955][DEBUG][action.index ] [Gin Genie] failed to execute [index {[t
estentry][list`1][AVc4E3HaPglqpoLcosDo], source[_na_]}] on [[testentry][1]]
MapperParsingException[failed to parse]; nested: NotXContentException[Compressor detection can only
be called on some xcontent bytes or compressed xcontent bytes];
at org.elasticsearch.index.mapper.DocumentParser.parseDocument(DocumentParser.java:156)
I'm using Elasticsearch.Net 2.4.3 and NEST 2.4.3.
In addition to Henrik's answer, you could also index values in a Dictionary<string, object>
public class MyType
{
public MyType()
{
Values = new Dictionary<string, object>();
}
public Dictionary<string, object> Values { get; private set; }
}
void Main()
{
var pool = new SingleNodeConnectionPool(new Uri("http://localhost:9200"));
var connectionSettings = new ConnectionSettings(pool);
var client = new ElasticClient(connectionSettings);
var myType = new MyType
{
Values =
{
{ "value1", "StringValue" },
{ "value2", 123 },
{ "value3", DateTime.Now },
}
};
client.Index(myType, i => i.Index("index-name"));
}
The Dictionary<string,object> will be serialized to a json object with property names to match the dictionary keys
{
"values": {
"value1": "StringValue",
"value2": 123,
"value3": "2016-09-18T18:41:48.7344837+10:00"
}
}
Within Elasticsearch, the mapping will be inferred as an object type.
Arrays with a mixture of datatypes are not supported.
You could convert all of the values to strings:
client.CreateIndex("testentry");
var values = new List<string> { "StringValue", "123", DateTime.Now.ToString() };
var indexResponse = client.Index(new { Values = values}, descriptor => descriptor.Index("testentry").Type("test"));
Or specify the fields that the values should be indexed to:
client.CreateIndex("testentry");
var values = new { Field1 = "StringValue", Field2 = 123, Field3 = DateTime.Now };
var indexResponse = client.Index(values, descriptor => descriptor.Index("testentry").Type("test"));
Consider specifying the type of the document with the IndexDescriptor or create a class for the document.

Categories