I have an application in which I am getting Shopify orders via Shopify webhooks. Now I have an issue is when I have only one client I mapped Shopify to receive an order via the traditional way with object properties and save it into the database.
Now I got two more clients and all clients want to map class fields according to them.
Let's suppose I got a Shopify order object with the following details:
{
"id": 4655316533326,
"checkout_id": 29212941516878,
"user_id": 71894499406,
"note": "hello world",
"subtotal_price_set": {
"shop_money": {
"amount": "109.00",
"currency_code": "USD"
}
"total_tax": "0.00",
"total_tax_set": {
"shop_money": {
"amount": "0.00",
"currency_code": "USD"
},
"presentment_money": {
"amount": "0.00",
"currency_code": "USD"
}
}
}
I have a class name Order as well in my application which contains some of the following properties:
public class Order {
public string OrderNo { get; set; }
public string DebtorID { get; set; }
public virtual string DeliveryAddress1 { get; set; }
public virtual string DeliveryAddress2 { get; set; }
public virtual string RIRTNo { get; set; }
public List<OrderLine> Lines { get; set; }
public string OrderNote { get; set; }
Public Customer Customer { get; set; }
}
Now some customers want me to map shopify order Note property with mine Order.Customer.Note
and one want to map that Note proerty with Order.Note
same with OrderNo. One wants to map it directly with Order.OrderNo and others want to map it with ``RIRTNo```
How to handle this kind of situation?
I created one table on the database to keep this mapping information.
I can get the customer detail from the Shopify order and then I make the DB query to get map information but the problem is in the database the detail is in dictionary like structure.
Consider this below mapping info for x Customer:
the key is a Shopify object property
"Order": {
"note": "Customer.Note"
"Id": "OrderId"
}
the key is the Shopify property name and the value is my application object name but I can't figure out how to make them based on this info.
First, you can make a web form to admin this task
we need a table we will call it MappingInfo table.
The idea is for 1 field you have 2 or 3 options to map it with the other class
so you will insert into this table all your customer's mapping options
after that, you will use the Automapper like this
var config = new MapperConfiguration(GetMapperConfigurationExpression());
var _mapper = config.CreateMapper();
private MapperConfigurationExpression GetMapperConfigurationExpression(Int customerId)
{
MapperConfigurationExpression mapperConfiguraitonExpression = new MapperConfigurationExpression();
// Get Data From MappingInfo Table where MappingInfo.customerId == customerId "from the parameters" and MappingInfo.PropertyName == "AppId"
if (MappingInfo.OptionId==1)
mapperConfiguraitonExpression.CreateMap<srcMyClass, destMyClass>().ForPath(dest => dest.AppId.Code, opt => opt.MapFrom(src => src.InvoiseId))
if (MappingInfo.OptionId==2)
mapperConfiguraitonExpression.CreateMap<srcMyClass, destMyClass>().ForPath(dest => dest.AppId.Code, opt => opt.MapFrom(src => src.LogecalId))
}
Basically, this depends on the Customer's mind.
So if you don't have any way to mapping yet.
you can divide your customers into 3 or 4 groups or more.
then you can use a factory design pattern
or simply can use an "if" statement depending on a new field "CustomerGroupId" and this should come with the entity to your code
Then:
if (Customer.CustomerGroupId == 1)
{
// do mapping like Order.Note = shopify.Note
}
else if (Customer.CustomerGroupId == 1)
{
// do mapping like Order.Customer.Note = shopify.Note
}
A very simple option is to use the "mapping info" within the Order class, and then modify the OrderNote property to use the mapping for getting/setting the note.
For example, create a method that provides the mapping:
private string _map = "note"; // default value when SetMap is not used
public void SetMap(string map)
=> _map = map;
And then OrderNote becomes:
private string _note;
public string OrderNote {
get => GetNote();
set => SetNote(value);
}
private string GetNote()
{
if(_map == "note") return _note;
if(_map == "Customer.Note") return Customer.Note;
throw new ArgumentOutOfRangeException("Invalid _map");
}
private void SetNote(string value)
{
if(_map == "note") { _note = value; return; }
if(_map == "Customer.Note") { Customer.Note = value; return; }
throw new ArgumentOutOfRangeException("Invalid _map");
}
Use the same pattern for each type of mapping.
Related
I have a MinionLeaders table with one-to-many relationship with Minions table. A minion could lead n-minions or it could lead itself if it doesn't have any member.
The problem is when I tried to GET a list of Minion Leaders with its members as json, it returns a disorganized-unfinished json :
[
{
"LeaderID":"ABCD000000000001",
"Name":"Foo",
"Members":
[
{
"ID":"EEEE000000000001",
"Name":"Fubar",
"LeaderID":"ABCD000000000001"
Also, the server throws an error :
Newtonsoft.Json.JsonSerializationException: Self referencing loop detected for property 'minionLeader' with type 'rt5_app.Models.MinionLeaders'. Path '[0].Members[0]'.
How to properly return the data?
And here's how I fetch the data from database MinionLeadersRepository.cs:
public async Task<IEnumerable<MinionLeaders>> ListAsync()
{
return await context.MinionLeaders.Include(p => p.Members)
.ToListAsync();
}
ApiController.cs :
public async Task<IEnumerable<MinionLeaders>> ListAsync()
{
return await _minionLeaders.ListAsync();
}
MinionLeadersModel.cs
public class MinionLeaders
{
[Key]
[MinLength(16)]
[MaxLength(16)]
public string LeaderID { get; set; }
[MinLength(16)]
[MaxLength(16)]
public string Name { get; set; }
public virtual List<Minion> Members { get; set; }
}
I find out that this was a serializer problem.
I resolved it by setting the serializer settings to ignore loop references in Startup.cs
services.AddMvc()
.AddJsonOptions(x => x.SerializerSettings.ReferenceLoopHandling = Newtonsoft.Json.ReferenceLoopHandling.Ignore);
But I'm not sure if this is the real solution or just a workaround.
I currently have a Web API that implements a RESTFul API. The model for my API looks like this:
public class Member
{
public string FirstName { get; set; }
public string LastName { get; set; }
public DateTime Created { get; set; }
public DateTime BirthDate { get; set; }
public bool IsDeleted { get; set; }
}
I've implemented a PUT method for updating a row similar to this (for brevity, I've omitted some non-relevant stuff):
[Route("{id}")]
[HttpPut]
public async System.Threading.Tasks.Task<HttpResponseMessage> UpdateRow(int id,
[FromBody]Models.Member model)
{
// Do some error checking
// ...
// ...
var myDatabaseEntity = new BusinessLayer.Member(id);
myDatabaseEntity.FirstName = model.FirstName;
myDatabaseEntity.LastName = model.LastName;
myDatabaseEntity.Created = model.Created;
myDatabaseEntity.BirthDate = model.BirthDate;
myDatabaseEntity.IsDeleted = model.IsDeleted;
await myDatabaseEntity.SaveAsync();
}
Using PostMan, I can send the following JSON and everything works fine:
{
firstName: "Sara",
lastName: "Smith",
created: "2018/05/10",
birthDate: "1977/09/12",
isDeleted: false
}
If I send this as my body to http://localhost:8311/api/v1/Member/12 as a PUT request, the record in my data with ID of 12 gets updated to what you see in the JSON.
What I would like to do though is implement a PATCH verb where I can do partial updates. If Sara gets married, I would like to be able to send this JSON:
{
lastName: "Jones"
}
I would like to be able to send just that JSON and update JUST the LastName field and leave all the other fields alone.
I tried this:
[Route("{id}")]
[HttpPatch]
public async System.Threading.Tasks.Task<HttpResponseMessage> UpdateRow(int id,
[FromBody]Models.Member model)
{
}
My problem is that this returns all the fields in the model object (all of them are nulls except the LastName field), which makes sense since I am saying I want a Models.Member object. What I would like to know is if there is a way to detect which properties have actually been sent in the JSON request so I can update just those fields?
I hope this helps using Microsoft JsonPatchDocument:
.Net Core 2.1 Patch Action into a Controller:
[HttpPatch("{id}")]
public IActionResult Patch(int id, [FromBody]JsonPatchDocument<Node> value)
{
try
{
//nodes collection is an in memory list of nodes for this example
var result = nodes.FirstOrDefault(n => n.Id == id);
if (result == null)
{
return BadRequest();
}
value.ApplyTo(result, ModelState);//result gets the values from the patch request
return NoContent();
}
catch (Exception ex)
{
return StatusCode(StatusCodes.Status500InternalServerError, ex);
}
}
Node Model class:
[DataContract(Name ="Node")]
public class Node
{
[DataMember(Name = "id")]
public int Id { get; set; }
[DataMember(Name = "node_id")]
public int Node_id { get; set; }
[DataMember(Name = "name")]
public string Name { get; set; }
[DataMember(Name = "full_name")]
public string Full_name { get; set; }
}
A valid Patch JSon to update just the "full_name" and the "node_id" properties will be an array of operations like:
[
{ "op": "replace", "path": "full_name", "value": "NewNameWithPatch"},
{ "op": "replace", "path": "node_id", "value": 10}
]
As you can see "op" is the operation you would like to perform, the most common one is "replace" which will just set the existing value of that property for the new one, but there are others:
[
{ "op": "test", "path": "property_name", "value": "value" },
{ "op": "remove", "path": "property_name" },
{ "op": "add", "path": "property_name", "value": [ "value1", "value2" ] },
{ "op": "replace", "path": "property_name", "value": 12 },
{ "op": "move", "from": "property_name", "path": "other_property_name" },
{ "op": "copy", "from": "property_name", "path": "other_property_name" }
]
Here is an extensions method I built based on the Patch ("replace") specification in C# using reflection that you can use to serialize any object to perform a Patch ("replace") operation, you can also pass the desired Encoding and it will return the HttpContent (StringContent) ready to be sent to httpClient.PatchAsync(endPoint, httpContent):
public static StringContent ToPatchJsonContent(this object node, Encoding enc = null)
{
List<PatchObject> patchObjectsCollection = new List<PatchObject>();
foreach (var prop in node.GetType().GetProperties())
{
var patch = new PatchObject{ Op = "replace", Path = prop.Name , Value = prop.GetValue(node) };
patchObjectsCollection.Add(patch);
}
MemoryStream payloadStream = new MemoryStream();
DataContractJsonSerializer serializer = new DataContractJsonSerializer(patchObjectsCollection.GetType());
serializer.WriteObject(payloadStream, patchObjectsCollection);
Encoding encoding = enc ?? Encoding.UTF8;
var content = new StringContent(Encoding.UTF8.GetString(payloadStream.ToArray()), encoding, "application/json");
return content;
}
}
Noticed that tt also uses this class I created to serialize the PatchObject using DataContractJsonSerializer:
[DataContract(Name = "PatchObject")]
class PatchObject
{
[DataMember(Name = "op")]
public string Op { get; set; }
[DataMember(Name = "path")]
public string Path { get; set; }
[DataMember(Name = "value")]
public object Value { get; set; }
}
A C# example of how to use the extension method and invoking the Patch request using HttpClient:
var nodeToPatch = new { Name = "TestPatch", Private = true };//You can use anonymous type
HttpContent content = nodeToPatch.ToPatchJsonContent();//Invoke the extension method to serialize the object
HttpClient httpClient = new HttpClient();
string endPoint = "https://localhost:44320/api/nodes/1";
var response = httpClient.PatchAsync(endPoint, content).Result;
Thanks
PATCH operations aren't usually defined using the same model as the POST or PUT operations exactly for that reason: How do you differentiate between a null, and a don't change. From the IETF:
With PATCH, however, the enclosed entity contains a set of
instructions describing how a resource currently residing on the
origin server should be modified to produce a new version.
You can look here for their PATCH suggestion, but sumarilly is:
[
{ "op": "test", "path": "/a/b/c", "value": "foo" },
{ "op": "remove", "path": "/a/b/c" },
{ "op": "add", "path": "/a/b/c", "value": [ "foo", "bar" ] },
{ "op": "replace", "path": "/a/b/c", "value": 42 },
{ "op": "move", "from": "/a/b/c", "path": "/a/b/d" },
{ "op": "copy", "from": "/a/b/d", "path": "/a/b/e" }
]
#Tipx's answer re using PATCH is spot on, but as you've probably already found, actually achieving that in a statically typed language like C# is a non-trivial exercise.
In the case where you're using a PATCH to represent a set of partial updates for a single domain entity (e.g. to update the first name and last name only for a contact with many more properties) you need to do something along the lines of looping each instruction in the 'PATCH' request and then applying that instruction to an instance of your class.
Applying an individual instruction will then comprise of
Finding the property of the instance that matches the name in the
instruction, or handling property names you weren't expecting
For an update: Trying to parse the value submitted in the patch into the instance property and handling the error if e.g. the instance property is a bool but the patch instruction contains a date
Deciding what to do with Add instructions as you can't add new properties to a statically typed C# class. One approach is to say that Add means "set the value of the instance's property only if property's existing value is null"
For Web API 2 on the full .NET Framework the JSONPatch github project looks to make a stab at providing this code, although it doesn't look like there's been a lot of development on that repo recently and the readme does state:
This is still very much an early project, don't use it in production
yet unless you understand the source and don't mind fixing a few bugs
;)
Things are simpler on .NET Core as that has a set of functionality to support this in the Microsoft.AspNetCore.JsonPatch namespace.
The rather useful jsonpatch.com site also lists out a few more options for Patch in .NET:
Asp.Net Core JsonPatch (Microsoft official implementation)
Ramone (a framework for consuming REST services, includes a JSON Patch implementation)
JsonPatch (Adds JSON Patch support to ASP.NET Web API)
Starcounter (In-memory Application Engine, uses JSON Patch with OT for client-server sync)
Nancy.JsonPatch (Adds JSON Patch support to NancyFX)
Manatee.Json (JSON-everything, including JSON Patch)
I need to add this functionality to an existing Web API 2 project of ours, so I'll update this answer if I find anything else that's useful while doing that.
I wanted to achieve exactly the same thing, but used a different method to others described here. I've created a working repo using this if you want to check it out:
https://github.com/emab/patch-example
If you have the following two models:
Database model
public class WeatherDBModel
{
[Key]
public int Id { get; set; }
public string City { get; set; }
public string Country { get; set; }
public double Temperature { get; set; }
public double WindSpeed { get; set; }
public double Rain { get; set; }
public Weather(int id, string city, string country, double temperature, double windSpeed, double rain)
{
Id = id;
City = city;
Country = country;
Temperature = temperature;
WindSpeed = windSpeed;
Rain = rain;
}
}
Update model
Containing exact names of database model properties. Includes properties which can be updated
public class WeatherUpdateModel
{
public string? City { get; set; }
public string? Country { get; set; }
public double Temperature { get; set; }
public double WindSpeed { get; set; }
public double Rain { get; set; }
}
This update model is sent to the service layer along with the id of the object you'd like to update.
You can then implement the following method in your repository layer which maps any non-null values from the updateModel into an existing entity if it has been found:
public Weather Update(int id, WeatherUpdate updateObject)
{
// find existing entity
var existingEntity = _context.Weather.Find(id);
// handle not found
if (existingEntity == null)
{
throw new EntityNotFoundException(id);
}
// iterate through all of the properties of the update object
// in this example it includes all properties apart from `id`
foreach (PropertyInfo prop in updateObject.GetType().GetProperties())
{
// check if the property has been set in the updateObject
// if it is null we ignore it. If you want to allow null values to be set, you could add a flag to the update object to allow specific nulls
if (prop.GetValue(updateObject) != null)
{
// if it has been set update the existing entity value
existingEntity.GetType().GetProperty(prop.Name)?.SetValue(existingEntity, prop.GetValue(updateObject));
}
}
_context.SaveChanges();
return existingEntity;
}
Using this method you can change your models without worrying about the update logic, as long as you ensure that the UpdateModel is kept up-to-date with the database model.
If a property of your object was omitted in your JSON, ASP.NET won't "set" that property on the object, the property will have its default value. In order to know which properties were sent with the JSON object you need to have a way to detect which properties of the object were set.
In order to detect which properties have "actually been sent" with the JSON object, you can modify your Member class to contain a collection of property names that were "set". Then, for all properties that you want to be able to know if they were sent in the JSON object make that when the property is set the name of the property should be added to the collection of set properties.
public class Member
{
private string _firstName;
private string _lastName;
...
private bool _isDeleted;
public string FirstName
{
get => _firstName;
set
{
_firstName = value;
_setProperties.Add(nameof(FirstName));
}
}
public string LastName
{
get => _lastName;
set
{
_lastName = value;
_setProperties.Add(nameof(LastName));
}
}
...
public bool IsDeleted
{
get => _isDeleted;
set
{
_isDeleted= value;
_setProperties.Add(nameof(IsDeleted));
}
}
private readonly HashSet<string> _setProperties = new HashSet<string>();
public HashSet<string> GetTheSetProperties()
{
return new HashSet<string>(_setProperties);
}
}
In the UpdateRow method you can now check whether a property was sent in the JSON by checking if it is in the _setProperties collection. So if you want to see if the LastName was sent in the JSON just do
bool lastNameWasInJson = model.Contains(nameof(model.LastName));
Following up to Avid Learners approach. I found this easy to add to an existing PUT method.
Alternatively to avoid loading twice you could apply update operations and then before saving apply the patch, but I'd rather load twice and have simple code.
public ResultModel Patch(UpdateModel model)
{
var record = LoadAsUpdateModel(model.Id);
if (record == null) return null;
foreach(var propertyName in model.SetProperties())
{
var property = model.GetType().GetProperty(propertyName);
property.SetValue(record, property.GetValue(model));
}
return Update(record);
}
I just can't seem to get the syntax correct for multi field mapping in NEST 2.0--if that's the correct terminology. Every example I've found for mapping seems to be <= the 1.x version of NEST. I'm new to Elasticsearch and NEST, and I've been reading their documentation, but the NEST documentation hasn't been completely updated for 2.x.
Basically, I don't need to index or store the entire type. Some fields I need for indexing only, some fields I'll need to index and retrieve, and some I don't need for indexing, just for retrieval.
MyType
{
// Index this & allow for retrieval.
int Id { get; set; }
// Index this & allow for retrieval.
// **Also**, in my searching & sorting, I need to sort on this **entire** field, not just individual tokens.
string CompanyName { get; set; }
// Don't index this for searching, but do store for display.
DateTime CreatedDate { get; set; }
// Index this for searching BUT NOT for retrieval/displaying.
string CompanyDescription { get; set; }
// Nest this.
List<MyChildType> Locations { get; set; }
}
MyChildType
{
// Index this & allow for retrieval.
string LocationName { get; set; }
// etc. other properties.
}
I've have been able to index the entire object and child as-is using the following as an example:
client.Index(item, i => i.Index(indexName));
However, the actual object is a lot larger than this, and I really don't need most of it. I've found this, which looks like what I think I want to do, but in an older version: multi field mapping elasticsearch
I think "mapping" is what I'm going for, but like I said, I'm new to Elasticsearch and NEST and I'm trying to learn the terminology.
Be gentle! :) It's my first time to ask a question on SO. Thanks!
In addition to Colin's and Selçuk's answers, you can also fully control the mapping through the fluent (and object initializer syntax) mapping API. Here's an example based on your requirements
void Main()
{
var pool = new SingleNodeConnectionPool(new Uri("http://localhost:9200"));
var connectionSettings = new ConnectionSettings(pool);
var client = new ElasticClient(connectionSettings);
client.Map<MyType>(m => m
.Index("index-name")
.AutoMap()
.Properties(p => p
.String(s => s
.Name(n => n.CompanyName)
.Fields(f => f
.String(ss => ss
.Name("raw")
.NotAnalyzed()
)
)
)
.Date(d => d
.Name(n => n.CreatedDate)
.Index(NonStringIndexOption.No)
)
.String(s => s
.Name(n => n.CompanyDescription)
.Store(false)
)
.Nested<MyChildType>(n => n
.Name(nn => nn.Locations.First())
.AutoMap()
.Properties(pp => pp
/* properties of MyChildType */
)
)
)
);
}
public class MyType
{
// Index this & allow for retrieval.
public int Id { get; set; }
// Index this & allow for retrieval.
// **Also**, in my searching & sorting, I need to sort on this **entire** field, not just individual tokens.
public string CompanyName { get; set; }
// Don't index this for searching, but do store for display.
public DateTime CreatedDate { get; set; }
// Index this for searching BUT NOT for retrieval/displaying.
public string CompanyDescription { get; set; }
// Nest this.
public List<MyChildType> Locations { get; set; }
}
public class MyChildType
{
// Index this & allow for retrieval.
public string LocationName { get; set; }
// etc. other properties.
}
This produces the mapping
{
"properties": {
"id": {
"type": "integer"
},
"companyName": {
"type": "string",
"fields": {
"raw": {
"type": "string",
"index": "not_analyzed"
}
}
},
"createdDate": {
"type": "date",
"index": "no"
},
"companyDescription": {
"type": "string",
"store": false
},
"locations": {
"type": "nested",
"properties": {
"locationName": {
"type": "string"
}
}
}
}
}
Calling .AutoMap() causes NEST to infer the mapping based on the property types and any attributes applied to them. Then .Properties() overrides any of the inferred mappings. For example
CompanyName is mapped as a multi_field with the field companyName analyzed using the standard analyzer and companyName.raw not analyzed. You can reference the latter in your queries using .Field(f => f.CompanyName.Suffix("raw"))
Locations is mapped as a nested type (automapping by default would infer this as an object type mapping). You can then define any specific mappings for MyChildType using .Properties() inside of the Nested<MyChildType>() call.
As far as I can see, you don't have any complex types that you are trying map. So you can easily use NEST attributes to map your objects.
Check this out:
[Nest.ElasticsearchType]
public class MyType
{
// Index this & allow for retrieval.
[Nest.Number(Store=true)]
int Id { get; set; }
// Index this & allow for retrieval.
// **Also**, in my searching & sorting, I need to sort on this **entire** field, not just individual tokens.
[Nest.String(Store = true, Index=Nest.FieldIndexOption.Analyzed, TermVector=Nest.TermVectorOption.WithPositionsOffsets)]
string CompanyName { get; set; }
// Don't index this for searching, but do store for display.
[Nest.Date(Store=true, Index=Nest.NonStringIndexOption.No)]
DateTime CreatedDate { get; set; }
// Index this for searching BUT NOT for retrieval/displaying.
[Nest.String(Store=false, Index=Nest.FieldIndexOption.Analyzed)]
string CompanyDescription { get; set; }
[Nest.Nested(Store=true, IncludeInAll=true)]
// Nest this.
List<MyChildType> Locations { get; set; }
}
[Nest.ElasticsearchType]
public class MyChildType
{
// Index this & allow for retrieval.
[Nest.String(Store=true, Index = Nest.FieldIndexOption.Analyzed)]
string LocationName { get; set; }
// etc. other properties.
}
After this declaration, to create this mapping in elasticsearch you need to make a call similar to:
var mappingResponse = elasticClient.Map<MyType>(m => m.AutoMap());
With AutoMap() call NEST will read your attributes from your POCO and create a mapping request accordingly.
Also see "Attribute Based Mapping" section from here.
Cheers!
At the time of writing, Nest does not offer a way to map a property in your class to multiple fields in your document mapping using built in attributes. However, it does provide the facilities needed to do anything with your mappings that you could do if you wrote the JSON yourself.
Here's a solution I've put together for my own needs. It shouldn't be hard to use it as the starting point for whatever you need to do.
First, here's an example of the mapping I want to generate
{
"product":{
"properties":{
"name":{
"type":"string",
"index":"not_analyzed",
"fields":{
"standard":{
"type":"string",
"analyzer":"standard"
}
}
}
}
}
}
The product document would then have the name field, which is indexed but not analyzed, and the name.standard field, which uses the standard analyzer.
The C# class that I generate the mapping from looks like this
[ElasticsearchType]
public class Product
{
[WantsStandardAnalysisField]
public string Name { get; set; }
}
Note the WantsStandardAnalysisField attribute. That's a custom attribute with no special properties added. Literally just:
public class WantsStandardAnalysisField : Attribute {}
If I were to use AutoMap as-is, my custom attribute would be ignored and I would get a mapping that has the name field, but not name.standard. Luckily, AutoMap accepts an instance of IPropertyVisitor. A base class called NoopPropertyVisitor implements the interface and does nothing at all, so you can subclass it and override only the methods you care about. When you use a property visitor with AutoMap, it will generate a document mapping for you but give you a chance to modify it before it gets sent to Elastic Search. All we need to do is look for properties marked with our custom attribute and add a field to them.
Here's an example that does that:
public class ProductPropertyVisitor : NoopPropertyVisitor
{
public override void Visit(IStringProperty type, PropertyInfo propertyInfo, ElasticsearchPropertyAttributeBase attribute)
{
base.Visit(type, propertyInfo, attribute);
var wsaf = propertyInfo.GetCustomAttribute<WantsStandardAnalysisField>();
if (wsaf != null)
{
type.Index = FieldIndexOption.NotAnalyzed;
type.Fields = new Properties
{
{
"standard",
new StringProperty
{
Index = FieldIndexOption.Analyzed,
Analyzer = "standard"
}
}
};
}
}
}
As you can see, we can do pretty much anything we want with the generated property, including turning off analysis for the main property and adding a new field with its own settings. For fun, you could add a couple properties to the custom attribute allowing you to specify the name of the field you want and the analyzer to use. You could even modify the code to see if the attribute has been added multiple times, letting you add as many fields as you want.
If you were to run this through any method that generates a mapping using AutoMap, such as:
new TypeMappingDescriptor<Product>().AutoMap(new ProductPropertyVisitor())
You'll get the desired multi-field mapping. Now you can customize mappings to your heart's content. Enjoy!
I think you have at least 2 possibilities to solve your problem:
On indexing: Create something like a metadata model, which is stored just for retrieving. See the _source field to limit the return to this field.
On searching: Specify the fields you want to query: if you don`t want to query the CreatedDate, just don't include it in your search.
In my case I am using both of these approaches to get very fast results :-)
I am new to .NET and Azure, and am trying to create a simple Web API, to help me learn. I have two collections of DocumentDB documents. The documents in each collection are defined as follows:
public class Log
{
[JsonProperty(PropertyName = "id")]
public string Id { get; set; }
[JsonProperty(PropertyName = "studentName")]
public string StudentName { get; set; }
[JsonProperty(PropertyName = "assignment")]
public string Assignment { get; set; }
[JsonProperty(PropertyName = "dueDate")]
public DateTime DueDate { get; set; }
[JsonProperty(PropertyName = "goal")]
public string Goal { get; set; }
[JsonProperty(PropertyName = "reflection")]
public string Reflection { get; set; }
[JsonProperty(PropertyName = "sessions")]
public List<Session> Sessions { get; set; }
}
public class Session
{
[JsonProperty(PropertyName = "id")]
public string Id { get; set; }
[JsonProperty(PropertyName = "date")]
public DateTime Date { get; set; }
[JsonProperty(PropertyName = "duration")]
public TimeSpan Duration { get; set; }
[JsonProperty(PropertyName = "comment")]
public string Comment { get; set; }
}
Here is an example of what I have for the Log documents:
{
"id": "2",
"studentName": "Joe student",
"assignment": "Web APIs",
"dueDate": "0001-01-01T00:00:00",
"goal": "Keep mistakes to a minimum.",
"reflection": "I'm getting there.",
"sessions": [ ]
}
Here is an example of what I have for the Session documents:
{
"id": "7",
"date": "2015-04-26T00:00:00",
"duration": "00:30:00",
"comment": "Practiced for 30 minutes"
}
I would like to relate Sessions with Logs such that each Log may have several sessions, but each session will be related to a single Log, and also a single Student. I think I want the sessions as separate documents, because I may want to get a list of Sessions by either Student, or Log. There would be Student documents as well.
I am struggling to add Sessions to a Log, without simply duplicating the session within the log. I would Like to get something like this, which lists the id's of related sessions:
{
"id": "2",
"studentName": "Joe student",
"assignment": "Web APIs",
"dueDate": "0001-01-01T00:00:00",
"goal": "Keep mistakes to a minimum.",
"reflection": "I'm getting there.",
"sessions": [1, 2, 7, 19, 22]
}
I would then like to display the log, by replacing the ids with the content of the actual documents.
Here are some related working code snippets ( Not necessarily correct) to show you my design structure:
public class PracticeLogRepository : DocumentDB
{
// ********** Logs section ********** //
private PracticeSessionRepository _sessionsRepository;
// specifies the database and document collection used by the repository
public PracticeLogRepository() : base("Project3", "Logs") { }
// Gets a list of practice Logs
public Task<List<Practice.Log>> GetLogsAsync()
{
return Task<List<Practice.Log>>.Run(() => Client
.CreateDocumentQuery<Practice.Log>(Collection.DocumentsLink)
.ToList());
}
// Gets the practice Log with the matching id
public Task<Practice.Log> GetLogAsync(string id)
{
return Task<Practice.Log>.Run(() => Client
.CreateDocumentQuery<Practice.Log>(Collection.DocumentsLink)
.Where(pl => pl.Id == id)
.AsEnumerable()
.FirstOrDefault());
}
...
public class PracticeSessionRepository : DocumentDB
{
// ********** Session section ********** //
// specifies the database and document collection used by the repository
public PracticeSessionRepository() : base("Project3", "Sessions") { }
// Gets a list of practice Sessions
public Task<List<Practice.Session>> GetSessionsAsync()
{
return Task<List<Practice.Session>>.Run(() => Client
.CreateDocumentQuery<Practice.Session>(Collection.DocumentsLink)
.ToList());
}
// Gets the practice Log with the matching id
public Task<Practice.Session> GetSessionAsync(string id)
{
return Task<Practice.Session>.Run(() => Client
.CreateDocumentQuery<Practice.Session>(Collection.DocumentsLink)
.Where(pl => pl.Id == id)
.AsEnumerable()
.FirstOrDefault());
}
...
public class LogController : ApiController
{
private PracticeLogRepository _logsRepository;
public LogController()
{
_logsRepository = new PracticeLogRepository();
}
// GET: api/Log
public async Task<IHttpActionResult> Get()
{
var logs = await _logsRepository.GetLogsAsync();
if (logs != null)
return Ok(logs);
return NotFound();
}
// Get: api/Log/{id}
public async Task<IHttpActionResult> Get(string id)
{
var log = await _logsRepository.GetLogAsync(id);
if (log != null)
return Ok(log);
return NotFound();
}
...
public class SessionController : ApiController
{
private PracticeSessionRepository _sessionsRepository;
public SessionController()
{
_sessionsRepository = new PracticeSessionRepository();
//_logsRepository = new PracticeLogRepository();
}
// GET: api/Session
public async Task<IHttpActionResult> Get()
{
var sessions = await _sessionsRepository.GetSessionsAsync();
if (sessions != null)
return Ok(sessions);
return NotFound();
}
// Get: api/Session/{id}
public async Task<IHttpActionResult> Get(string id)
{
var session = await _sessionsRepository.GetSessionAsync(id);
if (session != null)
return Ok(session);
return NotFound();
}
...
Since I am teaching myself, and am extremely new .NET and C#, and of course DocumentDB, I am struggling to code this. I would really appreciate a simple example of how I should create the Log document, and also how to update it by adding sessions.
If you need to see more of my code, just ask. I just don't want to over complicate this.
I'd like to make sure I'm understanding the question, here's a recap of my understanding:
You have a 1-to-many relationship between Log and Session.
You have a 1-to-1 relationship between Log and Student.
You'd like to query Sessions by either Student or Log.
You'd like to populate data from these relationships (e.g. get the session data)
Please keep in mind DocumentDB is a NoSQL database and does not support inter-document JOINs.
I'd recommend revisiting how you approach modeling your data - in particular how you represent relationships (e.g. whether to keep Sessions and Logs as separate documents).
The following article talks about modeling data in detail:
http://azure.microsoft.com/en-us/documentation/articles/documentdb-modeling-data/
To summarize the key points of the article - I would pick the appropriate trade-off between normalizing (e.g. keeping Sessions and Logs as separate documents) and de-normalizing (e.g. keeping Sessions and Logs in the same document) given your application's use case. In general, I prefer de-normalizing when you have a read-heavy application and normalizing when you have a write-heavy application.
If you choose to normalize - you can simply make a follow-up request to get the session data for a log. Extra credit: You can even take this a step further and remove the need of having to make multiple network requests by writing a stored procedure.
If you choose to de-normalize, you can simply query for the entire document and everything will be populated automatically. The drawback is you may have to fan-out writes to multiple documents if you update session data.
My goal is to put a "deleted at" timestamp on specific object in an array of a document.
If the document looks like this:
{
"subdoc": [
{
"key": 1,
"value": "abc",
"isActive": true
},
{
"key":5,
"value": "ade",
"isActive":true
}
]
}
I would like to be able to say "look for the document that has subdoc.key == 5 and subdoc.value == "ade"; set subdoc.isActive to false and set subdoc.deleteAt = current db timestamp. With a resulting document like this:
{
"subdoc": [
{
"key": 1,
"value": "abc",
"isActive": true
},
{
"key":5,
"value": "ade",
"isActive": false,
"deletedAt": Timestamp(1425911075,1)
}
]
}
Is this doable?
Update: After further review of the mongo docs, this does seem doable with the "$ (update)" operator. That gets me what I need, but I'm hoping for a less magical-strings way of doing this using the C# driver?
My working find/update looks like this:
// find
{
"subdoc.key":"2",
"subdoc.value":"ade"
}
// update
{
"$currentDate": {
"subdoc.$.deleteAt": {
"$type": "timestamp"
}
}
}
Update: I should clarify that this updated time stamp field is used for synchronization by many sometimes-connected mobile clients in a load-balanced environment (multiple web servers, multiple worker processes, and a mongo cluster) with a high transaction volume, which makes it crucial that this time stamp has a single point truth, is logically sequential in the context of the app, and is as high precision as possible (fractions of a second). Otherwise, records could be missed in a sync.
For the moment, I'm using the above approach to ensure time stamped values are generated by the mongo database instance. And I'm pretty satisfied with this approach.
you can use the c# driver to wrap the mongo entities in c# objects. Then in your code you can use linq to query the DB and update your objects as required. Then just save them to the DB to persist your changes.
Below is a small piece of code to query a Parent collection in the test DB.
The C# driver provides as AsQueryable extension to allow us to write our queries directly in Linq. The driver will then automatically build the required query and execute it against the collection.
The sample below looks for any sub documents in the subdoc list that have a value on the key field of 5
If it finds any, it updates the deletedAt date and then saves it back to the DB.
var client = new MongoClient();
var database = client.GetServer().GetDatabase("test");
var parentCollection = database.GetCollection<Parent>("Parent");
var parent = parentCollection.AsQueryable().FirstOrDefault(p => p.subdoc.Any(f => f.key == 5));
if (parent != null)
{
var fooList = parent.subdoc.Where(f => f.key == 5);
foreach (var foo in fooList)
{
foo.deletedAt = DateTime.UtcNow;
}
}
parentCollection.Save(parent);
Below are the two c# entities used to map to the Mongo documents. We can use the [BsonIgnoreIfNull] attribute of the mongo c# driver to only serialize the deletedAt field if it contains a value. We also use a nullable DateTime object in our code to allow nulls to be stores if required.
public class Foo
{
[BsonId]
public ObjectId Id { get; set; }
public int key { get; set; }
public string value { get; set; }
public bool isActive { get; set; }
[BsonIgnoreIfNull]
public DateTime? deletedAt { get; set; }
}
public class Parent
{
[BsonId]
public ObjectId Id { get; set; }
public List<Foo> subdoc { get; set; }
}
See most recent update. A combination of the positional and $currentDate operators is serving my purpose.