I want to map an XML fragment to a record and its nested record. I assume I have to use ConstructUsing() in order to be able to supply all required constructor parameters.
In the CreateMap for parent I don't know how to specify the second parameter without going into some kind of re-entrant hell? I'm guessing I have to reference the mapper that's created in my Startup.cs perhaps. Then I could specify: return new ParentRecord(ele.Attribute("id").Value, mapper.Map<ChildRecord>(ele));.
XML Fragment
XElement xe = new XElement("root",
new XElement("parent",
new XAttribute("id", "123"),
new XElement("child", "some text")));
Records
public record ParentRecord(long Id, ChildRecord Child);
public record ChildRecord(string Text);
Mapping Configuration
var configuration = new MapperConfiguration(cfg =>
{
cfg.CreateMap<XElement, ChildRecord>()
.ConstructUsing((ele, ctx) =>
{
return new ChildRecord(ele.Element("child").Value);
});
cfg.CreateMap<XElement, ParentRecord>()
.ConstructUsing((ele, ctx) =>
{
return new ParentRecord(ele.Attribute("id").Value, ?WHAT_TO_PUT_HERE?);
});
});
var mapper = configuration.CreateMapper();
Map Request
var o = mapper.Map<ParentRecord>(xe.Element("parent"));
You need the context mapper, smth like ctx.Mapper.Map<ChildRecord>(source).
Related
Is there a more efficient way of converting dynamo db data into concrete types? For example, when I query the data everything is in:
List<Dictionary<string, AttributeValue>>
Is it possible to easily convert the type without having to loop through each item and doing this all manually?
For example I am doing:
return items.Select(item => new Connection
{
ConnectionId = Guid.Parse(item["connectionId"].S),
ClientId = item["clientId"].S,
ProviderId = item["providerId"].S,
Scopes = item["scopes"].SS.ToArray(),
CredentialsId = item["credentialsId"].S,
Evidences = ToEvidences(item["consentEvidences"].L)
})
.ToList();
This then returns a list of my type Connection however I am explicitly mapping each field. Is there an easier way or a helper library that can do the mapping?
I think you'll have luck with the higher-level .NET Document model. It presents more natural data types.
https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/DotNetSDKMidLevel.html
The easiest way I have found is to use the Document.FromAttributeMap function to convert it to a Document object and then convert it again to the .NET type using the DynamoDBContext.FromDocument method as shown below.
public async Task<IEnumerable<WeatherForecast>> GetAll(string cityName)
{
var queryRequest = new QueryRequest()
{
TableName = nameof(WeatherForecast),
KeyConditionExpression = "CityName = :cityName",
ExpressionAttributeValues = new Dictionary<string, AttributeValue>()
{
{":cityName", new AttributeValue(cityName)},
}
};
var response = await _dynamoDbClient.QueryAsync(queryRequest);
return response.Items.Select(a =>
{
var doc = Document.FromAttributeMap(a);
return _dynamoDbContext.FromDocument<WeatherForecast>(doc);
});
}
I'm using the MongoDB .Net driver in my project. I want to update all of the properties of my object that is stored in MongoDB. In the documentation, update is shown like this:
var filter = Builders<BsonDocument>.Filter.Eq("i", 10);
var update = Builders<BsonDocument>.Update.Set("i", 110);
await collection.UpdateOneAsync(filter, update);
But I don't want to call the Set method for all of the properties, since there are many properties and can be many more in the future.
How can I update the whole object using the MongoDB .Net driver?
You can do that with ReplaceOneAsync instead of UpdateOneAsync.
You need a filter to match the existing document (a filter with the document id is the simplest) and the new object.
Hamster hamster = ...
var replaceOneResult = await collection.ReplaceOneAsync(
doc => doc.Id == hamster.Id,
hamster);
var update = new BsonDocument("$set", new BsonDocument(entityType.GetProperties().Where(p => p.Name != "Id").Select(p => new KeyValuePair<string, object>(p.Name, entityType.GetProperty(p.Name).GetValue(task, null)))));
var options = new UpdateOptions();
collection.UpdateOne<MyTask>(item => item.Name == "cheque", update, options);
this code uses reflection to include all properties of the given object
to the update statement, no need to manually add all properties, as u see the Id is explicitly excluded from the update statement to avoid exception.
If you want to update your whole BsonDocument, there is an implicit conversion from BsonDocument to UpdateDefinition.
https://github.com/mongodb/mongo-csharp-driver/blob/master/src/MongoDB.Driver/UpdateDefinition.cs
var doc = new BsonDocument() { .... }
UpdateDefinition<BsonDocument> update = doc;
I use Nest client to use ElasticSearch .I want to search in ElasticSearch :
SearchRequest countRequest = new SearchRequest
{
//Somthing
};
client.Search<Post>(countRequest);
On other hand :
client.Search<Post>(s=>s.Index("IndexName").Query(...))
How i can set index name by SearchRequest class search ?
This is for those using newer versions of NEST. In 2.0.1, I am unable to find the Indices property in SearchRequest. However, you can pass them in through the constructor:
var request = new SearchRequest<Post>("IndexName", "TypeName");
I map the index and type on the ConnectionSettings like so.
ConnectionSettings settings = new ConnectionSettings("url");
settings.MapDefaultTypeIndices(t => t.Add(typeof(Post), "IndexName"));
settings.MapDefaultTypeNames(t => t.Add(typeof(Post), "TypeName"));
Other ways to tell NEST the index and type:
client.Search<Post>(s => s.Index("IndexName").Type("TypeName").From(0));
or apply the ElasticsearchTypeAttribute on the type.
[ElasticsearchType(Name = "TypeName")]
public class Post{ }
SearchRequest contains an Indices property, so that you can specify multiple indices to search across. In your case, you could just pass the single index like so:
var request = new SearchRequest
{
Indices = new IndexNameMarker[] { "IndexName" }
};
Another option would be to map your Post type to the index it belongs to, and use the typed SearchRequest<T> to let NEST infer the index name.
I was trying to solve a bit different task with ES v5 (json request was pushed from the file) but also had the same problem with setting the indexName. So, my solution was to add index querystring parameter. Using this in integration tests:
public static class ElasticSearchClientHelper
{
public static ISearchResponse<T> SearchByJson<T>(this IElasticClient client, string json, string indexName, Dictionary<string, object> queryStringParams = null) where T : class
{
var qs = new Dictionary<string, object>()
{
{"index", indexName}
};
queryStringParams?.ForEach(pair => qs.Add(pair.Key, pair.Value));
using (var stream = new MemoryStream(Encoding.UTF8.GetBytes(json)))
{
var searchRequest = client.Serializer.Deserialize<SearchRequest>(stream);
((IRequestParameters)((IRequest<SearchRequestParameters>)searchRequest).RequestParameters).QueryString = qs;
return client.Search<T>(searchRequest);
}
}
}
The ModelInspector doesn't seem to provide the means to define Filter definitions . Any ideas/Workarounds?
I need to generate the following with mappings by code:
<filter-def name="filterName" use-many-to-one="false">
<filter-param name="filterParamName" type="Int32"/>
</filter-def>
I was able to achieve that using NHibernate.Cfg.Configuration:
var cfg = new Configuration();
var filterDef = new FilterDefinition(
"filterName",
null, // or your default condition
new Dictionary<string, IType> { { "filterParamName", NHibernateUtil.Int32 } },
false);
cfg.AddFilterDefinition(filterDef);
// cfg.AddMapping(...)
// cfg.DataBaseIntegration(...)
var sessionFactory = cfg.BuildSessionFactory();
then define the filter in entity mapping:
public class EntityMap : ClassMapping<Entity>
{
public EntityMap()
{
Table("Entity");
Filter("filterName", m => m.Condition("FilteredField = :filterParamName"));
// remaining mapping
}
}
and then use it as follows:
using(var session = sessionFactory.OpenSession())
{
var filterValue = 123;
session
.EnableFilter("filterName")
.SetParameter("filterParamName", filterValue);
}
I hope you;ll find this useful.
FYI,
It is important to note that the call to AddFilterDefinition is before AddMapping, otherwise you will get anArgumentException("An item with the same key has already been added")!
I create some items from a class X.
I add them to de base, do SaveChanges and all this...
The class Y has a relationship many-to-many with X.
Using another Context, I create a Y instance, putting into the collection of X the elements I've created.
I add Y to Y entity set, it is fine.
When I do Context.SaveChanges(), I get:
A value shared across entities or associations is generated in more than one location. Check that mapping does not split an EntityKey to multiple server-generated columns.
Have you ever seen this error?
EDIT: at the beginning, I've put 1-to-many, after I've noticed it is in fact many-to-many.
EDIT 2: showing the way this is being done. Unlike many people while using .net, we do use layers (business, data...). This is the test case:
[TestMethod]
public void WorksWithAreaCategories()
{
using (new TransactionScope())
{
//arrange
var context = ContextFactory.Create();
var categoryBusiness = new CategoryBusiness(context);
var category = new Category
{
Name = "TestCategory###"
};
categoryBusiness.Add(category);
var areaBusiness = new AreaBusiness(context);
var area = new Area
{
Name = "TestArea###",
Description = "TestAreaDescription###",
Categories = new List<Category> {category}
};
//act
areaBusiness.Add(area);
//assert
var areaFromDb = areaBusiness.FindById(area.AreaID);
Assert.IsNotNull(areaFromDb.Categories);
Assert.IsTrue(areaFromDb.Categories.Count > 0);
Assert.IsTrue(areaFromDb.Categories.Any(c => c.CategoryID == category.CategoryID));
}
}
They share the same context. The business layers call SaveChanges in the end of each Add.
How I resolved it:
After adding both of them, I established the relationship with Update.
[TestMethod]
public void WorksWithAreaCategories()
{
using (new TransactionScope())
{
//arrange
var context = ContextFactory.Create();
var categoryBusiness = new CategoryBusiness(context);
var category = new Category
{
Name = "TestCategory###"
};
categoryBusiness.Add(category);
var areaBusiness = new AreaBusiness(context);
var area = new Area
{
Name = "TestArea###",
Description = "TestAreaDescription###",
};
areaBusiness.Add(area);
//act
area.Categories = new List<Category> { category };
areaBusiness.Update(area);
//assert
var areaFromDb = areaBusiness.FindById(area.AreaID);
Assert.IsNotNull(areaFromDb.Categories);
Assert.IsTrue(areaFromDb.Categories.Count > 0);
Assert.IsTrue(areaFromDb.Categories.Any(c => c.CategoryID == category.CategoryID));
}
}
When the error says "more than one location", it really means more than one context. Data "belongs" in a specific context and you get problems if you try to move it between contexts.
There are 2 ways to fix it:
Do everything in the same context
When you create the Y instance, read the X elements from disk, so that they are in the same context, before adding the X elements to Y.
EDIT based on comment:
Try using a single save changes.