How to avoid adding a reference field in very small entries (thus avoiding doubling collection size)? - c#

I have a User class that accumulates lots of DataTime entries in some List<DateTime> Entries field.
Occasionally, I need to get last 12 Entries (or less, if not reached to 12). It can get to very large numbers.
I can add new Entry object to dedicated collection, but then I have to add ObjectId User field to refer the related user.
It seems like a big overhead, for each entry that holds only a DateTime, to add another field of ObjectId. It may double the collection size.
As I occasionally need to quickly get only last 12 entries of 100,000 for instance, I cannot place these entries in a per-user collection like:
class PerUserEntries {
public ObjectId TheUser;
public List<DateTime> Entries;
}
Because it's not possible to fetch only N entries from an embedded array in a mongo query, AFAIK (if I'm wrong, it would be very gladdening!).
So am I doomed to double my collection size or is there a way around it?
Update, according to #profesor79's answer:
If your answer works, that will be perfect! but unfortunately it fails...
Since I needed to filter on the user entity as well, here is what I did:
With this data:
class EndUserRecordEx {
public ObjectId Id { get; set; }
public string UserName;
public List<EncounterData> Encounters
}
I am trying this:
var query = EuBatch.Find(u => u.UserName == endUser.UserName)
.Project<BsonDocument>(
Builders<EndUserRecordEx>.Projection.Slice(
u => u.Encounters, 0, 12));
var queryString = query.ToString();
var requests = await query.ToListAsync(); // MongoCommandException
This is the query I get in queryString:
find({ "UserName" : "qXyF2uxkcESCTk0zD93Sc+U5fdvUMPow" }, { "Encounters" : { "$slice" : [0, 15] } })
Here is the error (the MongoCommandException.Result):
{
{
"_t" : "OKMongoResponse",
"ok" : 0,
"code" : 9,
"errmsg" : "Syntax error, incorrect syntax near '17'.",
"$err" : "Syntax error, incorrect syntax near '17'."
}
}
Update: problem identified...
Recently, Microsoft announced their DocumentDB protocol support for MongoDB. Apparently, it doesn't support yet all projection operators. I tried it with mLab.com, and it works.

You can use PerUserEntries as this is a valuable document structure.
To get part of that array we need to add projection to query, so we can get only x elements and this is done server side.
Please see snippet below:
static void Main(string[] args)
{
// To directly connect to a single MongoDB server
// or use a connection string
var client = new MongoClient("mongodb://localhost:27017");
var database = client.GetDatabase("test");
var collection = database.GetCollection<PerUserEntries>("tar");
var newData = new PerUserEntries();
newData.Entries = new List<DateTime>();
for (var i = 0; i < 1000; i++)
{
newData.Entries.Add(DateTime.Now.AddSeconds(i));
}
collection.InsertOne(newData);
var list =
collection.Find(new BsonDocument())
.Project<BsonDocument>
(Builders<PerUserEntries>.Projection.Slice(x => x.Entries, 0, 3))
.ToList();
Console.ReadLine();
}
public class PerUserEntries
{
public List<DateTime> Entries;
public ObjectId TheUser;
public ObjectId Id { get; set; }
}

Related

C# reading an Array from IQueryCollection

Working in a C# Asp.netcore project, I'm trying to read an array from an IQueryCollection in a GET request. The IQueryCollection is in Request.Query.
I need to read the query without a model. When the front end json hits the back end, its no longer JSON which is fine, I just need to read whatever the front end passed. In this example its a int array but it could be anything.
Models don't scale very well for queries. To return a model on a get, they work brilliantly but for a query that can be as complex or simple as it needs to be I can't commit to models. Would be really usefull if I could extract the query to an anonamous object.
Json passed from the front end:
var params = {
items: [1, 4, 5],
startDate: new Date(),
endDate: new Date()
}
C# Request.QueryString
{?items%5B%5D=1&items%5B%5D=4&items%5B%5D=5&startDate=Fri%20Oct%2015%202021%2022%3A30%3A57%20GMT%2B1000%20(Australian%20Eastern%20Standard%20Time)&endDate=Fri%20Oct%2015%202021%2022%3A30%3A57%20GMT%2B1000%20(Australian%20Eastern%20Standard%20Time)}
I've tried:
// Gives me a null
var value = HttpUtility.ParseQueryString(Request.QueryString.Value).Get("items");
// Gives me an empty object {}
var value = Request.Query["items"];
Hope this is enough information.
The query string format is undefined for arrays. Some web frameworks use ?foo[]=1&foo[]=2, others use ?foo=1&foo=2, again others ?foo[0]=1&foo[1]=2.
You'll have to use the same parsing serverside as you serialize it clientside. This works for your [] syntax:
var queryString = "?items%5B%5D=1&items%5B%5D=4&items%5B%5D=5&date=2021-10-15";
var parsed = HttpUtility.ParseQueryString(queryString);
foreach (var key in parsed.AllKeys)
{
if (key.EndsWith("[]"))
{
var values = string.Join(", ", parsed.GetValues(key));
Console.WriteLine($"{key}: array: {values}.");
}
else
{
Console.WriteLine($"{key}: scalar: {parsed[key]}.");
}
}
Output:
items[]: array: 1, 4, 5.
date: scalar: 2021-10-15.
But instead of parsing the query string yourself, let the framework do that. You say you don't find models scalable; I find that hand-crafting code doesn't scale well. A model like this would just work, granted you fix your date serializer on the JS side:
public class RequestModel
{
public int[] Items { get; set; }
public DateTime StartDate { get; set; }
public DateTime EndDate { get; set; }
}

Multiple Nested JSON information - C# Process

apologies if I'm doing something wrong, this is my first post.
I'm currently working with C# and want to save a bunch of data out to a JSON file and load it back, but I'm having trouble figuring out how to get it in the following format.
// Primary ID
001
{
// Secondary ID
01
{
// Tertiary ID
01
{
string: "this is some information.",
int: 9371
}
}
// Secondary ID
02
{
// Tertiary ID
01
{
string: "blah blah blah.",
int: 2241
}
}
}
I'd essentially like to be able to call up information with a particular set of IDs for example 001-02-01 which would return a string ("blah blah blah.") and an int (2241).
The reason I want to go about it like this instead of just having one longer ID is so that when the JSON file becomes very large, I'm hoping to be able to speed up the search for information by passing each ID in turn.
If that makes no sense and it would be equally as fast to just pass in one longer ID and not be bothered by this whole nested ID segments concept then please let me know!
If, however what I'm thinking is correct and it would help the speed of finding particular data by structuring it out like this, how would I go about doing that? With nested C# classes in arrays?
The most simple way and efficient way would be to have all data as same type. Currently, you seem to go for each object is of type of the given id:
{
"01":{},
"02" :{}
}
this will not go too well if trying to use a serializable class.
I would recommend the following:
{
"items" : [
{"id":"01" }, { "id":"02" },...
]
}
Then you can serialize/deserialize easily with
[Serializable]
public class Item
{
public string id = null;
}
[Serializable]
public class RootObject
{
public List<Item> items = null;
}
and then in Unity:
void Start(){
string str = GetJson(); // However you get it
RootObject ro = JsonUtility.FromJson<RootObject>(str);
}
if you want to speed up the fetching and your collection is large, convert to dictionary.
Dictionary<string, Item> dict = null;
void Start(){
string str = GetJson(); // However you get it
RootObject ro = JsonUtility.FromJson<RootObject>(str);
this.dict = new Dictionary<string,Item>();
foreach(Item item in ro.items){
Item temp = temp;
this.dict.Add(item.Id, temp);
}
ro = null;
}
Now you can access real fast.
Item GetItem(string id)
{
if(string.IsNullOrEmpty(id) == true){ return null; }
Item item = null;
this.dict.TryGetValue(id, out item);
return item;
}
If you end up storing millions of records in your file and want to start doing something more performant it would be easier to switch to a decent document database like MongoDB rather than trying to reinvent the wheel.
Worry about writing good standard code before worrying about performance problems that don't yet exist.
The following example is not in your language of choice but it does explain that JSON and arrays of 1,000,000 objects can be searched very quickly:
const getIncidentId = () => {
let id = Math.random().toString(36).substr(2, 6).toUpperCase().replace("O", "0")
return `${id.slice(0, 3)}-${id.slice(3)}`
}
console.log("Building array of 1,000,000 objects")
const littleData = Array.from({ length: 1000000 }, (v, k) => k + 1).map(x => ({ cells: { Number: x, Id: getIncidentId() } }))
console.log("Getting list of random Ids for array members [49, 60, 70000, 700000, 999999]")
const randomIds = ([49, 60, 70000, 700000, 999999]).map(i => littleData[i].cells.Id)
console.log(randomIds)
console.log("Finding each array item that contains a nested Id property in the randomIds list.")
const foundItems = littleData.filter(i => randomIds.includes(i.cells.Id))
console.log(foundItems)

EF core list of string datatype to postgresql

I am using EF core 1.1.1 & postgresql with code-first implementation.
I have a model class with member variable for List as follow.
public class user : IdentityUser
{
public int id { get; set; }
[Column("devices", TypeName = "text[]")]
public List<string> devices { get; set; };
}
I declared datatype as text[] and it shows as text array in database. Saving to database as follow works.
var device = "phone";
user.devices.Add(device);
It stores data in database but it returns an error when I try to call query like this.
if (user.Any(x => x.devices.Count > 0))
Showing error like this.
Can't cast database type _text to List`1'
How can I convert stored text data to List?
Thanks.
Late but still - use string[] instead of List<string>.
public string[] Devices { get; set; }
Bad news is that you cannot use the x => x.Devices.Count > 0 lambda anyway when querying directly from DB, since the EF+Npgsql still cannot convert these expressions into SQL (at least in versions <= 1.1.0...)
You'll have to write this condition in raw SQL.
var result = db.YourTable
.FromSql("select * from your_table where array_length(devices) > 0")
.Where(...)
.OrderBy(...)
...
.ToList();
Filtering fetched rows in memory works normally, of course.

LINQ get items in List<AttributeValuePair>

I have a table on my Database where, aside from other columns (one of which is a UniqueIdentifier) I also have one column where I have a JSON array string with values like this (formatted):
[
{
"AttributeId": "fe153d69-8ac1-6e0c-8793-ff0000804eb3",
"AttributeValueId": "64163d69-8ac1-6e0c-8793-ff0000804eb3"
},
{
"AttributeId": "00163d69-8ac1-6e0c-8793-ff0000804eb3",
"AttributeValueId": "67163d69-8ac1-6e0c-8793-ff0000804eb3"
}
]
I then have this AttributeValuePair class which will allow me to read this data on code:
public class AttributeValuePair
{
public AttributeValuePair();
public Guid AttributeId { get; set; }
public Guid AttributeValueId { get; set; }
}
Whenever I get a list of items from this table, I want to be able to filter the resulting array based on only one AttributeValueId and get only the items where this is a match, independently of the value of any other attributes.
Since that on code, to read these attribute collection I must have a List<AttributeValuePair>, how in LINQ can I get the items where a particular AttributeValueId is present?
List<AttributeValuePair> attributeValuePairs = serializer.Deserialize<List<AttributeValuePair>>(item.Variant);
I've been lost at it for two hours already and can't seem to find an escape from this one.
EDIT
Being more clear about the problem, what I'm trying to do is, from a List<ProductVariation>, get the possible values for the attribute "Portions", when the attribute "Days" is the specified value. I'm having a lot of trouble using the serializer to build the LINQ statement.
//This code is wrong, I know, but I'm trying to show what I want
result = model.ProductVariations.Find(x, new {serializer.Deserialize<List<AttributeValuePair>>(item.Variant).Where(valuePair => valuePair.AttributeId == attributeId)});
Can you try
attributeValuePairs.Where(valuePair => valuePair.AttributeId == new Guid("SomeValue"));
The answer to this question was actually a lot simpler than previously expected:
public string SelectedVariation(string mealsAttribute, string portionsAttribute, string product)
{
Guid productId = new Guid(product);
CatalogManager catalogManager = CatalogManager.GetManager();
EcommerceManager ecommerceManager = EcommerceManager.GetManager();
RegisterOrderAccountFormModel model = new RegisterOrderAccountFormModel();
model.Product = catalogManager.GetProduct(productId);
List<ProductVariation> productVariationsCollection = catalogManager.GetProductVariations(productId).ToList();
//This is the really interesting part for the answer:
return productVariationsCollection.Where(x => x.Variant.ToLower().Contains(mealsAttribute.ToLower()) && x.Variant.ToLower().Contains(portionsAttribute.ToLower())).FirstOrDefault().Id.ToString();
}

Updating entire node with mutating cypher in Neo4jclient

I need to update all the properties of a given node, using mutating cypher. I want to move away from Node and NodeReference because I understand they are deprecated, so can't use IGraphClient.Update. I'm very new to mutating cypher. I'm writing in C#, using Neo4jclient as the interface to Neo4j.
I did the following code which updates the "Name" property of a "resunit" where property "UniqueId" equals 2. This works fine. However,
* my resunit object has many properties
* I don't know which properties have changed
* I'm trying to write code that will work with different types of objects (with different properties)
It was possible with IGraphClient.Update to pass in an entire object and it would take care of creating cypher that sets all properies.
Can I somehow pass in my object with mutating cypher as well?
The only alternative I can see is to reflect over the object to find all properties and generate .Set for each, which I'd like to avoid. Please tell me if I'm on the wrong track here.
string newName = "A welcoming home";
var query2 = agencyDataAccessor
.GetAgencyByKey(requestingUser.AgencyKey)
.Match("(agency)-[:HAS_RESUNIT_NODE]->(categoryResUnitNode)-[:THE_UNIT_NODE]->(resunit)")
.Where("resunit.UniqueId = {uniqueId}")
.WithParams(new { uniqueId = 2 })
.With("resunit")
.Set("resunit.Name = {residentialUnitName}")
.WithParams(new { residentialUnitName = newName });
query2.ExecuteWithoutResults();
It is indeed possible to pass an entire object! Below I have an object called Thing defined as such:
public class Thing
{
public int Id { get; set; }
public string Value { get; set; }
public DateTimeOffset Date { get; set; }
public int AnInt { get; set; }
}
Then the following code creates a new Thing and inserts it into the DB, then get's it back and updates it just by using one Set command:
Thing thing = new Thing{AnInt = 12, Date = new DateTimeOffset(DateTime.Now), Value = "Foo", Id = 1};
gc.Cypher
.Create("(n:Test {thingParam})")
.WithParam("thingParam", thing)
.ExecuteWithoutResults();
var thingRes = gc.Cypher.Match("(n:Test)").Where((Thing n) => n.Id == 1).Return(n => n.As<Thing>()).Results.Single();
Console.WriteLine("Found: {0},{1},{2},{3}", thingRes.Id, thingRes.Value, thingRes.AnInt, thingRes.Date);
thingRes.AnInt += 100;
thingRes.Value = "Bar";
thingRes.Date = thingRes.Date.AddMonths(1);
gc.Cypher
.Match("(n:Test)")
.Where((Thing n) => n.Id == 1)
.Set("n = {thingParam}")
.WithParam("thingParam", thingRes)
.ExecuteWithoutResults();
var thingRes2 = gc.Cypher.Match("(n:Test)").Where((Thing n) => n.Id == 1).Return(n => n.As<Thing>()).Results.Single();
Console.WriteLine("Found: {0},{1},{2},{3}", thingRes2.Id, thingRes2.Value, thingRes2.AnInt, thingRes2.Date);
Which gives:
Found: 1,Foo,12,2014-03-27 15:37:49 +00:00
Found: 1,Bar,112,2014-04-27 15:37:49 +00:00
All properties nicely updated!

Categories