DynamoDB C# expression to query/scan - c#

I have a generic repository structure in place using interfaces and I am trying to develop a DynamoDB implementation for this. This is my first experience with DynamoDB and NoSQL (previously all T-SQL).
The problem I am having is that I am unable to find any way of converting a lambda expression in C# to a format that I can use to query/scan DynamoDB.
My Get method looks like this:
public Task<TEntity> GetAsync(Expression<Func<TEntity, bool>> where)
{
return await this.DataContext.ScanAsync(...);
}
Is there an existing way to do this? There doesn't seem to be anything in the documentation that addresses this and I am struggling to find an example of where someone else has had a similar problem.
Maybe my lack of experience with NoSQL and/or DynamoDB is just the problem here. If so, please do point out a better approach if necessary though I will note that as previously mentioned, I am implementing an interface which is already defined and changing this isn't really an option.

As far as I know, you can use the ServiceStack.Aws, which is similar to LINQ.
For example:
using System;
using Amazon;
using Amazon.DynamoDBv2;
using ServiceStack;
using ServiceStack.Text;
using ServiceStack.Aws.DynamoDb;
using ServiceStack.DataAnnotations;
var awsDb = new AmazonDynamoDBClient("keyId","key",
new AmazonDynamoDBConfig { ServiceURL="http://localhost:8000"});
var db = new PocoDynamo(awsDb);
public class Todo
{
[AutoIncrement]
public long Id { get; set; }
public string Content { get; set; }
public int Order { get; set; }
public bool Done { get; set; }
}
db.RegisterTable<Todo>();
db.DeleteTable<Todo>(); // Delete existing Todo Table (if any)
db.InitSchema(); // Creates Todo DynamoDB Table
var newTodo = new Todo {
Content = "Learn PocoDynamo",
Order = 1
};
db.PutItem(newTodo);
var savedTodo = db.GetItem<Todo>(newTodo.Id);
"Saved Todo: {0}".Print(savedTodo.Dump());
savedTodo.Done = true;
db.PutItem(savedTodo);
var updatedTodo = db.GetItem<Todo>(newTodo.Id);
"Updated Todo: {0}".Print(updatedTodo.Dump());
db.DeleteItem<Todo>(newTodo.Id);
var remainingTodos = db.GetAll<Todo>();
"No more Todos: {0}".Print(remainingTodos.Dump());

Related

Using a Database Context in multiple projects within the same solution

I'm currently working on a program that is being used to generate PDF's and documents. There are two different use cases, one being an automated process and the second being a manual process where data can be edited via a front-end app.
The solution has 2 Projects in it, the first for the automated part, and the second for the manual part.
However, since the two processes make use of the same data and templates, I've split the solution into two parts, this will allow me to set it up in a way in which I only need to maintain models/templates once.
My database context looks like this:
using System;
using System.Collections.Generic;
using System.Data.Entity;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
namespace RefundTracker.Models
{
public class DatabaseContext : DbContext
{
public DatabaseContext()
:base("Prod")
{
}
public DbSet<Referral> Referrals { set; get; }
public DbSet<ReferralAppointment> ReferralAppointments { set; get; }
public DbSet<ReferralPayment> ReferralPayments { set; get; }
public DbSet<BankDetails> BankDetails { set; get; }
public DbSet<ReferralAppointment_History> ReferralAppointment_History { set; get; }
public DbSet<ReferralPayment_History> ReferralPayment_History { set ; get; }
public DbSet<IsInUse> IsInUse { set; get; }
}
}
In terms of projects, I have a project called "RefundTracker" and another called "MailMergeTPA".
The context provided above, together with all of the models, are located in the "RefundTracker" project.
I would like to make use of these models and context in the "MailMargeTPA" project as well.
I referenced the "RefundTracker" in "MailMergeTPA" project, however, no results when using the context here. (When I access a function that get a list of names for instance, I get the full list in "RefundTracker", however, I get no results when I use the same function in "MailMergeTPA".
Code Example:
public BankDetails GetBankDetails(Referral record)
{
string bName = record.bankName.Trim();
try
{
BankDetails bankDetails= new BankDetails();
List<BankDetails> bankDetails = new List<BankDetails>();
using (DatabaseContext db = new DatabaseContext())
{
bankDetails = db.BankDetails.SingleOrDefault(a => a.BankName == bName);
}
return bankDetails;
}
catch(Exception ex)
{
Console.WriteLine(ex.ToString());
return null;
}
I would like to make use of this exact function in both projects.
Could you kindly help me with some advice? (Please go easy on me in the comments, I'm still fairly new to EF)
I've tried referencing the project, no result.
I've read up on interfaces, however, I'm unsure as to how I would incorporate this.

C# getting MongoDB BsonDocument to return as JSON

I asked a question a couple of days ago to collect data from MongoDB as a tree.
MongoDB create an array within an array
I am a newbie to MongoDB, but have used JSON quite substantially. I thought using a MongoDB to store my JSON would be a great benefit, but I am just experiencing immense frustration.
I am using .NET 4.5.2
I have tried a number of ways to return the output from my aggregate query to my page.
public JsonResult GetFolders()
{
IMongoCollection<BsonDocument> collection = database.GetCollection<BsonDocument>("DataStore");
PipelineDefinition<BsonDocument, BsonDocument> treeDocs = new BsonDocument[]
{
// my query, which is a series of new BsonDocument
}
var documentGroup = collection.Aggregate(treeDocs).ToList();
// Here, I have tried to add it to a JsonResult Data,
// as both documentGroup alone and documentGroup.ToJson()
// Also, loop through and add it to a List and return as a JsonResult
// Also, attempted to serialise, and even change the JsonWriterSettings.
}
When I look in the Immediate Window at documentGroup, it looks exactly like Json, but when I send to browser, it is an escaped string, with \" surrounding all my keys and values.
I have attempted to create a model...
public class FolderTree
{
public string id { get; set; }
public string text { get; set; }
public List<FolderTree> children { get; set; }
}
then loop through the documentGroup
foreach(var docItem in documentGroup)
{
myDocs.Add(BsonSerializer.Deserialize<FolderTree>(docItem));
}
but Bson complains that it cannot convert int to string. (I have to have text and id as a string, as some of the items are strings)
How do I get my MongoDB data output as Json, and delivered to my browser as Json?
Thanks for your assistance.
========= EDIT ===========
I have attempted to follow this answer as suggested by Yong Shun below, https://stackoverflow.com/a/43220477/4541217 but this failed.
I had issues, that the "id" was not all the way through the tree, so I changed the folder tree to be...
public class FolderTree
{
//[BsonSerializer(typeof(FolderTreeObjectTypeSerializer))]
//public string id { get; set; }
[BsonSerializer(typeof(FolderTreeObjectTypeSerializer))]
public string text { get; set; }
public List<FolderTreeChildren> children { get; set; }
}
public class FolderTreeChildren
{
[BsonSerializer(typeof(FolderTreeObjectTypeSerializer))]
public string text { get; set; }
public List<FolderTreeChildren> children { get; set; }
}
Now, when I look at documentGroup, I see...
[0]: {Plugins.Models.FolderTree}
[1]: {Plugins.Models.FolderTree}
To be fair to sbc in the comments, I have made so many changes to get this to work, that I can't remember the code I had that generated it.
Because I could not send direct, my json result was handled as...
JsonResult json = new JsonResult();
json.Data = documentGroup;
//json.Data = JsonConvert.SerializeObject(documentGroup);
json.JsonRequestBehavior = JsonRequestBehavior.AllowGet;
return json;
Note, that I also tried to send it as...
json.Data = documentGroup.ToJson();
json.Data = documentGroup.ToList();
json.Data = documentGroup.ToString();
all with varying failures.
If I leave as documentGroup, I get {Current: null, WasFirstBatchEmpty: false, PostBatchResumeToken: null}
If I do .ToJson(), I get "{ \"_t\" : \"AsyncCursor`1\" }"
If I do .ToList(), I get what looks like Json in json.Data, but get an error of Unable to cast object of type 'MongoDB.Bson.BsonInt32' to type 'MongoDB.Bson.BsonBoolean'.
If I do .ToString(), I get "MongoDB.Driver.Core.Operations.AsyncCursor`1[MongoDB.Bson.BsonDocument]"
=========== EDIT 2 =================
As this way of extracting the data from MongoDB doesn't want to work, how else can I make it work?
I am using C# MVC4. (.NET 4.5.2)
I need to deliver json to the browser, hence why I am using a JsonResult return type.
I need to use an aggregate to collect from MongoDB in the format I need it.
My Newtonsoft.Json version is 11.0.2
My MongoDB.Driver is version 2.11.1
My method is the simplest it can be.
What am I missing?

how to Serialize .Net Class to Avro.Generic.GenericRecord for publishing into kafka topic?

I am trying to find a way/helper to convert.Net Class to Avro.Generic.GenericRecord . Currently, I am manually adding field-name and field-value to Generic record. Is there a serializer/converter which I can use to convert the object to generic record and publish on to a kafka topic.
class Plant
{
public long Id { get; set; }
public string Name { get; set; }
public List<PlantProperties> PlantProperties{ get; set; }
}
class PlantProperties
{
public long Leaves{ get; set; }
public string Color{ get; set; }
}
Please suggest.
Assuming you are using the Confluent Schema Regsitry, you can use their .NET client1
https://github.com/confluentinc/confluent-kafka-dotnet
Copied from the examples folder
using (var serdeProvider = new AvroSerdeProvider(avroConfig))
using (var producer = new Producer<string, GenericRecord>(producerConfig, serdeProvider.GetSerializerGenerator<string>(), serdeProvider.GetSerializerGenerator<GenericRecord>()))
{
Console.WriteLine($"{producer.Name} producing on {topicName}. Enter user names, q to exit.");
int i = 0;
string text;
while ((text = Console.ReadLine()) != "q")
{
var record = new GenericRecord(s);
record.Add("name", text);
record.Add("favorite_number", i++);
record.Add("favorite_color", "blue");
producer
.ProduceAsync(topicName, new Message<string, GenericRecord> { Key = text, Value = record })
.ContinueWith(task => task.IsFaulted
? $"error producing message: {task.Exception.Message}"
: $"produced to: {task.Result.TopicPartitionOffset}");
}
}
cts.Cancel();
}
Where, in your case, update the record.Add uses accordingly
However, since you have a class, therefore, you should try to use SpecificRecord, rather than serializing back and forth between Avro and a .NET class via a GenericRecord. See the README section on the AvroGen tool for examples of this
1. I'm not aware of an alternative .NET library
Below are the steps I did to solve the problem using the suggestion from #cricket_007.
To avoid the complexity of writing the avro schema, create the c# classes first then use AvroSerializer to generate schema.
AvroSerializer.Create().WriterSchema.ToString()
This will generate the schema json for the class.
Move it to a schema file and
Make all the types to have nulls as Required
Then used avro_gen.exe tool to regenerate class files which implements ISpecific Record.
Add used the below code to publish to queue
using (var serdeProvider = new AvroSerdeProvider(avroConfig))
using (var producer = new Producer<string, MYClass>(producerConfig,
serdeProvider.GetSerializerGenerator<string>(),
serdeProvider.GetSerializerGenerator<MYClass>()))
{
Console.WriteLine($"{producer.Name} producing on
{_appSettings.PullListKafka.Topic}.");
producer.ProduceAsync(_appSettings.PullListKafka.Topic, new
Message<string, MYClass> { Key = Guid.NewGuid().ToString(), Value = MYClassObject})
.ContinueWith(task => task.IsFaulted
? $"error producing message: {task.Exception.Message}"
: $"produced to: {task.Result.TopicPartitionOffset}");
}
some links to help do this.
https://shanidgafur.github.io/blog/apache-avro-on-dotnet
https://github.com/SidShetye/HelloAvro/tree/master/Avro

How to use Dapper with Linq

I'm trying to convert from Entity Framework to Dapper to hopefully improve data access performance.
The queries I use are in the form of predicates like so Expression<Func<TModel, bool>>.
To give an example:
I have the following code which I need to convert to using Dapper.
What I currently do:
public async Task<List<TModel>> Get(Expression<Func<TModel, bool>> query)
{
// this.Context is of type DbContext
return await this.Context.Set<TModel>().Where(query).ToListAsync();
}
What I'd like to do:
public async Task<List<TModel>> Get(Expression<Func<TModel, bool>> query)
{
using (IDbConnection cn = this.GetConnection)
{
return await cn.QueryAsync<TModel>(query);
}
}
My google-fu is failing me, can someone please assist.
Edit:
Note that I did find:
https://github.com/ryanwatson/Dapper.Extensions.Linq
but I can't seem to figure out how to use it.
Firstly, one of the authors of Dapper said, when someone asked
Is there a plan to make Dapper.net compatible with IQueryable interfaces?
that
there are no plans to do this. It is far far outside what dapper tries to do. So far that I would say it is antithetical. Dapper core tries to be the friend to those who love their SQL.
(see https://stackoverflow.com/a/27588877/3813189).
In a way, that does suggest that the various extension packages to NuGet may help, as you have suggested.
I have tried DapperExtensions, which makes writing the query filters in a programmatic way a little easier - eg.
using System.Data.SqlClient;
using DapperExtensions;
namespace StackOverflowAnswer
{
class Program
{
static void Main(string[] args)
{
using (var cn = new SqlConnection("Server=.;Database=NORTHWND;Trusted_Connection=True;"))
{
var list = cn.GetList<Products>(
Predicates.Field<Products>(f => f.Discontinued, Operator.Eq, false)
);
}
}
class Products
{
public int ProductId { get; set; }
public string ProductName { get; set; }
public bool Discontinued { get; set; }
}
}
}
I also tried Dapper.Extensions.Linq (the package you suggested), which promises to
builds on this providing advanced DB access through Linq queries. The fluid configuration makes setup simplistic and quick.
Unfortunately, I also couldn't get very far with it. There isn't much documentation and the tests don't seem to cover the QueryBuilder, which is what appears to be the class to use to translate Linq Expressions into the Dapper Extensions predicates (as suggested by the issue Parsing boolean expressions with the QueryBuilder). I tried the following, which required add the IEntity interface to my DTO -
using System;
using System.Data.SqlClient;
using System.Linq.Expressions;
using Dapper.Extensions.Linq.Builder;
using Dapper.Extensions.Linq.Core;
using DapperExtensions;
namespace StackOverflowAnswer
{
class Program
{
static void Main(string[] args)
{
using (var cn = new SqlConnection("Server=.;Database=NORTHWND;Trusted_Connection=True;"))
{
Expression<Func<Products, bool>> filter = p => !p.Discontinued;
var queryFilter = QueryBuilder<Products>.FromExpression(filter);
var list = cn.GetList<Products>(
queryFilter
);
}
}
class Products : IEntity
{
public int ProductId { get; set; }
public string ProductName { get; set; }
public bool Discontinued { get; set; }
}
}
}
.. but it failed at runtime with the error
Operator was not found for StackOverflowAnswer.Program+Products
I'm not sure why generating the Predicate manually (the first example) works but the QueryBuilder doesn't..
I would say that it's increasingly looking like the comments left on your question are correct, that you will need to re-work your code away from the expressions that you used with Entity Framework. Since it's been so difficult to find any information about this QueryBuilder class, I would be concerned that (even if you did get it working) any issues that you encountered would be difficult to get help for (and bugs may go unfixed).
I wrote a utility to work EF with Dapper using attributes. I parsing predicate and translate to SQL.
"Users" POCO:
[Table("Users")]
public class User
{
[Key]
[Identity]
public int Id { get; set; }
public string Login { get; set;}
[Column("FName")]
public string FirstName { get; set; }
[Column("LName")]
public string LastName { get; set; }
public string Email { get; set; }
[NotMapped]
public string FullName
{
get
{
return string.Format("{0} {1}", FirstName, LastName);
}
}
}
And simple query:
using (var cn = new SqlConnection("..."))
{
var usersRepository = new DapperRepository<User>(cn)
var allUsers = await userRepository.FindAllAsync(x => x.AccountId == 3 && x.Status != UserStatus.Deleted);
}
Maybe it will be useful to you?
MicroOrm.Dapper.Repositories

Design considerations for an RSS client

I need to develop a RSS client using c# and I wonder how any RSS client stores what the user read or not.
The simple answer is to store all the feeds of each url and mark whether the user read it or not.
So I need to know how other RSS clients manage the feeds state from being read or not from the user. do they store all the feeds from all the urls or not
Also, I need to know if there are any .net Library for client using pubsubhubbub protocol
For example,
If I subscribe for CNN feeds , the application will load the current CNN feeds then I make it read. After while , I open the client , I should find all the feeds that I read is marked as read.
So this means , that the client will store - for example in its database - all the links of the CNN feeds and save for each link its status whether it is read or not
my question is , is there another way to track the feeds is read or not instead of saving all the feeds of all the sites on DB which will lead to huge database
Welcome to StackOverflow :D
Representing RSS feeds
You could use the following types to represent feeds and articles :
using System;
using System.Collections.Generic;
using System.Linq;
public abstract class RssItem
{
public virtual bool IsRead { get; set; }
public string Name { get; set; }
public string Url { get; set; }
}
public class RssFeed : RssItem
{
public List<RssFeedArticle> Articles { get; set; }
public override bool IsRead
{
get { return Articles.All(s => s.IsRead); }
set { Articles.ForEach(s => s.IsRead = true); }
}
}
public class RssFeedArticle : RssItem
{
public string Content { get; set; }
}
This is really a simple representation, feel free to enhance it.
Basically when you set feed.IsRead = true; all articles will be marked as read, if you query the value it will return true only if all the articles have been read.
Example :
var article1 = new RssFeedArticle {Name = "article1", Content = "content1"};
var article2 = new RssFeedArticle {Name = "article2", Content = "content2"};
var feed = new RssFeed
{
Name = "cool feed",
Articles =
new List<RssFeedArticle>(new[]
{
article1,
article2
})
};
article1.IsRead = true;
feed.IsRead = true;
Storing your data
A common approach is to store your application data is in ApplicationData folder or in My Documents.
The advantage of using My Documents is that the user will generally backup this folder, not necessarily the case of ApplicationData for which novice users probably don't even know its existence.
Example for retrieving your application folder :
using System.IO;
private void Test()
{
string applicationFolder = GetApplicationFolder();
}
private static string GetApplicationFolder()
{
var applicationName = "MyCoolRssReader";
string folderPath = Environment.GetFolderPath(Environment.SpecialFolder.LocalApplicationData);
string applicationFolder = Path.Combine(folderPath, applicationName);
bool exists = Directory.Exists(applicationFolder);
if (!exists)
{
Directory.CreateDirectory(applicationFolder);
}
return applicationFolder;
}
If you prefer My Documents instead :
string folderPath = Environment.GetFolderPath(Environment.SpecialFolder.MyDocuments);
Here's some explanation/advice from a Microsoft developer :
http://blogs.msdn.com/b/patricka/archive/2010/03/18/where-should-i-store-my-data-and-configuration-files-if-i-target-multiple-os-versions.aspx
pubsubhubbub
There's a library for C# : https://code.google.com/p/pubsubhubbub-publisherclient-csharp/
(from https://code.google.com/p/pubsubhubbub/wiki/PublisherClients)
If you are satisfied with my answer then do not forget to accept it; if you still have some interrogations update your question and I'll try to address them.

Categories