I'm writing a application where the user can write json-code and store that json code with an Id and a Collection. In other words, they specify an Id, a Collection (string; [a-zA-Z0-9]) and a Data (json, can be anything that is valid json).
Up til now I've been using RavenDb for this, cause I thought a document-database would be perfect, but I've had some problems with querying.
One of the objects that needs to be stored and queried is the following:
{
"network": "some_network",
"names": ["name1","name2"],
"data": {"values":[],"keys":[]}
}
This object should be stored with some Id that is either specified, or auto-generated (if null is given), and a Collection (must always be specified), and then I need to be able to query it based on Collection, network and a single name.
For instance, I have the code query('users', '{"network":"some_network","names":"name1"}'), and I need that code to return this object (and any other object that matches it).
Also, I'm ok with changing database, but the database needs to be able to run in-process (self-hosted), and to be able to run without admin-rights without installation (in other words, it can't bind to hostname/ip like wcf does).
How can I achieve something like this?
I found the answer to this:
public string Query(string dynIndexName, string query)
{
using (var session = store.OpenSession())
{
var q = new IndexQuery();
q.Query = query;
var result = session.Advanced.DatabaseCommands.Query("dynamic/" + dynIndexName, q, new string[0]);
return "[" + String.Join(",", result.Results.Select(r => r.ToString())) + "]";
}
}
Before calling the Query-method I convert the json-query-object into a Lucene-query that looks like this: (network:"some_network" AND names:"name1"), and use that as a query-parameter to the database. The whole class for storing and retrieving can be found here: https://github.com/Alxandr/RunJS/blob/master/src/AddIns/Storage/StorageMbro.cs
Related
My task is to create a asp.net API in which:
User posts XML to my APi endpoint
API Passes the whole of the XML directly to a stored procedure
Stored procedure returns some XML back to the API for the user
Example call to api:
https://localhost:44308/api/OrderProcessing?<?xml version="1.0"encoding="utf-8"?><Param1>abc123</Param1><Param2>5</Param2><Param3>123456</Param3>etc.. etc
I've found many examples of calling a procedure from asp.net and all of them show the XML nodes being parsed into parameters to pass to the stored procedure:
usp_myproc #param1 =abc123, #param2 =5, #param3 =123456 etc.
I can't find out how to pass the xml as a single string as shown below EG
usp_myproc #param1 =xml version="1.0"encoding="utf-8"?><Param1>abc123</Param1><Param2>5</Param2><Param3>123456</Param3>etc.. etc
So far I have the following code which, which of course just passes the first parameter, eg. usp_myproc #param1 not the whole string
namespace OrderProcessingApp.Controllers
{
public class OrderProcessingController : ApiController
{
[HttpGet]
public IHttpActionResult StockOrder(string param1)
{
StockFinderEntities sd = new StockFinderEntities();
var results = sd.fn_ProcessOrder(param1).ToList();
}
}
}
Any help very gratefully received
There's a lot to unpack here and quite some information missing, see also my comments on your question. There are answerable parts in your question though, but things are far from optimal.
If I get this straight, you want this:
An API endpoint that accepts GET requests with the query string comprising entirely of an XML string.
In this action method, pass the XML string to a SQL user defined function.
This function parses the XML, does its thing, and returns zero or more records.
These records are then returned as a list to the caller.
So first things first, you've got to read that XML. Your query string appears to have no keys (it directly starts with ?<xml...>). Is this intentional, or an oversight in creating this question? Are you really sure you want to read XML from the query string, or can you change this design?
If you can change it, the idiomatic way would be to use query string parameters, which are then bound to your action method parameters:
// GET .../StockOrder?param1=42¶m2=Foo¶m3=Bar
public IHttpActionResult StockOrder(string param1, string param2, string param3) { ... }
Assuming you really want to read the raw query string as XML, as opposed to a single parameter:
// GET .../StockOrder?xmlString=<xml...>
public IHttpActionResult StockOrder(string xmlString) { ... }
Then you need no parameters at all:
// GET .../StockOrder?<xml...>
public IHttpActionResult StockOrder()
{
string xmlString = this.Request.RequestUri.Uri.Query.Value;
...
}
So now you have the XML string, and you'll pass it to a user defined function in your database. While Entity Framework will pack it up in a SQL parameter, at least you'll be safe from SQL injection here:
// GET .../StockOrder?<xml...>
public IHttpActionResult StockOrder()
{
string xmlString = this.Request.RequestUri.Uri.Query.Value;
if (xmlString?.Length > 0)
{
// Remove the question mark from the start
xmlString = xmlString.Substring(1);
}
var results = sd.fn_ProcessOrder(xmlString).ToList();
...
}
But this is still far from optimal! The caller could pass invalid XML, which would only be detected by SQL Server, which is way too late in the process. Or worse, it could be a specially crafted XML that SQL Server would choke on (like an XML bomb), or worse (executing arbitrary code contained within the XML). So I would not be a fan of sending user input in the form of XML directly to a SQL Server, no.
The optimal way would look like this:
public class GetStockOrderModel
{
[Required]
public string Param1 { get; set; }
[Required]
public string Param2 { get; set; }
[Required]
public string Param3 { get; set; }
}
// GET .../StockOrder?param1=42¶m2=Foo¶m3=Bar
public IHttpActionResult StockOrder([FromUri] GetStockOrderModel model)
{
if (!ModelState.IsValid)
{
// ...
}
var records = sd.fn_ProcessOrder(model.Param1, model.Param2, model.Param3).ToList();
var models = records.Select(o => new
{
FieldToReturn = o.Field1,
OtherField = o.Field2,
// ...
}).ToList();
return Ok(models);
}
Here, you have a model to bind to and perform validation, the database function is altered to accept actual parameters instead of XML, and a model decoupled from the database is returned.
Now if you say you cannot alter the function, because other applications call it, you might still be able to. The function you call exists of two parts: one that reads the parameters from the XML, and the other that executes the query using those parameters.
So you keep the original function, and let it call the new one once it extracted the parameters. You can then also call that new function, using the parameters you have in code.
And really, you don't want to pass user XML to your database server. Sure, it'll only be called by trusted parties, you'll say. But is everyone at those parties to be trusted? And don't they ever make mistakes?
So if you really don't want to change anything about the endpoint nor the function, and you're going to use that fourth (or second) code block, at least read the XML string in your controller, parse it, extract the parameters, validate them, and rebuild the XML from a safe template string, or something like that.
Because to ASP.NET, that XML is treated as a plain old string, so you have none of its validation or security measures.
Question
We have a Dapper Row as a result from a Dapper Query, which I want to store as a json string in our database. Unfortunately, I can't seem to get that to work.
So, let's start with some background information.
Background info
We're doing a project where we extract Table Names from a table, to know which tables we have to address. We also want this to be as flexible as possible, so we decided to not use a particular POCO for our data.
We're using SQL Server 2014, so unfortunately we don't have the option 'FOR JSON' yet.
Code
Our code looks something like this, where GetData is our actual query:
var data = _queryHandler.Handle(new GetData(tableName.ToString(), 0), database);
The Handle technically just connects to the Database, calling
conn.QueryAsync(query, parameters)
GetData looks like this (simplified):
EXEC ('SELECT * FROM ' + #table + ')'
Reasoning
Because the table name differs each time, we don't want to force a POCO on the output. Sometimes it's a user, other times a role, so to say, so there's no prediction what output it returns.
Results
This works fine. We can extract the data in our variable and this looks like it's an IEnumerable, which should be fine. I gather we can just read them in a loop and extract the rows. So far, no problem.
The issue at hand
Next thing we want to do is to convert the data from said DapperRow to a json string, but I cannot seem to get the data to behave like a json string as JsonConvert.SerializeObject fails miserably. The DapperRow looks like this (again, simplified).
{{DapperRow, Id = '07501399-b385-4d8e-bacc-gad9d04c35f7', UserName = 'test8', ApplicationId = '4721fafb-12e6-4e3c-9298-etd82d18a0cb', IsApproved = 'True', IsLockedOut = 'False', CreateDate = '26-3-2019 07:52:55' }}
I've already looked into things like the SqlMapper.ITypeHandler, but I'm still not getting there. For instance with the TypeHandler, I get stuck on the destinationType as I don't want a particular type - well, except for a json object. But that's not an accepted type.
public class JsonObjectTypeHandler : SqlMapper.ITypeHandler
{
public void SetValue(IDbDataParameter parameter, object value)
{
parameter.Value = (value == null)
? (object)DBNull.Value
: JsonConvert.SerializeObject(value);
parameter.DbType = DbType.String;
}
public object Parse(Type destinationType, object value)
{
return JsonConvert.DeserializeObject(value.ToString(), destinationType);
}
}
The only other thing that crosses my mind is to extract each and every column and building a type object out of it, but as I said, we don't want to use a model/type for the data as we want to keep it flexible.
Could anyone point me in the right direction? I have the feeling I seem to overlook something simple.
If you are using the non-typed Query API, each returned row is also an IDictionary<string,object> (in addition to the dynamic API), which usually works fine with JsonConvert; for example, the following works OK for me:
var tables = (from row in await conn.QueryAsync("select top 5 * from sys.tables")
select (IDictionary<string, object>)row).AsList();
var json = JsonConvert.SerializeObject(tables, Formatting.Indented);
System.Console.WriteLine(json);
outputting:
[
{
"name": "spt_fallback_db",
"object_id": 117575457,
"principal_id": null,
"schema_id": 1,
"parent_object_id": 0,
"type": "U ",
"type_desc": "USER_TABLE",
"create_date": "2003-04-08T09:18:01.557",
"modify_date": "2017-08-22T19:40:40.763",
"is_ms_shipped": true,
"is_published": false,
"is_schema_published": false,
"lob_data_space_id": 0,
"filestream_data_space_id": null,
"max_column_id_used": 8,
... etc
When you deal with JSON, it's really easy to make C# models. You either Paste special in Visual Studio, or you use one of the many available tools online.
ElasticSearch responses obviously are JSON, meaning, if you can get the responding JSON, you're good to go. However, if you just have a connection-string and simply want to "map" all the ElasticSearch objects into your C# code - how do you do it?
My question:
Is there a way to see all fields/data in an ElasticSearch instance, and then easily get the JSON, so you can get strongly typed models?
You can query elasticsearch for mappings. Mapping will contain all information you need to build model in C# (but you will still have to build it by hand I think). Example using database from your previous question:
var settings = new ConnectionSettings(new Uri("http://distribution.virk.dk/cvr-permanent"));
var client = new ElasticClient(settings);
// get mappings for all indexes and types
var mappings = client.GetMapping<JObject>(c => c.AllIndices().AllTypes());
foreach (var indexMapping in mappings.Indices) {
Console.WriteLine($"Index {indexMapping.Key.Name}"); // index name
foreach (var typeMapping in indexMapping.Value.Mappings) {
Console.WriteLine($"Type {typeMapping.Key.Name}"); // type name
foreach (var property in typeMapping.Value.Properties) {
// property name and type. There might be more useful info, check other properties of `typeMapping`
Console.WriteLine(property.Key.Name + ": " + property.Value.Type);
// some properties are themselves objects, so you need to go deeper
var subProperties = (property.Value as ObjectProperty)?.Properties;
if (subProperties != null) {
// here you can build recursive function to get also sub-properties
}
}
}
}
I've looked around and cannot figure out how to get a different user by their user name, something tells me this is not possible am I correct? I want the user to be capable of using Game Center or Google Play and not just Facebook as an id. That means I can register anyone I want as a user and then find their friends (without having to use FB). Should I just make a separate Table to store this info?
I have attached code below for what should work, can anyone give some insight in either my code or on why I cannot get "bob". I have tested this code against another table and I can get the rows.
The code falls through properly and I do get to the foreach but there are no results to iterate through
I have also tried "_User" but get
ArgumentException: Use the class-specific query properties for class _User
Parameter name: className
Any help here would be great I would like to avoid making another Table just for searching and doing relationships to other tables.
private void getParseObjectTest()
{
var query = ParseObject.GetQuery("User").WhereEqualTo("username", "bob");
query.FindAsync().ContinueWith(user =>
{
if (user.IsCanceled || user.IsFaulted)
{
Debug.Log("Can't get the user.... " + user.Exception.Message);
}
else
{
Debug.Log("Return was success time to iterate");
IEnumerable<ParseObject> results = user.Result;
foreach (var result in results)
{
string str = result.Get<string>("email");
Debug.Log("email: " + str);
}
}
});
}
In Unity you must use ParseUser.Query instead of ParseObject.Query when querying users.
https://parse.com/docs/unity_guide#users-querying
And if you can't do what wbdev said (because there is another bug triggered by using generic ParseQuery objects), you can create the non-generic query with new ParseQuery<ParseObject>("_User").
UPDATE 18 Sep 2013
It looks like there isn't an easy way to do this. I'm holding out for a solution that involves some extension to Entity Framework.
If you'd like to see these features in Entity Framework, vote for them on the user voice site, perhaps here and here
There are several similar questions on SO but I can't find a question new and similar enough to have the answer I'm looking for.
If this looks like information overload, jump down to In Summary.
Background
I'm writing a WebApi REST service to expose some pre-existing data through an OData end point. I'm using the EntitySetContoller<TEntity, TKey> to do all the grunt work for me. As well as the standard OData parameters, that are routed and translated by the base class, I've added some custom parameters, to allow specific functionality for my controller.
My database server is MS SQL Server with a full text index on the [BigText] NVarChar[4000] column of the [SomeEntity] table.
I have one limitation, I must use a Code First model.
// Model POCO
public class SomeEntity
{
public int Id { get; set; }
public string BigText { get; set; }
}
// Simple Controller
public class SomeEntityController : EntitySetController<SomeEntity, int>
{
private readonly SomeDbContext context = new SomeDbContext();
public override IQueryable<SomeEntity> Get()
{
var parameters = Request.GetQueryNameValuePairs()
.ToDictionary(p => p.Key, p => p.Value);
if (parameters.ContainsKey("BigTextContains")
(
var searchTerms = parameters["BigTextContains"];
// return something special ...
)
return this.context.SomeEntities;
}
// ... The rest is omitted for brevity.
}
The Problem
How to implement the // return something special ... part of my example?
Obviously, the niave
return this.context.SomeEntities.Where(e =>
e.BigText.Contains(searchTerm));
is completely wrong, it composes to a WHERE clause like
[BigText] LIKE '%' + #searchTerm + '%'
This doesn't use Full Text Searching so, doesn't support complex search terms and otherwise, performs terribley.
This approach,
return this.context.SomeEntities.SqlQuery(
"SELECT E.* FROM [dbo].[SomeEntity] E " +
"JOIN CONTAINSTABLE([SomeEntity], [BigText], #searchTerm) FTS " +
" ON FTS.[Key] = E.[Id]",
new object[] { new SqlParameter("#searchTerm", searchTerm) })
.AsQueryable();
Looks promising, it actually uses Full Text Searching, and is quite functional. However, you'll note that DbSqlQuery, the type returned from the SqlQuery function does not implement IQueryable. Here, it is coerced to the right return type with the AsQueryable() extension but, this breaks the "chain of composition". The only statement that will be performed on the server is the one specified in the code above. Any additional clauses, specified on the OData URL will be serviced on the API hosting web server, without benefitting from the indices and specialised set based functionality of the database engine.
In Summary
What is the most expedient way of accessing MS SQL Server's Full Text Search CONTAINSTABLE function with an Entity Framework 5 Code First model and acquiring a "composable" result?
Do I need to write my own IQueryProvider? Can I extend EF in some way?
I don't want to use Lucene.Net, I don't want to use a Database Generated Model. Perhaps I could add extra packages or wait for EF6, would that help?
It is not perfect, but you can accomplish what you are after with 2 calls to the database.
The first call would retrieve a list of matching key's from CONTAINSTABLE and then the second call would be your composable query utilizing the IDs that you returned from the first call.
//Get the Keys from the FTS
var ids = context.Database.SqlQuery<int>(
"Select [KEY] from CONTAINSTABLE([SomeEntity], [BigText], #searchTerm)",
new object[] { new SqlParameter("#searchTerm", searchTerm) });
//Use the IDs as an initial filter on the query
var composablequery = context.SomeEntities.Where(d => ids.Contains(d.Id));
//add on whatever other parameters were captured to the 'composablequery' variable
composablequery = composablequery.Where(.....)
I had this same issue recently:
EF 5 Code First FTS Queriable
Let me extend that post.
Your first option was mine first as well - using SqlQuery
I also needed to do more filtering, so instead of always writing full sql I used QueryBuilder, to which I made some changes and added more functions to fit my needs(I could upload it somewhere if needed):
QueryBuilder
After I have found another idea which I implemented.
Someone already mention it here, and that is to use SqlQuery that will return HashSet of Ids and that you can use it in EF queries with Contains.
This is better but not most optimal since you need 2 queries and Id list in memory.
Example:
public IQueryable<Company> FullTextSearchCompaniesByName(int limit, int offset, string input, Guid accountingBureauId, string orderByColumn)
{
FtsQueryBuilder ftsQueryBuilder = new FtsQueryBuilder();
ftsQueryBuilder.Input = FtsQueryBuilder.FormatQuery(input);
ftsQueryBuilder.TableName = FtsQueryBuilder.GetTableName<Company>();
ftsQueryBuilder.OrderByTable = ftsQueryBuilder.TableName;
ftsQueryBuilder.OrderByColumn = orderByColumn;
ftsQueryBuilder.Columns.Add("CompanyId");
if (accountingBureauId != null && accountingBureauId != Guid.Empty)
ftsQueryBuilder.AddConditionQuery<Guid>(Condition.And, "" , #"dbo.""Company"".""AccountingBureauId""", Operator.Equals, accountingBureauId, "AccountingBureauId", "");
ftsQueryBuilder.AddConditionQuery<bool>(Condition.And, "", #"dbo.""Company"".""Deleted""", Operator.Equals, false, "Deleted", "");
var companiesQuery = ftsQueryBuilder.BuildAndExecuteFtsQuery<Guid>(Context, limit, offset, "Name");
TotalCountQuery = ftsQueryBuilder.Total;
HashSet<Guid> companiesIdSet = new HashSet<Guid>(companiesQuery);
var q = Query().Where(a => companiesIdSet.Contains(a.CompanyId));
return q;
}
However EF 6 now has something called Interceptors that can be used to implement queriable FTS, and it is pretty simple and generic(last post):
EF 6 Interceptors for FTS.
I have tested this and it works fine.
!! REMARK: EF Code First, even with version 6, does not support Custom Stored Procedures.
There are only some for predefined CUD operations if I understood it well:
Code First Insert/Update/Delete Stored Procedure Mapping, so it can't be done with it.
Conclusion: if you can use EF 6 go for third options, is gives all you need.
If you are stucked with EF 5 or less, second option is better then first but not most optimal.