Goal: I have a list of IDs in a MSSQL DB that I want to send to an API (NPI Registry API) then parse the returned JSON to
ingest certain elements back into the DB.
Ask: I am comfortable in t-sql inside of SSMS but very much a beginner anywhere outside, I'm hoping this exercise will be a good foray into using other web interfaces. I don't want anyone to explain exactly how to do this but I was hoping someone could outline the most efficient route so I can tackle the rest myself. I am hoping to do this through VS17 but am willing to go outside of VS if that's a better solution.
These are the steps you should generally follow:
Read the Rows
Here you can use ADO.NET simple queries or any framework like Entity Framework or Dapper.NET
Make your API calls
Here you can use something like this.
It may vary a little bit depending on how you will use the IDs retrieved from the DB:
:
var companies = db.Companies.ToList(); //supposing db is you DbContext on EF
var myTypesFromApi = new List<MyTypeT>();
companies.ForEach(async company =>
{
try
{
using (var client = new HttpClient())
{
var res = await client.GetStringAsync("http://myapi.com/v1/");
myTypesFromApi.Add(JsonConvert.DeserializeObject<MyTypeT>(res));
}
}
catch (Exception e)
{
//catch properly your exception here
}
});
Send it back to the DB
Related
I've read microsoft article about resource based authorization with IAuthorizatinService, but it allows to autorize only one resource. For example i have a User class and File class. File has an owner and can be public or not, so the file can be viewed only if its public or the user is owner of this file. I need to display a list of all files for different users, so each user will see all public files and all owned files.
In authorization handler i have this:
protected override Task HandleRequirementAsync(AuthorizationHandlerContext context,
OwnerOrPublicRequirement requirement,
File resource)
{
if (resource.IsPublic || context.User.Identity?.Name == resource.Owner)
{
context.Succeed(requirement);
}
return Task.CompletedTask;
}
Then in controller i had to do like this:
List<File> authorizedFiles = new List<File>();
foreach (var file in _dbContext.Files)
{
var result = await _authorizationService
.AuthorizeAsync(User, file, new OwnerOrPublicRequirement());
if (result.Success)
{
authorizedFiles.Add(file);
}
}
But it looks ugly cause i have to load all the files from DB and then filter them one by one. What if i have like millions of files and most of them are nor public not owned by user? I will not be able to load all of them and filter like this due to out of memory. I can rewrite it to LINQ query and let DB will do all the job:
var authorizedFiles = _dbContext.Files
.Select(f => f)
.Where(f.IsPublic || f.User.Identity?.Name == f.Owner)
.ToList();
But then i will have two places with code that does same thing, so whenever i need to change authorization logic i have to fix two different parts of code. So what will be the propper way of doing this?
Don't use the custom authorization provider too much extra cost and complexity.
Have one place to get the list of files and let the database do the heavy work of filtering and sorting by filename.
Death by a thousand cuts of having to know dozens/hundreds of special features of the ASP.NET framework costs. Each special knowledge item costs minutes per year to support for you and future developers and adds risk to the project.
Combined together, hundreds of small extra features/specialized knowledge needed, will add man days (months?) to the cost of keeping your production system alive and enhancing it. Microsoft seemed to forget the keep it simple and keeps adding dozens of specialized knowledge needed features with each new version of ASP.NET.
A developer should be able to read the application main program, then trace how each piece of code in the entire application code base is called without needing to know internals/extensibility hell micro-trivia of the ASP.NET framework.
Im using the new neo4j Client from https://www.nuget.org/packages/Neo4jClient/4.0.0.1-prerelease, now I would like to use parameters like from https://github.com/Readify/Neo4jClient/wiki/cypher-examples
in my C# App and try to save a new node Person with:
private async Task CreatePerson(IGraphClient client, params Person[] persons)
{
client.Cypher
.Unwind(persons, "person")
.Merge("(p:Person { Id: person.Id })")
.OnCreate()
.Set("p = person")
.Return(person => person.As<Person>());
}
I can run the Query but I didnt receive any Data and Im not getting any Error, what Im missing here?
Thank you and best regards
UPDATE: Added Return Statement
You are executing your query with the ExecuteWithoutResultsAsync method, which does not return results (and your code does not attempt to handle a result from the method anyway).
To get results, your generated Cypher must RETURN a result, and you need to use the Results method to get the results of the query. In addition, your code should actually use the result in some way.
Refer to the documentation for more details.
Background
I have a central database my MVC EF web app interacts with following best practices. Here is the offending code:
// GET: HomePage
public ActionResult Index()
{
using (var db = new MyDbContext())
{
return View(new CustomViewModel()
{
ListOfStuff = db.TableOfStuff
.Where(x => x.Approved)
.OrderBy(x => x.Title)
.ToList()
});
}
}
I also modify the data in this database's table manually completely outside the web app.
I am not keeping an instance of the DbContext around any longer than is necessary to get the data I need. A new one is constructed per-request.
Problem
The problem I am having is if I delete a row or modify any data from this table manually outside the web app, the data being served by the above code does not reflect these changes.
The only way to get these manual edits of the data to be picked up by the above code is to either restart the web app, or use the web app to make a modification to the database that calls SaveChanges.
Log Results
After logging the query being executed and doing some manual tests there is nothing wrong with the query being generated that would make it return bad data.
However, in logging I saw a confusing line in the query completion times. The first query on app start-up:
-- Completed in 86 ms with result: CachingReader
Then any subsequent queries had the following completion time:
-- Completed in 0 ms with result: CachingReader
What is this CachingReader and how do I disable this?
Culprit
I discovered the error was introduced elsewhere in my web app as something that replaced the underlying DbProviderServices to provide caching, more specifically I am using MVCForum which uses EF Cache.
This forum's CachingConfiguration uses the default CachingPolicy which caches everything unless otherwise interacted with through the EF which was the exact behavior I was observing. More Info
Solution
I provided my own custom CachingPolicy that does not allow caching on entities where this behavior is undesirable.
public class CustomCachingPolicy : CachingPolicy
{
protected override bool CanBeCached(ReadOnlyCollection<EntitySetBase> affectedEntitySets, string sql, IEnumerable<KeyValuePair<string, object>> parameters)
{
foreach (var entitySet in affectedEntitySets)
{
var table = entitySet.Name.ToLower();
if (table.StartsWith("si_") ||
table.StartsWith("dft_") ||
table.StartsWith("tt_"))
return false;
}
return base.CanBeCached(affectedEntitySets, sql, parameters);
}
}
With this in place, the database logging now always shows:
-- Completed in 86 ms with result: SqlDataReader
Thanks everyone!
I have a idea for a different data-layer. I want to load balance between different sql servers. To do so I have the following setup in mind:
When the webapplication does a SQL request, the "Application Proxy" checks if it is a SELECT statement or not. When it is a SELECT statement the "Application Proxy" sends the SELECT statement to 1 server. When it is not a SELECT statement it sends the request to all servers.
Now I know the idea in this state will not work and I have to solve a lot of different problems to get it to work (and yes, there are solutions for this already). But for now, the biggest startup problem is the intergration with Entity Framework.
What I want is to wrap a DbContext in my own class so I can intercept the messages and send them myself. So something like this:
public class MyDbContext : DbContext
{
public override string DoCallToServer(string sqlrequest)
{
if (sqlrequest.ToLower().StartsWith("select"))
{
return MyEngine.CallAll(sqlrequest);
}
else
{
return MyEngine.CallOne(sqlrequest);
}
}
}
Is this possible?
I've searched the internet, but I could not find anything.
Here is a good exmaple for interceptors:
http://www.entityframeworktutorial.net/entityframework6/database-command-interception.aspx
another excellent example which is very good but unfortunately only in Germans:
https://entwickler.de/online/entity-framework-6-1-neue-interceptors-und-mehr-161541.html
Please excuse me if this is a duplicate question - I have searched but have not found anything that explains my problem.
I created an ASP.NET website. I have 3 layers - Data Access, Business, Presentation. I used shared entities throughout using Entity Framework.
Presentation: UI
Business: WCF Service
Data Access: code to connect to DB (contains the EF context, connection string)
Entities: .edmx - EF Entities. These are shared throughout all 3 layers.
I have come upon some odd behavior in the WCF service which I cannot understand. My original code had a function in the DAL which queried the db for all customers, and returned a List. The service than further queried this list based on what the UI requested and returned a List to the Presentation Layer.
DAL:
public List<Customer> GetCustomers()
{
List<Customer> custList= new List<Customer>();
try
{
using (NorthWindsEntities context = new NorthWindsEntities(connectionString))
{
custList= context.Customers
.Include("Orders")
.Include("OrderDetails")
.ToList();
}
return custList;
}
catch (Exception ex)
{
//handle exception
return custList;
}
}
WCF:
public List<Customer> GetCustomersByState(string state)
{
contextManager contextDAL = new contextManager();
List<Customer> custList = contextDAL.GetCustomers()
.Where(c => c.State == state)
.ToList();
return custList;
}
When I debugged the code, eveything worked fine, but when the service tried to return the custList to the client, I got this error: An error occurred while receiving the HTTP response to localhost/WCF/MyService. This could be due to the service endpoint binding not using the HTTP protocol. This could also be due to an HTTP request context being aborted by the server (possibly due to the service shutting down). See server logs for more details.
I changed my DAL to inclued the entire query and my WCF Service to this:
public List<customer> GetCustomersByState(string state)
{
contextManager contextDAL = new contextManager();
List<Customer> custList = contextDAL.GetCustomers(state)
return custList;
}
This worked fine. I also tried to put the entire query in the service (the service connected directly to the context and got the list) and that worked fine as well.
So basically, my question is, why can't the WCF service return a list of objetcs that were queried using Linq to Objects (the DAL returns a List of customer so all further querying is linq to objects). It works fine using Linq to Entities (with all the code in the service) and it works fine with no querying at all - just getting a list and returning it.
I'm sorry for the lengthy post, I tried to include all necessary details...Thanks!
After much trial and error, I have found that it is not a querying issue, it is a size issue. Though the WCF service was filtering my data to only return a few records, I guess it retained some sort of reference to the rest of the data because it errored with too much data. If I do a partial filter on the DAL and then continue to filter on WCF layer, coming out the same amount of records as I originally tried to return, it returns the list without a problem.
I cannot explain why this happens, I don't know much about WCF, just explaining what I did to get around it in case anyone else has this problem.