How to use Azure Mobile Services practically (cross-platform client)? - c#

Azure Mobile seems to be very useful having built-in common functions so I don't have to implement them myself. But still I can't understand how can use it if I need something more than the very simple example with ToDoItems. First of all TableController by one hand seems to be very useful 'cause it can provide persistent server features and client notification feautures. By other hand I can't understand how the example can be used for real mobile devices if the ToDoItem class is in the back-end assembly (of course I would like to include it in a mobile application). But if I use some shared assembly that has to be portable how can I implement ITableData if it is not in the portable subset? What is the way to use MobileServiceCollection with CollectionChanged event in some real project? Then the problem is how to implement the logic layer - the persitent mode with DbContext is good, but sometimes I need some more logical features on the server than just a storage. Scheduled jobs seem to don't suite cause I need to invoke some data processing by client data update but not by some schedule.
If somebody knows how to use Azure Mobile Services for a real project please give me some examples/ideas how a portable service layer for mobile applications can interact with Azure Mobile Services backend. For the client applications I use Xamarin tools.

Utilising Xamarin means you can work entirely in C# so you can re-use many concepts you would know from developing C# solutions on non-mobile platforms. For instance, you can share "DTOs" which means you can share code amongst all your platforms. See: http://blog.siliconvalve.com/2013/08/16/portable-azure-mobile-services-dtos-when-using-xamarin-and-c/
I presented at TechEd Australia last year on this and the sample code is available on Github also. A video of the talk and the sample link can be found here: http://blog.siliconvalve.com/2013/09/08/teched-demo-video-available-online/.
At launch mobile services utilised Node.js for server-side functionality (it is still supported) and it is now possible to develop server-side scripts using C# as well. These aren't limited to just database interactions (though these tend to be the examples used). If you look at my sample project you'll see I do some parsing of inbound data to fire off push notifications.
Ultimately there's no easy answer other than to start working with the code (you an run an Azure trial for free for a short period - more than enough to get familiar with the environment).

TableController is only supposed to provide a REST API for one type of entity. At the root, REST is simple. You've got ToDoList, ToDoListItem and maybe for each ToDoListItem you have multiple ToDoListListItemDetail entries. This represents a one-to-many relationship between 3 entities.
Generated Table controllers only deal with one entity so scaffolding will create a ToDoListController for ToDoList entity, a ToDoListItemController for ToDoListItem and a ToDoListItemDetailController for ToDoListItemDetail entity. But all of the entities that you have defined in your web app share the same Context and thus all of them can be queried in any of the controllers. So while by default you can only do GET /tables/ToDoListItem which will give you all ToDoListItems or you can do GET /tables/ToDoListItem/{key} which will give you a specific ToDoListItem matching the key, you cannot get a ToDoListItem that matches a specific ToDoList.
As per REST best practices, such retrieval would be accomplished with GET /tables/ToDoList/{key}/ToDoListItem which would provide all ToDoListItems associated with a specific ToDoList identified by the key. Now this belongs logically to the ToDoListController and in order to expand the controller to get this, you need to implement attribute based routing.
[RoutePrefix("tables/todolist")]
public class ToDoListController : TableController<DataObjects.ToDoList>
{
...
// extended endpoint
// GET tables/todolist/{key}/todolistitem
[Route("{id:guid}/todolistitem")]
public IQueryable<DataObjects.ToDoListItem> GetAllToDoListItemsForToDoList(string id)
{
return from l in Context.ToDoLists
join li in Context.ToDoListItems on l.Id equals li.ToDoListId
where l.Id.Equals(id)
select li;
}
So now using this technique, you can query for whatever you want beyond basic entities provided by scaffolded Table controllers.
Now since you web API back end has to have it DataObject classes inherit from EntityData, you may not be able to reuse them entirely in your Xamarin app. Your Xamarin app layer does not have to implement all data elements of EntityData either - probably only Id and Version. But even if you had to duplicate the definition of your DataObjects (back-end) or DTOs/Models (client) its very small duplication.
If you need to call end points unrelated to table storage, you can invoke custom API as described in the Work with a custom API section of this article (https://learn.microsoft.com/en-us/azure/app-service-mobile/app-service-mobile-dotnet-how-to-use-client-library#work-with-tables).
I am not sure if I answered some of your questions or not so please provide more detail.

Related

Build dynamic user interfaces with Blazor and XML

I've been researching ways to build a Web Client using C# that is Single Page and is generated from XML files.
Essentially, I want to have a service that generates XML files that describe the UI of e.g. forms (not the problem). Those XML files are sent to the client, which in term reads the XML and dynamically creates the layout with all the controls. I had hoped to accomplish this in Blazor WebAssembly (I have also looked at ASP.NET WebForms, MVC and CORE (using DevExpress), but none of those are actually meant for SPA clients).
By comparison: We have an Android app that basically does this, similar to what is described right here: https://developer.ibm.com/tutorials/x-andddyntut/
But this time I am not developing an Android app in Java, this is supposed to be a WebClient. And as most coders in the company have a VB.NET background, my head of department would like for it to use C#. But I have tried finding ways to do something like this and have met lots of dead ends, as usually Blazor appears to be used with static pages from design time. I haven't managed to get it to run with RenderFragments, for example.
Any pointers with this would be very much appreciated!
Sincerely,
MR
You can generate the UI dynamycally using RenderTreeBuilder but most of its behavior is intended for internal use (take RenderTreeFrame for example) and therefore I don't think it is a good idea.
In short, I don't believe what you want to achieve is possible.
If these XMLs don't change often, I would think to create a transpiler that converts these XMLs to Blazor code and then recompile the app.
Not a direct answer to creating Forms dynamically, but a suggested alternative method.
For my application I have a number of services which have different properties but are based on underlying common base class. The services defined in several .NET Standard library for each type. The services are things like VoIP, Broadband, FTTC, Ethernet, Router Orders etc. etc. - not much in common, and very different types of data and behaviours.
The base service class has an abstract method called GetView which returns a C# type which is a Razor Component type. Remember in Blazor all those components are just C# classes. The type returned is a Razor Component in the same library (so we have UI as well as business and entity logic encapsulation).
The parent site loads a specific type of service, calls GetView and binds the service to the resulting Component.
That's pretty complicated to describe but I did a proof-of-concept application for this approach in the early days of Blazor as I realised it was going to be capable of this approach: https://github.com/conficient/BlazorDynamicList
There is also a demo site at https://blazordynamiclist.azurewebsites.net/
I won't explain it all in detail here but it follows a similar approach. There is an abstract base class ProductBase that has an abstract method GetViewComponent. Each product can return its preferred Razor Component to display itself.
The 'magic' is the DynamicComponent.cs which is a Razor Component with a BuildRenderTree method that creates a bound instance of the product's component view.

Is OData a good way to expose non-CRUD APIs?

I'm working on a project that is our companies first foray into Domain Driven Development.
Our Web API originally simply provided CRUD operations and the project exposed OData controllers, but I'm not sure if that is still a good idea.
Is OData a good way to expose non-CRUD APIs?
More info:
Initially our web api basically exposed CRUD functions. To create a new User you would simply create one and post it to the service. To change, for example, an address you would get a copy of the user entity, make changes, then perform an update operation. Basic OData stuff.
Beyond providing query support, OData also exposed the service in a readily consumable way, so it could be added to other projects as a service reference and accessed with a proxy.
Since we have moved over to a DDD approach, things have changed significantly. Our Web API is now simply a gateway to a number of independent sub-domain services. We no longer provide CRUD operations or direct access to entities, instead making service calls to manipulate entities. Instead of creating a User entity sending it to the User service via a Put request, a consumer must generate a CreateUserBindingModel and send it to the User/Create service and let the service generate the entity. Changing an address is done through the ChangeAddress(ChangeAddressBindingModel model) method, rather than just updating the whole object. Queries are much more targeted and rarely if ever return entire domain objects.
Is it a bad idea to keep using OData as a basis for our Web API, when we no longer provide CRUD operations? Is there another way to expose the details of our service the way you can with OData? I know WCF services provide similar functionality, but I was under the impression it was even more tied to CRUD than OData.
OData is a data oriented API spec, it's anti-DDD. Although it can satisfy all your requirements to implement REST APIs but it's best for data processing API. I guess you already know that using OData feels like operating the database via HTTP. If you are using DDD you should forget OData totally.
In OData, actions and functions are a way to add server-side behaviors that are not easily defined as CRUD operations on entities
https://learn.microsoft.com/en-us/aspnet/web-api/overview/odata-support-in-aspnet-web-api/odata-v4/odata-actions-and-functions
https://blogs.msdn.microsoft.com/alexj/2012/02/03/cqrs-with-odata-and-actions/
https://github.com/OData/ODataSamples/blob/master/WebApiCore/ODataActionSample/ODataActionSample/

LINQ to SQL in Silverlight client

Okay, so I have this Silverlight client program. I'm not allowed to use the web project, but I do need to be able to read from an SQL database for my data.
Some internet searching brought me to LINQ to SQL and the System.Data.Linq.DataContext object, as well as SQLMetal.exe. I have created my data context object from the meta data in a remote SQL database and the code looks okay (from what I can tell - all the right names and types seem to be there).
What I wanted to do was add this into Silverlight, but I realised, on importing the code, that you can't use System.Data in a Silverlight application, which sort of rules out having this code in the Silverlight client itself. Now this is annoying because a DataContext quite conveniently comes with properties which are tables and such like (I find those really convenient in Silverlight).
So I can't do it the 'normal way'. I can do it with a WCF service, but... well, here's where I could use some advice. I can create a WCF service with asynch calls, but I'm not really sure in what way to wire up the DataContext object. I mean... I need access to the classes in there within my Silverlight application (for my entities in the database) (and I'm not quite sure how to do that - help there would be appreciated). Then to synchronise it all up? Well, I could use some suggestions. For example, before, I have an exposed ObservableCollection. In its getter, it repopulated itself with the contents of, for example, Context.Customers. This made things nice and easy, but I can't see a way of doing something like that now. If I made a call to an asynch service for every 'get', surely this would be unacceptably slow.
If you could help me pick my way through this, I would be grateful. Thanks.
You definitely need to read about Entity Framewrok and couple articles about using WCF RIA + EF in Silverlight applications.
Hope, that will help you.
As mentioned above, you can use RIA services.
But more importantly... you should never use the classes that are generated in LINQ to SQL or EF in your client application. Your client application should only know about and use Domain Layer objects. Your datacontext and the types it uses should stay buried behind a repository pattern that gives you back Domain types.
Your client, whatever that may be (Silverlight app, WPF app, etc) only needs to (and should only) use these domain types. This is part of the whole separation of concerns and SOLID principles. Because I can guarantee you at some point in the lifecycle of the app you will swap out your source for data, so you may end up using Entity Framework, or NHibernate to talk to the db instead, in that case your client would be hard-wired to your LINQtoSQL types, and you wouldn't be able to swap out the ORM layer without breaking the client.

Designing an API: Use the Data Layer objects or copy/duplicate?

Struggling with this one today.
Rewriting a web-based application; I would like to do this in such a way that:
All transactions go through a web services API (something like http://api.myapplication.com) so that customers can work with their data the same way that we do / everything they can do through our provided web interface they can also do programmatically
A class library serves as a data layer (SQL + Entity Framework), for a couple of design reasons not related to this question
Problem is, if I choose not to expose the Entity Framework objects through the web service, it's a lot of work to re-create "API" versions of the Entity Framework objects and then write all the "proxy" code to copy properties back and forth.
What's the best practice here? Suck it up and create an API model class for each object, or just use the Entity Framework versions?
Any shortcuts here from those of you who have been down this road and dealt with versioning / backwards compatibility, other headaches?
Edit: After feedback, what makes more sense may be:
Data/Service Layer - DLL used by public web interface directly as well as the Web Services API
Web Services API - almost an exact replica of the Service Layer methods / objects, with API-specific objects and proxy code
I would NOT have the website post data through the web services interface for the API. That way leads to potential performance issues of your main website. Never mind that as soon as you deploy a breaking API change you have to redeploy the main website at the same time. There are reasons why you wouldn't want to be forced to do this.
Instead, your website AND web services should both communicate directly to the underlying business/data layer(s).
Next, don't expose the EF objects themselves. The web service interface should be cleaner than this. In other words it should try and simplify the act of working with your backend as much as possible. Will this require a fair amount of effort on your part? yes. However, it will pay dividends when you have to change the model slightly without impacting currently connected clients.
It depends on project complexity and how long you expect it to live. For small, short living projects you can share domain objects across all layer's. But if it's big project, and you expect it to exist, work well, and update for next 5 years....
In my current project (which is big), I first started with shared entities across all layers, then i discovered that I need separate entities for Presentation, and now (6 month's passed) I'm using separate classes for each layer (persistence, service, domain, presentation) and that's not because i'm paranoid or was following some rules, just I couldn't make all work with single set of classes across layers... Make you conclusions..
P.S. There are tools that can help you convert your objects, like Automapper and Value Injecter.
I would just buck up and create an API specifically aimed at the needs of the application. It doesn't make much sense to what amounts to exposing the whole DB layer. Just expose what needs to be exposed in order to make the app work, and nothing else.

What special considerations should I make when creating an object model that will be consumed by a desktop application and web site?

I'm writing a tool in C#.Net that will be used to generate Catalogs of content which users can browse. Initially I am creating a WinForms based interface, but in the future I'd like to be able to create a web based interface as well. So I've been careful to generalize the interface to a Catalog so that it does not depend on a specific UI.
My only experience with web development has been creating my own HTML website back in the early 90's, and I've done a little ASP (not ASP.NET). Now with ASP.NET it seems that I should be able to leverage my existing C#.Net object model, to create a web base interface. But I really hasn't done anything with ASP.NET beyond a simple hello world example.
Are there any special considerations I should make in designing my object model so that later I can create a web interface to it?
Here are few things to follow:
You should package your object model
is separate project (that you need
to do anyway to share it among
different projects) and make sure
that you do not add specific
references to it (for example, don't
add System.Web, WinForms, WPF etc) -
this will automatically avoid any
unwanted dependencies.
Try to have your classes as lean as possible. Avoid classes that track change states etc - in web scenario, tracking state over multiple requests is expensive. So it's best to have to your objects carry data only.
Consider the possibility that your objects may need to be serialized and/or passed over a wire. For example, a middle ware services serving both windows & web client. Or web page storing the object in the view-state.
There really shouldn't be that big a difference.
Be careful about placing too much “intelligence” in your entity classes. That’s a pattern I’ve seen often in Windows apps. Don't make references to controls that are specific to Windows Forms development in the parts of your project that you want to reuse for the web application.
Repository patterns work well with both Windows and Web applications, because you often want to optimize the web apps differently for performance with multiple users.
Your requirement can be handled with a multi-tier architecture:
http://en.wikipedia.org/wiki/Multitier_architecture

Categories