CRM 2011 which fields are being used? - c#

I'm developing against MS CRM 2011. And I find out there are a lot of pain points including borderline broken, half arsed LINQ support.
That aside the latest one has to do with a lot of built-in fields that's been hidden away, deprecated and/or unused as far as the user is concerned. However from developer's perspective, we have no idea that these fields are deprecated. The context generated by crm util still generates these properties. So what ends up happening is that developers code against properties/relationship/entities that has been deprecated on a number of occassion.
So now the question is, is there a way to interrogate the crm services for a list of fields/properties that aren't in any form for every entity? What more is there a way to interrogate the crm services for all unused entities (for lack of a better word). These could be entities that are orphaned, hasn't been updated in a while and/or empty etc.
I hoping with such a list the developers will know what to look out for as opposed to coding blindly against the crm context, which has been a source of frustration.
Thanks in advance.

Well I don't think you'd want to just generate code for fields that are on forms - I use "hidden" fields for flags all the time.
There is a way to make crmsvcutil generate only the entities you want:
http://erikpool.blogspot.com/2011/03/filtering-generated-entities-with.html
Sounds like you are a bit frustrated, but don't give up quite yet. There are definitely architectural decisions that I question, but all of the plumbing that I don't have to write makes up for it. Dynamics CRM is like any other technology, but I love it more than I hate it. :)

I've not been able to find much which would allow you to do this, at least not in CRM 4 (which is the version I'm using). None of that stuff seems to be documented anywhere. I did write a query based off of this guy's post which let me filter out most of the unused fields, but there was a lot of picking through the fields in the LocalizedLabelView to get what I wanted, and even then it wasn't easy. It's probably even harder now in CRM 2011 since you can have multiple forms per entity.

Related

How to allow for a general connection to Dynamics server

Finally, I've finished creating the Add-On that does exactly what I want in exactly the way I want. Only one issue remains. At the moment the software goes to organization and uses credential that are statically hard coded into the program.
I have a hunch that some of my customers might name their organization something else than "Hazaa Inc. #1-5", they could be called something else than "CRMK.local\Konrad" and there's a chance that they even choose a different password than me (although "abc123" is apparently fairly common).
So, here's the issue - how do I make my solution general?
I believe that as long as I get the right input to the code below, I'll be done.
using (OrganizationServiceProxy proxy
= new OrganizationServiceProxy(
organizationUri,
homeRealmUri,
credentials,
deviceCredentials))
The current log-in string I copied by hand from "Settings" in the CRM Dynamics but it won't suffice here. I need to obtain it dynamically and programmatically. Moreover, even if I learn the name of the organization, I need to determine if it's a CRM 4.0 or CRM 2011, if we're talking On-Line/Premiss what the user name and password are etc.
Is it a better solution to simply ask for that information to be provided by the user or would it be recommended to do that auto-magically (as far as the user experiences it)? This is my first time so most of stuff feels scary and confusing. I've seen some code examples but that led me back to scary & confused very quickly.
EDIT:
I've followed this code but I simply don't get it. E.g. ServerConnection I don't even know where to find. I tried adding references to the different Xrm and Crm packages in the SDK but it's more of a trial-and-horror than actual development. It feels like I'm missing something (fairly) obvious.
When in doubt, ask the user instead of trying to guess: you might unwillingly break something if your assumption is wrong.
Some more thoughts:
Online vs. On-Premise has limitations which might break functionalities. If you need things to work on both, focus on Online. In other words, if it works Online, it works On-Premise, but if it works On-Premise it might not work Online.
CRM 2011 provides many many innovations when compared to 4.0, code should really be branched so you stick to old-style in one project and leverage all the new features in another one. Solutions management alone is worth dropping 4.0 support altogether.

Which ORM does fit best here?

I need to choose an ORM for a project and I only have some experience with NHibernate. I have been reading Q&A from StackOverflow, and the most similar to my needs is What ORM for .net should I use?, but I would like to have an answer more adequate to the present products (the link is from 2009) and that also take into account some points of my project.
The easiest solution for me would be to use NHibernate because it is mature, feature rich and I have already used it, but I prefer to choose the best option for the project even if I have to "study" again.
The project is going to start as a core that communicates with SAP. The core has to support standalone and/or co-dependent modules, and each one of them may need to work with its own data from the database. The final step will be to implement the part of SAP that we use. The characteristics I need are the ones from the previous link and here are some more things to have in mind:
I would like to be able to split the data access layer so that a user with one or two modules won't need the whole thing.
A designer would be appreciated.
It will start with about 20-30 tables and, within a couple of years, that number will grow in several hundreds.
The amount of registers per table will vary from two or three to 150000+ (very few).
IT DOES NOT NEED TO BE A ONE PRODUCT SOLUTION. Combinations like NHibernate and Devart Entity Develop are also welcome.
The team for this project will also have students that will have to learn C Sharp and some of them probably donĀ“t know exactly what an ORM is, so it would be great if it is simple or, at least, the basic stuff is not very complex (mixing tons of lambdas, reflection, extension methods, etc.).
The last one is not very important. I hope this is specific enough to avoid been closed (the question I link is still open).
EDIT:
-It is a desktop application.
-Documentation and comunity are also very very important.
The most popular ORM for .Net these days is Entity Framework. It comes from Microsoft, so well documented in MSDN style. And it fits your criteria.
I worked with NHibernate and found that documentation is patchy, inconsistent and sometimes missing. Most of the time I had to use docs for Hibernate which were not for NHibernate, just similar.
EF can do the same things and more than NHibernate, and the latest release have Migrations, which was missing (when I worked with NHibernate).
Consider it Dapper: dapper-dot-net.
Dapper is a pretty simple ORM, developed and used by StackOverflow.
There is a lack of documentation, but that is because of it simplicity. You can find some usage example in the project page or in some websites like this.
I know this is a pretty old question, but I though I would post an answer for anyone who lands here. Check out SQL Data. It is extremely simple to use, very powerful and fits all the OP's requirements.

Generate A Simple Read-Only DAL?

I've been looking around for a simple solution to this, trying my best to lean towards something like NHibernate, but so far everything I've found seems to be trying to solve a slightly different problem. Here's what I'm looking at in my current project:
We have an IBM iSeries database as a primary repository for a third party software suite used for our core business (a financial institution). Part of what my team does is write applications that report on or key off of a lot of this data in some way. In the past, we've been manually creating ADO .NET connections (we're using .NET 3.5 and Visual Studio 2008, by the way) and manually writing queries, etc.
Moving forward, I'd like to simplify the process of getting data from there for the development team. Rather than creating connections and queries and all that each time, I'd much rather a developer be able to simply do something like this:
var something = (from t in TableName select t);
And, ideally, they'd just get some IQueryable or IEnumerable of generated entities. This would be done inside a new domain core that I'm building where these entities would live and the applications would interface with it through a request/response service layer.
A few things to note are:
The entities that correspond to the database tables should be generated once and we'd prefer to manually keep them updated over time. That is, if columns/tables are added to the database then we shouldn't have to do anything. (If some are deleted, of course, it will break, but that's fine.) But if we need to use a new column, we should be able to just add it to the necessary class(es) without having to re-gen the whole thing.
The whole thing should be SELECT-only. We're not doing a full DAL here because we don't want to be able to break anything in the database (even accidentally).
We don't need any kind of mapping between our domain objects and the generated entity types. The domain barely covers a fraction of the data that's in there, most of it we'll never need, and we would rather just create re-usable maps manually over time. I already have a logical separation for the DAL where my "repository" classes return domain objects, I'm just looking for a better alternative to manual ADO to be used inside the repository classes.
Any suggestions? It seems like what I'm doing is just enough outside the normal demand for DAL/ORM tools/tutorials online that I haven't been able to find anything. Or maybe I'm just overlooking something obvious?
You might want to investigate this:
http://www.codesmithtools.com/
I've been on a few projects where this tool was used, and the developers were fond of it.
Warning: it's not freeware.
Adding to code4life's answer.
I recently used CodeSmith 2.6 on a pet project at home.
I tailored the templates to generate Partial classes with partial CRUD methods.
I then extended the partial classes and partial methods to provide basic mapping e.g. when "Order" is read, read the associated "OrderLines" etc.
A similar approach might work for you, and be simpler because really, you only want the R of CRUD :)
Hope this helps.
P.S. I need to switch from CS 2.6 to something else (possibly MyGeneration), as 2.6 is dependant on .Net 1.1, which is a problem as I'm moving to a 64 development machine.
UPDATE
I hear ya.
Have a good poke a MyGeneration mate, we use CodeSmith here in work, which is why I went for 2.6 at home (I'd a quick glance at MyGeneration, didn't get it immediately - which was more time than I had to give it - so switched back to CS).
MyGeneration is basically an open source version of CS.
Also, why not get it to generate the code ALWAYS - you can tweak to your hearts content with Partial Classes & Partial Methods, I know you think re-generating is overkill, but experience tells me it's one of those things that if you don't it have you'll need it, where as if you have it and need it you'll never notice.

Is it foolish of me not to use NHibernate for my project?

I am working on a .NET web application that uses an SQL Server database with approximatly 20 to 30 tables.
Most tables will be included in the .NET solution as class.
I have written my own data access layer to read the objects from, and write them to the database.
The whole thing is consist of just a few classes and very few lines of code en uses generics and reflection to find out what SQL and parameters to use.
Now, such thing could be done by using NHibernate (or similair framework) and some co-workers claim that is foolish of me not to use it.
My main argument for not using it is that i want maximum control over my application, know exactly what everything does and how everything works, even if that costs me more development time.
I also dont like the fact i have to map my database in XML files (my own solution lets me map it in the entity class files).
So, what i would like to hear from you is, is it really stupid to not use NHibernate in this situation?
Am i really being ignorant or is it not such a strange idea to use my own solution?
I think these days there really isn't any reason to roll your own persistence framework since there are so many good choices out there. You don't have to use NHibernate (though it is a good choice) but I would seriously consider using something that is well tested and established in the industry as it will tend to perform better and have less bugs that something you write yourself.
It probably is foolish to write your own classes instead of using NHibernate, but it's less foolish to continue using your own classes, given that you've already written them. Maybe.
I won't call you foolish because I've done exactly the same thing in the past. Then I started using NHibernate and wondered why the hell I rolled my own. It's good, give it a go.
You have several possibilities that are probably better than you reinventing the wheel. Let me name two most likely choices:
Use Entity Framework for your DAL+DAO. This will make your classes (that you've already written) obsolete, since EF will create their own and you'll get up to date with latest language capabilities and technologies.
Use Fluent NHibernate so you don't have to work with XML mappings. This way you'll keep your business layer object classes you've written and avoid tedious NHibernation XML files. It's all C#.
Your way of thinking is good. You want control. That's fine. But using your own DAL is a bit foolish these days, because you are basically reinventing the wheel, plus you'll have not tested/buggy code that will take considerable time to develop+test+debug.
If I were you, I'd go with the #2 option, since I've done option #1 and I know I had to customize lots of things to make EF work as it should. EF will be ready with V2.
People tend to use frameworks that are already written because, well, they're already written (and tested).
But there IS merit to rolling your own. Only you and your colleagues can make assumptions about your domain. A generic framework like NHibernate cannot make many assumptions, because that wouldn't make it very universal.
When you roll your own, you can bake these assumptions into your framework, to make a more streamlined, natural API. That said, if you were starting over I would have suggested taking an existing framework and wrapping it to better suit your needs. But since you already have something and it works for you, I'm not sure that I would suggest swapping it out for something else.
It depends on what they mean by "foolish."
If by "foolish" they mean you shouldn't have written your persistence layer in the first place, they're probably right, but that's crying over spilled milk.
If by "foolish" they mean you should rewrite all your existing code to use another framework (like NHibernate) when it's already working with yours, they're probably wrong (although there's something to be said for # of bugs in NHibernate vs likely # of bugs in yours).
If by "foolish" they mean the entire team knows NHibernate cold, and it's already used in the rest of your code, so by using your framework you're making it harder on the team, they're absolutely right, and you should probably refactor the code in NHibernate as soon as possible, before any more code gets locked in to your framework.
If by "foolish" they mean no one there really knows NHibernate, they just like it, then... nobody wins. They're being fussy, you implemented a framework you didn't have to... let's call it a tie.
All of that said, everyone should write a persistence framework or three. Those probably shouldn't end up in anything that ships, but it's a good exercise. The only mistake you made was tying code the team had to maintain into your good exercise.
There are many good persistence tools out there that are well tested and have proven performance (NHibernate, Linq to entities, LLBL Gen Pro). If your needs are very different from the normal persistence frameworks that exist then I would roll my own. I would want to take advantage of the testing and optimizations of an existing tool if at all possible, however.
That being said, I might also roll my own if I wanted to have the experience of building my own ORM tool and was willing to live with the downsides (not as well tested or optimized as tools that have been around for years, speed to market).
Making your own solution, especially when it seems to work fine and be as simple as you say, is neither ignorant nor strange. There are lots of situations where it's better to do that than to add a dependency on a separate project like nHibernate.
That said, there are of course also a lot of situations where the complete opposite is true. :)
It really depends on your project and team. If you are developing an enterprise application that will eventually be supported by someone else, sticking to industry standards might be a good idea even if it means a bit more work up front.
All of the answers here are great, but I am really surprised that nobody has mentioned Castle ActiveRecord, it sounds very similar to what your framework does and really simplifies the interface to NHibernate. It's one of the patterns that made Ruby on Rails so popular after all!
Ayende Rahien (one of the principal NH developers) gave a GREAT presentation on ActiveRecord at Oredev a few years ago which I highly recommend: http://www.viddler.com/explore/oredev/videos/89
I think that it is a matter of balance of control. You say that you want control and you don't want mappings. If this control comes at the cost that there is an increased development and maintenance cost and that it takes longer to produce working code, then it is a problem.
I personally don't see a problem in rolling a framework as long as it simplifies a repetitive task and makes development more productive and code more stable due to less room for interpretation. We have rolled our own framework, that includes a persistence/data access implementation. Our reasons for doing it, though, were specific. In this case, it was to work within a DDD environment that was much closer to what Evans describes than what most off the shelf products were providing.
I think the difference is, though, that we understood that there was an upfront cost and that it would eventually balance itself out through savings in development time in the future. Of course, if you are writing code that you manually have to manage connections, map data, etc., you are probably going down the wrong path. At the very least, you could be using something like Enterprise Library to help you manage the tedium of connectivity and command construction. But, I also think, that if you have no reuse - nothing that is a "framework" type of implementation that you can abstract and apply to other projects, then you are creating a maintenance nightmare and time sink that you will be the sole owner of.
We were also using our own Data Access Layer and entity classes. We also had a code generator who used to generate all this classes for us. But now we are using Entity Framework and we are more then happy.
Simple advise : Start learning nHibernate or whatever you prefer and start using it in your next project.
Entity Spaces - http://www.entityspaces.net/Portal/Default.aspx
is also a good tool.
I ended up using Fluent NHibernate for the job.
All my entity classes were generated with ActiveRecordGenerator (http://code.google.com/p/active-record-gen/)

Compare and Contrast NHibernate and OpenAccess from Telerik

Have you used the OpenAccess ORM from Telerik? How does it compare to NHibernate? When should I consider using it over NHibernate?
I'm wondering the same thing myself. On one hand, there's NH with its free, open-source self, but with limited support options. On the other, a fairly new addition to a well-known tool provider's box, OA.
OA costs money, but you get support. NH is free, but support has been known at least in my brief experience to be limited and slow in coming.
I think both are likely fine products. I've decided to give OA a try since I am already a user of Telerik's tools. OA and its support are being paid for anyway.
NH uses plain classes and object with no decorations on the class properties whatsoever. OA requires decorations (nicely generated by the OA Visual Studio GUI).
NH requires a "session" in which to do a unit of work with the database; OA calls it "scope". Both use "transaction".
OA has integration with Visual Studio and can both forward- and reverse-map to and from a database. Forward mapping is so you can design your classes and then "push" those into the database for persistence. The "reverse" is for you "domain model" developers which is what I prefer.
OA is definitely undergoing some major updates as Telerik plays "catch up" per its recent acquisition and release of OpenAccess, formerly owned by Vanatec (out of Germany).
As far as an "ease of use" and "performance / scalability" standpoint, I wish I knew where each stood. I'm sure someone out there could put together an honest test between the two and make those determinations.
One thing I like about NH is the available templates to generate the needed code not just for the "dumb" business objects (which is all OA generates now), but for a BLL and DLL. After much conversation with Telerik, I have the impression they plan for more code-generating options so OA is more useful out of the box.
Hope this helps! Someone please try to get some stats on the performance issues.
I've not used it but one benefit obvious to me, is OpenAccess is supported by Telerik, where as nHibernate is supported by the community. Depending on your company this can be a deciding factor if your ready to embrace open source solutions with no guarentee of support.
Edit
For the record I am a big supporter of nHibernate, and open source in general. I have been using nHibernate for the last six months, using it for all new work in our web application. For my current company it is a good fit (Startups love free).
However, my previous employeer, would have had a very difficult time accepting a community supported component as a core piece of their infrastructure. This is perfectly reasonable as these companies' web sites are their sole source of revenue. Would you want to stake your entire business on software that has no accountability associated with it? Some people wouldn't want to take that risk on.
Personally I have found the support for nHibernate to be on par and even better with some commercial vendors.
My point is not to bash OSS, but to highlight one benefit of using software that has a coporate backing, with a fully staffed and dedicated support channel.
One more reason: Currently OpenAccess has better performance characteristics, if you need fast ORM for your project it will be a better choice. See ORM benchmarks for details.
I would say nHibernate is free and OpenAccess is $399. Although CodeSmith with nHibernate templates is $99-$399 if you want nHibernate easily automated. It looks like OpenAccess has more transparency in the data layer and is probably easier to maintain. But, if you used something like Spring.Net you would not only have nHibernate automating the data but the service layer automated as well. Although take that with a grain of salt because nHibernate and Spring.Net are another batch of configuration files that need to be maintained. I bet OpenAccess is GUI friendly. Either one works, but there is alot more info out there on nHibernate.

Categories