a bit new to Dynamics CRM and trying to find the best way to update external systems in real time when specific entities are created or updated in Dynamics CRM 4.0 or 2011. What I've gathered thus far is that my best option is to
Write a web service for the downstream LOB system.
Write a custom workflow activity that calls the web service and register the plugin inside CRM.
Attach the custom workflow activity to a record created / updated trigger on the underlying entity that I want to receive updates for.
Am I on the right track or is there a better alternative?
What you have seems a pretty sound design and is probably the way I would approach it.
If you need synchronous behaviour (not sure how real time your real time is here) you might want to consider plugins (different from custom workflow activities) as these can be registered to run synchronously.
Other than that you could have your downstream system read the records from CRM on a frequent, scheduled basis - you can use the modifiedon field to see which records have changed since the last read.
Related
I'm working with SAP Business Objects RESTful Web Services and I need to update an existing report. I can see a few pieces of information on what I want but not the whole picture.
In the user guide there is a section on updating properties and it lists those properties. It appears to be an exhaustive list. But what I need is not in there.
My end goal is to pause the report via web services as well as the ability to update items like the recurrance, report format or name of the report to name a few items.
What I'm looking at in the user guide seems to cover the name, but not the others.
What am I missing?
How do I update and/or pause a schedule after it has been created?
Alas, you cannot. The only actions that are available in the REST SDK regarding schedules for Web Intelligence are (up until BI 4.2 SP1):
Getting the list of schedules
Getting the details of a schedule
Adding a schedule
Deleting a schedule
If you want to update/pause schedules, then you'll need to resort to a different SDK (Java might be an option, although SAP is moving everything towards REST).
Another possibility is that you define these actions in your own application. E.g. updating a schedule could consist of:
Getting the details of a schedule
Deleting the schedule
Adding the schedule (with modifications of the original settings of course).
I am writing a .NET Windows Service whose job is to monitor the status of documents stored in a document database (MongoDB). These documents will be modified from time-to-time by users via a web site. The Windows Service needs to run every, say, 5 minutes, poll around all the documents (hundreds of these), examine the documents and see if any of the documents needs attention from a user (a real person). Users will be notified of required action via email.
The service will run 24/7. There is no current SQL database in the mix, and I don't really want the overhead and expense of maintaining a SQL database just to support this requirement. I do have MSMQ in the mix, alongside MongoDB. I would consider using WWF, but is there a lightweight workflow persistence store that does not rely upon SQL?
Can anyone advise as to the best strategy to support this requirement?
Thanks.
Since you can easily write your own implementation of persistence for workflows, I would suggest you to store data in XML file.
You can find the example of implementation of XML persistence here: XML Instance Store
Not sure how many workflows will be started in you case. If the number is big enough it might makes sence to use SQL Server because the approach with XML file can lead to concurrency and performance problems.
We have a process that needs to fire when a change occurs to a specific database table in Oracle. At the moment a similar process has been developed using triggers and a bunch of subsequent database actions that occur when that trigger is fired.
However, in this solution we want to call a .NET component (most likely a service) when the change occurs to a row or bunch of rows in a database table. Now, you could implement a polling mechanism that will check the table at regular intervals for those modifications and then instantiate the service when it finds any. However, I would prefer a more event driven approach.
I assume this is something that has been done elsewhere so I was wondering what approaches other people have used for dealing with such requirements?
Thanks in advance
Edit: The process that fires when a change occurs to the underlying data is essentially a call to an external web service using some of the related data. I am beginning to think whether this call should occur as part of the same process that is submitting the data into the database, rather than being triggered by the data change itself.
You should look at Oracle Database Extensions for .NET.
From the linked article:
Oracle Database Extensions for .NET provides the following:
A Common Language Runtime (CLR) host for Oracle Database
Data access through Oracle Data Provider for .NET classes
Oracle Deployment Wizard for Visual Studio .NET
You would still use triggers to detect the db changes but instead of firing all the db-side logic you describe you would now be able to execute that logic from a .NET module.
If you are using Oracle's .NET driver, you can use Oracle Continuous Query Notification (CQN) to do that.
You just give it a normal SELECT query, and it will fire a callback in your app whenever the resultset for that query changes.
The one caveot I know of is that when it initially runs the query to subscribe for continuous notification, it momentarily requires an exclusive lock. Usually its not a big deal since you just evecute it once at startup, so any other DB queries on the same table will be blocked for a fraction of a second.
It sounds possible but will likely take some leg work. I think you want to look into the Oracle Access Manager
http://download.oracle.com/docs/cd/E12530_01/oam.1014/e10355/toc.htm
This is similar to Paul's; but does not assume that you have Oracle installed on a Windows machine.
You may use dbms_scheduler to create a job that will call your external process.
You may directly call an remote external job from Oracle (this requires Oracle Scheduler Agent to be installed but nothing else)
it requires a bit of leg work to get the authentication set up and such, but this works.
then you utilize an event to start your job(called from your trigger).
This way, you may actually be able to utilize a lot of what you already have coded and just have the oracle scheduler to handle the rest.
Oracle provides two mechanisms to deal with what describe in a rather nice way... on the DB side you implement triggers for detecting the changes or whatever should result in an action on the .NET side...
For the communication you use a publish/subsribe mechanism based on the Oracle built-in queueing technology (called Advanced Queueing Technology, AQ) - for a nice example see http://www.devart.com/dotconnect/oracle/docs/AQ.html (not affiliated, just a happy customer).
Another option is to use built-in DBMS_ALERTER package for communication which is transactional and asynchronous - see for an example http://www.devart.com/dotconnect/oracle/docs/Devart.Data.Oracle~Devart.Data.Oracle.OracleAlerter.html
Just to be clear:
the above technologies (DBMS_ALERTER and AQ) are Oracle built-in ones, not specific to any 3rd-party libraries... you just need an ADO.NET provider supporting them...
EDIT - after the EDIT from the OP:
If you have control over the code or the call to the code that triggers the data change (WebService?) then it is indeed the best way to deal with it is purely on the .NET side of things... this also helps to deal with situations where such a change runs into an error etc.
I am in the process of puting together a custom workflow activity to be used in Microsoft Dynamics CRM 4.0
What I would like to ultimatly acheive is configure a workflow that runs on a schedueled basis i.e run every 2 hours Monday to Friday, rather than on a particular "CRM event" like create, delete status change ect.
Does anyone have any ideas?
Maybe schedule it outside of crm?
Edit 1:
What we are doing is processing rows in a staging table that gets generated from a front-end site. We are creating contact/account and opportunity records in CRM based on the data captured from the front-end.
The more I think about it the more I'm thinking that using workflow is possibly not the best solution?
What about a using a windows service?
Workflow was not best option for this situation due to the following:
Can't schedule it to run
The process can only be triggered by a CRM create, update or similar message
I went with a combination of the following:
A SQL CLR sproc that gets called on an UPDATE trigger on the staging table. The CLR sproc calls a webservice that generates CRM contacts/accounts. That way the front-end site can create records and set a "ready to process" flag once all data has been entered.
The requirement changed from a schedule solution to real time processing (well not ACTUALLY real time). The process needs to run as records are entered from the front end site.
Hope all that makes sense!
A windows workflow using the CRM webservices is one option, a better option would be to change your webform to access the CRM webservce adn enter the data directly.
if you really want to use workflows you can download a tool from http://www.patrickverbeeten.com/pages/TechnicalNet/MicrosoftCrm/PeriodicWorkflows.aspx?id=23 that you install on your CRM server and it allows you to use windows sheduled tasks to triger them.
What would you recommended re an approach to persist data for the following situation:
WPF application (desktop)
Will be capturing information ever second (approx) and will need to store about 5 values per second effectively.
Will need to save data for up to say 1 month
Usage will be both (a) real time viewing of last few hours data, + (b) ability to view historic view of data for other data, kind of like an ad hoc query to view the data. There would need to be some limited filtering or querying on the data store (whatever it is) prior to it being presented
What approach would be recommended here, ideally that is easiest and keep the WPF installation simple?
You could do it using the new SQL CE of Microsoft (which allows multithread). It's easy to deploy (I think it's just a matter of including a DLL)...
http://en.wikipedia.org/wiki/SQL_Server_Compact
It should handle the load fine, assuming not a gezillion people will be using it on the same machine. Even then it would be fairly easy to upgrade.
So you would run a timer or something to push your data mining into it every 5 seconds, then the client polls as the user loads screens in the client.
Later, you might want to separate it into two apps, one for the data mining, maybe running as a service and the other one as 'the client'. In that case having a server dedicated to the data mining would help.