Fire event on database datetime field - c#

I have table in the database like this:
What is the best way to implement trigger of event somewhere (in SQL server database or c# application) on time of event field in table.
Edit:
Traditionally I would have done something like this:
while(true)
{
DataTable tbl = getRows("select * from table where event=" + DateTime.now());
if(tbl.rows.Count()>0)
{
//do some thing
}
Thread.Sleep(1000);
}
Is there a more efficient way to achieve this?
(I don't want periodically check Database)

I would go like this:
public class Notify
{
public int ID { get; set; }
public string Name { get; set; }
public DateTime Time { get; set; }
}
public Timer tData = new Timer();
public List<Notify> Notifications = new List<Notify>();
public void Main()
{
// 5 minutes
tData.Interval = 1000;
tData.Tick += new EventHandler(CheckEvents);
}
public void GetAllEvents()
{
DataTable results = new DataTable();
/* results = YourDatabase(Select id, name, event FROM...); */
Notifications.Clear();
foreach(DataRow row in results.Rows)
{
Notifications.Add
(
new Notify
{
ID = int.Parse(row[0].ToString()),
Name = row[1].ToString(),
Time = DateTime.Parse(row[2].ToString())
}
);
}
}
public void CheckEvents(object sender, EventArgs e)
{
IEnumerable<Notify> eventsElapsed = Notifications.Where(notify => notify.Time == DateTime.Now);
foreach(Notify notify in eventsElapsed)
{
// Send your sms
var id = notify.ID;
var name = notify.Name;
var time = notify.Time;
}
}
To-Do's:
Care for the correct format when getting the DateTime out of your database. It might look different.
You would have to think about how to get new events. I would do it either by a button-click or by setting up another timer with an interval of something around 5-10 minutes. You can just call GetAllEvents().
I dont think you are inserting new events that will have their time in between the next 30 seconds.
Another more advanced way:
You could also setup 2 apps for this.
First app:
Get all Events from Database
Optionally create an UI that allows to filter which events should be get
Link every event to Scheduled Tasks with start arguments like secondApp.exe "2016-06-26 10:13:56".
Second app:
On startup, fetch the passed arguments and send the SMS.
If your table has lots of data, its not bad to split the time consuming process of hooking up events (1st app) from the simple process of sending out a SMS (2nd app).

Don't mess with triggers at the database for this purpose.
In C# you write the code for your logic - the thing you need to be done. Next designate an object which stores the input parameters + the time to start this logic. E.g. in SMS system we need the subscriber number, the message and the time of sending.
At your system startup (for example) you read the data to obtain the times. Start some threads and make them wait (Sleep) as long as needed, then execute the target method. Something like this:
class SmsDetails
{
public string Subscriber { get; set; }
public string Message { get; set; }
public DateTime SendOn { get; set; }
}
class Program
{
static void Main(string[] args)
{
SqlDataReader schedule = null; // Initialize the reader as appropriate.
while (schedule.Read())
{
var det = new SmsDetails();
//.
//.
//.
det.SendOn = schedule.GetDateTime(2);
ThreadPool.QueueUserWorkItem(ScheduleSending, det);
}
}
static void ScheduleSending(object Details)
{
var smsd = Details as SmsDetails;
if (smsd.SendOn > DateTime.Now)
{
var waitInterval = DateTime.Now - smsd.SendOn;
Thread.Sleep(waitInterval);
SendSms(smsd.Subscriber, smsd.Message);
}
}
static void SendSms(string PhoneNumber, string Message)
{
// Send it out
}
}
Of course this is not the complete code for such a solution but I hope you get the idea. You need to take care to signal (in the DB?) if the message was sent. You can also employ wait handles to interrupt the threads from the calling code.
Depending on the volume of the records in the reader you may want to poll the DB for the message about to be send in the next 1, 6, 12 and so on. E.g. if you need to dispatch 10000 messages over the next 3 days don't schedule them now. Having too many threads can degrade the performance.

Related

Sorted and indexed WinForms ListBox items

There is Client app with ListBox containing Records sorted by Record Time attribute. On application start Client loads Records from server and start listening to server updates. When server inform Client about new Record Client add Record to ListBox (theoretically with Time before last showed Recort Time). When Server inform about update or delete Client find message by ID and update or delete it (Theoretically when Time of Record is changed order of Records in ListBox must be changed).
I guess somethink like sorted dictionary with value time comparator is required.
public partial class RecordsForm : Form
private System.Windows.Forms.ListBox recordsListBox;
peivate SortedDictionary<long, Record> recordsDictionary;
public RecordsForm()
{
InitializeComponent();
// this is not working because comparer must compare keys
recordsDictionary = new SortedDictionary<long, Record>(new RecordComparer());
var recordsBbinding = new BindingSource();
recordsBbinding.DataSource = recordsDictionary;
recordsListBox.DataSource = recordsBbinding;
}
public HandleCreateUpdate(Record record)
{
recordsDictionary[record.Id] = record;
}
public HandleDelete(Record record)
{
if (recordsDictionary.ContainsKey(record.Id))
{
recordsDictionary.Remove(record.Id);
}
}
}
class Record {
public long Id { get; set; }
public DateTime Time { get; set; }
public String Title { get; set; }
}
class RecordComparer : Comparer<Record>
{
public override int Compare(Record left, Record right)
{
return left.Time.CompareTo(right.Time);
}
}
Or exists another pattern used for something like this?
EDIT: Added screen
List of Records is always sorted by Time desc. I want to synchronize this list across clients. When one edit/add/delete record other clients may reflects changes without reloading entire list or iterating over all items.

Is it possible to have a Business Event call Custom Code in Acumatica?

When my user adds an appointment, he wants to have 2 things happen:
Send an acknowledgement of the appointment to the customer's cell phone, with information about the appointment
24 hours before the appointment, to have a reminder sent out
The main issue is that the event screen displays dates as UTC. This, of course, confuses the customer. So, I need to change the date and format the text message via code.
When I first did this, there was no way to call a custom method from a business event -- so I actually send the message up to Twilio, catch that with a Webhook that sends it into my custom DLL. That massages the message, making everything look right, and then sends it back through Twilio to the customer.
But this is costly (2 messages sent and 1 received for every event) and needlessly complicated. I want to simplify it now, because I have been told there is added functionality in Business events now that allows a call out into custom code. Is this true?
I was told that this would be available starting in 2020 R2. I am looking for it in the docs and training classes, but I can't see anywhere that this is possible.
How do I call custom code from a business event? Can I set up a subscriber that is in a custom DLL?
Is there something that describes this process somewhere? Or did this never make it into the product?
If you're looking to implement a custom Business Event subscriber (coded in your own dll), I know that is possible in 2021 R1.
There are basic instructions in the Release Notes for Developers starting on page 15.
The upshot is you need to reference the PX.BusinessProcess.dll and implement the PX.BusinessProcess.Subscribers.ActionHandlers.IEventAction interface and either of the PX.BusinessProcess.Subscribers.Factories.IBPSubscriberActionHandlerFactory or PX.BusinessProcess.Subscribers.Factories.IBPSubscriberActionHandlerFactoryWithCreateAction interfaces.
Here is an example blog from crestwood to use business events to create an import scenario: https://www.crestwood.com/2020/05/19/using-business-events-to-create-transactions-employee-birthday-checks/
The gist of it would be to create the generic inquiry to monitor. Next, you would create a business event that ties to the generic inquiry. Go under subscribers, select Create New Subscriber, and then name it. It will load the import scenario, and attach the provider to the event.
For provider object in the business event, you can fill from Results or Previous Results.
This is based on the aborted shipments in sales demo, but it shows my custom action after matching shipment number.
once saved, you can see your business event shows up as a subscriber:
Just for the record, the code I was looking for I was in the Acumatica Help files, as TTook suggested. The reason I am publishing more on this is that the answer in help files aren't very reliable (sometimes) down the road, and I wanted a complete version of the code here -- including all of the using/includes. So, here it is:
This creates an event and a subscriber to respond to a Business Event. The code shows writing the screen data to a text file.
using System;
using System.Collections.Generic;
using System.Linq;
using PX.BusinessProcess.Subscribers.ActionHandlers;
using PX.BusinessProcess.Subscribers.Factories;
using PX.BusinessProcess.Event;
using PX.BusinessProcess.DAC;
using PX.BusinessProcess.UI;
using System.Threading;
using PX.Data;
using PX.Common;
using PX.SM;
using System.IO;
using PX.Data.Wiki.Parser;
using PX.PushNotifications;
namespace CustomSubscriber
{
//The custom subscriber that the system executes once the business event
//has occurred
public class CustomSMSEventAction : IEventAction
{
//The GUID that identifies a subscriber
public Guid Id { get; set; }
//The name of the subscriber of the custom type
public string Name { get; protected set; }
//The notification template
private readonly Notification _notificationTemplate;
//The method that writes the body of the notification to a text file
//once the business event has occurred
public void Process(MatchedRow[] eventRows, CancellationToken cancellation)
{
using (StreamWriter file =
new StreamWriter(#"C:\tmp\EventRows.txt"))
{
var graph = PXGenericInqGrph.CreateInstance(
_notificationTemplate.ScreenID);
var parameters = #eventRows.Select(
r => Tuple.Create<IDictionary<string, object>,
IDictionary<string, object>>(
r.NewRow?.ToDictionary(c => c.Key.FieldName, c => c.Value),
r.OldRow?.ToDictionary(c => c.Key.FieldName,
c => (c.Value as ValueWithInternal)?.ExternalValue ??
c.Value))).ToArray();
var body = PXTemplateContentParser.ScriptInstance.Process(
_notificationTemplate.Body, parameters, graph, null);
file.WriteLine(body);
}
}
//The CustomEventAction constructor
public CustomSMSEventAction(Guid id, Notification notification)
{
Id = id;
Name = notification.Name;
_notificationTemplate = notification;
}
}
//The class that creates and executes the custom subscriber
class CustomSubscriberHandlerFactory :
IBPSubscriberActionHandlerFactoryWithCreateAction
{
//The method that creates a subscriber with the specified ID
public IEventAction CreateActionHandler(Guid handlerId,
bool stopOnError, IEventDefinitionsProvider eventDefinitionsProvider)
{
var graph = PXGraph.CreateInstance<PXGraph>();
Notification notification = PXSelect<Notification,
Where<Notification.noteID,
Equal<Required<Notification.noteID>>>>
.Select(graph, handlerId).AsEnumerable().SingleOrDefault();
return new CustomSMSEventAction(handlerId, notification);
}
//The method that retrieves the list of subscribers of the custom type
public IEnumerable<BPHandler> GetHandlers(PXGraph graph)
{
return PXSelect<Notification, Where<Notification.screenID,
Equal<Current<BPEvent.screenID>>,
Or<Current<BPEvent.screenID>, IsNull>>>
.Select(graph).FirstTableItems.Where(c => c != null)
.Select(c => new BPHandler
{
Id = c.NoteID,
Name = c.Name,
Type = LocalizableMessages.CustomNotification
});
}
//The method that performs redirection to the subscriber
public void RedirectToHandler(Guid? handlerId)
{
var notificationMaint =
PXGraph.CreateInstance<SMNotificationMaint>();
notificationMaint.Message.Current =
notificationMaint.Notifications.
Search<Notification.noteID>(handlerId);
PXRedirectHelper.TryRedirect(notificationMaint,
PXRedirectHelper.WindowMode.New);
}
//A string identifier of the subscriber type that is
//exactly four characters long
public string Type
{
get { return "CTTP"; }
}
//A string label of the subscriber type
public string TypeName
{
get { return LocalizableMessages.CustomNotification; }
}
//A string identifier of the action that creates
//a subscriber of the custom type
public string CreateActionName
{
get { return "NewCustomNotification"; }
}
//A string label of the button that creates
//a subscriber of the custom type
public string CreateActionLabel
{
get { return LocalizableMessages.CreateCustomNotification; }
}
//The delegate for the action that creates
//a subscriber of the custom type
public Tuple<PXButtonDelegate, PXEventSubscriberAttribute[]>
getCreateActionDelegate(BusinessProcessEventMaint maintGraph)
{
PXButtonDelegate handler = (PXAdapter adapter) =>
{
if (maintGraph.Events?.Current?.ScreenID == null)
return adapter.Get();
var graph = PXGraph.CreateInstance<SMNotificationMaint>();
var cache = graph.Caches<Notification>();
var notification = (Notification)cache.CreateInstance();
var row = cache.InitNewRow(notification);
row.ScreenID = maintGraph.Events.Current.ScreenID;
cache.Insert(row);
var subscriber = new BPEventSubscriber();
var subscriberRow =
maintGraph.Subscribers.Cache.InitNewRow(subscriber);
subscriberRow.Type = Type;
subscriberRow.HandlerID = row.NoteID;
graph.Caches[typeof(BPEventSubscriber)].Insert(subscriberRow);
PXRedirectHelper.TryRedirect(graph,
PXRedirectHelper.WindowMode.NewWindow);
return adapter.Get();
};
return Tuple.Create(handler,
new PXEventSubscriberAttribute[]
{new PXButtonAttribute {
OnClosingPopup = PXSpecialButtonType.Refresh}});
}
}
//Localizable messages
[PXLocalizable]
public static class LocalizableMessages
{
public const string CustomNotification = "Custom SMS Notification";
public const string CreateCustomNotification = "Custom SMS Notification";
}
}

How insert 1M models in SQL

I have a file with 1 million lines. I read and insert these line to mssql. The reading operation takes about a second, but insertion does not work very well here (Time: 00: 03: 36.1424842).
public async Task<int> InsertAsync(List<Model> models)
{
var _connectionString =
"Data Source=(localdb)\\MSSQLLocalDB;Initial Catalog=test;Integrated Security=True;Connect Timeout=30;Encrypt=False;TrustServerCertificate=False;ApplicationIntent=ReadWrite;MultiSubnetFailover=False";
var result = 0;
try
{
using (var sqlBulk = new SqlBulkCopy(_connectionString))
{
sqlBulk.BatchSize = 10000;
sqlBulk.DestinationTableName = "Counterparty";
var dt = DataTableHelpers.ListToDataTable(models);
sqlBulk.WriteToServer(dt);
}
}
catch (Exception e)
{
_logger.Debug($"{e.Message} >>> {e.StackTrace}");
}
return result;
}
My models:
public class Model
{
public int Id { get; set; }
public string Name { get; set; }
public string Comment { get; set; }
public string Address { get; set; }
public string Phone { get; set; }
public bool IsActive { get; set; }
}
and file lines:
TestIsert703,Comment694,Adress694,816,1
TestIsert704,Comment695,Adress695,817,1
I tried changing sqlBulk.BatchSize but it does not work for me. How can I insert with good performance. Can I somehow use parallel.for? The load on the laptop is the minimum RAM 1GB, and the processes are generally silent.
Are you complaining about inserting a million records in three and a half minutes? You're getting more than 4,500 records per second!
If you really need to speed this up, I see the CPU and RAM use are low, and I bet at least some of the of the time is just in the ListToDataTable() method. You might reduce the time by splitting up this part of the work to take more advantage of the hardware.
On the SQL Server side, you can improve this job by switching from FULL to SIMPLE (or even BULK) logged mode, but that's not something I'd want to do all the time. I also see this is a local db. Does SQL Server has access to enough RAM on the system? That can make a huge difference.

Best way to send multiple email types in ASP.NET MVC

Hi there to the good friends of SO!
This is more of a design question so I'll get into a detailed example.
Let me explain the way we're sending emails.
In various parts of the application, we create entries in our Notification table for different kinds of email we might have to send.
For eg: The NotificationQueue table looks like this:
NotificationQueueID OrderID EmailType Notes SentDatetime
1 461196 OrderUpdate SomeNote1 2020-09-01 14:45:13.153
2 461194 OrderCancellation SomeNote2 2020-09-01 14:45:13.153
It's accessed using the property in the DbContext as:
public DbSet<NotificationQueue> NotificationQueues { get; set; }
The different types of email is modeled in an enum:
public enum TypeOfEmail
{
OrderCancellation,
OrderUpdate
}
We have a EmailModel class that has a TicketsInNotificationQueue property that has a list of any of the email types we have. For eg: At any given time, it can have list of either UpdatedTickets or CancelledTickets. The email type says what type of tickets are in the TicketsInNotificationQueue property.
public class EmailModel
{
public EmailModel(TypeOfEmail emailType, TicketsInNotificationQueue ticketsInNotificationQueue)
{
EmailType = emailType;
TicketsInNotificationQueue = ticketsInNotificationQueue;
}
public TypeOfEmail EmailType { get; set; }
public TicketsInNotificationQueue TicketsInNotificationQueue { get; set; }
}
public class TicketsInNotificationQueue
{
public List<OrderCancellation> CancelledTickets { get; set; }
public List<OrderUpdate> UpdatedTickets { get; set; }
}
public class OrderCancellation : CommonOrderInformation
{
public string SomeOrderId { get; set; }
}
public class OrderUpdate: CommonOrderInformation
{
public string SomeUpdateRelatedProperty { get; set; }
}
public class CommonOrderInformation
{
public int NotificationQueueId { get; set; }
public string ReferenceNumber { get; set; }
}
There's a method that retrieves tickets from Notification table:
public async Task<TicketsInNotificationQueue> GetTicketsfromNotificationQueueAsync(TypeOfEmail emailType)
{
var ticketsInNotificationQueue = new TicketsInNotificationQueue();
using (var dbCon = GetSomeDbContext())
{
var notifications = dbCon.NotificationQueues.Where(x => x.EmailType == emailType.ToString()).ToList();
foreach (var ntf in notifications)
{
if (ntf.EmailType == TypeOfEmail.OrderCancellation.ToString())
{
if (ticketsInNotificationQueue.CancelledTickets == null)
{
ticketsInNotificationQueue.CancelledTickets = new List<OrderCancellation>();
}
ticketsInNotificationQueue.CancelledTickets.Add(new OrderCancellation()
{
NotificationQueueId = ntf.NotificationQueueID,
ReferenceNumber = ntf.OrderID,
SomeOrderId = "Something from a table."
});
}
else if (ntf.EmailType == TypeOfEmail.OrderUpdate.ToString())
{
if (ticketsInNotificationQueue.UpdatedTickets == null)
{
ticketsInNotificationQueue.UpdatedTickets = new List<OrderUpdate>();
}
var notes = dbCon.NotificationQueues.FirstOrDefault(x => x.NotificationQueueID == ntf.NotificationQueueID)?.Notes;
ticketsInNotificationQueue.UpdatedTickets.Add(new OrderUpdate()
{
NotificationQueueId = ntf.NotificationQueueID,
ReferenceNumber = ntf.OrderID,
SomeUpdateRelatedProperty = "Something from a table."
});
}
}
}
return ticketsInNotificationQueue;
}
Now I just take this list, and filter out the notificationIds for the type of tickets that I just received, and work on them down the line. (I need those notificationIds to set the SentDatetime after the notification has been sent).
var ticketsReceived = false;
notificationIds = new List<int>();
if (ticketsInNotificationQueue.CancelledTickets != null && ticketsInNotificationQueue.CancelledTickets.Any())
{
ticketsReceived = true;
notificationIds = ticketsInNotificationQueue.CancelledTickets.Select(x => x.NotificationQueueId).ToList();
}
else if (ticketsInNotificationQueue.UpdatedTickets != null && ticketsInNotificationQueue.UpdatedTickets.Any())
{
ticketsReceived = true;
notificationIds = ticketsInNotificationQueue.UpdatedTickets.Select(x => x.NotificationQueueId).ToList();
}
if (ticketsReceived)
{
// Proceed with the process of sending the email, and setting the `SentDateTime`
}
The problem I see here is that as the type of emails grows bigger, let's say 10-20, the method to retrieve tickets and filter them out later needs to grow so big that it's going to spin out of control in terms of readability and code manageability which I'm not liking at all. The part where I need to check what emailType is requested in the fetch and what emailType has been received(to get the corresponding notificationIds for SentDateTime update).
So is there some other way to design this workflow (I'm even open to using reflection and such) to make it more manageable and concise?
Any help would be greatly appreciated!
There is significant improvements that you can make to the existing system and the existing code. In the interest of having a more complete answer I'm going to recommend a not-too-expensive system overhaul and then proceed to your exact answer.
A different and industry standard approach
You already have the data structure correct, this is a perfect job for distributed persistent queues, where you don't need to worry about querying the database as much; instead you just enqueue the messages and have a processor that deals with them. Since you're using C# and .net, I strongly encourage you to check out Azure Service Bus. This is effectively a large queue where you can send messages (in your case send email requests) and you can enqueue your messages to different channels in the service bus depending on their type.
You could also look into creating a queue processor / which Azure Functions have a trigger out of the box. Once your email is sent, then you can write to your DB, we've sent this email.
So, the good design looks like
Have distributed persistent queues, channels / enqueue the email requests to them directly.
If you want to process them at a cadence, run your processor using cron - which most industry solutions support.
If you want to process them as they are ending up in the queue, use a trigger.
You can enrich your processor based on your scenario, it looks like it has something to do with orders, so you may need to handle cases like not sending an already queued email after an order in cancelled, etc..
Improving what you have
Due to some circumstances, the solution above might not be available to you - so let's get to it.
See how to refactor switch statements (since you have one with if / else ifs)
https://sourcemaking.com/refactoring/smells/switch-statements
Ways to eliminate switch in code
You could get this through polymorphism, just create a base mail type and override the behaviors in subclasses. This way you can associate the correct queue with the correct email type.
Example:
var results = await getSomeEmails(OrderMail);
// returns a separate processor inherited from the base one, implemented in different ways.
var processor = ProcessorFactory.Create(OrderMail);
await processor.Send(results);
Some more improvements
foreach (var ntf in notifications)
{
if (ntf.EmailType == TypeOfEmail.OrderCancellation.ToString())
You are checking the email type over and over again unnecessarily in this loop, you should look into moving those statements above the for and check through the passed-in parameter, since you already know the type you're querying for.
Thank you for the answer #Mavi Domates.
But this is what I ended up doing:
I modified the EmailModel's TicketsInNotificationQueue property so that instead of having different types of classes for different types of email, we just have one type of common class. This will avoid having us to put those checks for checking what kind of email was requested in the fetch logic and also to retrieve notification Ids down the line (to update SentDateTime after email is sent) as indicated in the original question.
public class EmailModel
{
public EmailModel(TypeOfEmail emailType, IEnumerable<CommonEmailModel> ticketsInNotificationQueue)
{
EmailType = emailType;
TicketsInNotificationQueue = ticketsInNotificationQueue;
}
public TypeOfEmail EmailType { get; set; }
public IEnumerable<CommonEmailModel> TicketsInNotificationQueue { get; set; }
}
public enum TypeOfEmail
{
OrderCancellation,
OrderUpdate
}
I added a new class called: CommonEmailModel and removed all those different email type classes (classes for OrderCancellation, OrderUpdate etc.).
public class CommonEmailModel
{
// Common to all email types. A lot of email types only need these first 4 properties
public string EmailType { get; set; }
public int NotificationQueueId { get; set; }
public string OrderId { get; set; }
public string Notes { get; set; }
// Cancellation related
public string SomeOrderId { get; set; }
// Update related
public string SomeUpdateRelatedProperty { get; set; }
public static async Task<IEnumerable<CommonEmailModel>> GetEmailBodyRecordsAsync(TypeOfEmail emailType)
{
var emailModels = new List<CommonEmailModel>();
var emailEntries = await EmailNotificationQueue.GetEmailEntriesAsync(emailType);
var relevantOrdIds = emailEntries.Select(x => x.OrderID).Distinct().ToList();
using (var dbCon = GetSomeDbContext())
{
orders = dbCon.Orders.Where(x => relevantOrdIds.Contains(x.OrdNumber)).ToList();
}
foreach (var record in emailEntries)
{
var emailModel = new CommonEmailModel
{
EmailType = emailType,
NotificationQueueId = record.NotificationQueueID,
OrderId = record.OrderID,
Notes = record.Notes,
SomeOrderId = orders?.FirstOrDefault(o => o.OrdNumber == record.OrderID)?.SomeOrderIdINeed,
SomeUpdateRelatedProperty = orders?.FirstOrDefault(o => o.OrdNumber == record.OrderID)?.UpdateRelatedPropertyINeed
};
emailModels.Add(emailModel);
}
return emailModels;
}
}
I just get the records the following way:
var emailRecords = await CommonEmailModel.GetEmailBodyRecordsAsync(emailType);
And simply pass this to EmailModel constructor as the ticketsInNotificationQueue parameter. No need to do all that extra check of figuring out if records of certain emailType was requested. The views for OrderCancellation and OrderUpdate will use the common properties and their respective relevant properties that are present in the CommonEmailModel class.
if (emailRecords.Any())
{
var emailModel = new EmailModel(emailType, emailRecords);
}
Now all I have to do is pass the notification Ids to a method that marks the SentDateTime column with the current timestamp by simply calling:
if (emailWasSent)
{
await UpdateNotificationSentTimeAsync(emailRecords.Select(t => t.NotificationQueueId));
}
In the future if we keep on adding new emailType (most probably they'll carry the information in those 4 first common properties in CommonEmailModel), we can simply add new properties to the CommonEmailModel to accommodate that and just create a new view. This way I can avoid code repetition and complexity in the fetch and also at the end while updating the SentDateTime.

Silverlight 4 + WCF RIA - Data Service Design Best Practices

Hey all. I realize this is a rather long question, but I'd really appreciate any help from anyone experienced with RIA services. Thanks!
I'm working on a Silverlight 4 app that views data from the server. I'm relatively inexperienced with RIA Services, so have been working through the tasks of getting the data I need down to the client, but every new piece I add to the puzzle seems to be more and more problematic. I feel like I'm missing some basic concepts here, and it seems like I'm just 'hacking' pieces on, in time-consuming ways, each one breaking the previous ones as I try to add them. I'd love to get the feedback of developers experienced with RIA services, to figure out the intended way to do what I'm trying to do. Let me lay out what I'm trying to do:
First, the data. The source of this data is a variety of sources, primarily created by a shared library which reads data from our database, and exposes it as POCOs (Plain Old CLR Objects). I'm creating my own POCOs to represent the different types of data I need to pass between server and client.
DataA - This app is for viewing a certain type of data, lets call DataA, in near-realtime. Every 3 minutes, the client should pull data down from the server, of all the new DataA since the last time it requested data.
DataB - Users can view the DataA objects in the app, and may select one of them from the list, which displays additional details about that DataA. I'm bringing these extra details down from the server as DataB.
DataC - One of the things that DataB contains is a history of a couple important values over time. I'm calling each data point of this history a DataC object, and each DataB object contains many DataCs.
The Data Model - On the server side, I have a single DomainService:
[EnableClientAccess]
public class MyDomainService : DomainService
{
public IEnumerable<DataA> GetDataA(DateTime? startDate)
{
/*Pieces together the DataAs that have been created
since startDate, and returns them*/
}
public DataB GetDataB(int dataAID)
{
/*Looks up the extended info for that dataAID,
constructs a new DataB with that DataA's data,
plus the extended info (with multiple DataCs in a
List<DataC> property on the DataB), and returns it*/
}
//Not exactly sure why these are here, but I think it
//wouldn't compile without them for some reason? The data
//is entirely read-only, so I don't need to update.
public void UpdateDataA(DataA dataA)
{
throw new NotSupportedException();
}
public void UpdateDataB(DataB dataB)
{
throw new NotSupportedException();
}
}
The classes for DataA/B/C look like this:
[KnownType(typeof(DataB))]
public partial class DataA
{
[Key]
[DataMember]
public int DataAID { get; set; }
[DataMember]
public decimal MyDecimalA { get; set; }
[DataMember]
public string MyStringA { get; set; }
[DataMember]
public DataTime MyDateTimeA { get; set; }
}
public partial class DataB : DataA
{
[Key]
[DataMember]
public int DataAID { get; set; }
[DataMember]
public decimal MyDecimalB { get; set; }
[DataMember]
public string MyStringB { get; set; }
[Include] //I don't know which of these, if any, I need?
[Composition]
[Association("DataAToC","DataAID","DataAID")]
public List<DataC> DataCs { get; set; }
}
public partial class DataC
{
[Key]
[DataMember]
public int DataAID { get; set; }
[Key]
[DataMember]
public DateTime Timestamp { get; set; }
[DataMember]
public decimal MyHistoricDecimal { get; set; }
}
I guess a big question I have here is... Should I be using Entities instead of POCOs? Are my classes constructed correctly to be able to pass the data down correctly? Should I be using Invoke methods instead of Query (Get) methods on the DomainService?
On the client side, I'm having a number of issues. Surprisingly, one of my biggest ones has been threading. I didn't expect there to be so many threading issues with MyDomainContext. What I've learned is that you only seem to be able to create MyDomainContextObjects on the UI thread, all of the queries you can make are done asynchronously only, and that if you try to fake doing it synchronously by blocking the calling thread until the LoadOperation finishes, you have to do so on a background thread, since it uses the UI thread to make the query. So here's what I've got so far.
The app should display a stream of the DataA objects, spreading each 3min chunk of them over the next 3min (so they end up displayed 3min after the occurred, looking like a continuous stream, but only have to be downloaded in 3min bursts). To do this, the main form initializes, creates a private MyDomainContext, and starts up a background worker, which continuously loops in a while(true). On each loop, it checks if it has any DataAs left over to display. If so, it displays that Data, and Thread.Sleep()s until the next DataA is scheduled to be displayed. If it's out of data, it queries for more, using the following methods:
public DataA[] GetDataAs(DateTime? startDate)
{
_loadOperationGetDataACompletion = new AutoResetEvent(false);
LoadOperation<DataA> loadOperationGetDataA = null;
loadOperationGetDataA =
_context.Load(_context.GetDataAQuery(startDate),
System.ServiceModel.DomainServices.Client.LoadBehavior.RefreshCurrent, false);
loadOperationGetDataA.Completed += new
EventHandler(loadOperationGetDataA_Completed);
_loadOperationGetDataACompletion.WaitOne();
List<DataA> dataAs = new List<DataA>();
foreach (var dataA in loadOperationGetDataA.Entities)
dataAs.Add(dataA);
return dataAs.ToArray();
}
private static AutoResetEvent _loadOperationGetDataACompletion;
private static void loadOperationGetDataA_Completed(object sender, EventArgs e)
{
_loadOperationGetDataACompletion.Set();
}
Seems kind of clunky trying to force it into being synchronous, but since this already is on a background thread, I think this is OK? So far, everything actually works, as much of a hack as it seems like it may be. It's important to note that if I try to run that code on the UI thread, it locks, because it waits on the WaitOne() forever, locking the thread, so it can't make the Load request to the server.
So once the data is displayed, users can click on one as it goes by to fill a details pane with the full DataB data about that object. To do that, I have the the details pane user control subscribing to a selection event I have setup, which gets fired when the selection changes (on the UI thread). I use a similar technique there, to get the DataB object:
void SelectionService_SelectedDataAChanged(object sender, EventArgs e)
{
DataA dataA = /*Get the selected DataA*/;
MyDomainContext context = new MyDomainContext();
var loadOperationGetDataB =
context.Load(context.GetDataBQuery(dataA.DataAID),
System.ServiceModel.DomainServices.Client.LoadBehavior.RefreshCurrent, false);
loadOperationGetDataB.Completed += new
EventHandler(loadOperationGetDataB_Completed);
}
private void loadOperationGetDataB_Completed(object sender, EventArgs e)
{
this.DataContext =
((LoadOperation<DataB>)sender).Entities.SingleOrDefault();
}
Again, it seems kinda hacky, but it works... except on the DataB that it loads, the DataCs list is empty. I've tried all kinds of things there, and I don't see what I'm doing wrong to allow the DataCs to come down with the DataB. I'm about ready to make a 3rd query for the DataCs, but that's screaming even more hackiness to me.
It really feels like I'm fighting against the grain here, like I'm doing this in an entirely unintended way. If anyone could offer any assistance, and point out what I'm doing wrong here, I'd very much appreciate it!
Thanks!
I have to say it does seem a bit overly complex.
If you use the entity framework (which with version 4 can generate POCO's if you need) and Ria /LINQ You can do all of that implicitly using lazy loading, 'expand' statements and table per type inheritance.

Categories