I have a project where a part of it monitors changes made to an SQL database. I am using the SQL Table Dependency NuGet Package to monitor changes so I can update the UI when changes are made.
The issue I have is that there is a function in my program that can add 50-99k rows to a table in the database. The event gets triggered as many times as there are rows added. This is problematic because I do not want to update the UI 99k times. I want to update it at most once or twice. How I am handling right now is when I detect that 5 events are triggered within a certain timespan I DeInit the TableDependency, then a delayed task reenables it after a few seconds and also triggers a UI update at the end so it won't miss anything while it was disabled temporarily.
I also tried using a static bool for rate limiting, instead of DeIniting and ReIniting the TableDependency, but it takes 30-90s sometimes because the event handler cannot reasonably keep up with them all. I think the DeInit works better because removing the callbacks from the eventhandler appears to clear it from events. I could not find a way to clear the event handler from the massive queue of events otherwise.
I tried delving into Reactive Extensions and using the Throttle function. This worked OK except for the fact that the first event received would not trigger. It would wait until the events died off to trigger (I realize this is by design). This makes the program feel unresponsive for awhile because when the events are triggered SQL has already added all the rows so all it really needs is that first event and last event to update at most.
The reason I am trying to find an alternative is because TableDependency sometimes (no idea how to replicate this yet) is orphaning Trigger scripts in SQL on the table with invalid ids and it causes fatal exceptions to occur when the DB Instance (I am using EF6 Core) runs SaveChanges(). I theorize running the DeInit and Init functions frequently is at best not helping the issue and at worst the direct cause of it. So I am trying to find some way to avoid frequently DeIniting and ReIniting the TableDependency but also have my UI updates feel responsive and not have bad performance.
DeInit function:
private static void DeInitDependency(TableType tableType)
{
if(tableType == TableType.Event)
{
eventTableDependency.Stop();
eventTableDependency.Dispose();
eventTableDependency.OnChanged -= SqlDependencyEventTable_OnChanged;
eventTableDependency.OnError -= DepEvent_OnError;
eventTableDependency.OnStatusChanged -= DepEvent_OnStatusChanged;
eventChangeTrackingStarted = false;
}
else if (tableType == TableType.Location)
{
locationTableDependency.Stop();
locationTableDependency.Dispose();
locationTableDependency.OnChanged -= SqlDependencyLocationTable_OnChanged;
locationTableDependency.OnError -= DepLocation_OnError;
locationTableDependency.OnStatusChanged -= DepLocation_OnStatusChanged;
locationChangeTrackingStarted = false;
}
}
Init/Reinit Function:
public static void InitDependency(TableType tableType)
{
try
{
//Set Connection String to SQL
string dbConnectionString = "";
dbConnectionString = sqlCore.generateConnectionString();
if(tableType == TableType.Event)
{
//Create Dependency and Connect
eventTableDependency = new SqlTableDependency<NextGenGui.Models.Event>(dbConnectionString, executeUserPermissionCheck: false);
eventTableDependency.OnChanged += SqlDependencyEventTable_OnChanged;
eventTableDependency.OnError += DepEvent_OnError;
eventTableDependency.OnStatusChanged += DepEvent_OnStatusChanged;
eventTableDependency.Start();
eventChangeTrackingStarted = true;
Debug.WriteLine("Event SQL TRACKING STARTED!");
}
else if(tableType == TableType.Location)
{
locationTableDependency = new SqlTableDependency<Models.Location>(dbConnectionString, executeUserPermissionCheck: false);
locationTableDependency.OnChanged += SqlDependencyLocationTable_OnChanged;
locationTableDependency.OnError += DepLocation_OnError;
locationTableDependency.OnStatusChanged += DepLocation_OnStatusChanged;
locationTableDependency.Start();
locationChangeTrackingStarted = true;
Debug.WriteLine("Location SQL TRACKING STARTED!");
}
}catch (Exception ex)
{
Debug.WriteLine(ex);
if(ex.Message.Contains("Service broker"))
{
InitSQLBrokerSetting();
}
}
}
It sounds like you need one event per business-level operation, instead of one event per table update. If that's the case, then you're going to have to look at implementing your own solution. Topics like SignalR and ServiceBus are good starting points to look into. This stream of business operations is a useful thing to implement anyway, for auditing and caching.
It's worth pointing out that you don't have to completely replace the SQL Table Dependency in one go. You can start with just the tables that are causing problems from the bulk operations.
You effectively need to debounce the event signaling. Rather that removing the event handler (which means you won't know what rows have changed during that period) could your handler be changed to simply make marks in memory state based on what the current UI might be caring about? For instance if the UI is displaying one or more key records that you care about, the handler knows what IDs are relevant and if any of those rows are touched the markers are set, in which a periodic check looks at the markers and refreshes the view. This might be a single row the user is viewing, or a set of row IDs based on something like search results, etc.
If instead the UI reflects a summary of all data state and any row change would impact it, then perhaps consider an in-memory representation that can be updated by the event handler and periodically checked and refresh the view if necessary.
Ultimately it depends on what the view is currently displaying and the relationship with regards to the data update monitoring. Libraries like SignalR are typically employed for more server operation to relevant clients signaling where actions invoked by one client can be relayed to other clients to update data or refresh their view. When it comes to something from the database you would probably want to implement some manner of filtering and processing to monitor when relevant changes have come in to raise a signal for a stale view check to pick up on and refresh.
Related
How do I setup an event loop (main loop) in a UWP app?
My goal is to have an app that has a main page with a continuously updating calculation, that then continuously updates an image on the page. While that is constantly happening, the user can click a button to change the calculation behavior, or to go to a second page and edit related data.
The user can then click back to the main page and see the image and calculation restart and continuously update. (The calculation is complex, so it should go as fast as possible and use up as much app time as possible).
If there is a way to accomplish this without an event loop I would like to know that also, but so far I have not found a way.
With an event loop, I can simply update the calculation and UI every time through the loop. But I have not found any way to do so. This question asked nearly the same thing, but was never directly answered and had a different use case anyway.
The closest I have found to a solution is to grab the CoreDispatcher from the CoreWindow and use the RunIdleAsync() method to create a loop:
public MainPage()
{
this.InitializeComponent();
Windows.UI.Core.CoreWindow appwindow = Windows.UI.Core.CoreWindow.GetForCurrentThread();
Windows.UI.Core.CoreDispatcher appdispatcher = appwindow.Dispatcher;
//create a continuously running idle task (the app loop)
appdispatcher.RunIdleAsync( (dummyt) =>
{
//do the event loop here
.
.
.
if (appdispatcher.ShouldYield()) //necessary to prevent blocking UI
{
appdispatcher.ProcessEvents(Windows.UI.Core.CoreProcessEventsOption.ProcessAllIfPresent);
}
});
}
The main problem with this is that you can't switch between pages (you get a system exception from dispatching events within an already dispatched event).
Second, this is very messy and requires maintaining extra state in the event loop. Besides, why should I have to go through these contortions just to have some calculations happening while the app is waiting for user input?
Is there a way to do this (besides switching to a C++ DirectX app)?
I don't know about setting up your own event loop, but there is no reason to do so.
What you are talking about sounds like a great case for Tasks. You would start a calculation Task whenever your user did something, having it report its progress via standard C# events if you need mid-operation updates. Those updates would modify properties in your view model which the binding system would then pick up.
You could also make your calculation code cancellable so changes can abort a previous calculation.
All of this involves pretty standard UWP concepts; no need for a special event loop. That you are even considering that makes me think you need to study MVVM and multi-threading/tasks; you are still thinking in a very "Win-Forms" kind of way.
If we're talking about some event loop, or stream, .Net has a great library named Rx, or Reactive Extensions, which may be helpful for you. You can set up a simple flow, something like this:
var source = Observable
// fire event every second
.Interval(TimeSpan.FromSeconds(1), Scheduler.DispatcherScheduler)
// add timestamp for event
.Timestamp()
// gather system information to calculate
.Select(GetSystemInfo);
Note that the events right now are on UI thread, as you need to access the controls. Now you have two options: use Rx for background processing too or use TPL Dataflow' TransformBlock for processing your system information into new image (it can be Observer and Observable at a time). After that you need to get back to UI thread.
First option:
var source = Observable
// fire event every second
.Interval(TimeSpan.FromSeconds(1), DispatcherScheduler.Current)
// add timestamp for event
.Timestamp()
// gather system information to calculate
.Select(GetSystemInfo)
// expensive calculations are done in background
.Select(x => x.ObserveOn(DefaultScheduler.Instance))
.Select(x => Expensive(x))
.Select(x => x.ObserveOn(DispatcherScheduler.Current))
.Select(x => UpdateUI(x));
You probably should split this chain into several observers and observables, still the idea is the same, more information here: Rx Design Guidelines.
Second option:
var action = new TransformBlock<SystemInfo, ImageDelta>(CalculateDelta,
new ExecutionDataflowBlockOptions
{
// we can start as many item processing as processor count
MaxDegreeOfParallelism = Environment.ProcessorCount,
});
IDisposable subscription = source.Subscribe(action.AsObserver());
var uiObserver = action.AsObservable()
.Select(x => x.ObserveOn(DispatcherScheduler.Current))
.Select(x => UpdateUI(x));
I want to note that UWP and MVVM pattern do provide a possibility to work with binding between UI and ObservableCollection, which will help you to notify user in most natural way.
So I have a function like this in a singleton service that is injected into a controller.
public async Task<ResponseModel> Put(BoardModel request)
{
var board = await dbService.GetBoardAsync(request.UserId, request.TargetId, request.Ticker);
// Update the model
// ...
var response = await dbService.SetBoardAsync(request.UserId, request.TargetId, request.Ticker, request);
return new ResponseModel
{
ResponseStatus = response.Successful(replacements: 1) ? ResponseStatus.Success : ResponseStatus.Failure
};
}
What I'm worried about is race conditions, say if two instances of the function are running at the same time, and one overwrites the entry in the db.
Is this possible? There's a very small chance of it happening, but I'm still a bit worried.
Thanks!
Yes, assuming your server has more than one thread (which will be any production capable web server), then two or more threads can be simultaneously running the same block of code. The typical way to handle this type of situation is with optimistic concurrency. What that means is that EF will attempt to save the record (optimistically assuming it will be able to without issue), and if the record ends up having been modified before it got to it, it will return an exceptions (specifically OptimisticConcurrencyException). You can see this ASP.NET getting started article for a walkthrough on how to set it up. Essentially, it just involves adding a rowversion column to your database table(s). Each time the row is updated, the value of that column mutates. Therefore, EF can check the value on the record it's trying to update with what's currently on the table. If they're the same, it can continue updating. If not, then something else modified the record and it stops the update. By catching the exception that's returned, you can then respond appropriately by reloading the data and trying to do another update.
It's highly unlikely you would end up hitting a concurrency issue multiples times, but just in case, I would recommend using something like Polly (Nuget) to handle the exception. Among other things, it allows you retry a set number of times or even forever, until no exception is raised. This then would ensure that the record would eventually get updated, even if there were multiple concurrency conflicts.
Policy
.Handle<OptimisticConcurrencyException>()
.RetryForever((exception, context) => {
// resolve concurrency issue
// See: https://msdn.microsoft.com/en-us/data/jj592904.aspx
})
.Execute(() => {
db.SaveChanges();
});
Ok so I am not very familiar with databases so there may be a simple solution that I am not aware of.
I have a SQL database that is to be managed by a class in my c# application. What I want the class to do is to constantly check the database to see if there is new data. If there is new data, I want it to trigger an event that another class will be listening to. Now I'm guessing that I need to implement a thread that will check the database at every other ms or something. However, what would I need to look for in order to fire my event? Can the database notify the class when there is a new entry?
If you are using MS SQLServer, you can use the SqlDependency class from the .NET Framework to get notifications about database changes.
Maybe other database systems have similar mechanisms in their database driver packages.
If you cannot use that for whatever reason, you will need a Thread to poll the database periodically.
1.If you want the database to inform your Application about a change then you can user Broker(first you enable your database to support Brokers and then you write some code so as to "attach" the Broker.). For your Application you will need SqlDependency Class.
Helpful links:
Enable Broker
Query Notifications in SQL Server
If you want to check multiple Queries then be aware that Broker is a little haevy.
2.If you want your application to do all the work you have to create a function that will check the CKECKSUM for the selected table, each time you will keep the last checksum and if you find any difference then you will "hit" the database to get the new data.
You have to decide who is going to do all your job!
Hope it helps.
Other than using SqlDependency, you can use a Timer, or SqlCacheDependency if you are using ASP.NET or MVC with the Cache object. 1ms intervals are not recommended though as you probably wont complete your check before the next one starts, and your database load will be very high as a result. You could also make sure you use the Timer.AutoReset property so you don't have calls tripping over each other.
Edit 2: This MSDN example shows how you can use SqlDependency, including having to Enable Query Notifications (MSDN). There are many considerations for using SqlDependency, for example it was really designed for web servers where limited watchers would be created, not so much for desktop applications, so keep that in mind. There is a good article on BOL on this called Planning for Notifications which emphasises that Query notifications are useful
if the data in the query changes relatively infrequently, if the application does not require an instantaneous update when the data changes, and if the query meets the requirements and restrictions outlined in Creating a Query for Notification
In your sample you suggest the need for 1ms latency, so maybe the Dependency classes are not the best way for you (also see my later comment on your latency requirement).
EDIT: For example (using the timer):
class Program
{
static void Main(string[] args)
{
Timer timer = new Timer(1);
timer.Elapsed += timer_Elapsed;
timer.AutoReset = false;
timer.Enabled = true;
}
static void timer_Elapsed(object sender, ElapsedEventArgs e)
{
Timer timer = (Timer)sender;
try
{
// do the checks here
}
finally
{
// re=enable the timer to check again very soon
timer.Enabled = true;
}
}
}
As for what to check, it depends on what changes you are actually looking to detect. Here are some ideas:
table row count (but dangerous if a row is added and deleted since the last check)
max value of the table id column (only works if you have a numeric identity field that is increasing, and only works to check for new rows)
check individual columns for changes in specific rows you want to watch
use a row CHECKSUM in a column to check for changes on individual rows
ask writers to update a separate table with a change reference id that you can check
use audit tables to record changes, and check for new audit records
You need to better define the scope of your change monitoring before you can get a good answer to this.
Latency
Also ask yourself if you really need 1ms latency on change updates. If you do, a different approach entirely might be better. For example you may need to use a notification mechanism by the data writers to the parts of your application that need to know an update has occurred right now.
I have an very quick/lightweight mvc action, that is requested very often and I need to maintain minimal response time under heavy load.
What i need to do, is from time to time depending on conditions to insert small amount of data to sql server (log unique id for statistics, for ~1-5% of queries).
I don't need inserted data for response and if I loose some of it because application restart or smth, I'll survive.
I imagine that I could queue somehow inserting and do it in background, may be even do some kind of buffering - like wait till queue collects 100 of inserts and then make them in one pass.
I'm pretty sure, that somebody must have done/seen such implementation before, there's no need to reinvent wheel, so if somebody could point to right direction, I would be thankful.
You could trigger a background task from your controller action that will do the insertion (fire and forget):
public ActionResult Insert(SomeViewModel model)
{
Task.Factory.StartNew(() =>
{
// do the inserts
});
return View();
}
Be aware though that IIS could recycle the application at any time which would kill any running tasks.
Create a class that will store the data that needs to be pushed to the server, and a queue to hold a queue of the objects
Queue<LogData> loggingQueue = new Queue<LogData>();
public class LogData {
public DataToLog {get; set}
}
The create a timer or some other method within the app that will be triggered every now and then to post the queued data to the database
I agree with #Darin Dimitrov's approach although I would add that you could simply use this task to write to the MSMQ on the machine. From there you could write a service that reads the queue and inserts the data into the database. That way you could throttle the service that reads data or even move the queue onto a different machine.
If you wanted to take this one step further you could use something like nServiceBus and a pub/sub model to write the events into the database.
History of the problem
This is continuation of my previous question
How to start a thread to keep GUI refreshed?
but since Jon shed new light on the problem, I would have to completely rewrite original question, which would make that topic unreadable. So, new, very specific question.
The problem
Two pieces:
CPU hungry heavy-weight processing as a library (back-end)
WPF GUI with databinding which serves as monitor for the processing (front-end)
Current situation -- library sends so many notifications about data changes that despite it works within its own thread it completely jams WPF data binding mechanism, and in result not only monitoring the data does not work (it is not refreshed) but entire GUI is frozen while processing the data.
The aim -- well-designed, polished way to keep GUI up to date -- I am not saying it should display the data immediately (it can skip some changes even), but it cannot freeze while doing computation.
Example
This is simplified example, but it shows the problem.
XAML part:
<StackPanel Orientation="Vertical">
<Button Click="Button_Click">Start</Button>
<TextBlock Text="{Binding Path=Counter}"/>
</StackPanel>
C# part (please NOTE this is one piece code, but there are two sections of it):
public partial class MainWindow : Window,INotifyPropertyChanged
{
// GUI part
public MainWindow()
{
InitializeComponent();
DataContext = this;
}
private void Button_Click(object sender, RoutedEventArgs e)
{
var thread = new Thread(doProcessing);
thread.IsBackground = true;
thread.Start();
}
// this is non-GUI part -- do not mess with GUI here
public event PropertyChangedEventHandler PropertyChanged;
public void OnPropertyChanged(string property_name)
{
if (PropertyChanged != null)
PropertyChanged(this, new PropertyChangedEventArgs(property_name));
}
long counter;
public long Counter
{
get { return counter; }
set
{
if (counter != value)
{
counter = value;
OnPropertyChanged("Counter");
}
}
}
void doProcessing()
{
var tmp = 10000.0;
for (Counter = 0; Counter < 10000000; ++Counter)
{
if (Counter % 2 == 0)
tmp = Math.Sqrt(tmp);
else
tmp = Math.Pow(tmp, 2.0);
}
}
}
Known workarounds
(Please do not repost them as answers)
I sorted the list according how much I like the workaround, i.e. how much work it requires, limitations of it, etc.
this is mine, it is ugly, but simplicity of it kills -- before sending notification freeze a thread -- Thread.Sleep(1) -- to let the potential receiver "breathe" -- it works, it is minimalistic, it is ugly though, and it ALWAYS slows down computation even if no GUI is there
based on Jon idea -- give up with data binding COMPLETELY (one widget with databinding is enough for jamming), and instead check from time to time data and update the GUI manually -- well, I didn't learn WPF just to give up with it now ;-)
Thomas idea -- insert proxy between library and frontend which would receiver all notifications from the library, and pass some of them to WPF, like for example every second -- the downside is you have to duplicate all objects that send notifications
based on Jon idea - pass GUI dispatcher to library and use it for sending notifications -- why it is ugly? because it could be no GUI at all
My current "solution" is adding Sleep in the main loop. The slowdown is negligible, but it is enough for WPF to be refreshed (so it is even better than sleeping before each notification).
I am all ears for real solutions, not some tricks.
Remarks
Remark on giving up with databinding -- for me the design of it is broken, in WPF you have single channel of communication, you cannot bind directly to the source of the change. The databinding filters the source based on name (string!). This requires some computation even if you use some clever structure to keep all the strings.
Edit: Remark on abstractions -- call me old timer, but I started learning computer convinced, that computers should help humans. Repetitive tasks are domain of computers, not humans. No matter how you call it -- MVVM, abstractions, interface, single inheritance, if you write the same code, over and over, and you don't have way to automatize the things you do, you use broken tool. So for example lambdas are great (less work for me) but single inheritance is not (more work for me), data binding (as an idea) is great (less work) but the need of proxy layer for EVERY library I bind to is broken idea because it requires a lot of work.
In my WPF applications I don't send the property change directly from the model to the GUI. It always goes via a proxy (ViewModel).
The property change events are put in a queue which is read from the GUI thread on a timer.
Don't understand how that can be so much more work. You just need another listener for your model's propertychange event.
Create a ViewModel class with a "Model" property which is your current datacontext. Change the databindings to "Model.Property" and add some code to hook up the events.
It looks something like this:
public MyModel Model { get; private set; }
public MyViewModel() {
Model = new MyModel();
Model.PropertyChanged += (s,e) => SomethingChangedInModel(e.PropertyName);
}
private HashSet<string> _propertyChanges = new HashSet<string>();
public void SomethingChangedInModel(string propertyName) {
lock (_propertyChanges) {
if (_propertyChanges.Count == 0)
_timer.Start();
_propertyChanges.Add(propertyName ?? "");
}
}
// this is connected to the DispatherTimer
private void TimerCallback(object sender, EventArgs e) {
List<string> changes = null;
lock (_propertyChanges) {
_Timer.Stop(); // doing this in callback is safe and disables timer
if (!_propertyChanges.Contain(""))
changes = new List<string>(_propertyChanges);
_propertyChanges.Clear();
}
if (changes == null)
OnPropertyChange(null);
else
foreach (string property in changes)
OnPropertyChanged(property);
}
This isn't really a WPF issue per se. When you have a long-running operation that updates a set of data rapidly, keeping the UI updated - any UI, whether it's WPF or WinForms or just VT100 emulation - is going to present the same problem. UI updates are comparatively slow and complex, and integrating them with a fast-changing complex process without hurting that process requires a clean separation between the two.
That clean separation is even more important in WPF because the UI and the long-running operation need to run on separate threads so that the UI doesn't freeze while the operation is running.
How do you achieve that clean separation? By implementing them independently, providing a mechanism for periodically updating the UI from within the long-running process, and then testing everything to figure out how frequently that mechanism should be invoked.
In WPF, you'll have three components: 1) a view, which is the physical model of your UI, 2) a view model, which is the logical model of the data that is displayed in the UI, and that pushes changes in the data out to the UI through change notification, and 3) your long-running process.
The long-running process can be almost completely unaware of the UI, so long as it does two things. It needs to expose public properties and/or methods so that the view model can examine its state, and it needs to raise an event whenever the UI should be updated.
The view model listens to that event. When the event is raised, it copies state information from the process to its data model, and its built-in change notification pushes those out to the UI.
Multithreading complicates this, but only a bit. The process needs to run on a different thread than the UI, and when its progress-reporting event is handled, its data will be copied across threads.
Once you've built these three pieces, the multithreading is very straightforward to accomplish using WPF's BackgroundWorker. You create the object that's going to run the process, wire its progress-reporting event up with the BackgroundWorker's ReportProgress event, and marshal data from the object's properties to the view model in that event handler. Then fire off the object's long-running method in the BackgroundWorker's DoWork event handler and you're good to go.
A user interface that changes faster than the human eye can observe (~25 updates/sec) is not a usable user interface. A typical user will observe the spectacle for at most a minute before giving up completely. You are well past this if you made the UI thread freeze.
You have to design for a human, not a machine.
Since there are too many notifications for the UI to handle, why not just throttle the notifications a bit? This seems to work fine:
if (value % 500 == 0)
OnPropertyChanged("Counter");
You could also limit the frequency of the notifications, using a timer:
public SO4522583()
{
InitializeComponent();
_timer = new DispatcherTimer();
_timer.Interval = TimeSpan.FromMilliseconds(50);
_timer.Tick += new EventHandler(_timer_Tick);
_timer.Start();
DataContext = this;
}
private bool _notified = false;
private DispatcherTimer _timer;
void _timer_Tick(object sender, EventArgs e)
{
_notified = false;
}
...
long counter;
public long Counter
{
get { return counter; }
set
{
if (counter != value)
{
counter = value;
if (!_notified)
{
_notified = true;
OnPropertyChanged("Counter");
}
}
}
}
EDIT: if you cannot afford to skip notifications because they're used by other parts of your code, here's a solution that doesn't require big changes in your code:
create a new property UICounter, which throttles the notifications as shown above
in the Counter setter, update UICounter
in your UI, bind to UICounter rather than Counter
A layer between UI and the library is necessary. This will ensure that you will be able to do interaction testing and also allow you to swap out the library with another implementation in future without much change. This isn't a duplication, but a way of providing an interface for UI layer to communicate. This layer will accept objects from library, convert them to specific data transfer objects and pass them onto another layer which will have the responsibility to throttle the updates and convert them to your specific VM objects.
My opinion is that VMs should be as dumb as possible and their only responsibility should be to provide data to views.
Your qestion sounds similar to slow-down-refresh-rate-of-bound-datagrid.
At least the answers are similar
Have a shadow copy of your data bound to the gui element instead of binding the original data.
Add an eventhandler that update the shadow-copy with a certain delay from the original data.
You need to disconnect the source of the notifications from the target for the notifications. The way you have it set up now, every time the value changes, you go through an entire refresh cycle (which I believe is blocking your processing function from continuing as well). This is not what you want.
Provide an Output stream to your processing function which it would use to write its notifications.
On the monitoring side, attach an input stream to that outputstream and use it as the data source for your UI component. This way there isn't any notification event handling going on at all - the processing is running flat out as fast as it can, outputting monitor data to the output stream you provide. Your monitor UI is simply rendering whatever it receives in the input stream.
You will need a thread to continuously read from the input stream. If no data is available, then it should block. If it reads some data, it should dump it into the UI.
Regards,
Rodney