So I am only a few days into learning about wcf services, specifically duplex, and I am starting with a test app. The goal is to have a Service that has an internal (static?) class which stores variables, and a Client that fetches for those variables.
Currently I have two variables in the Storage class, one which is a list of Subscribers (ObservableCollection<IMyContractCallBack>) and one which is an ObservableCollection<string>, where each string gets sent in the callback method to the client.
I would like to be able to have the client Fetch (which first Subscribes if not already, by adding its context to the collection on the server side) the strings in the collection on the server side. That part works as expected. However, I would also like to Push a string from the server to every client in the subscription list, as well as Add strings to the collection of strings. That's where my issues crop up.
Anytime I Fetch, it adds to string list "test1..." and "test2..." and sends them, then the client updates a textblock UI (wpa) so if I fetch twice I'll have "test1...","test2...","test1...","test2..." because right now there's no checking for duplicates. That proves that the collection can get updated and remembered on the server side from Fetch to Fetch. However, when I try to Add or Send a given text, all variables are forgotten, so the subscriber list is null, and the list-to-add-to is empty. Yet when I then Fetch again, the old list is back (now with 6 things, test1...,test2... etc...)
I have this before the class
[ServiceBehavior(InstanceContextMode= InstanceContextMode.PerSession, ConcurrencyMode = ConcurrencyMode.Single)]
and I also tried a Singleton context mode to no avail. Changing the ConcurrencyMode to Multiple doesn't do anything different either. Any ideas as to why my static data is being reset only when internal commands come from the server itself?
Here is the code for my Service:
namespace WcfService3
{
[ServiceBehavior(InstanceContextMode= InstanceContextMode.Single, ConcurrencyMode = ConcurrencyMode.Single)]
public class Service1 : IService1
{
public static event Action NullContext;
public static ObservableCollection<IMyContractCallBack> Subscriptions;
public void NormalFunction()
{
//Only sends to Subs that are STILL Open
foreach (IMyContractCallBack user in Subscriptions)
{
//Removes the Closed users, because they are hanging around from last session
if (((ICommunicationObject)user).State != CommunicationState.Opened)
{
Subscriptions.Remove(user);
}
else
{
ObservableCollection<string> holder = Storage.GetList();
foreach (string str in holder)
{
user.CallBackFunction(str);
}
}
}
}
public static void Send(string str)
{
try
{
foreach (IMyContractCallBack user in Subscriptions)
{
user.CallBackFunction(str);
}
}
catch
{
//For some reason 'Subscriptions' is always null
NullContext.Invoke();
}
}
public static void Add(string str)
{
//For some reason 'SendList' is always null here, too
Storage.AddToList(str);
if (Subscriptions != null)
{
//For same reason 'Subscriptions' is always null
foreach (IMyContractCallBack user in Subscriptions)
{
user.CallBackFunction(str);
}
}
}
public void Subscribe()
{
//Adds the callback client to a list of Subscribers
IMyContractCallBack callback = OperationContext.Current.GetCallbackChannel<IMyContractCallBack>();
if (Subscriptions == null)
{
Subscriptions = new ObservableCollection<IMyContractCallBack>();
}
if(!Subscriptions.Contains(callback))
{
Subscriptions.Add(callback);
}
}
and here is my code for the Storage class:
namespace WcfService3
{
public static class Storage
{
public static readonly ObservableCollection<string> SendList = new ObservableCollection<string>();
public static IMyContractCallBack callback;
public static ObservableCollection<string> GetList()
{
if (SendList.Count == 0)
{
AddToList("Test1...");
AddToList("Test2...");
}
return SendList;
}
public static void AddToList(string str)
{
SendList.Add(str);
}
}
}
I can provide more code if needed.
Are you using the ThreadStatic attribute anywhere? (just do a quick search) Thats a real long shot and probably not your issue.
You probably have a threading issue. Do all your clients connect at the same time (i really mean in close succession?) If yes, you are going to have threading issues with this code in your Subscribe method:
if (Subscriptions == null)
{
Subscriptions = new ObservableCollection<IMyContractCallBack>();
}
You should better constrain access to your Subscriptions method so you can see who modifies it and when and use Console statments to figure out where you're going wrong.
Related
I have an abstract class called HttpHelper it has basic methods like, GET, POST, PATCH, PUT
What I need to achieve is this:
Store the url, time & date in the database each time the function is called GET, POST, PATCH, PUT
I don't want to store directly to the database each time the functions are called (that would be slow) but to put it somewhere (like a static queue-memory-cache) which must be faster and non blocking, and have a background long running process that will look into this cache-storage-like which will then store the values in the database.
I have no clear idea how to do this but the main purpose of doing so is to take the count of each calls per hour or day, by domain, resource and url query.
I'm thinking if I could do the following:
Create a static class which uses ConcurrentQueue<T> to store data and call that class in each function inside HttpHelper class
Create a background task similar to this: Asp.Net core long running/background task
Or use Hangfire, but that might be too much for simple task
Or is there a built-in method for this in .netcore?
Both Hangfire and background tasks would do the trick as consumers of the queue items.
Hangfire was there before long running background tasks (pre .net core), so go with the long running tasks for net core implementations.
There is a but here though.
How important is to you that you will not miss a call? If it is, then neither can help you.
The Queue or whatever static construct you have will be deleted the time your application crashes/machine restarts or just plain recycling of the application pools.
You need to consider some kind of external Queuing mechanism like rabbit mq with persistence on.
You can also append to a file, but that might also cause some delays as read/write.
I do not know how complex your problem is but I would consider two solutions.
First is calling Async Insert Method which will not block your main thread but will start task. You can return response without waiting for your log to be appended to database. Since you want it to be implemented in only some methods, I would do it using Attributes and Middleware.
Simplified example:
public IActionResult SomePostMethod()
{
LogActionAsync("This Is Post Method");
return StatusCode(201);
}
public static Task LogActionAsync(string someParameter)
{
return Task.Run(() => {
// Communicate with database (X ms)
});
}
Better solution is creating buffer which will not communicate with database each time but only when filled or at interval. It would look like this:
public IActionResult SomePostMethod()
{
APILog.Log(new APILog.Item() { Date = DateTime.Now, Item1 = "Something" });
return StatusCode(201);
}
public partial class APILog
{
private static List<APILog.Item> _buffer = null;
private cont int _msTimeout = 60000; // Timeout between updates
private static object _updateLock = new object();
static APILog()
{
StartDBUpdateLoopAsync();
}
private void StartDBUpdateLoopAsync()
{
// check if it has been already and other stuff
Task.Run(() => {
while(true) // Do not use true but some other expression that is telling you if your application is running.
{
Thread.Sleep(60000);
lock(_updateLock)
{
foreach(APILog.Item item in _buffer)
{
//Import into database here
}
}
}
});
}
public static void Log(APILog.Item item)
{
lock(_updateLock)
{
if(_buffer == null)
_buffer = new List<APILog.Item>();
_buffer.Add(item);
}
}
}
public partial class APILog
{
public class Item
{
public string Item1 { get; set; }
public DateTime Date { get; set; }
}
}
Also in this second example I would not call APILog.Log() each time but use Middleware in combination with Attribute
I have a hub that manages many worker processes. I want to build a UI that lets me connect to this hub and retrieve the processing log from any of these worker processes. Essentially this will be a client wanting to obtain a string from another client. I have been able to get the request from client A sent to client B, but i dont know how to return anything in that response.
I have a simple method in the hub
public void GetRunLog(int runid)
{
JobRunLog log = null;
JobRunClient client = this.GetClientByRunID(runid);
if(client != null)
{
var rawlog = Clients.Client(client.ConnectionID).GetRunLog();
log = JsonConvert.DeserializeObject<JobRunLog>(rawlog);
Clients.Client(Context.ConnectionId).GetRunLog(log);
}
}
This request gets picked up by the client, but I dont know how to make it return a value so that var rawlog actually contains something. For the moment, this is the best workaround i could come up with.
myHubProxy.On("GetRunLog", (uiconnectionid) =>
{
string connectionid = uiconnectionid;
myHubProxy.Invoke("ReturnRunLog", run.ID, run.Log, connectionid).ContinueWith(task => {});
});
This will then make the worker client send the log back in a separate request with a reference to the client that had requested the log, but it isnt actually returning a respnonse to the initial request. I cant see a way to make this happen. Rather than use Invoke, how would i just return the object directly to the method on the hub that initiated the request?
Unfortunatelly Hub doesn't keeps it's state:
Because instances of the Hub class are transient, you can't use them
to maintain state from one method call to the next. Each time the
server receives a method call from a client, a new instance of your
Hub class processes the message. To maintain state through multiple
connections and method calls, use some other method such as a
database, or a static variable on the Hub class, or a different class
that does not derive from Hub.
Try to move the logic into a separate class and store the instance object in a static dictionary related to the connection id (don't forget to clean it). Whenewer call comes to the Hub it repoints it to a appropriate instance,
here is the simplified sample
public class TestingLogHub : Hub
{
public static readonly Dictionary<string, TestInstance> Instances =
new Dictionary<string, TestInstance>();
public void SetParameter(string value)
{
Instances[Context.ConnectionId].ContinueWith(value);
}
...
}
public class TestInstance : IDisposable
{
public TestInstance(string basePath, IHubContext host, string connectionId)
{...
}
public void ContinueWith(string value)
{
if (_nextAction == null)
{
FinishExecution();
}
else
{
try
{
_nextAction(value);
}
catch (Exception exception)
{
Error(exception.Message);
FinishExecution();
}
}
}
public void RequestParameterFor(Action<string> action, string parameter, string defaultValue = null)
{
_nextAction = action;
_host.Clients.Client(_connectionId).requestParameter(parameter, defaultValue??GetRandomText());
}
}
So when Instance is started it's doing some work, but at the moment it requires some input it executes RequestParameterFor that set's the next function to be executed into an instance state and waits for the next call of ContinueWith.
it is a bit generic example, in your case you can send back an object and provide it to an instance, and maybe dispose the instance at the end of that request, if that was the only required call
Sorry if i dont explain myself clearly but I can't seem to wrap my head around how to present the problem at hand.
We have some utility classes used in web and windows forms development. We use the following pattern in order to dynamically call the corresponding methods depending on whether the code runs inside a web or in windows forms.
The following code is a stripped down version so you can observe the logic clearlier:
public static class Constants
{
private static ConstantsWinFormsWebCommonProvider _Provider;
public static ConstantsWinFormsWebCommonProvider Provider
{
get
{ /*in websites the static variables reset to null in case of Appdomain recycle
*so we check beforehand, this also serves for first time use in both: web and winforms */
//lazy loading, initializes when provider is required for the first time or if gets null because appdomain recycle...
if (_Provider != null) return _Provider;
_Provider = Fun.WebSite()? (ConstantsWinFormsWebCommonProvider)new ConstantsWebProvider() : new ConstantsWinFormsProvider();
Initialize();
return _Provider;
}
}
public static void Initialize()
{
Provider.Initialize();
}
public static string DataBaseName
{
get { return Provider.DataBaseName ; }
set { Provider.DataBaseName = value; }
}
}
public class ConstantsWinFormsWebCommonProvider
{
internal bool isInitialized { get; set; } //variable only used in the winform part for now, shouldnt affect the web part issue
internal string DataBaseName { get; set; }
}
public class ConstantsWebProvider: ConstantsWinFormsWebCommonProvider
{
public override void Initialize()
{
base.Initialize();
string errordetails= "";
if (!Fun.InitializeWeb(ref errordetails)) //fills all the Provider properties and connects, otherwise returns false
{
throw new Exception("Problem initializating web " + errordetails));
}
}
}
public class ConstantsWinFormsProvider: ConstantsWinFormsWebCommonProvider
{
public new string DataBaseName
{
get
{//lazy loading so it connects only when required
if (isInitialized) //(!string.IsNullOrEmpty(base.DataBaseName))
{
return base.DataBaseName ;
}
bool resul = Fun.InicializeWinForms();
if (resul) return base.DataBaseName ;
MessageBox.Show("Problem initializing");
return null;
}
set
{
Fun.WinFormsValueSavingSistemx(value); //example call, not related with the issue at hand
}
}
}
The problem is that occasionally in the case of the web we get errors because the properties inside the Provider variable are null, but the Provider itself is not null.
In theory it shouldn't be possible (in the case of the web we initialize all the Provider properties in the Initialize() function and if the appdomain recycles causing the properties to go null the Provider variable should be null too causing it to call Initialize() again when you try to access one of its properties).
What's the cause and recommended solution to this problem?
Edit:
In case this provides more hindsight, the problem for now has happened in the first page load of one aspx (isPostback = false, where it would access the Provider.DataBaseName for the first time in the first visit to that aspx or if coming again from another aspx). Also in the documented cases they where using IE8 and IE9...
I have an application(say App1) which is connected to another application (App2) via .net remoting. App2 acts as a server.. If App2 goes down App1 will not be able to pull data from App2. We are planning to run an instance of App2(say App2a) in another machine so that if App2 goes down App1 automatically takes the data from App2a. When App2 runs again.. App1 will need to take the data from App2. The fail over mechanism is not implemented yet... Please suggest a design pattern so that in future any number of server instances can be added for App1 to pull data.
Thanks
The closest design pattern that I can think of is the Chain of Responsibility pattern.
The idea is that:
You build a chain of objects (servers)
Let the object (server) handle the request
If it is unable to do so, pass the request down the chain
Code:
// Server interface
public interface IServer
{
object FetchData(object param);
}
public class ServerProxyBase: IServer
{
// Successor.
// Alternate server to contact if the current instance fails.
public ServerBase AlternateServerProxy { get; set; }
// Interface
public virtual object FetchData(object param)
{
if (AlternateServerProxy != null)
{
return AlternateServerProxy.FetchData(param);
}
throw new NotImplementedException("Unable to recover");
}
}
// Server implementation
public class ServerProxy : ServerProxyBase
{
// Interface implementation
public override object FetchData(object param)
{
try
{
// Contact actual server and return data
// Remoting/WCF code in here...
}
catch
{
// If fail to contact server,
// run base method (attempt to recover)
return base.FetchData(param);
}
}
}
public class Client
{
private IServer _serverProxy;
public Client()
{
// Wire up main server, and its failover/retry servers
_serverProxy = new ServerProxy("mainserver:2712")
{
AlternateServerProxy = new ServerProxy("failover1:2712")
{
AlternateServerProxy = new ServerProxy("failover2:2712")
}
};
}
}
This example wires up a chain of 3 servers (mainserver, failover1, failover2).
The call the FetchData() will always attempt to go to mainserver.
When it fails, it'll then attempt failover1, followed by failover2, before finally throwing an exception.
If it were up to me, I wouldn't mind using something quick and dirty such as:
public class FailoverServerProxy: IServer
{
private readonly List<ServerProxy> _servers;
public FailoverServerProxy RegisterServer(Server server)
{
_servers.Add(server);
return this;
}
// Implement interface
public object FetchData(object param)
{
foreach(var server in _servers)
{
try
{
return server.FetchData(param);
}
catch
{
// Failed. Continue to next server in list
continue;
}
}
// No more servers to try. No longer able to recover
throw new Exception("Unable to fetch data");
}
}
public class Client
{
private IServer _serverProxy;
public Client()
{
// Wire up main server, and its failover/retry servers
_serverProxy = new FailoverServerProxy()
.RegisterServer("mainserver:2712")
.RegisterServer("failover1:2712")
.RegisterServer("failover2:2712");
}
}
I think it borrows ideas from other patterns such as Facade, Strategy and Proxy.
But my motivations are simply to:
Make the least impact on existing classes (ie, No extra property in the Server class)
Separation of concerns:
Central class for the server's failover/recovery logic.
Keep the failover/recovery's implementation hidden from the Client/Server.
We have an old Silverlight UserControl + WCF component in our framework and we would like to increase the reusability of this feature. The component should work with basic functionality by default, but we would like to extend it based on the current project (without modifying the original, so more of this control can appear in the full system with different functionality).
So we made a plan, where everything looks great, except one thing. Here is a short summary:
Silverlight UserControl can be extended and manipulated via ContentPresenter at the UI and ViewModel inheritance, events and messaging in the client logic.
Back-end business logic can be manipulated with module loading.
This gonna be okay I think. For example you can disable/remove fields from the UI with overriden ViewModel properties, and at the back-end you can avoid some action with custom modules.
The interesting part is when you add new fields via the ContentPresenter. Ok, you add new properties to the inherited ViewModel, then you can bind to them. You have the additional data. When you save base data, you know it's succeeded, then you can start saving your additional data (additional data can be anything, in a different table at back-end for example). Fine, we extended our UserControl and the back-end logic and the original userControl still doesn't know anything about our extension.
But we lost transaction. For example we can save base data, but additional data saving throws an exception, we have the updated base data but nothing in the additional table. We really doesn't want this possibility, so I came up with this idea:
One WCF call should wait for the other at the back-end, and if both arrived, we can begin cross thread communication between them, and of course, we can handle the base and the additional data in the same transaction, and the base component still doesn't know anything about the other (it just provide a feature to do something with it, but it doesn't know who gonna do it).
I made a very simplified proof of concept solution, this is the output:
1 send begins
Press return to send the second piece
2 send begins
2 send completed, returned: 1
1 send completed, returned: 2
Service
namespace MyService
{
[ServiceContract]
[ServiceBehavior(ConcurrencyMode = ConcurrencyMode.Multiple)]
public class Service1
{
protected bool _sameArrived;
protected Piece _same;
[OperationContract]
public Piece SendPiece(Piece piece)
{
_sameArrived = false;
Mediator.Instance.WaitFor(piece, sameArrived);
while (!_sameArrived)
{
Thread.Sleep(100);
}
return _same;
}
protected void sameArrived(Piece piece)
{
_same = piece;
_sameArrived = true;
}
}
}
Piece (entity)
namespace MyService
{
[DataContract]
public class Piece
{
[DataMember]
public long ID { get; set; }
[DataMember]
public string SameIdentifier { get; set; }
}
}
Mediator
namespace MyService
{
public sealed class Mediator
{
private static Mediator _instance;
private static object syncRoot = new Object();
private List<Tuple<Piece, Action<Piece>>> _waitsFor;
private Mediator()
{
_waitsFor = new List<Tuple<Piece, Action<Piece>>>();
}
public static Mediator Instance
{
get
{
if (_instance == null)
{
lock (syncRoot)
{
_instance = new Mediator();
}
}
return _instance;
}
}
public void WaitFor(Piece piece, Action<Piece> callback)
{
lock (_waitsFor)
{
var waiter = _waitsFor.Where(i => i.Item1.SameIdentifier == piece.SameIdentifier).FirstOrDefault();
if (waiter != null)
{
_waitsFor.Remove(waiter);
waiter.Item2(piece);
callback(waiter.Item1);
}
else
{
_waitsFor.Add(new Tuple<Piece, Action<Piece>>(piece, callback));
}
}
}
}
}
And the client side code
namespace MyClient
{
class Program
{
static void Main(string[] args)
{
Client c1 = new Client(new Piece()
{
ID = 1,
SameIdentifier = "customIdentifier"
});
Client c2 = new Client(new Piece()
{
ID = 2,
SameIdentifier = "customIdentifier"
});
c1.SendPiece();
Console.WriteLine("Press return to send the second piece");
Console.ReadLine();
c2.SendPiece();
Console.ReadLine();
}
}
class Client
{
protected Piece _piece;
protected Service1Client _service;
public Client(Piece piece)
{
_piece = piece;
_service = new Service1Client();
}
public void SendPiece()
{
Console.WriteLine("{0} send begins", _piece.ID);
_service.BeginSendPiece(_piece, new AsyncCallback(sendPieceCallback), null);
}
protected void sendPieceCallback(IAsyncResult result)
{
Piece returnedPiece = _service.EndSendPiece(result);
Console.WriteLine("{0} send completed, returned: {1}", _piece.ID, returnedPiece.ID);
}
}
}
So is it a good idea to wait for another WCF call (which may or may not be invoked, so in a real example it would be more complex), and process them together with cross threading communication? Or not and I should look for another solution?
Thanks in advance,
negra
If you want to extend your application without changing any existing code, you can use MEF that is Microsoft Extensibility Framework.
For using MEF with silverlight see: http://development-guides.silverbaylabs.org/Video/Silverlight-MEF
I would not wait for 2 WCF calls from Silverlight, for the following reasons:
You are making your code more complex and less maintainable
You are storing business knowledge, that two services should be called together, in the client
I would call a single service that aggreagated the two services.
It doesn't feel like a great idea to me, to be honest. I think it would be neater if you could package up both "partial" requests in a single "full" request, and wait for that. Unfortunately I don't know the best way of doing that within WCF. It's possible that there's a generalized mechanism for this, but I don't know about it. Basically you'd need some loosely typed service layer where you could represent a generalized request and a generalized response, routing the requests appropriately in the server. You could then represent a collection of requests and responses easily.
That's the approach I'd look at, personally - but I don't know how neatly it will turn out in WCF.