I would like to build a winform business solution using SiganlR, but I am not able to install .net 4.0 on the client machine. It looks like SignalR has a mininum requirement of .net 4.0. What is the best way to use SignalR from a winform and .net 3.5. I would like to include the send/receive message functions in the client application.
I will be hosting SignalR on IIS on my intranet using .net 4.0 on the server side.
Would it be possible to create and API (in .net 3.5) similar to pubnub C#? Can anyone point me in the right direction.
If you're not too scared of compiling your own stuff I have created a .net 2.0 compatible client library for SignalR. It's available right off my fork (https://github.com/robink-teleopti/SignalR) of SignalR.
At the moment I have no intent of making a pull request as I don't want to add that extra burden to the original project.
I have one more modification on my fork that can be good to know about. When having clients belonging to more than 20 groups I automatically switch to using POST instead of GET and I have made minor modifications on the server side to handle that part.
I backported the v1.x and v2.x SignalR clients to NET35. Available as NuGet packages here: http://www.nuget.org/packages/Nivot.SignalR.Client.Net35/
One of the authors of the SignalR project had a goal of using TaskParallelLibrary (TPL) that shipped in .NET 4. So I doubt any of the C# code from that project will run on .NET 3.5. Why not host a TCP or named-pipes WCF endpoint on the server and use normal .NET client callbacks? This is relatively easy if server is Windows Server 2008 with WAS or AppFabric.
If an older server you could host WebBrowser control and use SignalR javascript client library to handle signaling.
In another 6 months I'd bet on a native .NET 3.5 client library, but as the maintainers are full time MS employees I doubt they will have time to get to supporting legacy versions any time soon.
Related
I'm creating an app that is downloading some content from a HTTPS-based website (and POSTs something back). The web server is using forced encryption (TLS 1.2 with modern ciphers), so all browsers before Firefox 30 can't open pages from it. At all.
The app is written in C# using .NET 4.6 HttpWebRequest and all is working okay. But I want to add support for older Windows XP machines that are still alive, where only .NET 4.0 is available. Also it will be prefect if the app will work even in Win2000 with .NET 2.0, but it's more like a inreal dream. :)
Is it possible to add support for HTTPS requests to modern servers in .NET 2.0/4.0 app? May be via OpenSSL, Mozilla NSS or something such...
I have a backend system developed with C# .NET 4.0. I wish to embed the light weight web server Kestrel on it. Has anyone done so? or can point me to a direction how to do such?
Cheers!
KestrelHttpServer : "A cross platform web server for ASP.NET Core.". (emphasis added by me).
So that's no, and no.
I did it, but of course only by opening Kestrel which serves a .net core application as a background process. The .Net Framework app communicates with the .net core app via inter-process communication over a websocket on localhost.
I have a simple soap client developed in .Net core 2.0 to consume calculator methods from the following web service:
http://www.dneonline.com/calculator.asmx?WSDL
The service is imported into Visual Studio 2017 as a connected service and the code that calls the service methods looks as below:
var binding = new BasicHttpBinding(BasicHttpSecurityMode.None);
var ep = new EndpointAddress("http://www.dneonline.com/calculator.asmx?WSDL");
Calculator.CalculatorSoapClient client =
new Calculator.CalculatorSoapClient(binding, ep);
client.AddAsync(1, 3).Wait();
Console.WriteLine("Finished successfully");
Console.ReadLine();
Now the thing is that this code fails after 20 seconds when calling the AddAsync methodwith an exception of "WinHttpException: The operation timed out". Other relevant observations:
The exact same code (incluing the generated connected service) works well in VS 2015 with .NET Framework.
Our dev machines are behind the company firewall and by that the wirshark trace looks different between the two executions. The successfull run on .NET Framework can establish the connection while the other one cannot connect to the proxy.
I have seen other articles talking about increasing the timeout value on long running or heavy calls but this one is neither heavy nor a long running method.
Has anyone faced any similar issue or knows why .NET core behaves differently?
Update 1
We tested the solution on a cloud based computer and it worked fine. So, it looks to be the company firewall blocking the outgoing communication.
On the other hand on the same local computer, we tested a simple .Net Core application that downloads an image from a website and that worked well too.
This is all while VS 2015 with .NET Framework has no problem running the same code, calling a web service and fetching the results.
So, the new question can be why .NET Core behaves differently on the network communication? Is there any new rule, policy, permission, restriction, capability, etc that needs to be turned on/off to enable coomunication with the internet?
There was a bug in .net Core 2.0 with proxy configuration, as you could see here: https://github.com/dotnet/wcf/issues/1592.
The solution for my project (VS 2017/.net Core 2.0) was to upgrade it to .net Core 2.1, and update System.ServiceModel.Http and other dependencies to latest stable (4.5.3 in my case). Then the IE/Edge proxy settings are used.
This question is possibly a duplicate of: Connecting to a SOAP service with .Net Core 2.0 behind a proxy
I am developing a .NET application (say A) which will talk to other .NET applications (say B). The application A is going to be consumed by a Java application (say X). I am currently using Apache Thrift. Thrift is great except for the basic support for OOPS features (such as overloading and inheritance). Of course, we can customize the Thrift compiler in the way we want as it is a open source technology.
One of my friend suggested to use WCF for application A and to use WSHttpBinding. Does WCF service when using Basic/WSHtppBinding has any limitations other limitations to go ahead with that? I guess the limitations of web service are applicable for the above stated way of hosting. Please guide me in this.
I have worked on a few projects where we have had to integrate WCF and Java. I have always ended up going for the BasicHttpBinding as that has allowed the two technologies to communicate with the least amount of friction. You lose a lot by using BasicHttpBinding over WsHttpBinding but that has not been an issue with the projects I have worked on. You are going to have to make the call between ease of use (BasicHttpBinding) and support for more/newer standards (WsHttpBinding).
Take a look at http://www.codeproject.com/Articles/36396/Difference-between-BasicHttpBinding-and-WsHttpBind
Limitations of WCF are :
ASP.NET 4.0
Visual studio 2010
IIS 7.5
MS SQL server 2008 R2
I have a .net 4.0 app computing data i'd like to visualise with www.unity3d.com which has c# mono scripting. To do this i'd like to send xml strings between the two apps - one of which is running .net and the other of which is running under mono.
I'm new to Interprocess communication but i think it's the way to go - does anyone know of a sample showing .net 4.0 to mono ipc?
any advice appreciated.
David
Don't think of it as "IPC". Use Web Services between the two. Use WCF on the .NET side. I don't know if Mono has support for WCF, but if not, WCF produces standard web services, so Mono should have no trouble consuming the WCF side.