Can a C# application communicate with Node.js code? - c#

I have a C# application and a Node.js application. I would like to press a button in my C# application to send three arguments to a Node.js application/function as input. Is this possible?
Edit: Both applications run on the same machine. The C# application would provide three arguments to the Node.js application. The Node.js application would query a web service (POST), receive some XML data, and manipulate that data. I know that I could do that task in C# too, but in this case it has to be Node.js.
Edit #2 and solution: Right now I have chosen: 4. Your node process runs a socket server and your C# app does requests over tcp.
I will also provide a solution that seems to work:
Node.js part
C# part
Now you are ready to send any data from your C# application to the Node.js server.

Yes communication is possible like several people have pointed out in your question's comments.
These are (some of) the options:
Your node process runs an http server and your C# app does JSON Rest requests over http
Your node process runs a SOAP webservice using the node-soap/strong-soap module
C# app starts your node app and you do IPC by writing to the node process inputstream and read it's outputstream.
Your node process runs a socket server and your C# app does requests over tcp.
You use a 3rd process/server like Redis or a Message Queue
Anything that allows you to share data like files..
I would recommend you go for the first option as that doesn't require you to define a language protocol to send over the "wire". The other reason would be that there is a lot of documentation available on doing Rest with C# and node.js.
As the client library in C# I would suggest you have a look at Restsharp as the client library if you can't use the latest version of .NET (4.5). If you can use the latest version, use HttpClient to call your Node.js restservice.
For Node just use Express.
Option 2 might be quick as there is good support in VS for webservices, however, I have only used node-soap as a client so can't comment on how well the node-soap webservices are with C# clients.

Manually handling inter-process communication is time-consuming and the old alternative to that, Edge.js, has not been updated since mid 2017.
My organization maintains a library, Jering.Javascript.NodeJS, that allows you to call into Node.js from C#.
Example usage
string javascriptModule = #"
module.exports = (callback, x, y) => { // Module must export a function that takes a callback as its first parameter
var result = x + y; // Your javascript logic
callback(null /* If an error occurred, provide an error object or message */, result); // Call the callback when you're done.
}";
// Invoke javascript in Node.js
int result = await StaticNodeJSService.InvokeFromStringAsync<int>(javascriptModule, args: new object[] { 3, 5 });
// result == 8
Assert.Equal(8, result);
You can invoke any valid Node.js module, including one that performs tasks like those listed in the question: querying a web service (POST), receiving XML data, and manipulating that data.
Highlights
Cross-platform support
Targets .NET Standard 2.0 and .NET Framework 4.6.1.
Tested on Windows, macOS and Linux.
Performance features
Does not start a new Node.js process for each invocation. Instead, sends invocations to long-lived processes via inter-process communication.
Optionally, runs invocations concurrently in a cluster of Node.js processes. Handles load balancing for the cluster.
Caches compiled javascript where possible.
Long-running application support
Restarts Node.js processes if they terminate unexpectedly.
Optionally, restarts Node.js processes on file change.
Kills Node.js processes when their parent .Net process dies.
Flexible API
Exposes both a static API and a dependency injection based API.
Supports invoking javascript in string form, Stream form, or from a file on disk.

Related

Interaction between web and a C# .dll

I have to call a method of a complex .dll C# library when I get a "request".
Since I don't want to program the network part (socket, protocol, etc.), I'm thinking of using an existing Web server (IIS or Apache) to handle this part (and use HTTP as protocol).
Is it possible to load the .dll in memory and call a method of it from a web server? How to do it?
If yes, is it better to use IIS on Windows, Apache or Windows, or Apache on Linux?
Is it mandatory to call this method of the C# .dll using C#/ASP? Or can it be with PHP?
It has to be very scalable, and the .dll library uses the .NET Framework.
Of course I'm thinking of exec function in PHP, but this would start a C# program again and again for each single request, which is not very good.
I would like to load the C# .dll once in memory, and have it running on the web server directly if possible, so that only "calls" to a method happen when I get a specific HTTP request.
My question has similarities to this one, however I don't believe our problems are the same.
Is it possible to load the .dll in memory and call a method of it from a web server? How to do it?
Yes, for example by programming a service layer around it using WCF or the ASP.NET Web API. Given the quirky (to say the least) interoperation of SOAP services and clients, I'd go with the latter.
You can then call your SOAP/REST/giveitaname service through PHP's cURL or any other applicable method, and your C# service will then call the code in the DLL (this part you'll have to build).
You can host both types of project easily in IIS, making it accessible through HTTP.

Designing a cross platform communication interface

I have a C# program running on a local system that needs to be able to do two things.
Asynchronously spin off remote jobs on remote systems running Windows, Linux, or Android.
Provide a way for those systems to send back the output(StdOut/StdErr) of those jobs back to the local system.
Previously I have used WCF when communicating with a remote windows system. I created a WCF server on the remote windows system and then my local machine can send commands and messages via that WCF channel. Things get more complicated when I try to do the same thing with Linux and Android.
I figure I could setup a local WCF service using REST, that way all 3 platforms can send messages to it using whatever convenient language (Most likely c++) via JSON REST. But what then is the best way to accomplish requirement #1?
Should I bother creating a REST server in C++ that runs on Linux and Android?
Can WCF even consume a C++ REST server that isn't written in .Net?
Would I be better off doing something simple with just TCP sockets?
Security is not an issue since this is used on a secure private network. I'm just looking for the easiest way to run remote commands/processes and receive response messages from those remote systems.
I use ZMQ and JSON for exactly this purpose: creating a custom private network topology that communicates using JSON messages over TCP (via ZMQ). Of course you could use any serialization format (I list some alternatives below).
I can't give you a definitive "this is what you should do" answer, because the question is fairly open-ended.
ZMQ: http://www.zeromq.org/
Really nice cross-platform abstracted socket library
Can use various transport protocols, including TCP
Sends messages as just plain byte arrays, leaving choice of serialization format up to you
Some serialization formats (in order of my personal preference):
JSON: http://www.json.org/
MessagePack: http://msgpack.org/
Exactly like JSON, but much more compact
No schemas
Cross-platform
Google Protocol Buffers: https://code.google.com/p/protobuf/
Uses schemas
Has bindings for C++, C#, Java, Python (at least)
This can really depend on your infrastructure and services you are using. If you are in the Amazon Web Services world you could use a Simple Queue Service (SQS) to receive messages. The remote systems could then poll the queue and run the jobs based on the messages pulled off the SQS queue.

Querying Quickbooks from an ASP.NET Web Application

I'm trying to enable a web application I've written in ASP.NET MVC2 to be able to send and receive data to and from Quickbooks on-demand, without a delay/intermediate database/sync operation. (Please take this as an assumption; I understand that it would be a better idea to use the Web Connector and sync every X units of time, but that is outside of the framework of discussion for this question)
It seems that the only way to do this is to write a regular application (it apparently will not work at all from a service), and then have the webapp communicate with that application and send and recieve data through that application.
So my question is what is the ideal way of setting up an intermediate application that will communicate with the web app?
Should I write a regular console application and get data from standard output? Or are there better ways of accomplishing the same goal?
QuickBooks exposes it's API through a COM interface and there are some limitations in how that can be used. An intermediate application that acts as an relay is an appropriate solution.
I would recommend building or using an application that listens over HTTP and proxies requests and responses to QuickBooks. In fact we use exactly the same approach in our product - RSSBus QuickBooks ADO.NET provider. Here are more details of what we do it in our product: http://rssbus.com/kb/help/RQR1-A/pg_qbconnector.rst

C# + PHP in the same application?

What im trying to do is a little different, im wondering, if its possible to create sorts of an interface, so that if a particular function is called in php (stand alone), than the arguments will be forwarded to a method in C#, and vice versa.
Depends of course on what kind of data you want to exchange, if the applications are on the same server or on two different ones etc.
I did briding between PHP and C# in some of my web based projects. What I did: create an ASP.NET MVC project which exposes a RESTful API. The methods exposed by this API are then called from PHP via HTTP (using CURL). I used JSON as a data exchange format to pass data from PHP to C# and back again. This worked good as the two applications were on different servers.
I could also imagine some kind of socket server. E. g. a background process written in C# is listening on some port, and then you connect to it via PHP´s socket functions. I did something like this to connect PHP and Java. The Java app ran as a demon process on port XXXX and was a wrapper around Apache FOP. I used PHPs socket functions to pass XML and XSLT to the Java demon which then used Apache FOP to transform the data into a pdf and returned that via the socket connection back to PHP which in turn sent the PDF file to the client requesting the page.
There are probably some other approaches, but these were my experiences so far in connecting two different technologies together.
EDIT: as an aside: PHP on a windows webserver using IIS was really not that nice to work with. Sometimes strange errors occured or certain permission related errors that were not easy to resolve. YMMV though.

How are server side applications created, how is client - server communication done?

I would like to have a client-server application written in .NET which would do following:
server is running Linux
on the server there is SQL database (mySQL) containing document URLs
What we want:
- server side would regularly crawl all URLs and create a full text index for them
- client side would be able to perform a query into this index using GUI
The client application is written in .NET using C#. Besides of searching in documents it will be able to do a lot of other things which are not described here and which are done client-side very well.
We would like to use C# for the server side as well, but we have no experience in this area. How are things like this usually done?
Clarifying question now based on some answers:
The thing which is most unclear to me is how client-server communication is usually handled. Is client and server usually using sockets, caring about details like IP addresses, ports or NAT traversal? Or are there some common frameworks and patters, which would make this transparent, and make client-server messaging or procedure calling easy? Any examples or good starting points for this? Are there some common techniques how to handle the fact a single server is required to server multiple clients at the same time?
To use c# on Linux you will need to use Mono. This is an open source implementation of the CLR specification.
Next you need to decide on how to communicate between server and client, from the lowest level of just opening a TCP/IP socket and sending bits up and down, to .Net remoting, to WCF, to exposing webservices on the server. I do not know how compleat WCF implementation is on mono, also I think you may have issue with binary remoting between mono and MS .Net .
I would suggest RPC style WebServices offer a very good solution. WebServices also have the advantage of alowing clients from other platforms to connect easily.
EDIT
In response to the clarification of the question.
I would suggest using mono/ASP.NET/WebServices on the server, if you wish to use c# on both server and client.
One assumption I have made is that you can do a client pull model, where every message is initiated by the client. Using another approach could allow the server to push events to the client. Given the client has the ability to pole the server regularly I don't consider this much of a draw back but it may be depending on the type of application you are developing.
Mono allow execution of c# (compiled to IL) on a Linux box. Mono ASP.NET allows you to use the standard ASP.NET and integrate into Apache see http://www.mono-project.com/ASP.NET and finally WebServices allow you to communicate robustly in a strongly typed manner between you client and your server.
Using this approach negates most of the issues raised in your clarification and makes them someone else's problem.
Sockets/SSL - is taken care of by standard .Net runtime on the client and Apache on the server.
IPAddress/ports/NAT traversal - Is all taken care of. DNS look up will get the servers IP. Open socket will allow the server to respond through any firewall and NAT setup.
Multiple Clients - Apache is built to handle multiple clients processing at the same time as is ASP.NET, so you should not encounter any problems there.
As many have already mentioned there are a number of thing that you have mentioned which are going to cause you pain. I'm not going to go into those, instead I will answer your original question about communication.
The current popular choice in this kind of communication is web services. These allow you to make remote calls using the HTTP protocol, and encoding the requests and responses in XML. While this method has its critics I have found it incredibly simple to get up and running, and works fine for nearly all applications.
The .NET framework has built in support for web services which can definitely be called by your client. A brief look at the mono website indicates that it has support for web services also, so writing your server in C# and running it under mono should be fine. Googling for "C# Web Service Tutorial" shows many sites which have information about how to get started, here is a random pick from those results:
http://www.codeguru.com/Csharp/Csharp/cs_webservices/tutorials/article.php/c5477
have a look at Grasshopper:
"With Grasshopper, you can use your favorite development environment from Microsoft® to deploy applications on Java-enabled platforms such as Linux"
Or see here
The ideea is to convert your app to Java and then run it on Tomcat or JBoss.
Another approach: use the Mod_AspDotNet module for Apache, as described here.
This Basic Client/Server Chat Application in C# looks like a kind of example which might be a starting point for me. Relevant .NET classes are TcpClient and TcpListener

Categories