It it possible to call a function on an external class?
The external class runs on another machine (say its locations is mymachine.com).
have used CreateInstance some time ago but don't think that will do (correct me if I'm wrong).
I have been searching for a long time but haven't found a solution yet so I hope one of you can help.
one of the sources i have searched is but no info :(
http://www.dreamincode.net/forums/topic/102523-call-an-external-function-on-button-click/
hope u can help.
use a self hosted WCF service, remoting or any networking technology.
There is unfortunately no magic attribute to achieve that.
[Edit] I also would like to add that you me be careful. When you are using some remoting mechanisms (either remoting, WCF, rpc, etc.) you work with "proxies". A proxy is an object that simulate the actual object, but encapsulate the communication. It allow the developer to hide the complexity by having an objects, with properties, methods etc., but the technologies behind (xml messaging for a WCF service for example) must be known by the developer. It can have impacts on network, responsiveness, and also programming model.
This sort of operation is what Web Services / WCF Services / Remoting is built for. WCF is a really nice solution for handling communication across boundaries. Have a look into WCF, google is your friend.
See these:
http://www.aspfree.com/c/a/.NET/Introduction-to-RPC-on-Windows-Part-I/
http://www.csharphelp.com/2007/01/interprocess-communication/
If the external assembly is on a shared folder on the remote machine, then you can do one of the following:
1) Implement the AppDomain.CurrentDomain.AssemblyResolve event handler to load the external assembly, and make the calls you need to make. Also put a reference to the assembly in your project by the way (copylocal=false). The first call to instance a class or static method in the assembly will invoke the event handler and load the remote external assembly.
2) Load the assembly via reflection using System.Reflection.Assembly.LoadFrom() and provide te remote path. Use reflection to invoke the desired method, etc.
Related
We are developing multiple web services in C# using WCF, but we´re new doing it.
So, for what we have read and learnt, this is our approach:
We have a class library that we called CommonLibrary that has a few classes that are going to be used on all our services (language stuff, type of user connected and a common object that all the services are meant to return).
We have another class library called SecurityLibrary which validates the user that is consuming the method.
At the moment we have 2 services that are almost at 90% finished, both of them use CommonLibrary and SecurityLibrary.
Now the questions:
Is this a bad approach?
Are we violating the SOA principles of encapsulation and autonomy by using common/shared library with each of our services?
A third person told us to copy all the code of those libraries on each of our services so we have a 100% autonomous service, is this the right way? I think is hard for maintenance and shows a lot of duplicity. Any update made on one has to be replicated or merged on those other services...
No, it is not a bad approach?
If using libaries in your service, you should also keep away from the .NET-library. I am wonding why you are thinking that a service process is only allowed to exist of only one assembly.
Furthermore, copy-paste code is a very, very bad habbit. It is known as a anti-design-pattern. I duplicates the maintainance and also all the bugs inside it.
Sharing libaries does not make your service less "autonomous". I think it could make them more compatible if they are sharing types.
A good service is just a process, existing of one or more (shared) assemblies, with a well defined service contract. This service contract is never allowed to be broken.
BTW: In my answer I did not include problems which shared assemblies in the GAC. That is a feature or problem shared by all processes, not only services.
I need to invoke WCF service 1 or WCF service 2, based on certain condition evaluated at runtime. Both the services are similar but hosted on different servers.
I have added two service references, NS1 and NS2 pointing to different urls. Current code already uses NS1. Considering this NS1 implementation has already been done at many places. What would be best way to refactor the code, to select dynamically which service has to be invoked ?
In general, it is considered a bad practice to program directly against the proxy generated by the svcutil.exe.
The best way is to wrap it in a class of your own and reference this class each time you require the service. This will also allow you to implement more advanced business logic such as routing (in your case) and other cross cutting concerns.
For example: you can now abstract from the application the strategy you are using to connect to the service, i.e. Service reference or ChannelFactory. You can easily share the service between different assemblies without ambiguity.
You are saying that you have much code written directly against NS1. Grind your teeth and wrap it. It is a lot of dirty work but the risk is very low.
Having said the above, I wonder about the requirement itself, where a service calls another instance of itself on another server (if I got you right). This smells funny, what is the problem you are trying to solve?
I am working in VS 2008 C# and need to share an instance of an object created in one project with another project. I tried creating a static class in project1 and adding it as a link to project2, but the information wasn't saved. The static class was written in project1.
//object o = new object
//project1.staticObject = o
//project2.object = project1.staticObject
When I tried something like above, project2.object would be null. By adding a class as a link, is it creating a new instance of the static class in project2 or is it referencing the same class? If it is referencing the same class, shouldn't any information saved into the static class from project1 be accessible by project2? I know this isn't the most elegant manner of sharing data, but if anyone would help with this problem or provide a better manner of doing it, I would greatly appreciate it.
Thanks in advance.
Projects run in separate processes, so they can't share data in this manner. You'll need to persist the data in another type of store. I recommend using a database (hey, 20 gazillion websites, stock trading apps, airlines, etc can't be wrong).
If you don't want to use a database, you could open an IP connection between instances of the app and have a thread send packets of data to sync back and forth between the applications. Or, in your "server" app, add a web service that each process would call to update and retrieve information.
If you need really high-speed communication between the processes, sockets with a peer or star topology is a good way to go. If you're okay with some latency, having a web service (which works fine even if these aren't web apps) or a database could be a good solution. The right approach depends on your application.
WCF could also solve this problem. It effectively wraps the IP/socket layer and provides some nice object persistence/remote control capabilities, but I find it overly complex for most applications.
To share a single instance of an object among different processes (that's what I think you are intending to do) you need something that will maintain that object's state. You can look at the WCF and how to set up it's behaviour to act as a singleton so essentially every requester gets the same instance across the board.
http://msdn.microsoft.com/en-us/magazine/cc163590.aspx
Creating the link creates only applies to the source code. When you compile each project, it then has that single class definition available in both projects. The process you took does nothing for instances during runtime for sharing.
You can look at WCF or .NET Remoting, although .NET Remoting is now officially replaced by WCF.
If you are talking about sharing the same object between two processes, you can do that, the concept is called memory-mapped files. Here is some starter docs from msdn.
Though the docs and API use the term "FileMapping" quite a bit, you can use it just for sharing memory between two processes.
In .NET 4.0, you can use the System.IO.MemoryMappedFiles namespace. For your case, looks like .NET 3.5, you'll have to use some sort of interop to use the Win API.
I need to create a project for multiple web services using WCF in c#. The web services will be calling other assemblies to perform the core processing. The assemblies will be accessing data from SQL Server. One of the parameters that will be part of every web service method will include the database to use. My problem is how to pass the database parameter to assemblies to use. I can't change all the signatures for all the satellite assemblies to use. I want to reference some kind of variable that the satellite assembles reference. Theses same satellite assemblies are used with a Windows Forms app and an ASP.NET app so I would need to have something that all types of applications could use. Static fields are not good since for one web service call the database could be "X" and for another it would be "Y". Any ideas?
This is the sort of thing that might play nicely with an IoC or DI framework - having some interface that includes the database information, and have it pushed into all the callers for you. Even without IoC, hiding the implementation in an interface sounds like a solid plan.
With your static concept; a [ThreadStatic] might work but is a little hacky (and you need to be religious about cleaning the data between callers), or another option is to squirrel some information away on the Principal, as this is relatively easily configured from both WCF (per-call) and winforms (typically per-process). In either case, be careful about any thread-switching (async, etc). In particular, note that ASP.NET can change threads in the middle of a single page pipeline.
I'm trying to implement a WCF Service in my program , howevery I don't understand something:
According to the book "Programming WCF Services" , Juval Löwy 2007 O'Reilly Media,.
Appendix C. - WCF Coding Standard C2 - Essential :
1. Place service code in a class library and not in any hosting EXE.
I don't understand this, where should I put my code? all my class are defined in my form application , How should I call my classes of the winforms from the Class Library of the service.
Am I missing here something??
Thanks,
Eyal
I like to structure my WCF solutions like this:
YourProject.Contracts (class library)
Contains all the service, operations, fault, and data contracts. Can be shared between server and client in a pure .NET-to-.NET scenario
YourProject.Service (class library)
Contains the code to implement the services, and any support/helper methods needed to achieve this. Nothing else.
YourProject.ServiceHost (optional - can be Winforms, Console App, NT Service)
Contains service host(s) for debugging/testing, or possibly also for production.
This basically gives me the server-side of things.
On the client side:
YourClient.ClientProxies (class library)
I like to package my client proxies into a separate class library, so that they can be reused by multiple actual client apps. This can be done using svcutil or "Add Service Reference" and manually tweaking the resulting horrible app.config's, or by doing manual implementation of client proxies (when sharing the contracts assembly) using ClientBase<T> or ChannelFactory<T> constructs.
1-n actual clients (any type of app)
Will typically only reference the client proxies assembly, or maybe the contracts assembly, too, if it's being shared. This can be ASP.NET, WPF, Winforms, console app, other services - you name it.
That way; I have a nice and clean layout, I use it consistently over and over again, and I really think this has made my code cleaner and easier to maintain.
This was inspired by Miguel Castro's Extreme WCF screen cast on DotNet Rocks TV with Carl Franklin - highly recommended screen cast !
Yes, it's a bit confusing.
We're talking about the service implementation here. What Loewy means here is that the code to implement the service should be in a separate project. The code that hosts the WCF service (i.e. the class that implements your service contract) should do nothing but call that service implementation code.
So your Windows Forms client application uses a proxy, which in turn calls the WCF service application hosting layer, which in turn calls your service logic.
It's a very good idea to go further and have three layers on the UI side and four on the service side. The namespaces might be
Company.Project.UI.WinForms
Company.Project.UI
Company.Project.ServiceClient
Company.Project.ServiceHost
Company.Project.Service
Company.Project.BusinessLogic
Company.Project.Persistence
For simpler projects this would be overkill, but for anything more than (say) one form or two service methods it will make life much easier. Not least, testing each layer in isolation should be fairly straightforward.
Here's a typical project structure following Löwy's recommendation:
MyProject.Data
MyProject.Logic
MyProject.Services
MyProject.ServiceHosts
MyProject.Presentation
Then MyProject.ServiceHosts will reference MyProject.Services and exposes the services defined there. So in Löwy's language, MyProject.Services is the class library, and MyProject.ServiceHosts contains the hosting executable.