Firebase functions build error when referencing multiple c# projects - c#

I was able to deploy my firebase c# function with no issues, however, when i referenced another c# project so that i could utilize another object i get error saying project doesn't exist.
So was able to deploy following with no problem:
namespace CloudFunctions
{
public class Login : IHttpFunction
{
public async Task HandleAsync(HttpContext context) {
await context.Response.WriteAsync("Hello World!");
}
}
}
This class lives in a project called CloudFunctions. I added a project reference to a project called Services so that i could call the login service and i get the following error:
The referenced project '../Services/Services.csproj' does not exist
This is how i am deploying:
gcloud functions deploy login --entry-point CloudFunctions.Login --runtime dotnet3 --trigger-http --allow-unauthenticated
I can't imagine we would be required to have everything in one project in order to deploy?

You need to make all of the projects available to the buildpack (i.e. deploy from the parent directory) but specify the project that contains the entry point as well, using the GOOGLE_BUILDABLE build-time environment variable.
From the deployment documentation in the Functions Framework GitHub repo:
Real world functions are often part of a larger application which will usually contain common code for data types and common business logic. If your function depends on another project via a local project reference (a <ProjectReference> element in your .csproj file), the source code for that project must also be included when deploying to Google Cloud Functions. Additionally, you need to specify which project contains the function you wish to deploy.
In a typical directory structure where all projects sit side-by-side within one top-level directory, this will mean you need to deploy from that top-level directory instead of from the function's project directory. You also need to use the --set-build-env-vars command line flag to specify the GOOGLE_BUILDABLE build-time environment variable. This tells the Google Cloud Functions deployment process which project to build and deploy. Note that the GOOGLE_BUILDABLE environment variable value is case-sensitive, and should match the directory and file names used.
When deploying a function with multiple projects, it's important to make sure you have a suitable .gcloudignore file, so that you only upload the code that you want to. In particular, you should almost always include bin/ and obj/ in the .gcloudignore file so that you don't upload your locally-built binaries.
Sample deployment command line from the examples directory:
gcloud functions deploy multi-project \
--runtime dotnet3 \
--trigger-http \
--entry-point=Google.Cloud.Functions.Examples.MultiProjectFunction.Function \
--set-build-env-vars=GOOGLE_BUILDABLE=Google.Cloud.Functions.Examples.MultiProjectFunction

Related

Not getting path to class library project from within class library project

I am calling a helper method in my Services class library project from a controller in my UI web application. I cannot get the proper path to the templates in the services project. I have tried dozens of ways but every time the base path of the full path points to the UI project.
C:\Users\TFD\OneDrive\TestEmal.UI\TestEmal.UI\bin\Debug\netcoreapp2.0\EmailService\EmailTemplates\EmailMaster_Body.html
I am building the path in the Services class library project
private static readonly string ThisDir = Path.GetDirectoryName(Assembly.GetExecutingAssembly().Location);
This came from the accepted solution from this post, accepted answer by mark Amery
Class library path SO
I have tried every permutation of Path and of Assembly but every one returns the path to the Web UI application.
How do I get the base path of the Services class library project without hardcoding or using Replace?
You cannot. A class library is compiled into the application that references it. After build/publish, the DLL for the library resides in the same directory as all the other DLLs for your application, meaning any paths will always be relative to your web app directory, not your class library directory.
It's not entirely clear what you're trying to achieve here, but if there's simply some file or files in your class library project directory that your class library needs to reference, you need to add them to your project and set them to copy on build in the properties pane for each file in Visual Studio. This will result in the file(s) coming along for the ride and ending up in your web app's build/publish directory as well. Your paths will still be relative to the web app, not the class library.
Alternatively, you can have a build task that does the copy instead, but that's a little more complicated to set up.

Using MEF in an Azure function App

I am trying to use MEF in my function app. My requirement is to access 5-10 external APIs, fetch,aggregate and return the data through an HTTP triggered function. I need to resolve the external dependencies dynamically based on some logic. These external components are already built and exported. I need to import them along with metadata.
I observed that System.ComponentModel.Compositionassembly is already referenced in a default function app created in VS 2017. Not sure how to proceed. A sample setup code would be helpful if at all it is possible in Azure functions.
Based on your scenario, I created my Http Trigger function via VS2017 to test this issue. I followed the Simple Calculator MEF Application. And here is the structure of my project and as follows:
Without adding the extension lib which supports the Mod operation into the Extensions folder, you could retrieve the following result:
While added the ExtendedOperations.dll, the Mod operation could work as expected as follows:
On my local side, I added the path for initializing DirectoryCatalog via hard code. When deploying to azure side, your precompiled function lib would be deployed under D:\home\site\wwwroot\bin>, you could add your Extensions folder within it and use the following code for retrieving your extension folder:
Path.Combine(System.Environment.GetEnvironmentVariable("HOME"), #"site\wwwroot\bin\<your-extensions-folder>")
Also, you could leverage kudu and navigate to D:\home\site\wwwroot\<your-function-name>, then add your Extensions folder under it, then init your DirectoryCatalog with the path Path.Combine(System.Environment.GetEnvironmentVariable("HOME"), #"site\wwwroot\<your-function-name>\<your-extensions-folder>").

Is it possible to make separate dlls with MVC project?

We have a big project developed in Asp.net MVC5. Our models and business logic are defined in separate class libraries. Now we need to add another module to an existing project but we want a separate dll.
This module also shares the most javascripts, css files and other files. That is the reason we don't want to separate MVC project.
Is there any why we can create separate dll for module basis. so we don't want deploy or touch other dlls.
From your description, you say that the projects share CSS and JS files. This leads me to believe you are talking about a separate MVC website (possibly part of the larger corporate website). This can be easiest with the use of Areas. If you are not familiar with Areas, please read the following: https://msdn.microsoft.com/en-us/library/ee671793(VS.100).aspx
Of course using Areas will require you to deploy the whole site everytime one of the areas change, and you have mentioned that you want to avoid doing so.
If you don't want to use areas, and instead want to create another MVC project in the same solution, you can do that easily too. You can right click on the solution, add new project > ASP.NET web application > MVC to add the project. To share JS and CSS files between these two MVC projects, you will have to create a new solution folder (right click solution > Add new solution folder), and move your resource files to that folder. Inside each MVC project in your solution, you will add existing items and select those js/css resource files. This way if you change the css file, it will be reflected in both the projects.
For more information, read the following:
How do you share scripts among multiple projects in one solution?
Yes you can, just add the logic classes to other class library project (you can have as many as you want), then add references of those class librarys to the mvc project.
Don't forget to import the classes after in your code
Edit: I'm assuming you are using Visual Studio, if yes, you can go to File -> Create Project, this will create another project in the same solution.
I don't know whether you tried with Managed Extensibility Framework (MEF) or not.. this framework works as you required ... I think below link will help you more
ASP.NET MVC Extensibility with MEF
How to integrate MEF with ASP.NET MVC 4 and ASP.NET Web API
http://www.codeproject.com/Articles/167321/MEF-with-ASP-NET-Hello-World
Other people have posted answers regarding the use of Areas. Areas are great and good and helpful. They really benefit the project structure.
This module also shares the most javascripts, css files and other file
The title of your question is about .dlls, but I suspect the client-side resources are the main concern.
If you consider your webapp as having two distinct parts: server-side and client-side, you can use appropriate strategies to modularize each. Areas a great for organizing server-side code, but don't help the front-end.
Front-end package management options have expanded for ASP.NET 5. In addition to the traditional NuGet package manager, Bower and NPM are now supported. For example, consider how this article demonstrates installing jQuery via NPM. Here's another good article about setting up NPM, Bower, and Gulp in Visual Studio.
What to do: Take your existing client-side code and make a custom NPM or Bower package, and then use that package from one or more Asp.NET projects.
I can suggest you two ways to organize your multi-module project.
Option 1 - Create Area per module, within same web project
One way to do it is, create separate Area within the same MVC project. So each module will have a separate area, with separate controllers, views, scripts etc. But,
(1) This will still create a single dll for the whole MVC project
(2) Sharing files across areas might not be very easy in some scenarios (you can keep all the scripts for all the modules in one shared directory though)
Option 2 - Create class library per module, merge after build
Another way is to create a single class library project per module. Add reference to the System.Web.Mvc and other libraries so that it can have controllers etc. Create your own views, scripts and other folders and populate with files as you need them.
Now, all your modules will build as separate projects, with the dll file and the javasvripts, htmls, csss, images etc. To make them all work as a single web application you can create a (only one) MVC web project, which will go to the IIS virtual directory and will be published as web.
To use all your separate modules from the same web, you can write post build events in all those libraries to copy the artifacts (dll, scripts etc.) into the main web, into corresponding folders (dll to \bin, javascript to \scripts etc.). So, after successful build, all the artifacts are available in the same web project, and that can be deployed as a single web with all the modules. Your post build scripts should look something like this
XCOPY "$(ProjectDir)$(OutDir)*.*" "$(ProjectDir)..\YourMainWebDirectory\Bin\" /Y
XCOPY "$(ProjectDir)Content" "$(ProjectDir)..\YourMainWebDirectory\Content\" /S /Y
XCOPY "$(ProjectDir)Scripts" "$(ProjectDir)..\YourMainWebDirectory\Scripts\" /S /Y
XCOPY "$(ProjectDir)Views" "$(ProjectDir)..\YourMainWebDirectory\Views\" /S /Y
XCOPY "$(ProjectDir)Images" "$(ProjectDir)..\YourMainWebDirectory\Images\" /S /Y
Now,
(1) You have separate dlls for separate modules
(2) Can directly share scripts and other files, as they will be in same location (after build)
(3) If you decide to remove a specific module from the web, just remove the post build event from that module (project) without affecting anything else. You can add that back at any time you please.
Your overall solution will look like
Module01.csproj => post build copy to main
\Controllers
\Scripts
\Views
\Contents
\Images
Module02.csproj => post build copy to main
\Controllers
\Scripts
\Views
\Contents
\Images
Models.csproj
\...
Application.csproj
\...
Main.Web.csproj => main web application hosted in IIS
\Controllers
\Scripts
\Views
\Contents
\Images

Solution Output Directory

The project that I'm currently working on is being developed by multiple teams where each team is responsible for different part of the project. They all have set up their own C# projects and solutions with configuration settings specific to their own needs. However, now we need to create another, global solution, which will combine and build all projects into the same output directory.
The problem that I have encountered though, is that I have found only one way to make all projects build into the same output directory - I need to modify configurations for all of them. That is what we would like to avoid. We would prefer that all these projects had no knowledge about this "global" solution. Each team must retain possibility to work just with their own sub-solution.
One possible workaround is to create a special configuration for all projects just for this "global" solution, but that could create extra problems since now you have to constantly sync this configuration settings with the regular one, used by that specific team. Last thing we want to do is to spend hours trying to figure out why something doesn't work when building under global solution just because of some check box that developers have checked in their configuration, but forgot to do so in the global configuration.
So, to simplify, we need some sort of output directory setting or post build event that would only be present when building from that global, all-inclusive solution. Is there any way to achieve this without changing something in projects configurations?
Update 1
Some extra details I guess I need to mention:
We need this global solution to be as close as possible to what the end user gets when he installs our application, since we intend to use it for debugging of the entire application when we need to figure out which part of the application isn't working before sending this bug to the team working on that part.
This means that when building under global solution, the output directory hierarchy should be the same as it would be in Program Files after installation, so that if, for example, we have Program Files/MyApplication/Addins folder which contains all the addins developed by different teams, we need the global solution to copy the binaries from addins projects and place them in the output directory accordingly.
The thing is, the team developing an addin doesn't necessary know that it is an addin and that it should be placed in that folder, so they cannot change their relative output directory to be build/bin/Debug/Addins.
The key here is that team is responsible for a deliverable. That deliverable is a collection of binaries. So the "global" solution ... or "product that uses the deliverables from teams" is interested in ensuring that all of the 'current deliverables' work together. That is, that you have a deliverable from the collaborative effort.
So this begs a few questions. Do the team deliver what they consider to be a 'release'. This may be automatic in the build system. If it builds and all tests pass then publish it.
What you are looking for is a team publishing or promoting a release. The source code is how you got there, the binaries are the result. Each team controls what binaries it considers to be a release (this may be automated by the build system).
Not exactly what you asked, but I hope it is the answer that leads to the right questions to give good results.
One very simple way would be to create the solution. Include all the projects and add a project (or more) to handle the global solution build tasks. The projects in the global solution should then have a reference to the projects they need and then let Visual Studio handle how to get the binaries from each project. They will (under normal circumstances) be copied to the output folder of the build project. So the project added specifically for the global build tasks would have a copy of all the referenced projects
Another way would be to create a global MSBuild script that references the rest of the build scripts. Each project is on it's own a MSBuild script
EDIT
From the comments it would seem that there are two categories of projects. One that needs building and one that does not.
For those that need building reference them as projects in the aggregating project for those that do not require building add them either as references or add the dll as resources.
Using the later change the property of the Build action to None and copy to output directory to Copy if newer
In both cases you now have all dll's in the output directory you can then have a post build action on the aggregating project moving the dlls that should be in a specific folder (ie not in the output folder)
Have a look at the practice of Continuous Integration and the usage of a Build Server with scripted builds. This is an indispensable instrument when developing different parts of an application as a team, and your problems are a great illustration of the reason why.
You've not mentioned if you use a Version Control system. I've found in practise that each developer maintains his/hers/their teams configuration and builds locally on there machine, since you don't check *.suo or *.user files most of the personal configuration only affects the individual team member.
On a completely seperate machine check-out the same code from all repositories and compile the project on the build machine (this can be completely automated). This maintains your build servers independance.
Don't worry about it being a "Solution". You can easily build multiple solutions one after the other.
Since the output path is relative (and probably "bin\Debug") it'll get built wherever you check it out to. If you want all the binaries in the same output folder you could tweak the output path on every configuration to match. Something like "....\bin\Debug" (obviously this affects where the projects get built to on the local machines but it might not matter). That way multiple projects would get built the same target output.
You could also include a seperate setup build on the build server which isn't on each developers local machine to package up the final product.

Ensure required install actions for a dll are executed without duplicating code

I have a c# solution with two regular projects and a setup project. One of the regular projects is an executable, while the other is a dll, that I also use in other solutions. The dll project relies on there being a certain event log source, that it can log to, and since the program is intended to be run by users that are not allowed to create log sources, this source must be created at installation.
I have done this by creating an installer class for my executable project, creating the log source in the installer, and included that installer in my custom actions in the setup project. This works, but now I have to create a similar installer for every other project, that also uses that dll.
The best solution would be, if I could write an installer for the dll, and then choose the dll for the custom actions in the setup project. This way I would only have to state the log creation requirement once. However, I am not able to select the dll project output for the custom actions in the setup project.
Another good solution would be, if I could somehow specify that the installer for the executable should be transitive, such that it would also perform install actions for any projects that the executable project depended on, but I don't know how to specify that requirement.
So what can I do to avoid duplicating installation code between different projects?
You should be able to add an installer class to your DLL then register the DLL for execution of custom actions in a setup project. If you have tried this and encountered problems, could you please be more specific about which version of Visual Studio and which type of setup project you are using?
I just have a MyApplication.Installation assembly where I put a custom action that creates the event source. All my setup projects reference this assembly and invoke its custom action.
How about this? You create a simple batch file or a powershell script to create the log file that you want to create.You could make an installer for the dll file(or even the entire solution it doesn't matter.) You can then invoke the batch file that you just wrote from the installer.[Refer here] . This way, you are not duplicating the creation logic for a dependent files/resources; and you can use the same batch file for multiple setup projects basically(provided they use the same resources.)
I hope this answers your question.
One step further, what environment are your clients on? Are they still on Win XP(SP2 or before)? If that is the case, you have to do something similar to what you already have in mind right now. However, if that is not the case, if your clients are on Win 7, You could use nuget to publish your bins(Refer here). I admit that this is still looked at as a source code sharing solution. But I believe that the approach can be extended to publishing binaries too.

Categories