I'm struggling writing the right configuration for grpc imports.
So the .net solution structure is like:
\Protos\Common\common.proto
\Protos\Vehicle\car.proto
\CarMicroservice
Inside car.proto I have: import "common.proto"
What I want is the generated grpc code to be inside the project CarMicroservice.
Inside CarMicroservice.csproj I have written the path to the protos:
<Protobuf Include="common.proto" ProtoRoot="..\Protos\Common\" />
<Protobuf Include="car.proto" ProtoRoot="..\Protos\Vehicle\" />
But getting error: 'common.proto' no such file or directory
The question is:
How do I correctly import common.proto inside the car.proto?
Note: I already looked at the similar issue, but couldn't make it work for my case
Importing .proto files from another project
Ok, I finally solved the issue. Also #DazWilkin pointed it out.
You can't use relative paths in the import, so you should use absolute path of the project. In my case it was: import "Common/common.proto"
Use the project root for the location of proto files. So instead of ProtoRoot="..\Protos\Common\" use ProtoRoot="../Protos/"
Now comes the interesting part.
For some reason when I used backslashes for the ProtoRoot path as "..\Protos\
I was getting errors as 'file not found'. So don't use the backslashes for paths.
The final code in CarMicroservice.csproj is like the following:
<Protobuf Include="Common/common.proto" ProtoRoot="../Protos/" />
<Protobuf Include="Vehicle/car.proto" ProtoRoot="../Protos/" />
Alternatively, there are two other ways to specify include directories for protoc.
First, there is a global property named Protobuf_AdditionalImportsPath. If you specify this property in your .csproj/.fsproj, it will be passed on to protoc for every .proto in that project. You can use it like this:
<PropertyGroup>
<Protobuf_AdditionalImportsPath>../../my/other/directory</Protobuf_AdditionalImportsPath>
</PropertyGroup>
There is also a AdditionalImportDirs property you can specify directly on <Protobuf> elements. This will likely only be valid for the .protos you specify it on:
<Protobuf Include="MyProto.proto" Link="MyProto.proto" AdditionalImportDirs="../../my/other/directory" />
Keep in mind that in both cases, you can specify any directory you want, regardless of whether it is a parent of the .proto you're compiling.
Also keep in mind that the import path you specify in your .proto needs be relative to one of the import directories you specify.
Related
I am using Grpc.Tools (2.38.1) to generate C# types and gRPC stubs from a Test.proto file containing some service definitions.
To do this I have the following in my project's .csproj file:
<ItemGroup>
<Protobuf Include="**/*.proto" />
</ItemGroup>
This is all working fine: my Test.proto gets compiled to Test.cs and TestGrpc.cs in the obj/Debug folder of my project. The types within them can be referenced from within other types in the project.
But I need to create a WCF interface for the service too, so I thought I could generate this using a custom Protoc plug-in. So I wrote a simple Protoc plug-in that writes out a TestWcf.cs file containing an interface. I then placed this plug-in executable on my path named protoc-gen-blah.exe and updated the entry in the .csproj file to this:
<ItemGroup>
<Protobuf Include="**/*.proto" AdditionalProtocArguments="--blah_out=obj\Debug" />
</ItemGroup>
This correctly creates the C# file, TestWcf.cs, with my interface in: fantastic.
The problem is that my interface within TestWcf.cs cannot be referenced from other types in the project unless I manually include the generated file in the project: something I do not have to do with the other generated files.
Whilst none of the files are included in the project by default―I have to enable 'Show All Files' to see them―Test.cs and TestGrpc.cs have arrows beside them in the Solution Explorer that allow them to be expanded to reveal the types inside. TestWcf.cs does not have this arrow. So Visual Studio is somehow aware that Test.cs and TestGrpc.cs are source code files.
Does anyone know what I need to do for my generated file to be automatically recognised by Visual Studio like the other two files are?
I suspect it has something to do with this part of the Grpc.Tools build target, as I noticed my TestWcf.cs file is not included in the files deleted by the Grpc.Tools clean either, but I can't see why it does not consider my generated file to be C#.
When I build, this is the Protoc call:
D:\...\Src\packages\Grpc.Tools.2.38.1\tools\windows_x86\protoc.exe --csharp_out=obj\Debug ⤶
--plugin=protoc-gen-grpc=D:\...\Src\packages\Grpc.Tools.2.38.1\tools\windows_x86\grpc_csharp_plugin.exe ⤶
--grpc_out=obj\Debug --proto_path=D:\...\Src\packages\Grpc.Tools.2.38.1\build\native\include ⤶
--proto_path=. --dependency_out=obj\Debug\xxxx_Test.protodep --error_format=msvs --blah_out=obj\Debug ⤶
Test.proto
The dependency file looks like this:
obj\Debug/Test.cs \
obj\Debug/TestGrpc.cs \
obj\Debug/TestWcf.cs: Test.proto
Thanks.
I believe the problem is caused some logic in Grpc.Tools that informs MSBuild of the files that have been generated:
public override string[] GetPossibleOutputs(ITaskItem protoItem)
{
...
var outputs = new string[doGrpc ? 2 : 1];
...
outputs[0] = Path.Combine(outdir, filename) + ".cs";
if (doGrpc)
{
...
outputs[1] = Path.Combine(grpcdir, filename) + "Grpc.cs";
}
return outputs;
}
This code only caters for two files being generated from a Protocol Buffer source (name.proto): the Protocol Buffers code generation (name.cs) and the gRPC code generation (nameGrpc.cs). It is not picking up the additional file and informing MSBuild that it exists, hence Visual Studio does not consider it to be code.
There is no away around this short of changing the Grpc.Tools code.
I want to access a MSBuild variable inside an unit test, which is a .NET 4.5 class library project (classic csproj), but I failed to find any articles discussing a way to pass values from MSBuild into the execution context.
I thought about setting an environment variable during compilation and then reading that environment variable during execution, but that seems to require a custom task to set the environment variable value and I was a bit worried about the scope of the variable (ideally, I only wanted it to be available to the currently executing project, not globally).
Is there a known solution to reading an MSBuild property from inside a DLL project in runtime? Can MSBuild properties be "passed as parameters" during execution somehow?
I finally made it work by using the same code generation task that is used by default in .Net Core projects. The only difference is that I had to manually add the Target in the csproj file for it to work, as code creation is not standard for framework projects:
<Target Name="BeforeBuild">
<ItemGroup>
<AssemblyAttributes Include="MyProject.SolutionFileAttribute">
<_Parameter1>$(SolutionPath)</_Parameter1>
</AssemblyAttributes>
</ItemGroup>
<WriteCodeFragment AssemblyAttributes="#(AssemblyAttributes)" Language="C#" OutputDirectory="$(IntermediateOutputPath)" OutputFile="SolutionInfo.cs">
<Output TaskParameter="OutputFile" ItemName="Compile" />
<Output TaskParameter="OutputFile" ItemName="FileWrites" />
</WriteCodeFragment>
</Target>
The lines with Compile and FileWrites are there for it to play nicely with clean and such (see linked answers in my comments above). Everything else should be intuitive enough.
When the project compiles, a custom attribute is added to the assembly, that I can then retrieve using normal reflection:
Assembly
.GetExecutingAssembly()
.GetCustomAttribute<SolutionFileAttribute>()
.SolutionFile
This works really well and allows me to avoid any hardcoded searches for the solution file.
I think you have a couple of options:
Use environment variables, like you already suggested. A custom task maybe required to do that, but it is easy to do, without any extra assemblies on your part. The required global visibility might be an issue tough; consider parallel builds on a CI machine, for example.
Write a code fragment during build and include that into your resulting assembly (something akin to what you have already found under the link you suggested in your comments.
Write a file (even app.config) during build that contains settings reflecting the MSBuild properties you need to have; read those during test runs.
(BTW, what makes little sense, is to attempt to read the MSBuild project file again during runtime (using the Microsoft.Build framework). For once that is a whole lot of work to begin with, for little gain IMHO.
And even more important, you most likely - depending on the complexity and dependencies of your properties - need to make sure you invoke the MSBuild libraries with the same properties that where present during the actual build. Arguably, that might put you back were you started from.)
The last two options are best suited because they share equal traits: they are scoped only to the build/test run you currently have (i.e. you could have parallel running builds without interference).
I might go for the third, because that seems to be the easiest to realize.
In fact I have done so on a larger project I've been working on. Basically, we had different environments (database connection strings, etc.) and would select those
as a post build step by basically copying the specific myenv.config to default.config.
The tests would only ever look for a file named default.config and pick up whatever settings are set in there.
Another version, compiled from several internet sources, get environment variable when building, then use its value in code
file AssemblyAttribute.cs
namespace MyApp
{
[AttributeUsage(AttributeTargets.Assembly)]
public class MyCustomAttribute : Attribute
{
public string Value { get; set; }
public MyCustomAttribute(string value)
{
Value = value;
}
}
}
file MainForm.cs
var myvalue = Assembly.GetExecutingAssembly().GetCustomAttribute<MyCustomAttribute>().Value;
file MyApp.csproj, at the end (get %USERNAME% environment variable in build, generate SolutionInfo.cs file, automatically include it to build)
<Target Name="BeforeBuild">
<ItemGroup>
<AssemblyAttributes Include="MyApp.MyCustomAttribute">
<_Parameter1>$(USERNAME)</_Parameter1>
</AssemblyAttributes>
</ItemGroup>
<WriteCodeFragment AssemblyAttributes="#(AssemblyAttributes)" Language="C#" OutputFile="SolutionInfo.cs">
<Output TaskParameter="OutputFile" ItemName="Compile" />
<Output TaskParameter="OutputFile" ItemName="FileWrites" />
</WriteCodeFragment>
</Target>
I have an application that needs to parse the ProjectReference elements from *.csproj files. It could do this well with the old (.net) format where I used the Name element to get the name of a project:
<ProjectReference Include="..\MyProject\MyProject.csproj">
<Project>{guid..}</Project>
<Name>MyProject</Name>
</ProjectReference>
The new format however (.net-core) makes it crash now because there is no Name element anymore.
I found a few differences between both files but I'm not sure which one I should use tell that I'm working with the new core-file. The differences are:
the new format does not contain xml declaration
the root element starts with the <Project Sdk= attribute and does not contain any default namespace wheres the old one has xmlns=" declared
the ProjectReference element does not contain any children
Which property would be the most reliable way to recognize the core-file like Visual Studio does? Am I on the right track or is there any other criteria I should use to tell the file formats apart?
If you are using MSBuild to evaluate the project file, the sdk sets the UsingMicrosoftNETSdk property to true.
If you are only use XML based tooling to read the file, you can check if a <TargetFramework> or <TargetFrameworks> (plural) property (inside a <PropertyGroup>).
This is the same mechanism that visual studio uses to determine whether the new or classic project system is used for the project (see Opening With CPS document).
I am new to MSBuild. Just started trying it two days ago, and now I am just testing it. I have run into a problem where I get this error:
"c:\Users\martinslot\Documents\Visual Studio 2010\Projects\MultifileAssembly\SpecializedBuild.xml" (BuildNumberUtil target) (1) ->
c:\Users\martinslot\Documents\Visual Studio 2010\Projects\MultifileAssembly\SpecializedBuild.xml(4,34): error MSB4006: There is a circular dependency in t
he target dependency graph involving target "BuildNumberUtil".
My MSBuild script look like this:
<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="4.0" DefaultTargets="Build" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<Target Name="BuildNumberUtil" DependsOnTargets="BuildStringUtil" >
<Message Text="=============Building modules for NumberUtil============="/>
<Csc TargetType="Module" Sources="NumberUtil/DoubleUtil.cs; NumberUtil/IntegerUtil.cs" AddModules="/StringUtil/StringUtil"/>
<Copy SourceFiles="#(NetModules)" DestinationFolder="../Output/Specialized"/>
</Target>
<Target Name="BuildStringUtil" DependsOnTargets="BuildNumberUtil" >
<Message Text="=============Building modules for StringUtil============="/>
<Csc TargetType="Module" Sources="StringUtil/StringUtil.cs;" AddModules="/NumberUtil/IntegerUtil;/NumberUtil/DoubleUtil"/>
<Copy SourceFiles="#(NetModules)" DestinationFolder="/Output/Specialized"/>
</Target>
</Project>
I understand the problem, actually I created this small example to see if MSBuild understood and could somehow correct the problem. How do I solve this?
My problem is that the two targets compile modules that rely on eachother. Does someone here have a solution on how to handle this kind of problem with MSBuild? Maybe I am constructing this in the wrong way?
You simply cannot build projects with circular dependencies. How could you? Which do you build first? There may be some esoteric, convoluted, incorrect way of doing so, but why do it? Circular dependencies usually indicate a design flaw. Fix the design, and you no longer have a circular dependency issue.
It is possible to construct Circular Modules within the scope of MSBuild and Visual Studio; however, doing so has a very limited set of situations where it would be valid to do so.
One key way to do this, if you're planning on using Xaml within your code, is to remove the Sources aspect of the Csc tag and generate your own .response file which actually points to the code you wish to inject. Within the Csc tag attributes you'd specify this file yourself in the ResponseFiles attribute.
Within your .response file, you would then break your application down into its assembly and netmodule components, making sure to include the core assembly's files first at all times. Typically the Csc tag's attributes are directly translated into Csc.exe command line parameters. The parameter names do not always match up. For the sake of resolution it's best to use full, non-relative, paths when referring to files (example, partial, .response below):
"X:\Projects\Code\C#\Solution Name\InternalName\ProjectName - InternalName\SearchContexts\StringSearchType.cs"
"X:\Projects\Code\C#\Solution Name\InternalName\ProjectName - InternalName\UI\Themes\Themes.cs"
/target:module /out:bin\x86\Debug\InternalName.UI.dll
"X:\Projects\Code\C#\Solution Name\InternalName\ProjectName - InternalName\UI\EditDatabaseImageControl.xaml.cs"
"X:\Projects\Code\C#\Solution Name\InternalName\ProjectName - InternalName\obj\x86\Debug\UI\EditDatabaseImageControl.g.cs"
You'll notice that this will end up with merging your multiple sets of Targets into one, and that I've included the xaml generated code myself. This is partly why you remove the Sources aspect, as the Xaml Page generator part of the MSBuild task automatically injects information into the #(Compile) set. Since there's a Debug/Release configuration, in the area where you define the response file to use, I create two versions of the response (since I'm using a T4 template):
ResponseFiles="$(CompilerResponseFile);InternalName.$(Configuration).response"
If you intended to include more than one platform in your code you'd likely need C*P response files where C is the number of configurations (Debug|Release) and P is the number of platforms (x86, x64, AnyCpu). This kind of solution would likely only be a sane method by using a generator.
The short version of this: it is possible to create circular modules so long as you can guarantee that you'll compile it all in one step. To ensure that you maintain the build functionality that is afforded to you with the Xaml build step, your best bet is to start with a normal C# project, and create your own .Targets file from the $(MSBuildToolsPath)\Microsoft.CSharp.targets in the <Import ... tag near the bottom. You'll also likely need a secondary csproj for design purposes since a large portion of intellisense is lost by using this workaround (or use a csproj Condition attribute where the target is selected by some flag you set). You'll also notice certain Xaml editors don't seem to like the binding to netmodule namespaces, so if you bind to types in a netmodule you'll likely have to do them in codebehind (I haven't tested workarounds for this since there's usually ways around static namespace binding)
For some reason within all this, the .baml compiled .xaml files are implicitly understood by the Csc compiler, I haven't been able to figure out where it's deriving this from a command argument, or if it's just implicit by design. If I had to guess they're inferred by the g.cs files associated to what you include in your list of included files.
Observe that this is occurred for web application (either ASP.NET standard web application or ASP.NET MVC application) and fix for this problem is to be removed the below line in ".csproj" file.
<PropertyGroup>
<BuildDependsOn>
$(BuildDependsOn);
Package
</BuildDependsOn>
</PropertyGroup>
I have a solution that contains several c# projects and I would like to be able to set the output path and other properties on all the projects together in a single place. Property Sheets (vsprops) do not seem to be able available for C# projects and the $(SolutionDir) variable is ignored. Are there any other methods to set properties across several C# projects?
Update
By Following the information in the answer by Bas Bossink I was able to set the output path of several projects by creating a common csproj and importing it into the individual project. A few other points:
When building in Visual Studio if changes are made to the common project it is necessary to touch/reload any projects that reference it for the changes to be picked up.
Any properties which are also set in a individual project will override the common properties.
Setting $(SolutionDir) as the output path via the Visual Studio UI does not work as expected because the value is treated as a string literal rather than getting expanded. However, Setting $(SolutionDir) directly into the csproj file with a text editor works as expected.
A csproj file is already an msbuild file, this means that csproj files can also use an import element as described here. The import element is
exactly what you require. You could create a Common.proj that contains something like:
<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="3.5"xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<PropertyGroup>
<OutputPath>$(SolutionDir)output</OutputPath>
<WarningLevel>4</WarningLevel>
<UseVSHostingProcess>false</UseVSHostingProcess>
<TreatWarningsAsErrors>true</TreatWarningsAsErrors>
</PropertyGroup>
</Project>
You can import this Common.proj in each of your csprojs, for instance like so:
<Import Project="..\Common.proj" />
The import statement should precede any tasks that depend on the properties defined in Common.proj
I hope this helps. I can't confirm your problems with the $(SolutionDir) variable I've used it many times. I do know however that this variable does not get set when you run an msbuild command via the commandline on a specific project that is contained in a solution. It will be set when you build your solution in Visual Studio.
Unfortunately, these bits of information such as output path are all stored inside the individual *.csproj files. If you want to batch-update a whole bunch of those, you'll have to revert to some kind of a text-updating tool or create a script to touch each of those files.
For things like this (apply changes to a bunch of text files at once) I personally use WildEdit by Helios Software - works like a charm and it's reasonably priced.
But I'm sure there are tons of free alternatives out there, too.
I would suggest you to use a build tool such as MSBuild or NAnt which would give you more flexibility on your builds. Basically the idea is to kick off a build using (in most cases) a single configurable build file.
I would personally recommend NAnt.
You could find an awesome tutorial on NAnt on JP Boodhoo's blog here
Set the $(OutputPath) property in a common property sheet. Then delete that entry in all the project files you want to it to affect. Then import that property sheet into all your projects.
For hundreds of projects that can be very tedious. Which is why I wrote a tool to help with this:
https://github.com/chris1248/MsbuildRefactor