t4 template processing finding another copy of EntityFramework.dll - c#

I have an issue where, when I run a T4 template, it is picking up another copy of the EntityFramework.dll in the VS Common7 folder. That causes it to think that my DbContext class does not inherit from System.Data.Entity.DbContext.
When I debug the T4 template I get the following in the Immediate window.
typeof(System.Data.Entity.DbContext).Module.FullyQualifiedName
"C:\\Program Files (x86)\\Microsoft Visual Studio 14.0\\Common7\\IDE\\EntityFramework.dll"
However that path is absolutely nowhere in my Modules window. The proper EntityFramework.dll shows up there from my build output for the project I'm referencing.
I understand I'm leaving out some detail here, and I'll fill in more to answer questions, but I am just at a total loss as to what would cause Visual Studio to pull in another EntityFramework.dll instead of using the one that I'm providing it. This all runs fine, by the way, if I directly run the code in a command line app or if I set the templating tool to TextTemplatingFilePreprocessor and run the generated class.
The line of code this is all erroring on is:
Database.SetInitializer<FakeDbContext>(null);
FakeDbContext being nothing more than:
public class FakeDbContext: DbContext
{
}
The error I get is, given the dll issue, predictable:
Method System.Data.Entity.Database.SetInitializer: type argument
'FakeT4Dependencies.FakeDbContext' violates the constraint of type
parameter 'TContext'.
It thinks FakeDbContext is not a DbContext. Which it thinks because of the dll issue. Interestingly, the issue is a run time error, not an error at compilation of the text transform class. I can tell that by the stack trace, the type of exception (VerificationException), and the fact that it's an exception and not a compile error.
If I run the code straight through my command line, by the way, you get what you would expect for the type. And everything works fine.
typeof(System.Data.Entity.DbContext).Module.FullyQualifiedName
"C:\\Users\\jeffreyvest\\Code\\TestT4Dependencies\\TestT4Dependencies\\bin\\Debug\\EntityFramework.dll"
The issue only comes up when running it through the custom tool by saving the .tt file. Somehow VS loads things up just a bit differently and it wants to go off and use a different copy of the dll that I do not want it using at all. Incidentally, that copy is the same version of EntityFramework.dll. Just the fact that it lives in another assembly, and not the one referenced by my code, is throwing it off. The deepest mystery for me is the fact that it does not show that dll anywhere in my module list. It shows the other copy in my bin/debug of the project whose output I'm using. So it appears that's all getting bound up correctly then suddenly at the last minute somehow the type itself gets associated to a different dll. It would even be ok if it used nothing but that other dll in Common7 folder. It's the mix that's making it all go wonky.
EDIT
As an added fun twist, changing the assembly referenced in the T4 to use that Common7 copy of EntityFramework fixes the issue only when debugging the T4. When running it straight (just saving the .tt file) it still has the issue described.
I also put this together that I think really illustrates the issue.
typeof(FakeT4Dependencies.FakeDbContext).BaseType.Module.FullyQualifiedName
"C:\\Users\\jeffreyvest\\Code\\TestT4Dependencies\\FakeT4Dependencies\\bin\\Debug\\EntityFramework.dll"
typeof(System.Data.Entity.DbContext).Module.FullyQualifiedName
"C:\\Program Files (x86)\\Microsoft Visual Studio 14.0\\Common7\\IDE\\EntityFramework.dll"

Related

Error CS7038 (failed to emit module) only in Edit and Continue

I'm debugging a .NET 4.0 application in Visual Studio 2015. My application builds and runs fine, but when I try to edit and continue while running under the debugger, regardless of what changes I make or where I make them in my main project, I get a dialog that says:
Edits were made which cannot be compiled. Execution cannot continue
until the compiler errors are fixed.
As an example of the sort of change I'm talking about, I've tried adding this line in various methods:
Console.WriteLine("foo");
When I look in Visual Studio's Error List pane, I see only one error, CS7038, with the description "Failed to emit module '<my app name>'." No filename, line number, or character is given. There are no squiggly red underlines in my code. If I stop the running application, build with the changes, and run again, everything builds and runs just fine. So there seems to be some discrepancy between what the build-time compiler and the edit-and-continue compiler consider acceptable.
Does anyone know of a way to get more information about why the compile fails in Edit and Continue mode? I read something about attaching to and debugging the VBCSCompiler process, so I tried that, but even with all exception types set to break when thrown, the attached VS never broke.
I'm not sharing any code because this isn't a question about my code but rather about strategies for finding out what the Edit and Continue compiler thinks is wrong, and for all I know the source of the compiler error could be anywhere in my entire project.
Edit:
As mentioned in the comments, I was able to attach a debugger to Visual Studio and break when an exception was thrown upon clicking "Continue" after editing code. The exception was a System.NotSupportedException with the following message: "Changing the version of an assembly reference is not allowed during debugging". It listed the name of the assembly in question, which was a small VB.Net project used by my application, which is mostly in C#. I'm trying to build up an MCVE to submit to Microsoft, but currently I'm unable to reproduce the problem in a smaller solution with just one VB and one C# project.
Edit 2:
I've found a workaround and self-answered the question in case anyone else ever encounters this weird problem, but I'm reserving the "Answered" check mark for anyone who can explain what's going on (why the compiler thinks the version number of the referenced project has changed during the edit).
I found a workaround for the problem, but I don't fully understand what was going on. In the VB.NET project whose assembly version the Edit and Continue compiler said was changing, there was a file called "AssemblyInfo.vb". That file contained the following line:
<Assembly: AssemblyVersion("3.0.*")>
The assembly version can also be set in the Project Properties, via the "Assembly Information" button in the Application tab:
When I removed the AssemblyVersion line from AssemblyInfo.vb, my Edit and Continue problem went away. At first I thought this was because the fields in the Assembly Information window were saved to a different file from AssemblyInfo.vb and there was some conflict between the two, but now I see that the Assembly Information window is just a handy way to edit AssemblyInfo.vb: if I delete the line in AssemblyInfo.vb, it gets cleared in the Assembly Information window.
After some more experimentation, it appears that the asterisk in the version number is the culprit. If I fully specify the assembly version, my Edit and Continue problem goes away. And the referenced project has to be a VB.NET project. I tried the same setup with a C# project, and I could Edit and Continue just fine.
This appears to be very much an edge case, and I'll submit a bug report to Microsoft, but in the meantime I'd love to know what is actually going on with the compiler--why it's getting two different assembly versions of an assembly that really shouldn't need to be recompiled during the debugging.... If you have a good explanation for what's happening, please add it as an answer.
Edit: here's the bug report I filed.
This happened with me in a .net 4.8 app with Visual Studio 2019.
I have a mix vb and cs projects, here the problem appears when a vbproj references a csproj that uses the wildcast operator '*' to specify the version of the assembly.
As commented above by #Wai-Ha-Hee, the wildcast uses the current time, I belive when VS rebuild the application to apply the edits you have made, the version of the assembly changes causing the error.
In assemblyInfo file (of the project present in error) Change:
[assembly: AssemblyVersion("1.0.*")]
To:
[assembly: AssemblyVersion("1.0.0.0")]
It Solved for me.
An important thing to say is the use of wildcast '*' make the assembly non-deterministic, it means each build produces a different assembly. This has been considered bad practice because build the source code in the same conditions generates different assemblies.
In Visual Studio 2019:
New csproj/vbproj with non-sdk style projects file are generated with:
<Deterministic>true</Deterministic>
And new csproj/vbproj with Sdk style projects file omits this line but assumes deterministic as default too.
I recommend considerate other ways to version the assembly.
More about Deterministic:
http://blog.paranoidcoding.com/2016/04/05/deterministic-builds-in-roslyn.html
https://reproducible-builds.org/
One of my C# projects in a mixed solution was .NET Framework 2.0 (while others - both C# and VB.NET - were .NET Framework 4). After I changed it to .NET Framework 4 it began to work.

Fody looks for intermediate files in the wrong directory

To be able to publish a single .exe I've added Costura/Fody package to my C# project. I've used this package before but now I get the following error message:
MSBUILD : error : Fody: AssemblyPath
"C:\Projects\X\MSBuild\obj\x86\Debug\X.exe" does not exists. If you
have not done a build you can ignore this error.
Finished Fody 4ms.
The strange thing is, is that intermediary X.exe is correctly build here:
C:\Projects\X\src\X\obj\x86\Debug\X.exe
The project I'm working on is fairly large. So we use a couple of MSBuild props files to put everything in the corect output directories. Both building from the command line with MSBuild and building from within Visual Studio works correctly. So I assume our props files are correct. Why is Fody looking in such a weird location for the intermediaries?
which MSBuild variable that Fody might use controls this Intermediary path?
Looking at the code that throws the exception, I see a very simple File.Exists check. It all stems from ProjectDirectory (in a WeavingTask) and you can check the places where the value is used here.
Since I have not used Fody, I can't tell you more than this. I would pay extra attention to the configuration files, since I don't see the ProjectDirectory being constructed anywhere, just injected from somewhere.

Why can't MSBuild resolve assembly version changes

My problem seems to be relatively simple. I have a c# solution created in VS2010 with several projects with project references configured appropriately. I use MSBuild on our build machine which works fine, building in the correct order incrementally to be efficient. However, if I extend the interface (adding a public property etc.) and increment the AssemblyVersion of one of the projects that others depend on, it seems that MSBuild is unable to refresh the caching on down stream dependents and throws an error that it cannot find the previous version of the changed dll. Interestingly if I run the build again immediately afterwards, it tells me that the output hasn't changed and completes the build without error but I have no longer any confidence that it has done the right thing.
There is a file 'ResolveAssemblyReference.cache' that seems to hold the old reference and if I delete this from the obj/x86/Release folder before each build I never receive any errors but don't know if the output has/should have been rebuilt.
I would like to understand why MSBuild struggles with this and why on a second build it seems to report that the first build did actually work and that the targets are up to date.
Until I understand what is going on I am going to have to always force a rebuild of the entire solution to be certain I have compatible files.
Incidentally if I build in VS2010 I never seem to suffer this as I suspect it updates that cache appropriately each build.
Update:
I have found that my use of 'OutDir' on the command line for MSBuild seems to be to blame. If I remove this then the referencing seems to be resolved appropriately. However now I do not have my output copied to where I need it for deployment...

Script# and compiler problems

I've just come across a pretty strange problem with VS2010 and Script#, which most of the time I am able to re-create.
In my simple scenario I have 2 projects in my solution; a standard Asp.Net MVC2 Web Application, and a Script# jQuery Class Library. I created a static class (attributed with [Imported]) with a static method on it, the intention being that I can map this class in code to an external Javascript library, as described in the documentation.
However, it seems that whenever I decorate such a class with [IgnoreNamespace] to achieve this goal, the project stops successfully compiling but doesn't give me any feedback as to why it's failing (no errors in the error window, for example). It's not easy to get rid of either, as Visual Studio seems to get into a permanent state of not build failure; removing the classes and project files doesn't solve it, nor restarting visual studio. The only way I can get VS to build the project successfully is to delete the project entirely, create a new one then add the files back in, which is annoying to say the least.
With a verbose build output setting, I get the following:
Target "AfterCompile" in file "C:\Program Files (x86)\ScriptSharp\v1.0\ScriptSharp.targets" from project "e:\project\local\ScriptSharpDemo\Scripts\Scripts.csproj" (target "Compile" depends on it):
Task "ScriptCompilerTask"
Done executing task "ScriptCompilerTask" -- FAILED.
Done building target "AfterCompile" in project "Scripts.csproj" -- FAILED.
.. which doesn't tell me whole lot.
There have been a couple of times where I have managed to create this type of class and then successfully build, but mostly I can reproduce this problem pretty reliably.
At this point I'm inclined to think that the bug lies with Script#, but would just like to have that confirmed, and to find a possible work around if there is one.
Just in case anyone is having a similar issue, I've found the cause of the problem.
When adding a class using this method, or copying in a file from another project for use within Script#, this causes a reference to System.dll to be added to the project. This (understandably) causes the project to stop compiling without error.
It would be nice to have a warning about this or for Script# to somehow detect when this situation occurs and/or create a new template for when I use 'Add class' or import a file, but it is just a convenience issue and at least now I can painlessly get my project compiling again just by removing this reference.
When trying to make my project build again, I came across the following, possible solutions:
The "Home\HomePage.cs" and "Shared\Utility.cs" must not be deleted and remain where they wre initially created
The "Home\HomePage.cs" and "Shared\Utility.cs" must be the last entries in the "*.csproj"-file. After them, no "Compile" tag should follow
Problematic calls to "Script.Literal" might cause silent fails - especially be careful when having parameters (like Script.Literal("{0}.doFoo()", variable))
The same seems to be true for "String.Format" when the format parameters are invalid
Namespaces and folders seem to cause many problems, putting all classes into the same namespace and all classes into the same folder might help
I tried all of the suggestions that have been given here, but continued to see the issue. Eventually, I determined that the cause in my situation was that I had added an [IntrinsicProperty] attribute to one of my properties. Removing it solved the issue. Don't ask me why this was causing a problem, but I thought I would share this solution in case others run into it.

Successful Visual Studio C# build does not create assembly

I am using Visual Studio 2005, .NET 2.0
I am not really sure yet under what circumstances it happens, but here is the scenario:
I have a solution with a project structure like this: A library project Foo, a library project Bar which references Foo, and a library project Quux which references Foo and Bar.
Compiling fails with the Error message "Metadata file 'Foo.dll' could not be found" from Bar, and "Metadata file 'Foo.dll' could not be found" and "Metadata file 'Bar.dll' could not be found" from Quux.
Looking in my target directory (I have a combined target directory for all 3 projects), it is empty, so no project at all is compiled. Now I can get that Bar and Quux fail if there is no output from Foo. Problem is: Why does Foo silently fail? There is no error from it, and just building Foo instead of the entire solution works fine.
The "funny" thing is, after just pushing the build button again, the Foo.dll file appears, Bar no longer complains but does not produce any output file either, and Quux complains about missing Bar.dll. Pushing the button again, the Bar.dll appears, there are no more errors but no Quux.dll. Only after pushing the button yet again, the Quux.dll appears, once again with no errors.
The project dependencies are all set correctly, the solution build order says exactly the right thing.
I have even tried creating a new solution and new project files, then adding the sources again to those. No joy, either. Same thing happens.
I am completely stumped. Does anyone know a way out of this mess?
You should have separate output directory for each project. Each time a project builds, it clears the output directory, so it won't find any dependencies on the next one.
Don't fear losing any DLLs, they'll be copied on each bin directory where they are needed.
I think a work around to your problem could be using post build events that deletes previous version of your dlls and copies the new ones to your combined target directory.
When you set up the three projects to work this way, you will find that you are compiling each project to it's correspondent bin folder and to a combined target directory. There is a second thing you should do if you decide to work with this, set up for each project in your solution a reference path pointing to the combined target directory. Compiling order must still exist.
This way each project dll will be found in the combined target directory, each time you compile.
Although, this solution has it's own problems such as when a post-build event forgets to wrok properly; but it's rare.
Hope this helps
The problem is build order. If some project depends on another then that second project must be built first. Use build dependencies in the solution properties to overcome this.
Check you build order so that all seems right there..
Try to run a clean solution and build again is this when it happens?
Drag your project file into note pad and Find with "Import" tag
and replace that tag with this
Import Project="$(MSBuildToolsPath)\Microsoft.CSharp.targets"
Its should work
For reasons unknown, this happened to me with Visual Studio 2013 in the middle of a morning's work. One build, it was updating the dll, the next, it just wasn't, even though the build seemed to go smoothly. I finally addressed it by deleting the existing dll. With no pre-existing dll, the build had to provide a new one.
I agree that each project should have its own target directory. I have tried to get cute with this and it always causes more trouble than whatever I was trying to get around.
I have a vs19 , 5 project solution. I just added a new console app. THis app started suddenly compiling without complaint but did not produce files. the unit test project that depends on it complained.
I did the usual
clean / rebuild
clean / rebuild each project in order
check the project dependencies and the build order
restart vis studio (in know its 2020 and i still have to restart vs sometimes)
faced w/ the prospect of just creating a new project, i decided to try one other thing
removed all references to other projects
commented out all the code that depended on these.
i was left with pretty much a main() that returned a 0
this compiled and produced files
one by one i added the references back until everything was there.
uncommented the code.
at the end of this exercise things worked.
I can not tell you what changed.
Thought I would offer this as a trouble shooting method.

Categories