In my program, I use SevenZipSharp to generate zip files. SevenZipSharp is a managed DLL which loads another DLL, 7z.dll. I am manually setting SevenZipSharp's path to 7z.dll using SevenZipCompressor.SetLibraryPath.
When I execute my program in Debug mode, this all works fine, and it generates the zip file as nice as you please. However, when I execute my unit tests with mstest, SevenZipSharp always gives me the following error:
Test method threw exception:
SevenZip.SevenZipLibraryException: Can not load 7-zip library or
internal COM error! Message: failed to load library..
I suspect that MSTest might be doing something that is preventing SevenZipSharp from being able to load 7z.dll, like running in a security-tight sandbox (or something. I'm new to C# and MSTest...)
Does anyone have an idea about what might be happening?
Thank you!
Though the question posits a questionable scenario, the general problem of MSTest not loading required DLLs seems to be a common one, deserving of a less dismissive answer.
By default MSTest will copy the assemblies it believes are required by the test container to the Out folder of the default results folder, which changes for each run.
MSTest does not always automatically infer the necessary assemblies correctly; if there is no explicit direct reference to an assembly it won't be copied. Also, native DLLs are typically not detected.
I am not aware of a direct option to set the MSTest search path. You can determine the search path using procmon.exe as suggested above (it is basically the standard Windows DLL search).
Unintuitively, the default search path does not include the launch directory and I think this is a cause of confusion. When the tests are running the current directory is the test results "Out" directory, not the MSTest launch directory.
However, it is possible to control MSTest search behaviour (and the copying behaviour) with a test settings file. You can easily create and edit these through Visual Studio (see the Test menu) and then specify the created settings file on the MSTest command line. You can use different settings files for Visual Studio and MSTest.
By this means you can control exactly what DLLs are copied to your test directory.
See Create Test Settings to Run Automated Tests from Visual Studio to get more information on this.
Of course, DLL load failures may be due to missing dependencies, and the DLL mentioned in the error message may itself be present. You can use the dependency viewer or procmon to pick up unexpected dependencies in DLLs.
Consider using Process Monitor (aka procmon.exe) from the excellent SysInternals tools to monitor your test harness (MSTest). It will show you where the executable is looking for 7z.dll.
Visual Studio 2017 (and possibly 2015) provides two new ways to indicate that a native dll or other file is required by your tests without the need for a test settings file:
1: Add a link to the dll to your test project and tell VS to copy it to the output directory. Right-click the project in Solution Explorer and choose Add > Existing Item. Browse to 7z.dll, click the down arrow next to the Add button, and choose Add as Link. Then select the new 7z.dll item in Solution Explorer, alt-Enter to bring up Properties, and set "Copy to Output Directory" to "Copy if newer" (or "Copy always").
2: Attach a DeploymentItemAttribute to your test class. This attribute's constructor takes a single string argument, which is the path to the file you want to make available to tests in the class. It is relative to the test project's output directory.
Are you sure you want your unit tests to involve external libraries? Ideally you should have mechanisms to replace external stuff with e.g. mock objects, because testing external libraries like that actually turns your test into an integration test.
For popular libraries such as SevenZipSharp you can assume that it is properly tested, and you can have manual integration tests to verify that it performs correctly in your program.
I would consider getting rid of that dependency through dependency injection, mocking frameworks, etc, and let your unit tests solely test your own code.
Investigate the factory method or abstract factory design patterns for tips on how to easily replace such dependencies.
A good start would be to create your own ICustomZipInterface, and use the wrapper pattern to encapsulate your zip logic for production code. In your unit tests, replace that wrapper class with a dummy implementation. The dummy implementation might for example record how you access your zip component and you could use that information to validate your code, rather than checking if a zip file is actually created.
Related
I'm writing a C# application that accepts plugins. The way I accomplished this is as follows:
In my solution, create a project that contains a single interface that defines the expected methods of a plugin class.
In the main application, add a reference to this project containing one interface.
Add a third project to the solution which represents a plugin. This plugin also has a reference to the interface project.
In my main application, I scan a plugins folder for files matching a given filename (plugin_.dll). If such files are found, I load the assembly and then use reflection to look for any class that implements the interface. For any such class, I add an instance of it to a List<IPlugin>. The app then has access to all the plugins via this list.
This works great and I have successfully written a couple of very simple plugins.
Here's where I'm struggling:
When I want to test the plugin, I have to first build the solution and then manually copy the built plugin into the correct location which the app scans. I know I can probably automate this by adding a post-build command though.
More importantly, is there a good way to actually debug the code in the plugin? (single-step, exception breaks, etc.) Right now I simply run the app and see what happens. I use extensive Console.WriteLines if I need to trace something. It's far less productive though than using VS's debugger.
Less important at this particular point but could be a thing down the road: how would someone else debug the plugin? More specifically: if I start a new VS solution and make a reference to the interface assembly, is there any reasonable way to debug the code in my new plugin?
What I have
I have a unit test Visual Studio C# Solution (which runs webdriver tests, not that that's necessarily relevant). It runs via TeamCity. Currently the environment is hard-coded to "Dev" in one of the .cs files, and I manually change the code locally to run elsewhere when required.
What I want
A way to setup two projects in TeamCity - one to run on "Dev" environment and the other on "Test" environment. Obviously I can't use hard-coded values so I need some sort of set of configuration files that can be chosen at runtime, or possibly some sort of build parameters - but I have no clue how to do this or what will work.
(I didn't mention TeamCity in the question as it is not 100% relevant / just provides context --- as long as I can run the unit tests eg from the command prompt with parameters that can be passed in, that would do the trick.)
What I've tried
From what I've asked around, I don't believe I can use web.config as it's not a web solution but a unit test solution. I believe there is a mechanism to tell Configuration Manager what web.config file to use, so I'm hoping there's a similar mechanism that can be used for Unit Test projects. I've tried hunting down information on "build configurations" on "unit test projects" and a range of other searches, but it's a nightmare finding anything relevant.
Can someone point me in the right direction? I'm good with my basic programming, but if it requires messing around with configurations or build parameters, then I might need a more explicit 'how-to' from you.
Thanks in advance.
Check this extension for Visual Studio
https://visualstudiogallery.msdn.microsoft.com/579d3a78-3bdd-497c-bc21-aa6e6abbc859
This allows you to create different config files for different build type. You need to create different build types each with a config file specific to that build type.
Ok as per my comments above, I tried a solution here: https://visualstudiogallery.msdn.microsoft.com/69023d00-a4f9-4a34-a6cd-7e854ba318b5
IT WORKED.
I'm now able to create App.Config transforms that automatically link to a Build Configuration. I can then specify the build configuration inside TeamCity.
The only trick was that the "Rebuild" target didn't work if there were no code changes but there was a difference in Configuration. (They use the same directory of the same Agent, and a rebuild is necessary). The workaround for this is to tick the 'Clean all files before build' option in TeamCity Version Control settings.
I wonder if anyone could assist me..
I am working in a Visual Studio project, and we have recently begun working on Unit Tests for our project.
The Unit Tests rely on referencing an Excel file in the solution.
We have added code that appears to work well on each of our own environments with regards to referencing the file, and all seems good.
We have now also setup TFS to trigger it to run all the Unit Tests on each commit with a report, which has now shown to have some problems as almost all of them are failing, despite them all running successfully on our own environments.
TFS doesn't seem to provide any logging why the tests are failing, but we assume it's to do with the path referencing.
So our solution structure is like so..
..\head\Solution\Project\project.csproj
..\head\Solution\Tests\TestFiles\spreadsheet.xlsx
We are currently using the following code to reference the spreadsheet..
string filename = Path.Combine(Directory.GetParent(Directory.GetCurrentDirectory()).Parent.Parent.Parent.FullName, "Tests\\TestFiles\\spreadsheet.xlsx");
ExcelImporter importer = new ExcelImporter(filename);
Seems like we are explicitly calling a set number of parent folders, which probably isnt the same on the build server environment.
How can we better reference the spreadsheet file, assuming we'll never know how many parent folders to the solution there will be to it?
I really would not got this way, the physical path where your unit tests will be running from might be very different on the build server compared to your local dev machines.
in general you should simply include (even adding it as a link) a reference to the solution item (excel file) in your unit test project and then you can set its property to copy to output folder so that when unit tests assembly is generated the excel file will also be copied in the same location as the assembly of unit tests, then at the top of your test method you can declare the dependency in this way with the standard attribute:
[DeploymentItem("spreadsheet.xlsx")]
this should work well for MS Tests at least, we use it and works with no issues.
In the Unit tests project create a folder, let's call it "Resources", and copy your file there. Make sure the file properties set your file to be always copied to the output.
Then in your unit test just get the file like this:
string filename = Path.Combine(Environment.CurrentDirectory, "Resources\\spreadsheet.xlsx");
Relevant link: Enviroment.CurrentDirectory
I have a c# solution with two regular projects and a setup project. One of the regular projects is an executable, while the other is a dll, that I also use in other solutions. The dll project relies on there being a certain event log source, that it can log to, and since the program is intended to be run by users that are not allowed to create log sources, this source must be created at installation.
I have done this by creating an installer class for my executable project, creating the log source in the installer, and included that installer in my custom actions in the setup project. This works, but now I have to create a similar installer for every other project, that also uses that dll.
The best solution would be, if I could write an installer for the dll, and then choose the dll for the custom actions in the setup project. This way I would only have to state the log creation requirement once. However, I am not able to select the dll project output for the custom actions in the setup project.
Another good solution would be, if I could somehow specify that the installer for the executable should be transitive, such that it would also perform install actions for any projects that the executable project depended on, but I don't know how to specify that requirement.
So what can I do to avoid duplicating installation code between different projects?
You should be able to add an installer class to your DLL then register the DLL for execution of custom actions in a setup project. If you have tried this and encountered problems, could you please be more specific about which version of Visual Studio and which type of setup project you are using?
I just have a MyApplication.Installation assembly where I put a custom action that creates the event source. All my setup projects reference this assembly and invoke its custom action.
How about this? You create a simple batch file or a powershell script to create the log file that you want to create.You could make an installer for the dll file(or even the entire solution it doesn't matter.) You can then invoke the batch file that you just wrote from the installer.[Refer here] . This way, you are not duplicating the creation logic for a dependent files/resources; and you can use the same batch file for multiple setup projects basically(provided they use the same resources.)
I hope this answers your question.
One step further, what environment are your clients on? Are they still on Win XP(SP2 or before)? If that is the case, you have to do something similar to what you already have in mind right now. However, if that is not the case, if your clients are on Win 7, You could use nuget to publish your bins(Refer here). I admit that this is still looked at as a source code sharing solution. But I believe that the approach can be extended to publishing binaries too.
I have a component that reads some configuration from the standard .NET configuration (app.config) file.
when I run unit tests (NUnit) for this component (using TD.NET), i noticed that the configuration file is not read.
Upon inspection of AppDomain.CurrentDomain.SetupInformation.ConfigurationFile
I have noticed that its' value is set to C:\Users\ltal\AppData\Local\Temp\tmp6D2F.tmp (some temp random locaiton).
Is there a reason for why this is happening? (Is it NUnit or TD.NET's fault?)
I suppose i could set this SetupInformation object myself for the sake of the test, haven't tried yet, but still wondering why is it being created like that and not as default.
To workaround this, you can create an app.config in your unit test project. This will then be called in place of the main app.config by your unit tests. You can then change values in that app.config in your unit tests making it easier to test different values and configurations i.e. you can setup your test app.config with certain values before running your test.
ConfigurationManager.AppSettings[""] = "";
Another option might be to place settings in the Settings.setting file of your main project. You do not have to change anything in your unit test project then. Some links about the difference between settings and app.config - MSDN forums, StackOverflow, User Settings - MSDN
And of course a third option would be to remove the dependency on the app.config from your component by introducing an interface and inject the dependency into the component making it easy to mock it out and unit test.
By default the .NET runtime looks in the working directory of the AppDomain, which is being managed by NUnit in the temp location.
This link offers two solutions about how to get config files picked up:
http://blogs.msdn.com/b/josealmeida/archive/2004/05/31/loading-config-files-in-nunit.aspx
Basically, they need to live in the testing directory.