Powershell Importing custom C# CMDlets, no available "ExportedCommands" - c#

first question here :)
So I have to create a custom CMDLet for Powershell 2.0, using visual studio 2010 express.
I have followed this seemingly simple tutorial : http://blogs.msdn.com/b/saveenr/archive/2010/03/08/how-to-create-a-powershell-2-0-module-and-cmdlet-with-visual-studio-2010-screencast-included.aspx
My code is almost the same (even tried copy pasting thier code) but after I call Import-Module "path_to_dll"
and then call Get-Module, i see my imported module, but no ExportedCommands are available.
ModuleType Name ExportedCommands
---------- ---- ----------------
Binary PowerShellCMDLetsLibrary {}
C# Code:
namespace PowerShellCMDLetsLibrary
{
[System.Management.Automation.Cmdlet(System.Management.Automation.VerbsCommon.Get,"RemedyXml")]
public class Get_RemedyXml:System.Management.Automation.PSCmdlet
{
[System.Management.Automation.Parameter(Position = 0, Mandatory = true)]
public string TicketID;
protected override void ProcessRecord()
{
...
this.WriteObject(Result.InnerXml, true);
}
Might be a blunder, I just can't see it

Two things jump out # me:
The TicketID is a field, not a Property.
The overnamedspaced attribution makes the code hard to read.
I suspect it's #1 , but I can't see enough past #2 to be sure.
Hope this Helps

Dont know if repost, but:
I found the solution. Copied my dll from UNC networkpath to local c:\
And now the command shows up.

If you are running this off a network or unc path , you must add the path to your .net trusts:
ie:
caspol -machine -addgroup 1. -url "file:\\\network\dir\\*" FullTrust -n Uniquename
HTH,
Bob

This is due to Code Access Security in .NET. By default, an assembly loaded from a network share executes with reduced privileges, whereas one loaded from local storage has no restrictions at all. Unfortunately, the Import-Module cmdlet gives no indication that it failed to import the cmdlets within a module, even when called with the -Verbose parameter.
To change the set of permissions granted to a particular network location, use the caspol.exe utility to create a new code group for that location:
caspol.exe -machine -addgroup 1.2 -url "file://server/share/directory/*" FullTrust
The 1.2 in the above command refers to the LocalIntranet code group, which will be the new code group's parent. The following command will show which code groups are defined, and can be used to display the group that you created:
caspol.exe -machine -listgroups
Note that on 32-bit Windows caspol.exe is located in %WinDir%\Microsoft.NET\Framework\CLR_VERSION\ (where, for PowerShell 2.0, CLR_VERSION is v2.0.50727), and on 64-bit Windows another copy is located in %WinDir%\Microsoft.NET\Framework64\CLR_VERSION\. The 32- and 64-bit versions each have their own security configuration file (CONFIG\security.config), so you will need to make sure you apply each security change to both versions using their respective caspol.exe.
The following command can be used to display the permissions that will be granted to a particular assembly:
caspol.exe -resolveperm "//server/share/directory/assembly.dll"

I basically copied an pasted your code. I did a Get-Module ClassLibrary2. Also TicketID does work.
ModuleType Name ExportedCommands
---------- ---- ----------------
Binary ClassLibrary2 Get-RemedyXml
using System.Management.Automation;
namespace ClassLibrary1
{
[Cmdlet(VerbsCommon.Get, "RemedyXml")]
public class Class1 : PSCmdlet
{
[Parameter(Position = 0, Mandatory = true)]
public string TicketID;
protected override void ProcessRecord()
{
WriteObject(TicketID);
}
}
}

Related

Embedding build information in local and CI builds like in Gradle

my question does not target a problem. It is more some kind of "Do you know something that...?". All my applications are built and deployed using CI/CD with Azure DevOps. I like to have all build information handy in the create binary and to read them during runtime. Those applications are mainly .NET Core 2 applications written in C#. I am using the default build system MSBuild supplied with the .NET Core SDK. The project should be buildable on Windows AND Linux.
Information I need:
GitCommitHash: string
GitCommitMessage: string
GitBranch: string
CiBuildNumber: string (only when built via CI not locally)
IsCiBuild: bool (Detecting should work by checking for env variables
which are only available in CI builds)
Current approach:
In each project in the solution there is a class BuildConfig à la
public static class BuildConfig
{
public const string BuildNumber = "#{Build.BuildNumber}#"; // Das sind die Namen der Variablen innerhalb der CI
// and the remaining information...
}
Here tokens are used, which get replaced with the corresponding values during the CI build. To achieve this an addon task is used. Sadly this only fills the values for CI builds and not for the local ones. When running locally and requesting the build information it only contains the tokens as they are not replaced during the local build.
It would be cool to either have the BuildConfig.cs generated during the build or have the values of the variables set during the local build (IntelliSense would be very cool and would prevent some "BuildConfig class could not be found" errors). The values could be set by an MSBuild task (?). That would be one (or two) possibilities to solve this. Do you have ideas/experience regarding this? I did not found that much during my internet research. I only stumbled over this question which did not really help me as I have zero experience with MSBuild tasks/customization.
Then I decided to have a look at build systems in general. Namly Fake and Cake. Cake has a Git-Addin, but I did not find anything regarding code generation/manipulation. Do you know some resources on that?
So here's the thing...
Short time ago I had to work with Android apps namly Java and the build system gradle. So I wanted to inject the build information there too during the CI build. After a short time I found a (imo) better and more elegant solution to do this. And this was modifying the build script in the following way (Scripting language used is Groovy which is based on Java):
def getGitHash = { ->
def stdout = new ByteArrayOutputStream()
exec {
commandLine 'git', 'rev-parse', '--short', 'HEAD'
standardOutput = stdout
}
return stdout.toString().trim().replace("\"", "\\\"")
}
def getGitBranch = { ->
def fromEnv = System.getenv("BUILD_SOURCEBRANCH")
if (fromEnv) {
return fromEnv.substring("refs/heads/".length()).replace("\"", "\\\"");
} else {
def stdout = new ByteArrayOutputStream()
exec {
commandLine 'git', 'rev-parse', '--abbrev-ref', 'HEAD'
standardOutput = stdout
}
return stdout.toString().trim().replace("\"", "\\\"")
}
}
def getIsCI = { ->
return System.getenv("BUILD_BUILDNUMBER") != null;
}
# And the other functions working very similar
android {
# ...
buildConfigField "String", "GitHash", "\"${getGitHash()}\""
buildConfigField "String", "GitBranch", "\"${getGitBranch()}\""
buildConfigField "String", "BuildNumber", "\"${getBuildNumber()}\""
buildConfigField "String", "GitMessage", "\"${getGitCommitMessage()}\""
buildConfigField "boolean", "IsCIBuild", "${getIsCI()}"
# ...
}
The result after the first build is the following java code:
public final class BuildConfig {
// Some other fields generated by default
// Fields from default config.
public static final String BuildNumber = "Local Build";
public static final String GitBranch = "develop";
public static final String GitHash = "6c87e82";
public static final String GitMessage = "Merge branch 'hotfix/login-failed' into 'develop'";
public static final boolean IsCIBuild = false;
}
Getting the required information is done by the build script itself without depending on the CI engine to fulfill this task. This class can be used after the first build its generated and stored in a "hidden" directory which is included in code analysis but exluded from your code in the IDE and also not pushed to the Git. But there is IntelliSense support. In C# project this would be the obj/ folder I guess. It is very easy to access the information as they are a constant and static values (so no reflection or similar required).
So here the summarized question: "Do you know something to achieve this behaviour/mechanism in a .NET environment?"
Happy to discuss some ideas/approaches... :)
It becomes much easier if at runtime you are willing to use reflection to read assembly attribute values. For example:
using System.Reflection;
var assembly = Assembly.GetExecutingAssembly();
var descriptionAttribute = (AssemblyDescriptionAttribute)assembly
.GetCustomAttributes(typeof(AssemblyDescriptionAttribute), false).FirstOrDefault();
var description = descriptionAttribute?.Description;
For most purposes the performance impact of this approach can be satisfactorily addressed by caching the values so they only need to read once.
One way to embed the desired values into assembly attributes is to use the MSBuild WriteCodeFragment task to create a class file that sets assembly attributes to the values of project and/or environment variables. You would need to ensure that you do this in a Target that executes before before compilation occurs (e.g. <Target BeforeTargets="CoreCompile" ...). You would also need to set the property <GenerateAssemblyInfo>false</GenerateAssemblyInfo> to avoid conflicting with the functionality referenced in the next option.
Alternatively, you may be able to leverage the plumbing in the dotnet SDK for including metadata in assemblies. It embeds the values of many of the same project variables documented for the NuGet Pack target. As implied above, this would require the GenerateAssemblyInfo property to be set to true.
Finally, consider whether GitVersion would meet your needs.
Good luck!

PackageManager.FindPackageForUser(String, String) always returning null

Why my following code example of this method is returning?
using Windows.Management.Deployment;
…
...
Windows.ApplicationModel.Package oPkg = oPkgManager.FindPackageForUser(string.Empty, "HoloCamera_1.0.0.5_neutral__cw5n1h2txyewy");
Remark: To test the FindPackageForUser(…) method, you will need to first add following references to your VS2017 project of any type (Winform, WPF, etc) as explained here:
C:\Program Files (x86)\Windows Kits\10\UnionMetadata\10.0.17763.0\Windows.winmd
C:\Program Files (x86)\Reference Assemblies\Microsoft\Framework.NETCore\v4.5\System.Runtime.WindowsRuntime.dll
NOTE: First using VS2017 , I ran this sample code example for FindPackages() method to find all the packages installed on my Windows 10. And I found out several packages that are installed on windows by default. And, I tried the following two but both of them return null on the above code line.
Following are two of the packages that FindPackages() method returns. And, I tried both of them in my above code example:
1.
Name: HoloCamera
FullName: HoloCamera_1.0.0.5_neutral__cw5n1h2txyewy
Version: 1.0.0.5
Publisher: CN=Microsoft Windows, O=Microsoft Corporation, L=Redmond, S=Washington, C=US
PublisherId: cw5n1h2txyewy
IsFramework: False
And
2.
Name: DesktopLearning
FullName: DesktopLearning_1000.15063.0.0_neutral__cw5n1h2txyewy
Version: 1000.15063.0.0
Publisher: CN=Microsoft Windows, O=Microsoft Corporation, L=Redmond, S=Washington, C=US
PublisherId: cw5n1h2txyewy
IsFramework: False
There's no problem with the FindPackageForUser method. To make this method work successfully, you first need to run your visual studio in Administrator mode(Steps: 'Start-> right click your visual studio -> More -> Run as Administrator').
Then, when you call FindPackageForUser method and pass the string.Empty as the first param, if it return NULL, it means that this package is not installed for current user.
To validate this point, you could check the message in Output window when you call FindPackages() method. The different packages should have different user and user securityId. You could use the 'DisplayPackageUsers' method to see the user securityId like the following:
private static void DisplayPackageUsers(Windows.Management.Deployment.PackageManager packageManager, Windows.ApplicationModel.Package package)
{
IEnumerable<Windows.Management.Deployment.PackageUserInformation> packageUsers = packageManager.FindUsers(package.Id.FullName);
Debug.Write("Users: ");
foreach (var packageUser in packageUsers)
{
Debug.Write(string.Format("{0},UserSecurityId: {1} ", SidToAccountName(packageUser.UserSecurityId), packageUser.UserSecurityId));
}
Debug.WriteLine("");
}
private static string SidToAccountName(string sidString)
{
SecurityIdentifier sid = new SecurityIdentifier(sidString);
try
{
NTAccount account = (NTAccount)sid.Translate(typeof(NTAccount));
return account.ToString();
}
catch (IdentityNotMappedException)
{
return sidString;
}
}
So, if you want to use FindPackageForUser method to find some one package, you also need to pass specific user securityId as the first param. You could get the appropriate user securityId from the above methods. Then, calling the FindPackageForUser method will return the specific package information successfully.

c# Register custom dll as processor to Printer++ Config file

I am trying to create a custom printer driver to generate images. For this, I have installed Printer++ which converts print files to postscripts. To convert postscript file to image, I am using ghostscript. Independently both the processes are running fine and I am able to achieve what is required.
But, I need a custom process to generate images in one go. I followed through the Printer++ tutorial but it didn't work.
This is what I have done:
I installed Printer++ and gave the name of the printer driver as- Septane.
In Visual Studio, I created a project- Test.
And the following code in Processor.cs class:
using System;
using System.Net.Mail;
using PrinterPlusPlusSDK;
namespace Test
{
public class Processor : PrinterPlusPlusSDK.IProcessor
{
public PrinterPlusPlusSDK.ProcessResult Process(string key, string psFilename)
{
//Convert PS to Png
psFilename = "b.ps";
MessageBox.Show("Rahul");
ConvertPsToTxt(psFilename);
}
public static string ConvertPsToTxt(string psFilename, string txtFilename)
{
var retVal = string.Empty;
var errorMessage = string.Empty;
var command = "C:\\PrinterPlusPlus\\gs\\gswin64.exe";
var args = string.Format("-dNOPAUSE -dBATCH -dFirstPage=1 -q -r300 -sDEVICE=png256 -sOutputFile=", psFilename, txtFilename);
retVal = Shell.ExecuteShellCommand(command, args, ref errorMessage);
return retVal;
}
}
}
This class inherits from PrinterPlusPlusSDK.IProcessor and implements the PrinterPlusPlusSDK.ProcessResult Process function. I have tested the standalone console project (without using PrinterPlusPlusSDK processor) and that converts ps to png successfully.
Now, as per the tutorial, the dll needs to be deployed to printer++ and registered as a processor. I copied Test.dll to Printer++ installation folder and added an entry to PrinterPlusPlus.exe.config file.
The config entry looks like:
<processors>
<add key="Test"
value="Test.Processor, Septane, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null" />
</processors>
That's it. Now, when I print a file, it gives an error:
Proccessor not found for: Septane
What am I doing wrong?
If anyone has better idea for achieving the same, please let me know. Actually, mine is a commercial product so can't use CutePDF/VerPDf kind options.
Edit: I now know why I was getting error- Processor not found. I renamed my printer to Test and the error disappeared. I have edited my code as well.
For testing, I have added a message box. I expected to get a popup once I give print command. But that is not the case. The ps file is getting generated without an error and that's it. I can't view pop-up message and there is no png converted file. Can someone please help me resolve this issue at least? It doesn't seem to be picking up the dll at all.
Thanks.
Remember The Printer name must be same as the you dll name
If your printer driver name is Septane. Then you must have to create a project name with "Septane". In that case project name "Test" will not work.

How to call Workstation.GetLocalWorkspaceInfo in a way that works for those who use VS2012 and those who use VS2013 without reflection?

Workstation.Current.GetLocalWorkspaceInfo(string) returns the WorkspaceInfo object associated with the given local directory.
So, I wrote a simple program that displays the workspace name given the local directory name.
It works great on my machine, but does not work on another.
The difference between the two is that mine runs VS2012 and the other one - VS2013. For the life of me, I cannot understand it. Both workspaces are linked to the same TFS server 2010.
After reading TFS API: GetLocalWorkspaceInfo always returns null I replaced the Microsoft.TeamFoundation.XXX references to ones found on the second machine and it worked again. But, of course, it stopped working on my machine.
It cannot be the right way. I must be doing something wrong here.
I want a single executable to work for both machines without resorting to reflection. My question is simple - how?
The complete source code can be found here - https://bitbucket.org/markkharitonov/tfsinsanity/src. Basically, it is two projects sharing exactly the same source code, but using a different set of dependencies.
The main source code is:
private static Workspace GetWorkspace(string wsRoot, string wsName, string wsOwner)
{
var coll = new TfsTeamProjectCollection(new Uri("http://torsvtfs01:8080/tfs/DefaultCollection"));
var vcs = (VersionControlServer)coll.GetService(typeof(VersionControlServer));
WorkspaceInfo wsInfo;
if (wsRoot == null)
{
Console.WriteLine(Workstation.Current.Name);
wsInfo = Workstation.Current.GetLocalWorkspaceInfo(vcs, wsName, wsOwner);
if (wsInfo == null)
{
throw new Exception(string.Format("Failed to identify the workspace {0};{1}", wsName, wsOwner));
}
}
else
{
wsInfo = Workstation.Current.GetLocalWorkspaceInfo(wsRoot);
if (wsInfo == null)
{
throw new Exception(string.Format("Failed to identify the workspace corresponding to \"{0}\"", wsRoot));
}
}
return wsInfo.GetWorkspace(coll);
}
Here how it works on my machine (VS2012):
PS C:\work\GetShelvedChangeset> tf workspaces
Collection: http://torsvtfs01:8080/tfs/defaultcollection
Workspace Owner Computer Comment
--------- -------------------- -------- ------------------------------------------------------------------------------------
CANWS212 DAYFORCE\mkharitonov CANWS212
PS C:\work\GetShelvedChangeset> .\bin\Debug2012\GetShelvedChangeset.exe --wsRoot C:\dayforce\SharpTop
Microsoft.TeamFoundation.VersionControl.Client, Version=11.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a
Workspace instance -784339741
Comment:
Computer: CANWS212
EffectivePermissions: 0
Folders: [0]
IsLocalWorkspace: False
LastAccessDate: 1/1/0001 12:00:00 AM
Name: CANWS212
Options: 0
OwnerAliases: [0]
OwnerDisplayName: DAYFORCE\mkharitonov
OwnerIdentifier:
OwnerIdentityType:
OwnerName: DAYFORCE\mkharitonov
OwnerUniqueName:
SecurityToken: /CANWS212;34be4ed8-c4fd-4e9f-bdae-d1843df36b0f
PS C:\work\GetShelvedChangeset> .\bin\Debug2013\GetShelvedChangeset.exe --wsRoot C:\dayforce\SharpTop
Failed to identify the workspace corresponding to "C:\dayforce\SharpTop"
PS C:\work\GetShelvedChangeset>
And on the other machine:
PS C:\tfs\DFGatedCheckInTest2\Build\2010\scripts> &"C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE\tf.exe" workspaces
Collection: http://torsvtfs01:8080/tfs/defaultcollection
Workspace Owner Computer Comment
-------------------- ----------------- ------------ ----------------------------------------------------------------------------------------
1733_TORSVBUILD10 DAYFORCE\tfsbuild TORSVBUILD10 Workspace Created by Team Build
1846_91_TORSVBUILD10 DAYFORCE\tfsbuild TORSVBUILD10 Workspace Created by Team Build
1846_92_TORSVBUILD10 DAYFORCE\tfsbuild TORSVBUILD10 Workspace Created by Team Build
PS C:\tfs\DFGatedCheckInTest2\Build\2010\scripts> .\debug2012\GetShelvedChangeset.exe --wsRoot C:\tfs\DFGatedCheckInTest2
Failed to identify the workspace corresponding to "C:\tfs\DFGatedCheckInTest2"
PS C:\tfs\DFGatedCheckInTest2\Build\2010\scripts> .\debug2013\GetShelvedChangeset.exe --wsRoot C:\tfs\DFGatedCheckInTest2
Microsoft.TeamFoundation.VersionControl.Client, Version=12.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a
Workspace instance 215494889
Comment: Workspace Created by Team Build
Computer: TORSVBUILD10
EffectivePermissions: 0
Folders: [0]
IsLocalWorkspace: False
LastAccessDate: 1/1/0001 12:00:00 AM
Name: 1733_TORSVBUILD10
Options: 0
OwnerAliases: [0]
OwnerDisplayName: DAYFORCE\tfsbuild
OwnerIdentifier:
OwnerIdentityType:
OwnerName: DAYFORCE\tfsbuild
OwnerUniqueName:
SecurityToken: /1733_TORSVBUILD10;f2899138-af14-4449-9f6d-78a0fbccebb8
PS C:\tfs\DFGatedCheckInTest2\Build\2010\scripts>
In this case, the method signatures should be identical, you simply need to worry about getting the proper DLL referenced in the first place. You should be able to link to the newer DLL and use binding redirection to load the DLL for users who have VS 2012 installed.
We used this method successfully in a previous product to provide TFS 2005 and 2008 compatibility.
In short, you create a custom assembly resolver. Here we will attempt to load all Microsoft.TeamFoundation.* DLLs from VS 2012 when we fail to load the VS 2013 versions that we were linked against:
AppDomain.CurrentDomain.AssemblyResolve += new ResolveEventHandler(ResolveAssembly);
public static Assembly ResolveAssembly(object sender, ResolveEventArgs args)
{
String[] arguments = args.Name.Split(new string[] { ", ", "," },
StringSplitOptions.RemoveEmptyEntries);
String assemblyName = arguments[0];
if(assemblyName.StartsWith("Microsoft.TeamFoundation.", StringComparison.CurrentCultureIgnoreCase))
{
return Assembly.Load(assemblyName + ", Version=11.0.0.0");
}
return null;
}
(Note that you should probably check the requested version number and pass the culture information to the loader, as outlined in the blog post, but this snippet should be sufficient to get you started.)
Add your project's referenced TFS assemblies to the executable, by setting the CopyLocal property to Always.

Current directory from a DLL invoked from Powershell wrong

I have a DLL with a static method which would like to know the current directory. I load the library
c:\temp> add-type -path "..."
...and call the method
c:\temp> [MyNamespace.MyClass]::MyMethod()
but both Directory.GetCurrentDirectory() and .Environment.CurrentDirectory get the current directory wrong...
what is the correct way to do this?
There are two possible "directories" you can have in powershell. One is the current directory of the process, available via Environment.CurrentDirectory or Directory.GetCurrentDirectory(). The other "directory" is the current location in the current Powershell Provider. This is what you see at the command line and is available via the get-location cmdlet. When you use set-location (alias cd) you are changing this internal path, not the process's current directory.
If you want some .NET library that uses the process's current directory to get the current location then you need to set it explicitly:
[Environment]::CurrentDirectory = get-location
Powershell has an extensible model allowing varying data sources to be mounted like drives in a file system. The File System is just one of many providers. You can see the other providers via get-psprovider. For example, the Registry provider allows the Windows Registry to be navigated like a file system. Another "Function" lets you see all functions via dir function:.
If the command in your DLL inherits from System.Management.Automation.PSCmdLet, the current PS location is available in SessionState.Path.
public class SomeCommand : PSCmdlet
{
protected override void BeginProcessing()
{
string currentDir = this.SessionState.Path.CurrentLocation.Path;
}
}
To get to the path without a session reference, this code seems to work. This solution I found after going through code that makes GIT autocomplete in PS GIT Completions, specifically this function here.
public class Test
{
public static IEnumerable<string> GetPath()
{
using (var ps = PowerShell.Create(RunspaceMode.CurrentRunspace))
{
ps.AddScript("pwd");
var path = ps.Invoke<PathInfo>();
return path.Select(p => p.Path);
}
}
}
Output:
PS C:\some\folder> [Test]::GetPath()
C:\some\folder

Categories