I'm trying to use the Get-VM Cmdlet called from C# on a Hyper-V host.
Obviously, the according PowerShell module Hyper-V has to be imported first. The import fails, however - apparently because the module is supported only on PowerShell 3.0 (at least that's what I figure from this article). The PowerShell used by System.Management.Automation seems to be version 2.0, though.
InitialSessionState iss = InitialSessionState.CreateDefault();
iss.ImportPSModule(new string[] { "Hyper-V" });
Runspace runSpace = RunspaceFactory.CreateRunspace(iss);
runSpace.Open();
foreach (var err in (ArrayList)runSpace
.SessionStateProxy.PSVariable.GetValue("Error"))
Console.WriteLine(err.ToString());
runSpace.Close();
returns
The
'C:\Windows\system32\WindowsPowerShell\v1.0\Modules\Hyper-V\Hyper-V.psd1'
module cannot be imported because its manifest contains one or more members
that are not valid. The valid manifest members are ('ModuleToProcess', ...).
Remove the members that are not valid ('HelpInfoUri'),
then try to import the module again.
Is there a way to use a specific version of PowerShell in C#?
A colleague figured it out:
Apparently, .NET 4+ comes with an all new common language runtime: the CLR4
This runtime uses its own assemblies loaded from a new assembly cache located at C:\Windows\Microsoft.NET\assembly.
The System.Management.Automation version 3.0.0.0, which will automatically use PowerShell 3.0, exists for the CLR4 only. Because I configured my application to run under .NET 3.5, it would use the old CLR2 and could not even see the newer assembly.
To make sure the application would still run on .NET 3.5, add this to the App.config file in the project folder:
<supportedRuntime version="v4.0"/>
<supportedRuntime version="v2.0.50727"/>
If CLR4 is available, it'll load the according GAC, find a policy file that redirects all references to System.Management.Automation version 1.0.0.0 to version 3.0.0.0 and the PowerShell-Modules work as expected.
If you only have .NET 3.5, the older version will be loaded; PowerShell still works, but only up to version 2.0.
Have you looked at this yet?
http://code.msdn.microsoft.com/windowsdesktop/Windows-PowerShell-30-SDK-9a34641d
You might just need the new SDK to call Powershell 3 even if PSv3 is installed on your system already, but I'm usually just a straight Powershell guy.
Related
The package we used before (Microsoft.TeamFoundationServer.ExtendedClient), doesn't support netstandard2.0 and it seems that the new REST APIs doesn't support the tasks we want to accomplish either.
So what's the recommended way to add new files as pending changes or checkout existing files in tfvc when targeting netstandard2.0?
TF.exe should be capable of doing the aforementioned tasks, but to get the path to it we would somehow need vsWhere.exe and so that whole approach seems a little bit cumbersome.
There is no, and won't be, a .NET Core compatible version of the TFVC client object model. The API used to interact with TFVC is a SOAP based API and it has never been ported to REST.
You can use vswhere to find tf.exe or build a custom .NET 4.x executable using the Client Object Model and call from your .NET Core app.
You're allowed to redistribute a copy of vswhere with your application. You can grab the latest version from here:
- powershell: |
$vswhereLatest = "https://github.com/Microsoft/vswhere/releases/latest/download/vswhere.exe"
$vswherePath = "$(Build.SourcesDirectory)\tools\vswhere.exe"
remove-item $vswherePath
invoke-webrequest $vswhereLatest -OutFile $vswherePath
test-path $vswherePath -PathType Leaf
displayName: 'Grab the latest version of vswhere.exe'
As you can see, I run this in my Azure Pipeline to inject the latest version into my build directory prior to packaging my app.
And this is the code I use do find tf.exe afterwards:
$path = & $PSScriptRoot/vswhere.exe -latest -products * `
-requires Microsoft.VisualStudio.TeamExplorer `
-find "Common7\IDE\CommonExtensions\Microsoft\TeamFoundation\Team Explorer\TF.exe"
Are there any restrictions with packages you can use with Azure Functions. I have researched as much as I can and it doesn't seem so, however when I try creating an Azure Function that references the package "Microsoft.ServiceFabric" I get the following error:
System.Private.CoreLib: Exception while executing function:
ScaleDownServiceFabrics. FunctionApp2: Could not load file or assembly
'System.Fabric, Version=6.0.0.0, Culture=neutral,
PublicKeyToken=31bf3856ad364e35'. Could not find or load a specific
file. (Exception from HRESULT: 0x80131621). System.Private.CoreLib:
Could not load file or assembly 'System.Fabric, Version=6.0.0.0,
Culture=neutral, PublicKeyToken=31bf3856ad364e35'.
I have tried both Azure Func and.1 and 2, and .Net Framework and .Net Core with no luck.
using System;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host;
using Microsoft.Extensions.Logging;
using System.Fabric;
namespace FunctionApp5
{
public static class Function1
{
[FunctionName("Function1")]
public static void Run([TimerTrigger("*/5 * * * * *")]TimerInfo myTimer, ILogger log)
{
log.LogInformation($"C# Timer trigger function executed at: {DateTime.Now}");
FabricClient client = new FabricClient();
}
}
}
Is this possible, or a limitation of Azure Functions in Visual Studio - if so, what packages are acceptable?
ServiceFabric packages are x64 bit, if your function target 32bit it will fail. Try the solution proposed by Jerry Liu
Service Fabric Packages have to be added as packages instead of reference the dll directly in the project, because of the dependencies on other libraries. You should add the NuGet package Microsoft.ServiceFabric.
Microsoft.ServiceFabric latest version 6.3.x targets .Net Standard 2.0 and .Net Framework from 4.5 to 4.7.1, make sure you are using any of these on your project.
Make sure the Microsoft.ServiceFabric DLLs are being copied to the bin folder when built\deployed.
When you use FabricClient outside the cluster, you have to specify the settings and credentials, otherwise you won't be able to connect to the cluster. See this example and this docs.
FabricClient uses Service Fabric API to interact with the cluster, if are having issues with the packages, another option is use HttpClient and make the requests to the API and avoid the packages conflicts
Diego and Suraj have pointed out the cause, conflict between 64 and 32 bit.
Two points to fix
Set build platform to x64 like what you have done.
Get x64 Function runtime. Functions work on Function runtime(contained in Azure Function core tools), but the default bit is x86 downloaded by VS.
To get x64 bit in an easy way, let's use Nodejs to install Azure Functions Core Tools from NPM.
After installation, in cmd input npm i -g azure-functions-core-tools --unsafe-perm true to get Function core tools.
Then set project debug properties(right click on your project>Properties>Debug blade).
Set Launch type to Executable
Set Executable path to %appdata%\npm\node_modules\azure-functions-core-tools\bin\func.exe.
Add Application arguments start.
I run into exactly same issue as #tank140 commented in original post:
Unable to load DLL 'FabricClient.dll' or one of its dependencies: The specified module could not be found. (Exception from HRESULT: 0x8007007E)
More details in another question that I fired on the issue. As answer, it was confirmed that SF Client API for .NET requires that SF runtime is installed on the platform, which is not supported in Azure Functions.
In my case I just update the azure platform configuration to 64 bit. However, I was using .net core 3.1 function app into win platform.
I'm trying to use Gst# (which is the C# bindings for GStreamer) on Mac, but autotools keeps failing with these last few lines:
checking whether to build shared libraries... yes
checking whether to build static libraries... yes
checking for pkg-config... /opt/local/bin/pkg-config
checking pkg-config is at least version 0.9.0... dyld: Library not loaded: /opt/local/lib/libiconv.2.dylib
Referenced from: /opt/local/bin/pkg-config
Reason: Incompatible library version: pkg-config requires version 8.0.0 or later, but libiconv.2.dylib provides version 7.0.0
./configure: line 11558: 15968 Trace/BPT trap: 5 $PKG_CONFIG --atleast-pkgconfig-version $_pkg_min_version
no
checking for MONO_DEPENDENCY... no
checking for csc.exe... no
configure: error: You need to install either mono or .Net
Saying that Mono isn't installed is nonsense; how can I fix this and get it to succeed? If it helps, I installed Mono through the Mac installer, not Macports, though Macports is installed.
Your mono installation is not in $PKG_CONFIG_PATH, and that's why pkg-config cannot find the .pc files for mono.
You need to add the directory to your $PKG_CONFIG_PATH environment variable. The directory should be at /Library/Frameworks/Mono.framework/Versions/VERSIONNUMBER/lib/pkgconfig/
The error is explained quite clearly:
Reason: Incompatible library version: pkg-config requires version 8.0.0 or later, but libiconv.2.dylib provides version 7.0.0
In order to update to a newer version I would suggest trying this in Terminal:
sudo port -n upgrade --force libiconv
More information here.
We've created a C# class library assembly and made it COM visible to be able to call its methods from PHP. This used to work fine, but now we wanted to install it on a Windows Server 2008 server and we keep walking into the error "Class not registered".
To rule out any dependency problems I made a tiny little test class library in C#. The class library is built for Any CPU and it is COM visible (also set COMVisible to true in AssemblyInfo.cs). The test class library only contains one class with one method. The class is called TestLib and the namespace is also called TestLib. The method is called Test and only returns a string.
What we have done is the following:
- built the TestLib.dll
- copied it to the Windows Server 2008 machine
- registered the dll with: regasm /codebase TestLib.dll
- the regasm tool returns a success message
- in PHP we simply try to create a new COM instance:
try
{
$test = new COM("TestLib.TestLib");
}
catch (Exception $e)
{
die($e->getMessage());
}
when we call this test script from either the browser or the commandline (php -f test.php) we get the error "Class not registered" in both cases
I also tried adding TestLib to the GAC by using gacutil -i, but to no avail; still the class not registered error.
Then I tried compiling the testlibrary with .NET 2.0 instead of 4.0 as the target framework, same result. The .NET framework 4.0 is installed on the server by the way.
Any ideas?
Okay, so after some more research I figured it out. The php.exe process is 32 bit. The COM visible assembly is compiled for Any CPU so it should be accessible to both 32 and 64 bit applications.
The problem is that on a 64 bit OS php.exe, and any 32 bit process for that matter, searches in HKEY_CLASSES_ROOT\Wow6432Node\CLSID instead of HKEY_CLASSES_ROOT\CLSID and in HKEY_LOCAL_MACHINE\Software\Classes\Wow6432Node\CLSID instead of HKEY_LOCAL_MACHINE\Software\Classes\CLSID. The registry entries in the Wow6432 keys aren't created by regasm that is shipped with .NET framework v4 on Windows Server 2008. On Windows 7 they are created, don't ask me why.
It also turned out that if I create a little test assembly for .NET v2.0 and register it with regasm that ships with .NET framework v2.0 that it does create the Wow6432Node entries on Windows 2008. Strange.
So my solution is to create a basic registry file on the server using:
regasm /regfile MyClassLib.dll
This creates a file MyClassLib.reg with only the 'normal' 64 bit entries. Then I exported the Wow6432Node keys from a Windows 7 machine and added it to that .reg file. Now when I import that reg file into the registry on Windows 2008 everything works fine.
For more info on the Wow6432Node entries check out: http://msdn.microsoft.com/en-us/library/windows/desktop/ms724072%28v=vs.85%29.aspx
Hope this saves someone else some time and headaches.
If you are trying to call a 32-bit COM DLL on 64-bit Windows, you will need to register it.
Copy your 32-bit DLL to C:\Windows\sysWOW64
Run C:\Windows\sysWOW64\regsvr32.exe your_com_32.dll
A bit more info with screenshots.
Is there a way to make a simple installer that includes the necessary runtimes and dependency packages, and creates an icon in the OpenSuse menu, so the application will "just work?"
The actual application is just an executable (.EXE) and a handful of support files (mostly XML and CSV).
I already have the application successfully building and executing in MonoDevelop 2.0.
I originally tried to install the Mono Runtime via zypper from the following repository http://ftp.novell.com/pub/mono/download-stable/openSUSE_11.0, but quickly got bogged down in missing package dependencies.
This is what happens when I try to use zypper to install the Mono runtime:
linux-lkfu:~ # zypper addrepo http://ftp.novell.com/pub/mono/download-stable/SLE_11 mono-stable
Adding repository 'mono-stable' [done]
Repository 'mono-stable' successfully added
Enabled: Yes
Autorefresh: No
URI: http://ftp.novell.com/pub/mono/download-stable/SLE_11
linux-lkfu:~ # zypper refresh --repo mono-stable
Retrieving repository 'mono-stable' metadata [done]
Building repository 'mono-stable' cache [done]
Specified repositories have been refreshed.
linux-lkfu:~ # zypper dist-upgrade --repo mono-stable
Loading repository data...
Reading installed packages...
Computing distribution upgrade...
Nothing to do.
linux-lkfu:~ #
Notice the last line before the prompt that says "Nothing to do." I don't think it's doing anything.
One option is to buy mono support for SLES 11 from Novell. That will grant you a tested, supported and working mono repository.
If not you have to use the SLES repositories. For mono it's http://ftp.novell.com/pub/mono/download-stable/SLE_11
This should do it:
zypper addrepo http://ftp.novell.com/pub/mono/download-stable/SLE_11 mono-stable
zypper refresh --repo mono-stable
zypper dist-upgrade --repo mono-stable
If you have Visual Studio, you can try Mono Tools for Visual Studio, which provides tooling to make this easy. It is fully functional for 30 days.
http://go-mono.com/monotools/
that depends on how you plan to package them up. Either you manually create (deb for Debian and Ubuntu, rpm for Fedora and openSUSE, or anything else).
Personally, I think at first it is acceptable to simply add a .sh to your ZIP package and ask the end users to execute it for dependency checking, installing, and so on.
Then you can learn about the packaging methods.