I have a console application with some arguments and options so I would like to use a free third-party library.
I have found two libraries for this purpose: NDesk.Options and Command Line Parser Library
Finally I have decided to use Command Line Parser Library because it is clearer using properties so I have downloaded it and added a reference to it.
The problem is that when adding the reference to my .NET Framework 3.5 project I get a warning icon. From the above page where I have downloaded it, it says that compatibility is .NET Framework 3.5+ so I understand 3.5 is compatible, am I right? If not which previous version of it is compatible with .NET Framework 3.5?
You can also use the new Microsoft CommandLineUtils library.
The nuget package is here, but only for .NET Core or Framrwork 4.5.2.
But you can download the source code (only 7 files) and include in your projet. For the Framework 3.5, you have only 2 compilation errors to solve: remove an extra method (using Tasks) and remove one line (in HandleUnexpectedArg).
Nuget: https://www.nuget.org/packages/Microsoft.Extensions.CommandLineUtils
Source: https://github.com/aspnet/Common/tree/dev/shared/Microsoft.Extensions.CommandLineUtils.Sources
To use this library, find here a first sample:
static void Main(string[] args)
{
var cmd = new CommandLineApplication();
var argAdd = cmd.Option("-a | --add <value>", "Add a new item", CommandOptionType.SingleValue);
cmd.OnExecute(() =>
{
Console.WriteLine(argAdd.Value());
return 0;
});
cmd.HelpOption("-? | -h | --help");
cmd.Execute(args);
}
I recommend FluentArgs (see: https://github.com/kutoga/FluentArgs). I think it is very easy to use:
namespace Example
{
using System;
using System.Threading.Tasks;
using FluentArgs;
public static class Program
{
public static Task Main(string[] args)
{
return FluentArgsBuilder.New()
.DefaultConfigsWithAppDescription("An app to convert png files to jpg files.")
.Parameter("-i", "--input")
.WithDescription("Input png file")
.WithExamples("input.png")
.IsRequired()
.Parameter("-o", "--output")
.WithDescription("Output jpg file")
.WithExamples("output.jpg")
.IsRequired()
.Parameter<ushort>("-q", "--quality")
.WithDescription("Quality of the conversion")
.WithValidation(n => n >= 0 && n <= 100)
.IsOptionalWithDefault(50)
.Call(quality => outputFile => inputFile =>
{
/* ... */
Console.WriteLine($"Convert {inputFile} to {outputFile} with quality {quality}...");
/* ... */
return Task.CompletedTask;
})
.ParseAsync(args);
}
}
}
There are many other examples on the github page.
McMaster.Extensions.CommandLineUtils is the best command line parser for c# that I've used. I especially like that it supports subcommands well.
Source code is here: https://github.com/natemcmaster/CommandLineUtils
dotnet add package McMaster.Extensions.CommandLineUtils
This is a simple example of how to use it using attributes:
using System;
using McMaster.Extensions.CommandLineUtils;
public class Program
{
public static int Main(string[] args)
=> CommandLineApplication.Execute<Program>(args);
[Option(Description = "The subject")]
public string Subject { get; } = "world";
[Option(ShortName = "n")]
public int Count { get; } = 1;
private void OnExecute()
{
for (var i = 0; i < Count; i++)
{
Console.WriteLine($"Hello {Subject}!");
}
}
}
Or you could use a builder:
using System;
using McMaster.Extensions.CommandLineUtils;
var app = new CommandLineApplication();
app.HelpOption();
var subject = app.Option("-s|--subject <SUBJECT>", "The subject", CommandOptionType.SingleValue);
subject.DefaultValue = "world";
var repeat = app.Option<int>("-n|--count <N>", "Repeat", CommandOptionType.SingleValue);
repeat.DefaultValue = 1;
app.OnExecute(() =>
{
for (var i = 0; i < repeat.ParsedValue; i++)
{
Console.WriteLine($"Hello {subject.Value()}!");
}
});
return app.Execute(args);
Microsoft have also been working on a command line parser: https://github.com/dotnet/command-line-api but it's been in preview for ages.
System.CommandLine might do the trick. Though as of November 2022 it is still in beta.
I suppose the .NET team is going to include it in some upcoming .NET framework release.
https://github.com/dotnet/runtime/issues/68578
https://www.nuget.org/packages/System.CommandLine
If you're looking for a third-party library to help you parse command-line arguments and options in C#, you might want to check out the TreeBasedCli library. It is a C# library designed to simplify the process of creating command-line interfaces (CLIs) with nested subcommands, and offers a number of benefits for both developers and users.
One of the key features of TreeBasedCli is its modular structure, which allows you to easily organize and structure your CLI's functionality using leaf and branch commands. Leaf commands represent specific actions that can be performed, and are implemented as individual classes with their own command definition, input parser, and asynchronous handler. Branch commands, on the other hand, represent a group of subcommands and do not have an associated action. This allows you to easily create complex CLIs with multiple levels of nesting.
Another benefit of TreeBasedCli is its support for asynchronous command execution. It also includes a lightweight Dependency Injection (DI) interface, allowing you to use your preferred method of DI type resolution.
public class CreateCatCommand :
LeafCommand<
CreateCatCommand.Arguments,
CreateCatCommand.Parser,
CreateCatCommand.Handler>
{
private const string NameLabel = "--name";
public CreateCatCommand() : base(
label: "create-cat",
description: new[]
{
"Prints out a cat."
},
options: new[]
{
new CommandOption(
label: NameLabel,
description: new[]
{
"Required. The name of the cat to print."
}
),
})
{ }
public record Arguments(string CatName) : IParsedCommandArguments;
public class Parser : ICommandArgumentParser<Arguments>
{
public IParseResult<Arguments> Parse(CommandArguments arguments)
{
string name = arguments.GetArgument(NameLabel).ExpectedAsSingleValue();
var result = new Arguments(
CatName: name
);
return new SuccessfulParseResult<Arguments>(result);
}
}
public class Handler : ILeafCommandHandler<Arguments>
{
private readonly IUserInterface userInterface;
public Handler(IUserInterface userInterface)
{
this.userInterface = userInterface;
}
public Task HandleAsync(Arguments arguments, LeafCommand _)
{
this.userInterface.WriteLine($"I am a cat đ¸ with the name {arguments.CatName}!");
return Task.CompletedTask;
}
}
}
Related
I want to load a class form a .cs file and use it in another code.
Assume I have a .cs file which contains code like that:
//some imports
public class Commands
{
//a lot of commands
}
What I am trying is to load this class from a file using CSharpCodeProvider or whatever and create a list of Commands.
A piece of code from a console app.
list<Commands> lst;
The question is how can I load Commands class dynamically (at runtime) (without restarting the console app or starting VS) and create the list of Commands?
Try this example, which I have put together and tested:
Build program.cs as a .Net Framework Console App in e.g. Visual Studio.
// program.cs
using System;
using System.IO;
using System.CodeDom.Compiler;
using System.Reflection;
namespace RuntimeCompile
{
class Program
{
static void Main(string[] args)
{
// Get a path to the file(s) to compile.
FileInfo sourceFile = new FileInfo("mySource.cs");
Console.WriteLine("Loading file: " + sourceFile.Exists);
// Prepary a file path for the compiled library.
string outputName = string.Format(#"{0}\{1}.dll",
Environment.CurrentDirectory,
Path.GetFileNameWithoutExtension(sourceFile.Name));
// Compile the code as a dynamic-link library.
bool success = Compile(sourceFile, new CompilerParameters()
{
GenerateExecutable = false, // compile as library (dll)
OutputAssembly = outputName,
GenerateInMemory = false, // as a physical file
});
if (success)
{
// Load the compiled library.
Assembly assembly = Assembly.LoadFrom(outputName);
// Now, since we didn't have reference to the library when building
// the RuntimeCompile program, we can use reflection to create
// and use the dynamically created objects.
Type commandType = assembly.GetType("Command");
// Create an instance of the loaded class from its type information.
object commandInstance = Activator.CreateInstance(commandType);
// Invoke the method by name.
MethodInfo sayHelloMethod = commandType.GetMethod("SayHello", BindingFlags.Public | BindingFlags.Instance);
sayHelloMethod.Invoke(commandInstance, null); // no arguments, no return type
}
Console.WriteLine("Press any key to exit...");
Console.Read();
}
private static bool Compile(FileInfo sourceFile, CompilerParameters options)
{
CodeDomProvider provider = CodeDomProvider.CreateProvider("CSharp");
CompilerResults results = provider.CompileAssemblyFromFile(options, sourceFile.FullName);
if (results.Errors.Count > 0)
{
Console.WriteLine("Errors building {0} into {1}", sourceFile.Name, results.PathToAssembly);
foreach (CompilerError error in results.Errors)
{
Console.WriteLine(" {0}", error.ToString());
Console.WriteLine();
}
return false;
}
else
{
Console.WriteLine("Source {0} built into {1} successfully.", sourceFile.Name, results.PathToAssembly);
return true;
}
}
}
}
In the output directory (bin), next to the console app executable place a text file named mySource.cs with this content:
// mySource.cs
using System;
internal class Program
{
static void Main()
{
Console.WriteLine("Hello from mySource!");
Console.ReadLine();
}
}
public class Command
{
public void SayHello()
{
Console.WriteLine("Hello (Command)");
}
}
Then run the first console app and observe it's output. It should log "Hello (Command)", showing that the code was correctly compiled, loaded and executed.
The example shows how to use the CodeDom.Compiler to compile a cs-file at runtime and then load it as dll to run code within it. Be aware, that almost no error handling was implemented.
This should answer the question, but there may still be better approaches to handling your use-case. In case of plugin loading it makes sense to use interfaces which are added as a reference to both assemblies to avoid the use of reflection, etc.
There is probably a better way to achieve your overall goal, like dependency injection.
However, you can do it with the ICodeCompiler.
See this article https://support.microsoft.com/en-ca/help/304655/how-to-programmatically-compile-code-using-c-compiler
To load the c# class from another c# class you need to use "using"
using Commands;
public class class1
{
private list<Commands>lst;
//...
}
I tried a simple test but it didn't like out variables
As a simple test, I wrote this (perhaps there is something simple wrong with it, but I also had trouble with patterns and with tuples)
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
namespace ConsoleApplication2
{
public class Program
{
static void Main(string[] args)
{
Runner runner = new ConsoleApplication2.Runner();
Point p = new ConsoleApplication2.Point();
runner.PrintCoordinates(p);
}
}
public class Point
{
int x = 20;
int y = 50;
public void GetCoordinates(out int a, out int b)
{
a = x;
b = y;
}
}
public class Runner
{
public void PrintCoordinates(Point p)
{
p.GetCoordinates(out int x, out int y);
Console.WriteLine($"({x}, {y})"); // x does not exist in current context
}
}
}
According to this post, where the PrintCoordinates example method comes from:
Note: In Preview 4, the scope rules are more restrictive: Out variables are scoped to the statement they are declared in. Thus, the above example will not work until a later release.
The new tuples suffer from a similar problem, though it seems you can partially work around that with a NuGet download:
Note: Tuples rely on a set of underlying types, that arenât included in Preview 4. To make the feature work, you can easily get them via NuGet:
Right-click the project in the Solution Explorer and select âManage NuGet PackagesâŚâ
Select the âBrowseâ tab, check âInclude prereleaseâ and select ânuget.orgâ as the âPackage sourceâ
Search for âSystem.ValueTupleâ and install it.
I am currently reading the "500 Lines or Less" book, the chapter for creating a Template Engine from Ned Batchelder.
Their example is using Python. In their template engine they are building code as a string and then they are calling exec (docs) to evaluate the string as Python code.
def get_globals(self):
"""Execute the code, and return a dict of globals it defines."""
# A check that the caller really finished all the blocks they started.
assert self.indent_level == 0
# Get the Python source as a single string.
python_source = str(self)
# Execute the source, defining globals, and return them.
global_namespace = {}
exec(python_source, global_namespace)
return global_namespace
This is very convenient, because they can easily evaluate expressions in the template such as {{object.property.property}}
With C# as my main programming language I am wondering how can this be achieved (in the context of building a template engine as in the book)?
Research and thoughts
First I don't believe there is an exec equivalent in C#.
One way I can think of it is to recursively use Reflection to get the List of properties of an object (handling checks for Null References), but I don't like this from performance point of view.
Another way is to use Roslyn's ScriptEngine class (which I haven't used so correct me if I am wrong). But I am afraid that this won't be good because this is supposed to be a library and it won't be able to be used with older versions of C# and .NET. Example
Q: First I don't believe there is an exec equivalent in C#.
As for compling C# code, CS-Script library can be used to achieve this in various ways.
For example:
dynamic script = CSScript.Evaluator
.LoadCode(#"using System;
using Your.Custom.Relevant.Namespace;
public class Executer
{
public object Execute()
{
return SomeStaticClass.array[123];
}
}");
int result = script.Execute();
//shorter way
int a = (int)CSScript.Evaluator.Evaluate("some.namespace.SomeStaticClass.array[123]");
Read more here: http://www.csscript.net/
CS-Script isn't made for templating.
Unless you create it yourself by manipulating the strings before you compile them.
But how can I pass some Context for the template engine
You can pass a context into a function like this:
dynamic script = CSScript.Evaluator
.LoadCode(#"
using System;
using Namespace.Of.The.Context;
public class Executer {
public string Execute(Context ctx) {
return ctx.Person.Firstname + ctx.Person.Lastname;
}
}");
int result = script.Execute(new Context(new Person("Rick", "Roll")));
Q: Can I call CSScript from a normal C# application lets say a Web App?
A: Yes.
S-Script currently targets Microsoft implementation of CLR (.NET
2.0/3.0/3.5/4.0/4.5) with full support on Mono.
Basically if it runs C#, it can be compiled accordingly to the .net-framework that the library is executed on, so if your project is ran on .net4.5, any feature of that .net version is available including any external references in your project too.
You can use Microsoft.CSharp.CSharpCodeProvider in order to compile code on fly.
https://msdn.microsoft.com/en-us/library/microsoft.csharp.csharpcodeprovider.aspx
Like this:
static void Main(string[] args)
{
string source =
#"
namespace Test
{
public class Test
{
public void HelloWorld()
{
System.Console.WriteLine(""Hello World"");
}
}
}
";
var options = new Dictionary<string, string> { {"CompilerVersion", "v3.5"} };
var provider = new CSharpCodeProvider(options);
var compilerParams = new CompilerParameters{GenerateInMemory = true, GenerateExecutable = false };
var results = provider.CompileAssemblyFromSource(compilerParams, source);
var method = results.CompiledAssembly.CreateInstance("Test.Test");
var methodInfo = method.GetType().GetMethod("HelloWorld");
methodInfo.Invoke(method, null);
}
I am new here and I hope that i will find a solution for my problem. The background of the problem is as follows:
I am trying to build an expert system that constitute a C# front-end which is interacting with Swi-prolog.
I have downloaded SwiPlCs.dll (A CSharp class library to connect .NET languages with Swi-Prolog)
And added a reference to it in a Visual Studio project(Win form app) that I have created to test if I can query prolog from c# (I followed the example used in the documentation found here).
It worked fine.
Then, in a more complicated scenario, I have built a WCF service that will act as an intermediary layer between Swi-Prolog and C# client application (it consumes the service).
The service is hosted in IIS 7.0.
For the sake of simplicity, lets say my service contains three methods.
The first method initializes the prolog engine, consults prolog source file then queries the file.
The second method performs another query.
The third method calls PlCleanup().
Method#1:
public void LaunchAssessment()
{
Dictionary<string, string> questions = new Dictionary<string, string>();
#region : Querying prolog using SwiPlCs
try
{
if (!PlEngine.IsInitialized)
{
String[] param = { "-q" };
PlEngine.Initialize(param);
PlQuery.PlCall("consult('D:/My FYP Work/initialAssessment')");
using (var q = new PlQuery("go(X, Y)"))
{
foreach (PlQueryVariables v in q.SolutionVariables)
{
questions.Add("name", v["X"].ToString());
questions.Add("age", v["Y"].ToString());
}
}
}
}
catch (SbsSW.SwiPlCs.Exceptions.PlException exp)
{
throw new FaultException<PrologFault>(new PrologFault(exp.Source), exp.MessagePl);
}
#endregion
Callback.PoseQuestion(questions, ResponseType.None);
}
Method#2:
public void DetermineAgeGroup(int age)
{
//Determine age group
string age_group = string.Empty;
try
{
using (var query = new PlQuery("age_group(" + age + ", G)"))
{
foreach (PlQueryVariables v in query.SolutionVariables)
age_group += v["G"].ToString();
}
}
catch (SbsSW.SwiPlCs.Exceptions.PlException exp)
{
throw new FaultException<PrologFault>(new PrologFault(exp.Source), exp.MessagePl);
}
//Check whether age_group is found or not
if (string.IsNullOrEmpty(age_group))
{
throw new FaultException<NoSolutionFoundFault>(new NoSolutionFoundFault("No solution found"), "Age specified exceeds the diagnosis range!");
}
else
{
Callback.RespondToUser(age_group, ResponseType.Age);
}
}
Method#3:
public void QuitProlog()
{
if (PlEngine.IsInitialized)
{
PlEngine.PlCleanup();
}
}
The client invokes the first method just fine and a result of the first query is successfully returned. When client tries to call the second method an exception is thrown with message (attempted to read or write protected memory) which causes the application to freeze. I checked the event viewer and this is what I get:
Application: w3wp.exe
Framework Version: v4.0.30319
Description: The process was terminated due to an unhandled exception.
Exception Info: System.AccessViolationException
Stack:
at SbsSW.SwiPlCs.SafeNativeMethods.PL_new_term_ref()
at SbsSW.SwiPlCs.PlQuery..ctor(System.String, System.String)
at SbsSW.SwiPlCs.PlQuery..ctor(System.String)
at PrologQueryService.PrologQueryService.DetermineAgeGroup(Int32)
I also tried to use the interface for a .NET project.
Looking in the official repository of the CSharp interface to SWI-Prolog I noticed that the project is very old and the latest updates do not seem included in the binaries available in the download page of the official website.
Then I did the following steps:
The contrib repository dedicated to .NET indicates that the compatible SWI-Prolog version (at the time of writing) is "8.0.3-1" (look in the README file).
-> Then I uninstalled from my computer the latest stable and installed the indicated one. I got it from the full list of downloads of the old versions at this link.
I cloned the SWI-Prolog/contrib-swiplcs repository, unloaded the incompatible projects from the solution, in my case, since I don't use Visual Studio.
-> I set the target framework to Net Framework 4.8 and recompiled it (you can also do this with standard NET). Beware of some pragma directives defined in the old project file (For example I re-defined _PL_X64 variable via code.
I brought the main unit test methods into a new project with xUnit wiht the appropriate changes.
I set the target to x64, recompiled and rebuilt the tests and the "hello world" example.
It worked!
I was able to use SWI-Prolog both for Net 4.8 and in other Net Core applications (if you make the needed changes in order to target the Net Standard). You should not have any problem in both cases).
This is my fork as a preliminary example.
Finally, I can load a *.pl Prolog file with a program in my C# application and use it to evaluate some business logic rules (example with boolean answer [Permitted/Not-Permitted]):
[Fact]
public void ShouldLoadAProgramAndUseIt()
{
var pathValues = Environment.GetEnvironmentVariable("PATH");
pathValues += #";C:\Program Files\swipl\bin";
Environment.SetEnvironmentVariable("PATH", pathValues);
// Positioning to project folder
var currentDirectory = Directory.GetCurrentDirectory().Split('\\').ToList();
currentDirectory.RemoveAll(r => currentDirectory.ToArray().Reverse().Take(3).Contains(r));
var basePath = currentDirectory.Aggregate((c1, c2) => $"{c1}\\{c2}");
var filePath = $"{basePath}\\prolog_examples\\exec_checker.pl";
String[] param = { "-q", "-f", filePath };
PlEngine.Initialize(param);
try
{
var query = "exutable('2020-08-15',[('monthly', ['2019-12-30', '2020-03-10'])])";
_testOutputHelper.WriteLine($"Query: {query}");
using (var q = new PlQuery(query))
{
var booleanAnswer = q.NextSolution();
_testOutputHelper.WriteLine($"Answer: {booleanAnswer}");
Assert.True(booleanAnswer);
}
query = "exutable('2020-08-15',[('daily', ['2019-12-30', '2020-08-15'])])";
_testOutputHelper.WriteLine($"Query: {query}");
using (var q = new PlQuery(query))
{
var booleanAnswer = q.NextSolution();
_testOutputHelper.WriteLine($"Answer: {booleanAnswer}");
Assert.False(booleanAnswer);
}
}
finally
{
PlEngine.PlCleanup();
}
}
Try to close engine in the end of the first method and initialize it in the second again.
You can check this as the answer to the question unless you object.
Consider the following situation:
Now, I have a C# application that parses a file in order to get details (tables, columns etc) and starts a new SQL Connection in order to execute a SQL Command to create those tables in the database.
What I want is to create a SQL Project in which I will manually create those tables, and from the C# application I want to programatically publish the SQL project to a certain server and database.
Is this possible ?
If you are using a sqlproj based project in .NET 4 and above, you can build and publish it programatically fairly easily using classes in the Microsoft.Build namespace. Taken from my answer here:
using Microsoft.Build.Framework;
using Microsoft.Build.Execution;
public void UpdateSchema() {
var props = new Dictionary<string, string> {
{ "UpdateDatabase", "True" },
{ "PublishScriptFileName", "schema-update.sql" },
{ "SqlPublishProfilePath", "path/to/publish.xml") }
};
var projPath = "path/to/database.sqlproj";
var result = BuildManager.DefaultBuildManager.Build(
new BuildParameters { Loggers = new[] { new ConsoleLogger() } },
new BuildRequestData(new ProjectInstance(projPath, props, null), new[] { "Publish" }));
if (result.OverallResult == BuildResultCode.Success) {
Console.WriteLine("Schema update succeeded!");
}
else {
Console.ForegroundColor = ConsoleColor.Red;
Console.WriteLine("Schema update failed!");
Console.ResetColor();
}
}
private class ConsoleLogger : ILogger
{
public void Initialize(IEventSource eventSource) {
eventSource.ErrorRaised += (sender, e) => {
Console.ForegroundColor = ConsoleColor.Red;
Console.WriteLine(e.Message);
Console.ResetColor();
};
eventSource.MessageRaised += (sender, e) => {
if (e.Importance != MessageImportance.Low)
Console.WriteLine(e.Message);
};
}
public void Shutdown() { }
public LoggerVerbosity Verbosity { get; set; }
public string Parameters { get; set; }
}
This is for .NET 4 and above. Be sure and include assembly references to Microsoft.Build and Microsoft.Build.Framework.
There are a number of ways you could accomplish this. In our app (~ 7 large databases) we manage them all with Database Projects from SQL Server Data Tools. This has allowed us to version control easily as well as do some awesome comparison tools at deploy time and a plethora of other options. We did expand ours to deal with some nuances in our environment but for most people that shouldn't be an issue.
Part of that toolset includes DAC (Data Tier Applications) which allow you to transplant a database that is in your project to various environments pretty easily. This would support a great majority of projects in existence today.
If you wanted to go pure programatic you could use Code First (and Code First Migrations which is pretty slick) which is kind of build as you go and MS will figure out the rest for you (mainly by convention but flexibility to go beyond that). It's really friendly when it comes to upgrading versions. Again IMHO.
Database Projects exist as well but tend to require a little more insight/work to get them tweaked the way you want (but also offer a familar SQL Explorer type layout).