Nancy Self Host and Static Content Files - c#

Am I right in thinking that if you create a self host nancy console app and want to serve up html,javascript and css files that you have to go thru all these files (could be quite a few) and mark them all for copy to output directory.
public class HomeModule : NancyModule
{
public HomeModule()
{
Get["/"] = v => View["index.html"];
}
}
This will not be found if the index.html file is in the project folder and is not marked copy to output on it's properties.

Edit: I stand corrected, I misunderstood the question.
Yes you need to set all static content to copy, however when I setup my project's (I can't copy paste an example for you at the moment), I just add a Build Event in the project file, or I setup a Build Task for the CI / deployment.
Nope, you don't need to mark every file individually.
https://github.com/NancyFx/Nancy/wiki/Managing-static-content
You can mark an entire directory.
Alternatively, if you're using OWIN, you can use the Static Content middleware.
Something like:
public class Startup
{
public void Configuration(IAppBuilder app)
{
var fileSystem = new FileServerOptions
{
EnableDirectoryBrowsing = false,
FileSystem = new PhysicalFileSystem("....")
};
app.UseFileServer(fileSystem);
app.UseNancy();
}
}

I had the same issue and I found a workaround that others might find useful:
Instead of copying the files to the output directory on each build, I created a directory junction in it, targeting the original static-files directories.
This allows real-time editing of the static content in Visual-Studio (without the need to rebuild in order to copy the edited files to the output directory)
e.g. (Post-build command line):
if not exist "$(TargetDir)Web" md "$(TargetDir)Web"
if not exist "$(TargetDir)Web\Content" mklink /j "$(TargetDir)Web\Content" "$(ProjectDir)Content"
if not exist "$(TargetDir)Web\Scripts" mklink /j "$(TargetDir)Web\Scripts" "$(ProjectDir)Scripts"
if not exist "$(TargetDir)Web\Fonts" mklink /j "$(TargetDir)Web\Fonts" "$(ProjectDir)Fonts"
if not exist "$(TargetDir)Web\Static" mklink /j "$(TargetDir)Web\Static" "$(ProjectDir)Web\Static"

You can use Visual studio build events and add xcopy command like this:
xcopy /E /Y "$(ProjectDir)\Views" "$(ProjectDir)\bin\$(ConfigurationName)\Views\*"
xcopy /E /Y "$(ProjectDir)\Content" "$(ProjectDir)\bin\$(ConfigurationName)\Content\*"
when project is built xcopy gets executed and files are copied in output dir, so your selfhost exe can see that files.

Related

Issue getting executable directory with .NET core? [duplicate]

With asp.net core 1.0 There are lots of functionality added. But there is not way to get Bin Folder path.
Can anyone please know how we can get the bin folder path for asp.net core 1.0 application.
Alternative way (corresponds to the AppDomain.BaseDirectory):
AppContext.BaseDirectory
This works to retrieve the assembly's directory, from which we can determine the bin location.
var location = System.Reflection.Assembly.GetEntryAssembly().Location;
var directory = System.IO.Path.GetDirectoryName(location);
System.Console.WriteLine(directory);
Output
C:\MyApplication\bin\Debug\netcoreapp1.0
Well, the bin folder does exists but it is moved to artifacts folder next to the solution file. Since ASP.NET Core RC 1 compiles everything in memory, you will find empty bin folder. But if you set "Produce output on build" option to true (Right click Project file -> Properties and Build tab) then you will find the generated files in bin folder.
I don't think so there is any direct property available as to get the path of this but you can use the same solution pointed out by #Nikolay Kostov to get application path. And then using System.IO classes jump to bin folder.
Code updated to for ASP.NET Core as mentioned here.
http://www.talkingdotnet.com/get-application-wwwroot-path-aspnet-core-rc2/
public Startup(IHostingEnvironment env, IApplicationEnvironment appenv)
{
string sAppPath = env.ContentRootPath;
string sRootPath = Path.GetFullPath(Path.Combine(sAppPath, #"..\..\"));
string sBinFolderPath = #"artifacts\bin\" + appenv.ApplicationName;
string sBinPath = Path.Combine(sRootPath, sBinFolderPath);
}
You can't really get the /bin/ folder since it is not relevant to your project and the ASP.NET environment doesn't know what a /bin/ folder is.
And also there isn't exactly a /bin/ folder. You may want to read this article: http://docs.asp.net/en/latest/conceptual-overview/understanding-aspnet5-apps.html
But you can get the so called ApplicationBasePath which is the directory in which you application runs:
public Startup(IHostingEnvironment env, IApplicationEnvironment appEnv)
{
string baseDir = appEnv.ApplicationBasePath;
// Other startup code
}
AppDomain.CurrentDomain.BaseDirectory;

DbMigration.SqlFile difference in base directory

We are using the new DbMigration.SqlFile method in EF Migrations 6.1.2 to run a migration script in our migration. According to the documentation, the file has to be relative to the current AppDomain BaseDirectory. We have included these files in the project, and set them to copy to output directory.
Locally this all runs fine. They get output to the bin directory, and run fine.
When deploying the software to a server running IIS however, the migration fails, because it suddenly expects the files to be relative to the root. When I copy them there, the migration works.
How can I use DbMigration.SqlFile so it runs correctly both locally and on the server?
The SqlFile method uses the CurrentDomain.BaseDirectory if a relative path is given. A workaround is to map the path yourself and give an absolute path to the method. A solution would look like this:
var sqlFile = "MigrationScripts/script1.sql";
var filePath = Path.Combine(GetBasePath(), sqlFile);
SqlFile(filePath);
public static string GetBasePath()
{
if(System.Web.HttpContext.Current == null) return AppDomain.CurrentDomain.BaseDirectory;
else return Path.Combine(AppDomain.CurrentDomain.BaseDirectory,"bin");
}
BasePath solution taken from: Why AppDomain.CurrentDomain.BaseDirectory not contains "bin" in asp.net app?
We're using it like this from within the migration: SqlFile(#"..\..\Sql\views\SomeView.sql");

Change connectionstring in Visual studio whening publishing

I have an asp.net application. I have two databases set up, a test and production. Write now before I push any updates I manually change my conn string to point to production. Is there a way to do this when I publish my application, automatically change to production? Thanks for any help.
You can use web.config transformations for this.
Yeah web.config transformations or I think a better approach is to separate your configs from your web/app.config, so having a folder for each build configuration for example
Config/Debug/connectionStrings.config
Config/Stage/connectionStrings.config
Config/Production/connectionStrings.config
Then your app or web.config would look like
<connectionStrings configSource="bin\connectionStrings.config" />
To get the environment specific config to your bin directory create a post build event which copies the config based on the current build configuration or copy them manually to your bin. This way you don't have to relay on a build to get your desired configuration.
If you really want this automated your post build event could look something like
"$(SolutionDir)CopyConfigs.bat" "$(ProjectDir)" "$(ConfigurationName)" "$(OutDir).."
and the batch file so the copying can be reused between projects:
#echo CopyConfigs.bat :
#echo Coping Config Files...
set projectDir=%1
set configurationName=%2
set outDir=%3
REM Trim Quotes
for /f "useback tokens=*" %%a in ('%1') do set projectDir=%%~a
for /f "useback tokens=*" %%a in ('%2') do set configurationName=%%~a
for /f "useback tokens=*" %%a in ('%3') do set str3=%%~a
#echo Project Directory: %1
#echo ConfigurationName: %2
#echo OutDir: %3
if not exist %1 goto ProjectDirectoryNotFound
REM Copy the configuration files to the projects output directory
xcopy /Y "%projectDir%Configuration\%configurationName%\*.config" "%outDir%"
xcopy /Y "%projectDir%Configuration\*.config" "%outDir%"
#goto END
:ProjectDirectoryNotFound
#echo Project Directory %projectDir% was not found.
#goto END
:END
#echo Coping Config Done

Merge DLL into EXE?

I have two DLL files which I'd like to include in my EXE file to make it easier to distribute it. I've read a bit here and there how to do this, even found a good thread here, and here, but it's far too complicated for me and I need real basic instructions on how to do this.
I'm using Microsoft Visual C# Express 2010, and please excuse my "low standard" question, but I feel like I'm one or two level below everyone else's expercise :-/ If someone could point out how to merge these DDL files into my EXE in a step-by-step guide, this would be really awesome!
For .NET Framework 4.5
ILMerge.exe /target:winexe /targetplatform:"v4,C:\Program Files\Reference Assemblies\Microsoft\Framework\.NETFramework\v4.0" /out:finish.exe insert1.exe insert2.dll
ILMerge
Open CMD and cd to your directory. Let's say: cd C:\test
Insert the above code.
/out:finish.exe replace finish.exe with any filename you want.
Behind the /out:finish.exe you have to give the files you want to be
combined.
Use Costura.Fody.
You just have to install the nuget and then do a build. The final executable will be standalone.
Download ilmerge and ilmergre gui . makes joining the files so easy
ive used these and works great
Reference the DLL´s to your Resources and and use the AssemblyResolve-Event to return the Resource-DLL.
public partial class App : Application
{
public App()
{
AppDomain.CurrentDomain.AssemblyResolve += (sender, args) =>
{
Assembly thisAssembly = Assembly.GetExecutingAssembly();
//Get the Name of the AssemblyFile
var name = args.Name.Substring(0, args.Name.IndexOf(',')) + ".dll";
//Load form Embedded Resources - This Function is not called if the Assembly is in the Application Folder
var resources = thisAssembly.GetManifestResourceNames().Where(s => s.EndsWith(name));
if (resources.Count() > 0)
{
var resourceName = resources.First();
using (Stream stream = thisAssembly.GetManifestResourceStream(resourceName))
{
if (stream == null) return null;
var block = new byte[stream.Length];
stream.Read(block, 0, block.Length);
return Assembly.Load(block);
}
}
return null;
};
}
}
Download
ILMerge
Call
ilmerge /target:winexe /out:c:\output.exe c:\input.exe C:\input.dll
Install ILMerge
as the other threads tell you to
Then go to the installation folder, by default
C:\Program Files (x86)\Microsoft\ILMerge
Drag your Dll's and Exes to that folder
Shift-Rightclick in that folder and choose open command prompt
Write
ilmerge myExe.exe Dll1.dll /out:merged.exe
Note that you should write your exe first.
There you got your merged exe. This might not be the best way if your going to
do this multiple times, but the simplest one for a one time use, I would
recommend putting Ilmerge to your path.
static class Program
{
/// <summary>
/// The main entry point for the application.
/// </summary>
[STAThread]
static void Main()
{
/* PUT THIS LINE IN YOUR CLASS PROGRAM MAIN() */
AppDomain.CurrentDomain.AssemblyResolve += (sender, arg) => { if (arg.Name.StartsWith("YOURDLL")) return Assembly.Load(Properties.Resources.YOURDLL); return null; };
Application.EnableVisualStyles();
Application.SetCompatibleTextRenderingDefault(false);
Application.Run(new Form1());
}
}
First add the DLL´s to your project-Resources. Add a folder "Resources"
2019 Update (just for reference):
Starting with .NET Core 3.0, this feature is supported out of the box. To take advantage of the single-file executable publishing, just add the following line to the project configuration file:
<PropertyGroup>
<PublishSingleFile>true</PublishSingleFile>
</PropertyGroup>
Now, dotnet publish should produce a single .exe file without using any external tool.
More documentation for this feature is available at https://github.com/dotnet/designs/blob/master/accepted/single-file/design.md.
Also you can use ilmergertool at codeplex with GUI interface.
Here is the official documentation. This is also automatically downloaded at step 2.
Below is a really simple way to do it and I've successfully built my app using .NET framework 4.6.1
Install ILMerge nuget package either via gui or commandline:
Install-Package ilmerge
Verify you have downloaded it. Now Install (not sure the command for this, but just go to your nuget packages):
Note: You probably only need to install it for one of your solutions if you have multiple
Navigate to your solution folder and in the packages folder you should see 'ILMerge' with an executable:
\FindMyiPhone-master\FindMyiPhone-master\packages\ILMerge.2.14.1208\tools
Now here is the executable which you could copy over to your \bin\Debug (or whereever your app is built) and then in commandline/powershell do something like below:
ILMerge.exe myExecutable.exe myDll1.dll myDll2.dll myDlln.dll myNEWExecutable.exe
You will now have a new executable with all your libraries in one!
I answered a similar question for VB.NET. It shouldn't however be too hard to convert. You embedd the DLL's into your Ressource folder and on the first usage, the
AppDomain.CurrentDomain.AssemblyResolve event gets fired.
If you want to reference it during development, just add a normal DLL reference to your project.
Embedd a DLL into a project
NOTE: if you're trying to load a non-ILOnly assembly, then
Assembly.Load(block)
won't work, and an exception will be thrown:
more details
I overcame this by creating a temporary file, and using
Assembly.LoadFile(dllFile)
I Found The Solution Below are the Stpes:-
Download ILMerge.msi and Install it on your Machine.
Open Command Prompt
type cd C:\Program Files (x86)\Microsoft\ILMerge Preess Enter
C:\Program Files (x86)\Microsoft\ILMerge>ILMerge.exe /target:winexe /targetplatform:"v4,C:\Windows\Microsoft.NET\Framework\v4.0.30319"
/out:NewExeName.exe SourceExeName.exe DllName.dll
For Multiple Dll :-
C:\Program Files (x86)\Microsoft\ILMerge>ILMerge.exe /target:winexe /targetplatform:"v4,C:\Windows\Microsoft.NET\Framework\v4.0.30319"
/out:NewExeName.exe SourceExeName.exe DllName1.dll DllName2.dll DllName3.dll
The command should be the following script:
ilmerge myExe.exe Dll1.dll /target:winexe /targetplatform:"v4,c:\Program Files (x86)\Reference Assemblies\Microsoft\Framework\.NETFramework\v4.0\" /out:merged.exe /out:merged.exe

Programatically checking files into TFS getting more than expected

So I have a .NET app which goes thru and generates a series of files, outputs them to a local directory and then determines if it needs to update an existing file or add a new file into a TFS (Team Foundation Server) project.
I have a single workspace on my local machine and there are 10 different working folders that are other coding projects I have worked on from this particular machine. My problem happens when I go to check if the file already exists in the TFS project and an update is required or if needs to be added to the project as a new file.
snipet:
static string TFSProject = #"$/SQLScripts/";
static WorkspaceInfo wsInfo;
static VersionControlServer versionControl;
static string argPath = "E:\\SQLScripts\\";
wsInfo = Workstation.Current.GetLocalWorkspaceInfo(argPath);
TeamFoundationServer tfs = new TeamFoundationServer(wsInfo.ServerUri.AbsoluteUri);
versionControl = (VersionControlServer)tfs.GetService(typeof(VersionControlServer));
Workspace workspace = versionControl.GetWorkspace(wsInfo);
workspace.GetLocalItemForServerItem(TFSProject);
At this point I check if the file exists and I do one of two things. if the file exists, then I mark the file for EDIT and then I writeout the file to the local directory, otherwise I will script the file first and then ADD the file to the workspace. I dont care if the physical file is identical to the one I am generating as I am doing this as a SAS70 requirement to 'track changes'
If it exists I do:
workspace.PendEdit(filename,RecurisionType.Full);
scriptoutthefile(filename);
or if it doesn't exist
scriptoutthefilename(filename);
workspace.PendAdd(filename,true);
Ok, all of that to get to the problem. When I go to check on pending changes against the PROJECT I get all the pending changes for all of the projects I have on my local machine in the workspace.
// Show our pending changes.
PendingChange[] pendingChanges = workspace.GetPendingChanges();
foreach (PendingChange pendingChange in pendingChanges)
{
dosomething...
}
I thought that by setting the workspace to workspace.GetLocalItemForServerItem(TFSProject) that it would give me ONLY the objects for that particular working folder.
If there any way to force the workspace object to only deal with a particular working folder?
Did that make any sense? Thanks in advance...
using LINQ you can get the pending changes of particular folder
PendingChange[] pendingChanges = workspace
.GetPendingChanges()
.Where(x => x.LocalOrServerFolder.Contains(argPath))
.ToArray();
Here is another way to refresh you workspace cache:
tf workspaces /s:http://SomeTFSServer:8080/
Also, Workspace has a method called UpdateWorkspaceInfoCache(..) that seems to do the same thing. I haven't tested this, though.
Ok, so it turned out that I had a corrupted TFS workspace cache on my machine.
Here was the solution (from a command prompt):
SET AppDataTF=%USERPROFILE%\Local Settings\Application Data\Microsoft\Team Foundation
SET AppDataVS=%APPDATA%\Microsoft\VisualStudio
IF EXIST "%AppDataTF%\1.0\Cache" rd /s /q "%AppDataTF%\1.0\Cache" > NUL
IF EXIST "%AppDataTF%\2.0\Cache" rd /s /q "%AppDataTF%\2.0\Cache" > NUL
IF EXIST "%AppDataVS%\8.0\Team Explorer" rd /s /q "%AppDataVS%\8.0\Team Explorer" > NUL
IF EXIST "%AppDataVS%\9.0\Team Explorer" rd /s /q "%AppDataVS%\9.0\Team Explorer" > NUL
The other part of the issue is that by having the project folder in the same workspace, it WILL pull in all pending changes from other projects. I created a secondary workspace with only the SQLScripts project and local folder and everything works like a charm.
Maybe someone else will find this useful...

Categories