I have a VS Project containing 2 projects. One is the main Project called "main" which is an executable. The second project called "main-ext" is also an executable, but it is an extension to the first one. It has "main" as a reference because it uses some of main's types.
I'd now like to make a ClickOnce publish of "main" that also includes "main-ext". I have already successfully set up ClickOnce for "main". But I cannot include the output generated from "main-ext" in that publish.
I then found out that you somehow need to include the files from "main-ext" into the project of "main" in order for ClickOnce to deploy them.
Now, with that in mind I have tried the following:
Add "main-ext" as reference to "main". This obviously does not work because of circular references.
Add the output EXE of "main-ext" manually as "Content" with "Copy Always" to the project "main". While this does work, it has a huge problem. If I should ever delete the output files (or run Build > Clean), I can't get the solution to build anymore. Because "main-ext" needs "main" to build, and "main" needs the output of "main-ext" to build. Same circular references, just a little different.
Any ideas?
The next step I'd try would be to refactor "main" and move all the common types and things, that "main" and "main-ext" share, into a dedicated third project, a class library. But because the project is rather large and complicated, I'd like to leave that as an option of last resort.
Thanks.
Look into modifying the ClickOnce's manifest. The idea is:
Main doesn't reference the ext
build ClickOnce for main
publish ClickOnce
open .application in ManifestManagerUtility (alternatively you can do that using ClickOnce API, I think)
add reference to your main-ext
re-sign the manifest.
Then you can allow your users to download a new version. Both files should be installed on client's machines.
Have a look at this chapter in Prism documentation, it contains a step by step description of a very similar scenario.
Related
I have this simple line of code in DotSpatial
var raster = Raster.OpenFile("X://Data//4mr_project.tif");
Why raster just getting null value??
I also have .aux, .ovr, .tfw files in the same directory.
EDITED:
I found that the line below works fine:
var featureSet = FeatureSet.Open("X:\\Test Data\\shap\\edited.shp")
because Dotspatial have capability to load .shp file by default. But loading raster data .tif format, Dotspatial need GDAL extensions. Now the question is how to load GDAL extensions manually in Dotspatial using C#.
GDAL extensions can be supported in your own application through the use of the AppManager component. You can drag and drop this onto your form. This allows for support from the GDAL data extensions, and will also gives support to other plug-ins. Here is a basic walk through for adding the AppManager to a new project that just has a Map on the form.
1) From the Visual Studio Toolbox, right click and click on "Choose Items"
2) From the dialog, choose "Browse" and browse to the DotSpatial.Controls.dll library.
3) Click Ok as needed to close the dialogs and get back to the Toolbox.
4) Find the AppManager component you just added in the Toolbox.
5) Drag the AppManager component onto your form. (not on the map, but on the form). A new instance should appear below your form in the non-visual components list.
6) Select this component to view it's properties in the Properties Dialog.
7) Set the map for the appManager (or other components if you are using them).
8) The GDAL component does not even require the Map to be defined in order to work, it should just work. But you will need the GDAL extension. You can find the DotSpatial.Data.Rasters.GdalExtension in the "Windows Extensions" folder. Ensure that you have a similar folder in your output directory with the necessary GdalExtension. One method is just to ensure that this is in your final distribution folder manually.
9) (Optional) One trick you can use, to ensure you have the GDAL plugin in your release folders is to add the libraries as content. This way, regardless if you are working on a debug version or a release version, it will ensure that the GDAL data extension makes it to the output folders.
10) Ensure that the directory you are using (like "Windows Extensions") is listed in the Directories property of the AppManager. The image below shows the default folders which are "Application Extensions" and "Plugins". I think it originally was "Application Extensions" but got updated to "Windows Extensions" later. Unfortunately, I don't think they updated the default folder.
11) In the code somewhere (probably in the form constructor) you need to call appManager1.LoadExtensions(); If you don't call this, it will not actually load the GDAL extension even if you have the GDAL library as part of your project.
12) Add a SpatialDockManager, SpatialHeaderManager, SpatialStatusStrip to the project. Then assign these to the properties on the AppManager, the same way you did the map. For reasons that are beyond me and were implemented after I left, the previously open ended design structure has changed, and now it will throw message box errors if the program does not include these things but you try to use Extensions. The "ProgressHandler" property takes the SpatialStatusStrip.
After following all 12 of these steps (and running the project in x86 mode) the raster code you posted in the initial question works, and you can open geotifs. I also pushed the GDAL extension into the root "Application Extensions" directory while trying to get it to work, but I don't think you have to do that. It should work if it is in a subfolder.
Sorry to be that late (hopefully, it's never too late), but if you wish to use the plugin without using AppManager, because you may be composing something custom and do not want to depend on the main DotSpatial application framework (note that the AppManager utilizes some slightly advanced "magic" to make it all work together), you can do yourself the following few simple tasks:
1) Add a reference to the file
(DotSpatial Release Folder)\Windows
Extensions\DotSpatial.Data.Rasters.GdalExtension.dll
to your project (this is the main GdalExtension Plugin output file).
NOTE: To make sure this step is done correctly, make sure that building your library (the one that references the GdalExtension.dll) ends up copying to this project's output directory the additional files from the same folder (i.e. gdal_csharp.dll etc.).
2) This same folder also contains a gdal subfolder. Copy the folder itself, as-is, to your output path (usually ...\bin\Release\\ or ...\bin\Debug\\, depending on your configuration). Of course in your final project, you would probably like to use a post-build copy event to automate the process, or just include the folder as content in your application build output, as Ted also mentions in step 9 of his answer.
NOTE: By output folder, I am referring to the Application Output Path, not the library output path. If your application is using a library, which undertakes the task of loading rasters (through GdalExtension), the gdal folder does not need to be in the output folder of this library. It needs to end up in your final application's output folder. The reason is that the various dll files are loaded dynamically, so they have to be found in the executing application folder.
3) As early as possible in your codebase, create a new GdalRasterProvider, which should now be referenced by the dll file added in step 1. This means, add something like the following line to your project
var grp = new DotSpatial.Data.Rasters.GdalExtension.GdalRasterProvider();
Thereafter, the first line of code in your post should work as expected. So, technically, the answer to the original question is that the DefaultDataManager class did not find any suitable provider to perform the task of actually loading the Raster file. Therefore, you are left with a null variable.
Interestingly, you don't need to hold the reference anywhere (i.e. do anything with variable grp). If you check the source code, the constructor itself undertakes the task of adding itself to the DefaultDataProvider.PreferredProviders dictionary, which is eventually invoked behind the scenes in the call to Raster.Open(string) method. The only "tough-to-figure-out" part is simply to copy the gdal folder in your application output path, because the GDAL extension loads a number of references located therein upon instantiation of any provider, and the loading is based on a "gdal" subfolder located in whichever folder your application resides and is executed from.
(Note that the Plugin also contains two more providers (GdalImageProvider and OgrDataProvider). To make these two work, you need to instantiate them but also to manually add them to the PreferredProviders dictionary of the DefaultDataProvider, typically also up early in your application code)
I have multiple projects in a solution and I'd like them all to share one pool of graphics resources. I've tried this:
1 - Created project1, made its resource file public, added some graphics to it.
2 - Created project2, Alt+dragged Resources.resx from project1\Properties to project2 (not in the Properties folder)
3 - Add reference in project2 to project1
So, now all the images from project1 are available in project2. So far, so good. I can use them at design time just fine.
But, when I want to access them at runtime, I try this (in project2)...
Image img = project1.Properties.Resources.image14;
And that crashes with a MissingManifestResourceException.
What am I doing wrong here? Is there a better way I could approach this? All I'm trying to do is maintain all my graphics in one place, so if I add a resource, it becomes available to all projects in the solution.
Just built an example following these steps:
Create a class library do hold the resources (Project 1)
Create the consumer project (Project 2)
Add a resource file (GlobalResources.resx) in the Project 1 and add a resource item Information
Change the BuildAction of the resource file to Embedded Resource
Change the Do not copy of the resource file to False
Check if the Custom Tool of the resource file is set to PublicResXFileCodeGenerator
Add a reference to the class library (Project 1) to the consumer project (Project2).
Add the resource namespace reference wherever you want to use it.
Finally it is working: GlobalResources.Information
It should be simple.
Edit:
You are concerned about using an external resource file inside the design time property editor. Sorry to inform that there is no standard support for this :(
However, if you think that the benefits are greater than the effort:
Issue with shared WinForms resources across projects in Visual Studio
How do I get the Windows Forms Designer to use resources from external assembly?
Hope it helps.
Choose the referenced file in your solution explorer, then properties, then see what the "copy to output" property looks like. I suspect it's not set to "Copy Always" or "Copy if Newer" of which either should be fine.
Once it's being copied, let's also check to see where it's being copied. Is the output path for that particular item the same as where the program ultimately expects? Is it being copied to the bin\Debug of the correct project?
Make sure it's being copied to the path where the MissingManifestResourceException says it's failed to find the resource.
Finally, given additional information in our comments, I would also suggest you verify the following:
culture the resources are targeting. Check spelling and capitalization.
any culture settings of your build xml or publish xml.
culture setting(s?) of your host system that's running this code.
I'm new to using Visual Studio, and I'm trying to figure out how to 'publish' my program so I can move it other's computers and run it from there. I'm not sure if it makes a difference, but there are three projects in my solution. And if I publish it, will I still continue to be able to develop the original files etc?
Thanks a lot!
Ok, so you've written your code, debugged it and now you want to distribute it...
When you build a solution using Visual Studio the compiled output of each project is produced in a folder which is either specified manually, or, by default, in a bin folder relative to the projects root folder. Within this folder are subfolders which hold the output for a corresponding build (for instance the Debug folder contains the Debug compilation output).
If you have three projects then, for example, let's assume one is an executable application and the other two are dynamic link libraries on which the application project is dependent, the compiled output from the latter two projects will automatically be copied to the applications compiled output folder, meaning you only need to ship what is in this folder (along with anything else you actually know is required).
For a (rough) folder graph to try and visualise what I'm saying:
SolutionFolder\
ApplicationProjectFolder\
Bin\ <- contains overall output
Debug\ <- the compilation you develop with
Release\ <- the compilation you distribute (after testing)
DynamicLinkLibrary0Folder\
Bin\
Debug\ <- automatically copied to 'ApplicationProjectFolder\Bin\Debug'
Release\ <- automatically copied to 'ApplicationProjectFolder\Bin\Release'
DynamicLinkLibrary1Folder\
\Bin
Debug\ <- as above
Release\ <- as above
You can continue to work on your code after distributing, yes, of course, but you can hardly expect the users of the application to have your latest changes without redistributing the whole thing, or updating/patching et cetera.
Of course, this solution is the simplest form - ideally you'd want an installer project as part of the solution, which is the final distributable end-product.
As I said above, it seems you may need to know a heck of a lot more than this to proceed competently and confidently, and I could explain further details on each aspect mentioned here, no doubt, but it has to stop somewhere. Hope this gets you started, though.
Adding to the answer given by #Mr. Disappointment, you could also add a Setup project to the solution, which will take the compiled output and build an installer (a .msi file) for your program. You can then give the .msi file to your users and they can run the setup program to install the application on their computers. You can also put the .msi file on a network share or make it available for download from a website, depending on your requirements.
Another option is to investigate Visual Studio's ClickOnce deployment, which also allows you to distribute your application to users in a simplified way, via a web site or network file share.
I have created a ClickOnce Solution with VS2008.
My main project references another project who references COM dll as "links".
When I build my solution in VS the dlls from the orther projects are moved in my bin folder but when I publish and launch the project these files are not presents in my Local Settings\Apps\2.0... folder.
I know that I can add each dll of the other project as a reference of my main project but I'd like a cleaner solution ...
Is it possible ?
First add those files to your project directly.
Then goto Application properties -> Publish -> Application files
Select "show all files" if you do not see the files you need and then set their
publish status to "Include" NOT "Include (Auto)". This is important or they will not be added.
Please note if you update the files, you will have to remove them and add them again
and set their publish Status again. This is a small bug.
See a previous question of mine for more info:
ClickOnce - Overwriting content files
You need to open the "Application Files" dialog in the Publish tab of your project. From there you can set the publish type (Include, Prerequisite, etc.) of each of your files.
If it's an unmanaged DLL, you'll need to add the actual .dll as a file to your project and mark its build action as "Data". You can then set the Publish Type of that file to Include.
I had the same issue.... and the only way to fix this after going through many options, was by adding those dlls to References.
It works, but I hope there would be a cleaner solution to it in future.
For some reason, we have a script that creates batch files to XCOPY our compiled assemblies, config files, and various other files to a network share for our beta testers. We do have an installer, but some don't have the permissions required to run the installer, or they're running over Citrix.
If you vomited all over your desk at the mentions of XCOPY and Citrix, use it as an excuse to go home early. You're welcome.
The code currently has hundreds of lines like:
CreateScripts(basePath, "Client", outputDir, FileType.EXE | FileType.DLL | FileType.XML | FileType.CONFIG);
It used to be worse, with 20 int parameters (one per file type) representing whether or not to copy that file type to the output directory.
These hundreds of lines create upload/download batch files with thousands of XCOPY lines. In our setup projects, we can reference things like "Primary output from Client" and "Content Files from Client". I'd love to be able to do that programmatically from a non-setup project, but I'm at a loss.
Obviously MS does it, either using an API or by parsing the .csproj files. How would I go about doing this? I'm just looking for a way to get a list of files for any of the setup categories, i.e.:
Primary Output
Localized Resources
Content Files
Documentation Files
EDIT:
I have a setup project like Hath suggested, and it's halfway to what I'm looking for. The only problem keeping that from being a perfect solution is that multiple projects depend on the same assemblies being in their own folder, and the setup will only copy the file once.
Example:
Projects Admin, Client, and Server all rely on ExceptionHandler.dll, and Admin and Client both rely on Util.dll, while Server does not. This is what I'm looking for:
Admin
Admin.exe
Admin.exe.config
ExceptionHandler.dll
Util.dll
Client
Client.exe
Client.exe.config
ExceptionHandler.dll
Util.dll
Server
Server.exe
Server.exe.config
ExceptionHandler.dll
Since the referenced assemblies are all the same, what I get is this:
Admin
Admin.exe
Admin.exe.config
ExceptionHandler.dll
Util.dll
Client
Client.exe
Client.exe.config
Server
Server.exe
Server.exe.config
This causes a FileNotFoundException when either Client or Server can't find one of the two DLLs it's expecting.
Is there a setup property I'm missing to make it always copy the output, even if it's duplicated elsewhere in another project's output?
EDIT AGAIN: All referenced DLLs are set to "Copy Local", and always have been. I found a decent article on using NAnt and XSLT to grab the list of files, so that may be a possible solution as well, as neouser99 suggested.
ACCEPTED SOLUTION: I'm pretty much back where I started. All .exe and .dll outputs are put into a "bin" directory in the setup project, loosely packed. The other per-application folders contain shortcuts to the executable in that directory.
The difference now is, I'm going to add a custom action to the installer to use reflection, enumerate the dependencies for each executable output, and copy the .exe and .dll files to the separate directories. Bit of a pain, as I just assumed there was a way to programmatically detect what files would be included via some setup library.
why not use another setup project and just set the 'Package files' setting to As Loose uncompressed files (setup project->properties)? then share the folder.. or something.
edit:
I see, you have 3 folders for your outputs. but the setup project only detects the ExceptionHandler.dll and Util.dll once, so it will just pick the first folder and put it in there.
You could do a setup project for each project - bit annoying maybe..
You could manually add in the dll's to the projects that are missing the assembly's
either by adding in the File by 'add file' or 'add assembly' or 'add project output' if you have those projects in the same solution.. (I doubt that's the case though).
or just dump all of them into one output directory...
Although it's designed as a build tool, you might find NAnt to be extremely useful in what you are talking about. The tasks (build, copy, move, delete, etc.) that you can define allow for very fine-grained file lookups, up to general, full folders. If you also incorporate NAnt into your build process, I think you could find that it helps out in more ways then one.
Another approach that has worked for me in the past is to add the shared resource (Assembly, DLL or project) as a reference to each of the Admin, Server and Client projects. Then open the properties panel for the referenced item in each project and set "Copy Local" to true.
Now when you build the projects, each will have its own instance of the Assembly copied into its output folder.
This should also cause the shared components added in this manner to be replicated in each of the output folders in the setup package.
A completely different approach could be to set them up as symbolic links on the network share. A symbolic link is basically a short-cut where the file-system hides the fact that it is a short-cut, so all other applications actually believes that the file has been copied (http://en.wikipedia.org/wiki/NTFS_symbolic_link).
One advantage of this approach is that the file is updated immediately as the file changes and not only when you build your projects. So when you for instance save one of the config-files with a text-editor the update is applied immediately.
The following MSBuild script part can build your SLN file (you can replace it with .csproj) and will report a list of all projects that were build (Dlls, EXEs).
<MSBuild Projects="MySolution.sln" Targets="Clean; Rebuild" Properties="Configuration=$(BuildMode);">
<Output TaskParameter="TargetOutputs"
ItemName="AssembliesBuilt" />
</MSBuild>
Now, this doesn't really solve your problem, but it gets you a list of everything that was build. You also have copylocal, so you could probably just take AssembiesBuild and copy all DLL and .CONFIG files from there.
Example:
AssembliesBuild = c:\myproj\something1\build.dll
you'd go to c:\myproj\something1\ and simply search for all *.dll and *.config files and include them. You can do this pretty easily with MSBuild or powershell, if you have it installed. To output a XCOPY script from MSBuild, I think you'll need MSBuild contrib projct installed.