I have a MS Access Database that has a button on it that is supposed to run a tool that I wrote. The tool works. The Access DB works. The VBA to run the tool works.
However, there is an issue when running the tool from VBA. It always crashes when trying to load XML configuration data. I tested this by checking my Error Reporting tool (Control Panel -> Administrative Tools -> Event Viewer). I also wrote a second version of my tool which has all of the XML data hard-coded into it.
I am wondering if there is a known reason for this to occur, and if there is a workable solution for getting around this error. Hard-coding all of my configuration data is not really an ideal solution, as the configuration data needs to be modifiable without having to recompile the entire project and push out an update.
Information:
MS Access 2010 with Visual Basic for Applications (using Shell for
application calling)
C# application with XML configuration data using references (IO, Linq, Xml.Linq, Data.OleDb, Globalization)
Thanks for any assistance you may be able to offer.
Per request, here is the code I use to run the application:
Dim hProcess As Long
Dim myPath As String
dim myFile As String
myPath = Environ("ProgramFiles(x86)" & "\mytool\"
myFile = "mytool.exe"
hProcess = Shell( myPath & myFile, vbNormalFocus )
The application is a WinForms application that allows the user to map column names in a csv to fields in access. The only actual issue is the part where it attempts to load the XML configuration data:
XDocument xDoc = XDocument.Load(Environment.CurrentDirectory + "\\config.xml");
Again, if I hard-code the configuration data into the application itself, there is no issue at all when running the application. However, if I attempt to load the XML configuration data, it gives an Error Event in the Event Viewer stating that there was an unhandled exception when loading the XML file. If I run the application outside of the VBA Shell call, it runs fine and can load the XML file. It only ever crashes when trying to load the XML from VBA Shell.
In my experience, executing .NET code from MS Access seems to "mess up" some of .NET's "get the current directory" methods, which all work fine if you run the exact same .NET application directly without Access.
If you want to see what I'm talking about, create a new console application in Visual Studio and paste the following code:
using System;
using System.IO;
namespace CurrentDirTest
{
class Program
{
static void Main(string[] args)
{
Console.WriteLine(Environment.CurrentDirectory);
Console.WriteLine(Directory.GetCurrentDirectory());
Console.WriteLine(System.Threading.Thread.GetDomain().BaseDirectory);
Console.WriteLine(AppDomain.CurrentDomain.BaseDirectory);
Console.ReadLine();
}
}
}
When I run that directly from Visual Studio, it outputs this (as expected):
C:\Dev\Code\CurrentDirTest\CurrentDirTest\bin\Debug
C:\Dev\Code\CurrentDirTest\CurrentDirTest\bin\Debug
C:\Dev\Code\CurrentDirTest\CurrentDirTest\bin\Debug\
C:\Dev\Code\CurrentDirTest\CurrentDirTest\bin\Debug\
Now put the compiled .exe in your Program Files (x86)\mytool\ folder and try to call it from Access, with the VBA code from your question.
When I do that on my machine, I get this:
C:\Users\MyUserName\Documents
C:\Users\MyUserName\Documents
C:\Program Files (x86)\mytool\
C:\Program Files (x86)\mytool\
Weird, isn't it?
Environment.CurrentDirectory and Directory.GetCurrentDirectory() both return my "Documents" folder as soon as the application is executed from MS Access.
I've got no idea why this happens, but it does.
Solution:
If you get the same results on your machine as I do on mine, the solution for your problem is simple: just use System.Threading.Thread.GetDomain().BaseDirectory or AppDomain.CurrentDomain.BaseDirectory to get the current directory.
Just in case someone has a similar problem when using COM-Interop:
The problem is even worse when you execute a .NET assembly via COM-Interop from MS Access.
If I recall it correctly, both System.Threading.Thread.GetDomain().BaseDirectory and AppDomain.CurrentDomain.BaseDirectory didn't work for me either because both returned the directory of the msaccess.exe.
I had to use this.GetType().Assembly.Location to get the actual location of the .NET assembly.
Related
I'm working on a project that involves the rewriting of legacy Windows applications for a modern .NET environment. The structure of these applications requires that there is one "Master" application running locally on the user's machine that fetches the others from a network share, along with required DLLs, and then runs them locally. This is not a problem per se, but I'm running into issues when I try to use the current user's AppData folder structure for this, whether it's local or roaming.
Thing is though, I can create a subfolder without a problem, I can copy files over (via a call to System.IO.File.Copy) and run them (via creating and starting an instance of System.Diagnostics.Process), but still the applications complain that they can't find the needed DLLs. This probably is related to the fact that the subfolder doesn't show up in the file system at all, which leads me to believe that files are not being copied over there in a fully conventional sense, but rather exist in some sort of limbo that prevents them from being "aware" of each other. When I do the copying and running to a standard filesystem folder, it all works fine. I've been trying to Google this issue as I was sure that this was a known problem, but I'm having a hard time finding anything on this subject. Hence this question.
Any ideas as to what might be going on? Or is my approach of using AppData for this particular purpose perhaps misguided to begin with?
EDIT: Comments asked for a viable code sample that reproduces the problem. I did have some interesting results trying to reproduce the issue as simply as I could. I created a fresh console app and could not reproduce it there. I then tried a fresh WinUI 3 app (which is what I'm trying to work with for my project) and I reproduced it there - the code seems to create the directory and then the file, without complaints (I run without the subfolder being there), but they don't show up in Explorer. For comparison, I then tried a fresh WPF app that does the exact same thing but this behaves the same as the console app, no issue with the subfolder and file showing up after running the app. So my issue seems to be confined to WinUI only.
You should be able to reproduce by placing this code inside the main window of a fresh application (and creating a button called myButton if VS doesn't do that automatically) and comparing WinUI 3 with WPF.
public MainWindow()
{
InitializeComponent();
}
private void myButton_Click(object sender, RoutedEventArgs e)
{
// See https://aka.ms/new-console-template for more information
var destinationPath = System.IO.Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.LocalApplicationData), "Subfolder");
string sourceFilename = "C:\\Data\\Textfile.txt";
CopyFileToLocalExecutionPath(sourceFilename, destinationPath);
}
private string CopyFileToLocalExecutionPath(string sourceFilename, string destinationPath)
{
// Make sure that destination path exists before copying file
if (!Directory.Exists(destinationPath))
Directory.CreateDirectory(destinationPath);
// Assign full destination filename - local path + same as source filename
var destinationFilename = System.IO.Path.Combine(destinationPath, System.IO.Path.GetFileName(sourceFilename));
// Do not overwrite file if it's already there
if (!File.Exists(destinationFilename))
File.Copy(sourceFilename, destinationFilename);
return destinationFilename;
}
I have many .exe files stored on IIS server (MSSQL) that contain reports and access to the file(s) on the servers . (These files will be change on Sundays .)
After connecting to the SQL Server and choosing an .exe file , I am Downloading(Select in SQL) , Now I have an array of bytes that assigned to a variable .
I cant creating a temporary file like "temp.exe" in an unknown directory because I know there are many ways to understand a new created file directory and ...
It is not secure because my users are professional and if one of them know these ways ...
So , I want know is it possible to run an .exe file from an array of bytes (as default as running from "Windows Explorer") without creating a temporay file ?!
tnx
update : Exe files are .net and Manager will be upload new files or change files .
Be warned that your belief of any extra security is illusory. If the user has access to the machine to read files, they will also be able to read the memory of your process.
However, to answer your question, what you are asking to do is simple enough and described here: Load an EXE File and Run It from Memory.
In essence you do the following:
Pass your byte array to Assembly.Load to create a new Assembly.
Read the entry point of that assembly using the EntryPoint property.
Create an instance using Assembly.CreateInstance, and invoke the method on that instance.
The code looks like this:
Assembly a = Assembly.Load(bytes);
MethodInfo method = a.EntryPoint;
if (method != null)
method.Invoke(a.CreateInstance(method.Name), null);
Doesn't sound safe either way, why are you storing exécutables in a db to begin with? Who uploads them? Wether they're on the filesystem or not they're just as dangerous if malicious.
Are those .net exes? If so you could load the assembly into a child appdomain with security restrictions and i'm pretty sure you can do that without copying to disk.
For regular native exe i don't think it's possible to just launch an exe without a physical file backing it (even in the task manager you can see the path from which a program was launched)
There are two different concerns for security here:
That someone can see the file that you've downloaded from the database.
That executing the file might be a security threat.
For the first concern: Create a directory on the server and restrict access to that directory so that no one but the user account that runs your server program can see/use it. Save the byte array into a temporary file in that directory, execute it, and once the process has completed, delete the temporary file.
For the second concern: You'll need to run that executable in a sandboxed environment. In .NET you can run code in a sandboxed environment by loading the code into a separate AppDomain that you've setup to only have partial trust. How to do that deserves another question on SO though.
I'm building a program in visual studio 2010 using C# .Net.
I am wondering if there is a way to build a executable that allows multiple users to run it at the same time.
Basically, the executable is built and available online. When the user access the site, he can use the executable. However, when multiple users try to access the site and access the same exe, it breaks.
I couldn't really identify the issue. When I run the exe on 2 separate command line prompt, they both run as expected, but when running from the website, it breaks.
Any idea what could be the cause and how to fix it?
-- Edit 3:40pm --
The way the site works is the the site will call a java program, and the java program will call my exe, and then return some data to the the site.
The java will call the exe using the command line argument program.exe arg1
The executable is pulling some information from the server side, do some modification to the information, and then put the modified information on the site and display to user.
When I say it breaks, I mean the the site does not get any modified information and display nothing back for the user.
I'm building a digital signage application and I want to deploy it using ClickOnce. (I feel this is the best approach.) When I start the application from Visual Studio (VS) it works great. The application downloads a lot of images from my web service and saves them to disk:
string saveDir = new FileInfo(Assembly.GetExecutingAssembly().Location).Directory.FullName;
When I start my deployed application, it shows the splash screen and then disappears. The process keeps running, but the UI doesn't display. I'm wondering if my saveDir as shown above is giving me trouble?
How do I locate my installed application? (I need to make license files, etc.)
I'm not sure if this is the root of your problem, but I highly recommend you change the structure of how you store your application information.
When an application is installed through ClickOnce, the application is installed within the User's folder, and it's considerably obfuscated. Furthermore, locations may change with subsequent application updates, so you can not be guarantee than any cached, downloaded file will exist from update to update.
To solve this problem, ClickOnce does provide a Data directory, that is not obfuscated and can be used for caching local data. The only caveat is this directory is not available for non-ClickOnce instances of your application (like the version that is running within the VS debugger.)
To get around this, you should write a function that you can use to get your data directory, regardless of your method of distribution or execution. The following code is a sample of what the function should look like:
//This reference is necessary if you want to discover information regarding
// the current instance of a deployed application.
using System.Deployment.Application;
//Method to obtain your applications data directory
public static string GetAppDataDirectory()
{
//The static, IsNetworkDeployed property let's you know if
// an application has been deployed via ClickOnce.
if (ApplicationDeployment.IsNetworkDeployed)
//In case of a ClickOnce install, return the deployed apps data directory
// (This is located within the User's folder, but differs between
// versions of Windows.)
return ApplicationDeployment.CurrentDeployment.DataDirectory;
//Otherwise, return another location. (Application.StartupPath works well with debugging.)
else return Application.StartupPath;
}
So, I am using Matthew Ephraim's GhostscriptSharp, which is a simple C# wrapper for the unmanaged Win32 Ghostscript DLL in my ASP.Net MVC project. Some background:
What I am attempting to do is have a user upload a PDF, and then convert that document into an image that I can then save off into whatever directory I choose (as well as do some other OOP to tie that new image to my site).
I decided to use Mr. Ephraim's wrapper class (GhostscriptSharp) because it was simple enough to use, and it gives me relatively clean access to the DLL's API.
To test it, I created a dummy C# console application to make sure I could load the DLL, access it, hand it a PDF file on the local disk and then have it write a JPG to the same local disk. After a few learning experiences, I had success. I would hand it C:\INPUT.pdf, it would hand me C:\OUTPUT.jpg.
However, after integrating the GhostScriptSharp code that I had in the console application into my ASP.NET MVC project to the point of where I was calling the DLL with P/invoke, Ghostscript is returning with the int/error code -100, which is a fatal error (is called E_Fatal in the GhostScript source code). I get the same result with both the file that is uploaded through the HTML form, and if I hand it the exact same hard-coded paths that I used in my working console application.
For reference, the lines which the exception is thrown are 93-97 in GhostScriptSharp.cs (which is in the CallApi function):
int result = InitAPI(gsInstancePtr, args.Length, args);
if (result < 0) {
throw new ExternalException("Ghostscript conversion error", result);
}
Obviously the exception is thrown because result is -100.
When InitAPI is called, the instance ptr is a valid int (though I don't know if the instance of GS is correct or not), args has a length of 20 (is a string[]) of valid GhostScript options (including the correctly-escaped paths to my input & output files).
Long story short, what am I doing wrong? The error code -100 seems like a catch-all because there is no documentation that states what could possibly be going wrong here.
Any help is much appreciated, thank you in advance.
The -100 error is a generic "fatal error" in GhostScript.
A few things to check:
1) Permissions (al operations require file access)
2) Scope, you want to add the GS bin folder to the PATH variables
3) Consider not calling GhostScript directly from asp.net, GS can be very CPU intensive, rather process files in a separate service
I have also created a wrapper, send me an email (address on profile) and I will send it you. It allows one to pass in the GS bin folder which helps.
So, ended up being an ID10T error that was derailing me here in this specific instance.
In Matthew Ephraim's GhostscriptSharp code, he uses a couple of enums to define the options set forth for Ghostscript, and two in particular were the GhostscriptDevices and GhostscriptPageSizes enums. Problem is, the way they're written Resharper (Jetbrains Visual Studio plugin) has default rules for naming Enum members. Not thinking, I fixed all of these definitions to please Resharper not realizing that these are passed directly to Ghostscript, so instead of getting a7 for -sPAPERSIZE GS was getting A7, and for -sDEVICE it was getting Jpeg instead of jpeg.
For the time being permissions weren't a problem on my end, but only because I run the Cassini web dev test server in Visual Studio.
Thanks to #MarkRedman and #tvanfosson for their helpful suggestions!
Most likely the process running the web application does not have permission to write to the directories that you are using. I'd suggest creating some specific directory for the app to use and a local id to use to run the app pool, then give that id enough privileges to read/write the directory you've created.