I have a .NET C# WebAPI application which is meant to run on IIS. The app loads an external 64 bit DLL. When I debug on VS2015 everything is fine, but I had to set this checkbox in order to make it work:
Now, I deployed the application on a Windows7 VM running IIS version 7.5 and I get the same error as if I haven't set that checkbox on VS2015. Is there any equivalent setting in IIS in order to force 64 bit mode?
Related
I have MVC application. I need whatever it was in the x64. I have local iis with x64 OS. When I publish the project in Visual Studio I choose x64 configuration. When i set the value of “Enable 32-bit Applications” to False in IIS AppPool settings everything works well, but when i set the value of “Enable 32-bit Applications” to True i get an error.
How can I publish x64 project to IIS server with “Enable 32-bit Applications” = True?
I still have a problem. I can not even run a standard initial application. here's a screencast (sorry that the IIS have russian interface): http://www.screencast.com/t/Kf0mpM9uFa
what am I doing wrong?
And sorry for my English. Bears, vodka, balalaika.
Thanks in Advance.
It’s because you are trying to run 32 bit dlls on a 64 bit system. Fix it in IIS manager by changing the advanced settings on you application pool to enable 32 bit applications. Or create a new app pool specifically for them.
original artcle
I got a simulation:
1. Project platform target x64 => Working in IIS.
2. IIS > enable 32bit application = > Working also.
My Idea is:
Kindly check framework of your application pool.
I have a 64bit web application written in c# deployed to IIS on both windows server 2008 (IIS 7) and windows server 2012 (IIS 8).
Part of the application involves accessing an unmanaged c++ DLL from c# code. This call is failing when I deploy to IIS. I get the classic:
An attempt was made to load a program with an incorrect format. (Exception from HRESULT: 0x8007000B)
I have read many threads on the topic, including:
An attempt was made to load a program with an incorrect format ERROR
“An attempt was made to load a program with an incorrect format” even when the platforms are the same
But I don't think they apply as
I have a 64 bit application.
It's compiled as x64
It's deployed as x64
I have verified it is running under an x64 IIS appPool.
enable32bitapplications is set to false.
Both the c# dll importing and the unmanaged dll are compiled to 64bit.
I have deployed to IIS on the server machines in both Release and Debug mode - no difference.
There are 32 bit dlls in the application, but they are in completely separate projects which are not referenced by, nor reference, the project that is throwing the error.
For additional information, it runs fine on both my local machine in IIS express AND my local machine when deployed through IIS (windows 7), so there is a disconnect somewhere and I can't track it down.
Additional info:
The unmanaged DLL does have dependencies on:
Kernel32.dll
Advapi32.dll
Crypt32.dll
User32.dll
Version.dll
I realize these are 32bit I thought these were 32bit, but on a 64 bit machine these live in System32, so I'm not sure... but if so, how can I get it to compile as the unmanaged DLL itself is compiled to 64 bit.
Also, why would it work locally for me?
Also, I have (just for kicks and skittles) set Enable32BitApplications to True and I get the same error, as my project is a 64 bit project and references a 64bit unmanaged DLL, which can't run in a 32 bit process.
There was an additional dependency for the unmanaged DLL that was a c++ redistributable dll. The copy of that DLL on the machine was 32 bit, once I swapped out the version to be 64 bit, it loaded just fine.
Lesson learned, all dependencies of the DLL must be present and the correct Bitness.
You need to set Enable32bitApplications to true in IIS.
I am having problem getting the MS Visual Studio Remote Debugger to connect to my local IE instance as it is running as a 64-bit rather than a 32-bit process.
Every time I try to run it currently in Visual Studio I get the
The 32-bit version of the Visual Studio Remote Debugging Monitor (MSVSMON.EXE) cannot be used to debug 64-bit processes or 64-bit dumps".
error.
Investigating a bit, I think I have narrowed it down to the web asp service being run as a x64 process rather than a x86 (which both Visual Studio, and the silver light application are running as). I confirmed it as running as a 64-bit process by trying to "attach" visual studio to the process when the application was running in the ASP.Net Development Server.
In short: Is there a setting I am missing somewhere to force Visual Studio to run the ASP.Net service as a 32-bit process? I have read about using a variable in the web.config application pool to use 32-bit (via the enable32BitAppOnWin64 config option), but it seems to only work in IIS, not ASP.net Dev server.
Any thoughts?
Edit For Clarity:
I am running Windows 7 64-bit, Visual Studio 2010 (which is running as a 32-bit process). Currently it is launching ASP.NET Development Server (not IIS) to host the back end web service. I am hoping I can simply "fix" this via a setting, but if not my backup would be to run IIS Express.
If I'm understanding you correctly you should do this:
IIS Manager/Application Pools-> choose the correct pool for your application/Advanced Settings/Enable 32-Bit Applications->set it to true !
I had problem like this in the past which cost me 1-2 days, hope this helps !
Check also Project/Properties/Build/Platform target->this should be Any CPU
I have an asp.net web application that uses an unmanaged 32 bit dll that I have successfully running on my development machine, but when I use web deploy to move the code to our test server, I start seeing BadImageFormat exceptions.
I set the target in visual web developer to x86 and both machines are running 64 bit os's (windows 7 and windows server 2008 r2). I'm not sure what other differences there could be causing the problem. Thanks for any help you can provide.
It turns out that there was a flag "Enable 32-bit executables" or something like this in the config files for the webserver that starts out false under iis 7.5
The error message was a total red herring.
What would be the best (easiest and fastest) way to provide an access between 64-bit application (ASP.NET) and 32-bit assembly (.NET 32-bit database driver).
1) I've got complete control over this two pieces of code (64-bit and 32-bit),
2) They both run on the same machine,
3) Security is not an issue,
4) Performance is important.
Run the ASP.NET application processes in 32-bit mode. This is the only way to get it to work.
For example, Crystal Reports XI does not have a 64-bit driver. In order to run the report, you must run the ASP.NET app in 32-bit mode on a 64-bit server.
On IIS 6.0:
Click Start, click Run, type cmd, and
then click OK.
Type the following command to enable
the 32-bit mode:
cscript %SYSTEMDRIVE%\inetpub\adminscripts\adsutil.vbs SET W3SVC/AppPools Enable32bitAppOnWin64 1
Now since the IIS worker process is
running in 32 bit mode we need to
ensure the ASP.NET ISAPI filter is
also changed to the 32 bit version.
Type the following command to install
the version of ASP.NET 2.0 (32-bit)
and to install the script maps at the
IIS root and under:
%SYSTEMROOT%\Microsoft.NET\Framework\v2.0.50727\aspnet_regiis.exe –i
In IIS 7.0, you can set the 32 or 64-bit per application pool whereas on IIS 6.0 it's the entire processes of the server.