I use nancyfx.
I have published a folder with files.
I start the server and observe the following situation:
If there is a file '1.html' in the published folder, then I get it through the browser. If I delete this file from the disk, I get an error 404 (this is correct). If I add this file to disk again, or I change its contents, I normally get it in the browser.
If after starting the server I try to access from the browser to a nonexistent file '2.html', I get an error 404 (This is correct). However, if after that I create the file '2.html' on the disk, I still get a 404 error. It helps only restart the server.
I got the impression that nancyfx at the first access to the requested files forms some kind of cache, which subsequently does not allow me to get the files added after they were unsuccessfully requested.
Help please with the decision of the given problem. Thank you in advance.
Nancy uses a convention based approach for figuring out what static content it is able to serve at runtime. you can check the Documentation here
Related
I am working on .NET Core 2.2 application (will be upgraded soon). There is a functionality to upload files on the server which can be accessed by other users via a link. Everything works pretty fine. There are checks to prevent files with certain characters including #. The only issue I am having is, client is insisting to allow # in filename. There are no issues when uploading such files, but it doesn't load via link. I get Status Code: 404; Not Found error. This was the issue in legacy site (ASP.NET WebForms) as well where it was showing 404 - File or directory not found..
The URL I get looks like this: /_ClientData/NTTF/Announcements/61/Docs/invalid%20#%20test.pdf
As a last option, I can allow this files and replace # with something else on server, but I am wondering if there is any way to make this work without manipulating filename.
You probably have an Razor page (or equivalent) that dynamically generates the "clickable link".
SUGGESTION: Use HttpUtility.UrlEncode() to explicitly generate the link, as you serve the page.
It should generate something like this:
File Name: invalid # test.pdf
HttpUtility.UrlEncode: invalid%20%23%20test.pdf
When I am trying to delete the failed jobs in the production server i am getting 404 error but I am able to delete them in my local PC.
URL
https://mywebsite.com/hangfire/jobs/failed/delete
HTTP ERROR 404
URL
https://localhost:59141/hangfire/jobs/failed/delete
HTTP 200
Can anyone let me know why this is happening on the production server only.
There is only one server behind this url..
Thanks in Advanceā¦
The issue actually I am facing was - I deployed the hangfire application inside the sharepoint website in IIS.
The path deleting the job was something like this on button click
http://SharePointWebsite/MyAapplicationWebSite/hangfire/jobs/failed/delete
This is the path that is getting generated by the hangfire code internally(I added the hangfire assembly reference to my project). My Share Point application due to some configuration issues is not accepting this huge path. So i changed the path to
http://SharePointWebsite/MyAapplicationWebSite/hangfire/jobs/faileddelete
(I removed one "/" in the path) which worked for me.
How to change automatic generated path :
Take the HangFire Code from GitHub which is opensource now
In the Dashboard Pages you will find FailedJobsPage.cshtml. Update this page contents with the short url that you want.
But this cshtml will not be rendered until you run the custom tool - "razor generator tool"(VS Extension) for this cshtml page which generates cs file
You can see these cs files already present in the GitHub Code (expand the cshtml page you will find the cs file)
After finishing the above steps make sure the same path is configured in the DashboardRoutes.cs file as well.
=========================
"ExtensionlessUrlHandler-Integrated-4.0" in "handlers" section
of SharePoint Application web Config is causing the actual issue
I have tried to test code to read the content of a google sheet from C#
For now, I have copied the code from: http://mscodingblog.blogspot.com.mt/2016/11/how-to-read-google-spreadsheet-using.html
Everything is verbatim, but my keyfile.
When running, I get a 403.
The account is admin, the spreadsheet is even posted publicly on the web (id: 1wAkLe8GRpsL_FrBjEi4aMTeR1wGWWllwqBdY2Z8zVms) and I request read-only access, and yet, I can't access it.
If I run it with the wrong spreadsheet Id, I get a 404, so at least I know it finds the spreadsheet in the first call.
I am using a service account.
I found this: Getting a 403 - Forbidden for Google Service Account but it doesn't really help me as there doesn't seem to be a valid solution there.
What could I be missing?
Try deleting your .credential directory. On Windows, you would open a file explorer and navigate to .credentials.
I've got a windows service coded in c#, with a config file:
ProcessingService.exe
ProcessingService.exe.config
Its got a webservice endpoint address in it. This initially went in with the wrong address, so I stopped the service, changed the config file and restarted, but the service is still hitting the original URL.
I then restarted the entire server and still the wrong URL is being accessed.
We have a load of corporate rules about new install versions meaning my turnaround time for compiling a new service and getting it installed will be measured in weeks, leaving the URL broken for that entire time. Is there a way to force the config to update?
(Yes, I've triple checked that the config file is now correct!)
In response to the request for the service setup code, I simply do (class names changed):
WebserviceNamespace.ServiceClass client = new WebserviceNamespace.ServiceClass();
The service config shows the original URL, and I'm using a transformed app.config process to overwrite the new URL in the new config file (again, I've triple checked this). I am generating the service classes as internal though, could this be something to do with it?
OK, so I've now tried installing the service on another computer, stopping it, changing the URL in the config to "nevergonnahappen" and restarted. Requests to that now fail on an invalid url. So it must be something to do with our live server specifically...
make sure that your services is pulling the url from config file.
make sure the services is not cached the url somewhere else after getting it(maybe in memory or file or database).
make sure you pointed to correct folder for the config file, perhaps do a wild search with agent ransack, any other place that is still writing broken url and your services in fact is pointing into it.
Say I have a virtual folder /topFolder/ in IIS7, and in that folder there can be any file that can be displayed in a browser (xml, html, swf, doc etc - typically "unmanaged" resources from the IIS perspective).
Before giving the request permission to open any file below the folder, I need to check some session variables in order to see if the user has a "license" for the subfolder and file in question.
I've tried implementing a module with IHttpModule and IReadOnlySessionState interfaces, but the Session is always null on the AcquireRequestState event when the file is "static" and not IIS managed (like aspx, ashx etc).
If I use a custom HttpHandler, I get the session, but then I also need to implement how the content is sent to response. Edit: Since the user isn't downloading the file, I just want IIS to serve the file like it does with its StaticFileModule. The Handler/Module should really be a StaticFileModuleWithAuthorizationHook...
So I really want to do the following:
1. For request /topFolder/* : check session and licenses etc
a) If ok, continue serving file
b) If not ok, interrupt request, or just send FORBIDDEN in response.
Hope someone can help.
You should be able to handle this via the httphandler, the simple way is to use the built in methods to send the file down to the user if they have access.
This article (at the bottom) shows an example of how to do this.