I want to say that I'm new to Azure functions and probably what I'm trying to do may not make sense.
Here we go.
In the function I am creating I have to import content into a database. Whenever I do this I validate if the content is new or if it is a simple update to the data.
After that I take an email template that is in the root "root/EmailsTemplate/MyTemplate.html" I fill in with the data that I collected and send.
However, I cannot access this repository directly. I've seen that I can use Environment.CurrentDirectory, Path.GetDirectoryName(Assembly.GetExecutingAssembly().Location)
or also the ExecutionContext, however they all send me to the \bin\Debug\net472 execution folder. Do I have to use one of these and go back in the folders? Or is there another way? And how will it behave later in the Azure environment?
I use Azure Function V1 (.net Framework 4.7.2)
Thanks
As auburg says, set the below properties of the file can copy the file to \bin\Debug\net472 after you build the project:
And if your function app is on azure, then the content of the root directory is the files in \bin\Debug\net472.
Related
I'm currently in the process of converting project frontend from .netcore web application (razor page) to reactjs, one of the road block we encounter is the dynamic url path that we applied in razor page.
As the application is deployed on IIS under default website, so the url is construct as https://localhost/Project. One of the project requirement is it can to be replicate and deploy many time, so there will be https://localhost/Project{n}. That mean the url path name need to be configurable so the frontend JavaScript will request to the correct api url. What i done was config the pathname in iis application's webconfig and inject it into the razor page, so javascript can refer to the injected pathname variable.
But when come to reactjs, it is a seperate project from backend but they share the same url, and reactjs has no way to get the configurable pathname so it can only stick to whatever set in the public_url/homepage, and it can only be set before project is build. As it is not set dynamically, the frontend point to the right default page but wrong javascript url, so it cant even run the react main chunk js.
Any lead for me to continue on? Or i need to change the way of configurable pathname?
In React you can use standard JS. So in order to get the current path name from your React application, you can do something like:
const pathname = location.pathname
If you need the pathname in many different places in your React project, you can set it as a context. (See this documentation)
Alternatively, if you only need it on the initial load, you can just do a useEffect in your app.js file like:
const {pathname, setPathname} = React.useState();
React.useEffect(()=>{
setPathname(location.pathname);
}, [])
I'm coding a C# function which is triggered by the upload of a blob. I would like to read another file in the container. How would the input binding bring the second blob?
public static async Task Run([BlobTrigger("csv/{name}.csv", Connection = "StorageConnectionAppSetting")]Stream myBlob, string name, ILogger log)
In addition to this question, how can I reference values from the local.settings.json in my code? I'm able to reference the "StorageConnectionAppSetting" on the input binding but I'm not sure how to do the same for portions of my code where I'm creating clients using APIKEYs.
Thanks!
A blob-trigger can be a single blob, but you can add an input binding to your function. In this case you can add an input binding to CloudBlobContainer by adding a reference to the storage SDK, and then read any blobs in that container.
https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-blob-input?tabs=csharp#usage.
Another option would be not to use input binding and read the container and its contents the way you would normally do using storage SDK. You will need to add reference to Microsoft.Azure.Storage.Blob in both the cases.
For app settings you can use System.Environment.GetEnvironmentVariable("APIKEY");, assuming APIKEY is your customer setting. Remember, local.settings.json will only be local and you will need to set these values in Azure either via Azure Portal or your CI/CD pipeline via and ARM template.
You can also use Azure functions dependency injection and inject configuration. Check the section Working with options and settings at https://learn.microsoft.com/en-us/azure/azure-functions/functions-dotnet-dependency-injection#working-with-options-and-settings
We are a team trying to upgrade our windows application to use Azure DevOps Services' new REST based .NET Client Libraries instead of Client OM that uses SOAP.
The part of the application that we are upgrading does the following:
Checks out all AssemblyInfoVersion.cs files.
Updates the version on those files.
Checks in all the files.
Create a Label with information about that the version was updated.
We managed to do the first three steps with the new REST based .NET Client Libraries using the CreateChangesetAsync method.
But we can not find any information about how to create a Label so we have not been able to do the last step. Is this really not supported?
Currenlty you can't create a new label with the new Azure DevOps Rest API, you can only get labels.
As workaround, you can use tf.exe with the command label to label the files.
In your code add something like this (using System.Diagnostic):
string tfExePath = "path/to/exe";
string tfArgs = "label test /version:45 $test/src"
Process.Start(tfExePath, tfArgs)
I have a method called ReadFileData(string blobStorageFilePath) in my .NET Web API project. This method reads the text content from Azure Blob file. The azure blob storage file path is passed via the parameter in this method. Till now a client application (web) was calling this method to read file data but now I have to automate this process.
So, is it possible to call this web API method (by some way) whenever a new file is added to azure blob storage automatically? So this way there will be no need of any client application.
Which approach should I use to implement this process?
Any working example will be appreciated.
You can add a webjob to your Azure app service and install the Azure Webjobs SDK. Then you can easily trigger your read with a simple declarative blob trigger
https://learn.microsoft.com/en-us/azure/app-service-web/websites-dotnet-webjobs-sdk-storage-blobs-how-to
public static void CopyBlob([BlobTrigger("input/{name}")] TextReader input,
[Blob("output/{name}")] out string output)
{
output = input.ReadToEnd();
}
https://learn.microsoft.com/en-us/azure/azure-functions/functions-create-storage-blob-triggered-function
You can create a function triggered by Azure Blob storage. Please see the link for the complete example
I have a ASP.NET application. Inside the asp.net application I have a folder called WebServices where I keep all the .asmx files.
I am referring these asmx files inside asp.net .cs files. Instead of giving the full url to the webservice.url property how can i set the path like this.
ds.Url = this.ResolveUrl("~/WebServices/xxx.asmx");
Is HttpServerUtility.MapPath what you're looking for?
ds.Url = Server.MapPath("~/WebServices/xxx.asmx");
You can get hold of it either via Server property in Page class, or via HttpContext.Current.Server chain.
Even better, I'd store this URL in an application configuration file.
Your questions suggests that you have your webservices in the same project as the consuming apllication. This will not work. Move all your webservices into a seperate project.
If your services and cs files both are in same project then you donot need to set the URL as such. These services can be called as if you can call other classes in your application.