I am trying to make scheduled mail with azure functions on Visual Studio but this time I took No Jobs functions found error. How can I fix and send this mails?
Also I made this function on Azure Portal and it works. I'm trying to convert Visual Studio.
using System;
using SendGrid.Helpers.Mail;
using Microsoft.Azure.WebJobs.Host;
using Microsoft.Azure.WebJobs;
namespace FifteenMinMail
{
public class Program
{
public static void Main()
{
JobHostConfiguration config = new JobHostConfiguration();
config.UseTimers();
JobHost host = new JobHost(config);
host.RunAndBlock();
}
}
public class ScheduledMail
{
public static Mail Run(TimerInfo myTimer, TraceWriter log)
{
JobHostConfiguration config = new JobHostConfiguration();
config.UseTimers();
JobHost host = new JobHost(config);
var today = DateTime.Today.ToShortDateString();
log.Info($"Generating daily report for {today} at {DateTime.Now}");
Mail message = new Mail()
{
Subject = "15 DK'LIK TEST MAILI"
};
Content content = new Content
{
Type = "text/plain",
Value = "Bu mail 15 dk da bir yinelenecektir."
};
message.AddContent(content);
return message;
}
}
}
Error; https://imgur.com/pyyctnC
you are mixing up web jobs and azure functions.
those are different but related technologys.
have a look at this if you want to create an azure function in visual studio:
https://learn.microsoft.com/en-us/azure/azure-functions/functions-create-your-first-function-visual-studio
Related
I have a c# sample app that can send the trace to the console. How can i send the trace to the Jaeger running locally?
i am using Jaeger running locally at jaeger-all-in-one --collector.zipkin.host-port=:9411
using System.Diagnostics;
using OpenTelemetry;
using OpenTelemetry.Trace;
using OpenTelemetry.Resources;
using OpenTelemetry.Exporter;
// Define some important constants and the activity source
var serviceName = "MyCompany.MyProduct.MyService";
var serviceVersion = "1.0.0";
// Configure important OpenTelemetry settings and the console exporter
using var tracerProvider = Sdk.CreateTracerProviderBuilder()
.AddSource(serviceName)
.SetResourceBuilder(
ResourceBuilder.CreateDefault()
.AddService(serviceName: serviceName, serviceVersion: serviceVersion))
.AddConsoleExporter()
.Build();
var MyActivitySource = new ActivitySource(serviceName);
using var activity = MyActivitySource.StartActivity("SayHello");
activity?.SetTag("foo", 1);
activity?.SetTag("bar", "Hello, World!");
activity?.SetTag("baz", new int[] { 1, 2, 3 });
I am trying to generate PDFs in asp.net core 3.0 application. Have added the DinkToPdf.dll using nuget package and image added this above 3 files into DinkToPdf folder. Trying to load those DLLs using CustomAssemblyLoadContext.
I am able to generate PDF locally and I generated publish code into one folder. I didn't see the libwkhtmltox.dll in publish code. I added those 3 files manually into publish code and hosted into IIS.
I am facing issue to generate PDF when I host to IIS.
I am using below code:
public class Startup
{
public void ConfigureServices(IServiceCollection services)
{
var data = Path.Combine(Directory.GetCurrentDirectory(), "DinkToPdf", "libwkhtmltox.dll");
CustomAssemblyLoadContext Context = new CustomAssemblyLoadContext();
Context.LoadUnmanagedLibrary(data);
services.AddSingleton(typeof(IConverter), new SynchronizedConverter(new PdfTools()));
}
}
public class CustomAssemblyLoadContext : AssemblyLoadContext
{
public IntPtr LoadUnmanagedLibrary(string absolutePath)
{
return LoadUnmanagedDll(absolutePath);
}
protected override IntPtr LoadUnmanagedDll(string unmanagedDllName)
{
return LoadUnmanagedDllFromPath(unmanagedDllName);
}
protected override Assembly Load(AssemblyName assemblyName)
{
throw new NotImplementedException();
}
}
using below code to generate PDF:
public void GenPdf()
{
var globalSettings = new GlobalSettings
{
ColorMode = ColorMode.Color,
Orientation = Orientation.Portrait,
PaperSize = PaperKind.A4,
};
var objectSettings = new ObjectSettings
{
HtmlContent = //html content
};
var pdf = new HtmlToPdfDocument()
{
GlobalSettings = globalSettings,
Objects = { objectSettings }
};
var file = _converter.Convert(pdf);
}
please help me with this issue.
services.AddControllersWithViews();
var architectureFolder = (IntPtr.Size == 8) ? "64 bit" : "32 bit";
var wkHtmlToPdfPath = Path.Combine(_hostingEnvironment.ContentRootPath, $"wkhtmltox\\v0.12.4\\{architectureFolder}\\libwkhtmltox");
CustomAssemblyLoadContext context = new CustomAssemblyLoadContext();
context.LoadUnmanagedLibrary(wkHtmlToPdfPath);
we have to add above code to load assembly it works for me in iis and kubernet
https://www.nuget.org/packages/Haukcode.WkHtmlToPdfDotNet/ worked for me, when upgrading from netcoreapp2.1 to netcoreapp3.1.
replace your refs to DinkToPdf to:
using WkHtmlToPdfDotNet;
using WkHtmlToPdfDotNet.Contracts;
I used Haukcode.WkHtmlToPdfDotNet version 1.4.0 and my libwkhtmltopdf was 0.12.6
I recently download Nlog.dll from internet and add it into references part of project I'm writing code in C#. But even ready codes doesn't work in my simple console application. For the beginning I write it into my simple console application
As you can see here even some method of ondefined on other panel there is NLog classes located. How I can configure NLog from code?
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using NLog;
using NLog.Config;
namespace ConsoleApplication1
{
class Program
{
private static NLog.Logger logger = NLog.LogManager.GetCurrentClassLogger();
static void Main(string[] args)
{
var config = new NLog.Config.LoggingConfiguration();
var logfile = new NLog.Targets.FileTarget("logfile") { FileName = "file.txt" };
var logconsole = new NLog.Targets.ConsoleTarget("logconsole");
config.AddRule(LogLevel.Info, LogLevel.Fatal, logconsole);
config.AddRule(LogLevel.Debug, LogLevel.Fatal, logfile);
NLog.LogManager.Configuration = config;
var logger = NLog.LogManager.GetCurrentClassLogger();
logger.Info("Hello World");
}
}
}
I get this error (in Russian language):
The code is correct for the latest version of NLog on NuGet. The .dll you downloaded seems to be an older version of NLog.
I'm trying to submit a MapReduce job to HDInsight cluster. In my job I didn't write reduce portion because I don't want to reduce anything. All I want to do is to parse the each filename and append the values to every line in the file. So that I will have all the data needed inside the file.
My code is
using Microsoft.Hadoop.MapReduce;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
namespace GetMetaDataFromFileName
{
class Program
{
static void Main(string[] args)
{
var hadoop = connectAzure();
//Temp Workaround to Env Variables
Environment.SetEnvironmentVariable("HADOOP_HOME", #"c:\hadoop");
Environment.SetEnvironmentVariable("Java_HOME", #"c:\hadoop\jvm");
var result = hadoop.MapReduceJob.ExecuteJob<MetaDataGetterJob>();
}
static IHadoop connectAzure()
{
//TODO: Update credentials and other information
return Hadoop.Connect(
new Uri("https://sampleclustername.azurehdinsight.net//"),
"admin",
"Hadoop",
"password",
"blobstoragename.blob.core.windows.net", //Storage Account that Log files exists
"AccessKeySample", //Storage Account Access Key
"logs", //Container Name
true
);
}
//Hadoop Mapper
public class MetaDataGetter : MapperBase
{
public override void Map(string inputLine, MapperContext context)
{
try
{
//Get the meta data from name of the file
string[] _fileMetaData = context.InputFilename.Split('_');
string _PublicIP = _fileMetaData[0].Trim();
string _PhysicalAdapterMAC = _fileMetaData[1].Trim();
string _BootID = _fileMetaData[2].Trim();
string _ServerUploadTime = _fileMetaData[3].Trim();
string _LogType = _fileMetaData[4].Trim();
string _MachineUpTime = _fileMetaData[5].Trim();
//Generate CSV portion
string _RowHeader = string.Format("{0},{1},{2},{3},{4},{5},", _PublicIP, _PhysicalAdapterMAC, _BootID, _ServerUploadTime, _LogType, _MachineUpTime);
//TODO: Append _RowHeader to every row in the file.
context.EmitLine(_RowHeader + inputLine);
}
catch(ArgumentException ex)
{
return;
}
}
}
//Hadoop Job Definition
public class MetaDataGetterJob : HadoopJob<MetaDataGetter>
{
public override HadoopJobConfiguration Configure(ExecutorContext context)
{
//Initiate the job config
HadoopJobConfiguration config = new HadoopJobConfiguration();
config.InputPath = "asv://logs#sample.blob.core.windows.net/Input";
config.OutputFolder = "asv://logs#sample.blob.core.windows.net/Output";
config.DeleteOutputFolder = true;
return config;
}
}
}
}
Usually what do you thing the reason of 500 (Server Error) ? Am I suppling to wrong credentials ? Actually I didn't really understand the difference between Username and HadoopUser parameters in Hadoop.Connect method ?
Thank you,
I had approximately same issue in the past (was unable to submit hive job to the cluster with BadGateway response). I have contacted the support team and in my case the problem was in memory leakage at the head node, what means that the problem was not at client's side and it seems to be inherited hadoop problem.
I've solved that stuff by redeploying the cluster.
Have you tried to submit other jobs (simple ones)? If so, than I suggest to have a contact with azure support team or just redeploy the cluster if it's not painful for you.
I want to programatically connect to TFS and be able to checkout and checkin the files. For that purpose, I am using the following code (some private information omitted), however, I get the "not having sufficient permissions error", I have checked with the administrator and he has given me both read and write permissions, can anyone please help me. Here's the code:
using System;
using Microsoft.TeamFoundation.Client;
using Microsoft.TeamFoundation.VersionControl.Client;
namespace CodeGeneration
{
public class CheckInTFS
{
public static void ProcessFile()
{
var tfs = new TfsTeamProjectCollection(new Uri("http://tfs"));
var versionControlServer = tfs.GetService<VersionControlServer>();
var workspace = versionControlServer.GetWorkspace(#"D:\Test");
#region Checkout File
var file = #"D:\EnumGeneration.cs";
workspace.PendEdit(file);
var pendingChange = workspace.GetPendingChanges();
#endregion
#region Checkin File
workspace.CheckIn(pendingChange, "Test Comment!");
#endregion
}
}
}
The error which I receive is this:
Also, I have looked at the permissions from This MS Page and I have GENERIC_READ and GENERIC_WRITE permissions.
I found this sample, try with this and let me know if you still have persmisson problems when adapting and running this
using System;
using System.Collections.ObjectModel;
using Microsoft.TeamFoundation.Client;
using Microsoft.TeamFoundation.Framework.Common;
using Microsoft.TeamFoundation.Framework.Client;
namespace TfsApplication
{
class Program
{
static void Main(String[] args)
{
// Connect to Team Foundation Server
// Server is the name of the server that is running the application tier for Team Foundation.
// Port is the port that Team Foundation uses. The default port is 8080.
// VDir is the virtual path to the Team Foundation application. The default path is tfs.
Uri tfsUri = (args.Length < 1) ?
new Uri("http://Server:Port/VDir") : new Uri(args[0]);
TfsConfigurationServer configurationServer =
TfsConfigurationServerFactory.GetConfigurationServer(tfsUri);
// Get the catalog of team project collections
ReadOnlyCollection<CatalogNode> collectionNodes = configurationServer.CatalogNode.QueryChildren(
new[] { CatalogResourceTypes.ProjectCollection },
false, CatalogQueryOptions.None);
// List the team project collections
foreach (CatalogNode collectionNode in collectionNodes)
{
// Use the InstanceId property to get the team project collection
Guid collectionId = new Guid(collectionNode.Resource.Properties["InstanceId"]);
TfsTeamProjectCollection teamProjectCollection = configurationServer.GetTeamProjectCollection(collectionId);
// Print the name of the team project collection
Console.WriteLine("Collection: " + teamProjectCollection.Name);
// Get a catalog of team projects for the collection
ReadOnlyCollection<CatalogNode> projectNodes = collectionNode.QueryChildren(
new[] { CatalogResourceTypes.TeamProject },
false, CatalogQueryOptions.None);
// List the team projects in the collection
foreach (CatalogNode projectNode in projectNodes)
{
Console.WriteLine(" Team Project: " + projectNode.Resource.DisplayName);
}
}
}
}
}
taken from here
http://msdn.microsoft.com/en-us/library/bb286958.aspx
The following code snippet did the trick:
var tfs = TfsTeamProjectCollectionFactory.GetTeamProjectCollection(new Uri(TFS_SERVER));
var workspaceInfo = Workstation.Current.GetLocalWorkspaceInfo(FULL_FILE_PATH);
var workspace = workspaceInfo.GetWorkspace(tfs);