I'm trying to determine whether or not WSUS manages the current machine by using the Windows Update Agent API. Currently, my code looks at all the services registered with Windows Update Agent.
private static bool IsManagedByWSUS(WUApiNativeMethods.IUpdateSession3 session)
{
WUApiNativeMethods.IUpdateServiceManager serviceManager = session.CreateUpdateServiceManager();
WUApiNativeMethods.IUpdateServiceCollection services = serviceManager.Services;
foreach (WUApiNativeMethods.IUpdateService service in services)
{
// Indicates whether the service is registered with automatic updates
var registeredWithAu = service.IsRegisteredWithAU();
// Indicates whether the service is a managed service
var isManaged = service.IsManaged();
var name = service.Name().ToLower();
if (registeredWithAu &&
isManaged &&
name.Contains("windows server update service"))
{
return true;
}
}
return false;
}
The problem is that I don't know whether or not checking for the name is reliable. I see that there is a service id field on the IUpdateService object that is a guid. I tested a couple of boxes, and it seems to always be 3da21691-e39d-4da6-8a4b-b43877bcb1b7.
How can I reliably check for WSUS?
you can check the registry file: HKLM\SOFTWARE\Policies\Microsoft\Windows\WindowsUpdate
shell command:
reg query "HKLM\SOFTWARE\Policies\Microsoft\Windows\WindowsUpdate"
Related
In short:
I have a C# console app which i used to test and write data from SQL to SharePoint lists.
Everything works fine while it is ran as console app. I get the connection towards SQL, context is created, then I connect to SharePoint site and proceed with update of certain fields.
Now, when I deploy working solution as a timerjob (a .wsp) for sharepoint, and when I update it to the server farm and deploy it as feature to the site, and run it as a timerjob, it does work, but only, so to speak "once".
When I run that timer job, it recives SQL context, connects, and updates SharePoint lists. But when I change data in a SQL table (eg. a field called "price" from 10.99 to 11.99), and run timerjob again, it still only updates the "old" data, to be exact, the 10.99 value.
Now when doing this with console app .exe, on the server, no matter how many db changes I perform, it always updates the newest data, but as a timerJob it seems like it "hanges" onto previous context connection, and updates previous data only.
Do I need to specify, in the code, to drop context after the timerjob has ended it's run, so it can call the same but "fresh" context on the next run.
Here is inital code in the .wsp
class TimerJobPromoToolsDefinition : SPJobDefinition
{
public TimerJobPromoToolsDefinition() : base()
{
}
public TimerJobPromoToolsDefinition(string jobName, SPService service) : base(jobName, service, null, SPJobLockType.None)
{
this.Title = "PromoTools_Timer_Job";
}
public TimerJobPromoToolsDefinition(string jobName, SPWebApplication webapp) : base(jobName, webapp, null, SPJobLockType.ContentDatabase)
{
this.Title = "PromoTools_Timer_Job";
}
public override void Execute(Guid targetInstanceId)
{
System.Diagnostics.Trace.Assert(false);
Main();
}
private static TimerJobConnection _context;
public static void Main()
{
PrintInitializing();
_context = new TimerJobConnection();
string siteURL = "http://somesite.com/";
try
{
using (SPSite site = new SPSite(siteURL))
{
using (SPWeb web = site.OpenWeb())
{
var catalogs = GetArtikliFromPromoTools(web);
var articlesFiltered = GetArtikliFromDB(catalogs);
PrintFinishedLoadingArticles();
GetSharePointCatalogHeaders(web);
UpdateArticles(catalogs, articlesFiltered, web);
PrintEndOfOperation();
Console.WriteLine("Press any key to continue...");
}
}
}
catch (Exception)
{
PrintErrorSharepointConnection();
PrintPressKeyToExit();
}
finally
{
}
}
I think the "context" should not the issue based on my experience.
As you catched the exception, try to check is any exception, You could try to debug the code by attaching to owstimer process also.
I have a requirement where we need a plugin to retrieve a session id from an external system and cache it for a certain time. I use a field on the entity to test if the session is actually being cached. When I refresh the CRM form a couple of times, from the output, it appears there are four versions (at any time consistently) of the same key. I have tried clearing the cache and testing again, but still the same results.
Any help appreciated, thanks in advance.
Output on each refresh of the page:
20170511_125342:1:55a4f7e6-a1d7-e611-8100-c4346bc582c0
20170511_125358:1:55a4f7e6-a1d7-e611-8100-c4346bc582c0
20170511_125410:1:55a4f7e6-a1d7-e611-8100-c4346bc582c0
20170511_125342:1:55a4f7e6-a1d7-e611-8100-c4346bc582c0
20170511_125437:1:55a4f7e6-a1d7-e611-8100-c4346bc582c0
20170511_125358:1:55a4f7e6-a1d7-e611-8100-c4346bc582c0
20170511_125358:1:55a4f7e6-a1d7-e611-8100-c4346bc582c0
20170511_125437:1:55a4f7e6-a1d7-e611-8100-c4346bc582c0
To accomplish this, I have implemented the following code:
public class SessionPlugin : IPlugin
{
public static readonly ObjectCache Cache = MemoryCache.Default;
private static readonly string _sessionField = "new_sessionid";
#endregion
public void Execute(IServiceProvider serviceProvider)
{
var context = (IPluginExecutionContext)serviceProvider.GetService(typeof(IPluginExecutionContext));
try
{
if (context.MessageName.ToLower() != "retrieve" && context.Stage != 40)
return;
var userId = context.InitiatingUserId.ToString();
// Use the userid as key for the cache
var sessionId = CacheSessionId(userId, GetSessionId(userId));
sessionId = $"{sessionId}:{Cache.Select(kvp => kvp.Key == userId).ToList().Count}:{userId}";
// Assign session id to entity
var entity = (Entity)context.OutputParameters["BusinessEntity"];
if (entity.Contains(_sessionField))
entity[_sessionField] = sessionId;
else
entity.Attributes.Add(new KeyValuePair<string, object>(_sessionField, sessionId));
}
catch (Exception e)
{
throw new InvalidPluginExecutionException(e.Message);
}
}
private string CacheSessionId(string key, string sessionId)
{
// If value is in cache, return it
if (Cache.Contains(key))
return Cache.Get(key).ToString();
var cacheItemPolicy = new CacheItemPolicy()
{
AbsoluteExpiration = ObjectCache.InfiniteAbsoluteExpiration,
Priority = CacheItemPriority.Default
};
Cache.Add(key, sessionId, cacheItemPolicy);
return sessionId;
}
private string GetSessionId(string user)
{
// this will be replaced with the actual call to the external service for the session id
return DateTime.Now.ToString("yyyyMMdd_hhmmss");
}
}
This has been greatly explained by Daryl here: https://stackoverflow.com/a/35643860/7708157
Basically you are not having one MemoryCache instance per whole CRM system, your code simply proves that there are multiple app domains for every plugin, so even static variables stored in such plugin can have multiple values, which you cannot rely on. There is no documentation on MSDN that would explain how the sanboxing works (especially app domains in this case), but certainly using static variables is not a good idea.Of course if you are dealing with online, you cannot be sure if there is only single front-end server or many of them (which will also result in such behaviour)
Class level variables should be limited to configuration information. Using a class level variable as you are doing is not supported. In CRM Online, because of multiple web front ends, a specific request may be executed on a different server by a different instance of the plugin class than another request. Overall, assume CRM is stateless and that unless persisted and retrieved nothing should be assumed to be continuous between plugin executions.
Per the SDK:
The plug-in's Execute method should be written to be stateless because
the constructor is not called for every invocation of the plug-in.
Also, multiple system threads could execute the plug-in at the same
time. All per invocation state information is stored in the context,
so you should not use global variables or attempt to store any data in
member variables for use during the next plug-in invocation unless
that data was obtained from the configuration parameter provided to
the constructor.
Reference: https://msdn.microsoft.com/en-us/library/gg328263.aspx
I am developing a Windows Service that submits some data to a web api. As Part of this I need to submit a GUID that I am generating with
Guid.NewGuid():
This GUID would be individual per machine, never change, and be the same for all users who log in. I'm struggling with where to actually store this though. I came across the Properties.Setting which seemed perfect, but if I scope to Application instead of User, it won't let me set the property as it is read only.
How and where do I store the GUID? It will only generate once (when the service starts on a PC for the first time).
In your case, you can use the ConfigurationManager Class in order to access and write the GUID in your application setting. From the link above example:
static void AddUpdateAppSettings(string key, string value)
{
try
{
var configFile = ConfigurationManager.OpenExeConfiguration(ConfigurationUserLevel.None);
var settings = configFile.AppSettings.Settings;
if (settings[key] == null)
{
settings.Add(key, value);
}
else
{
settings[key].Value = value;
}
configFile.Save(ConfigurationSaveMode.Modified);
ConfigurationManager.RefreshSection(configFile.AppSettings.SectionInformation.Name);
}
catch (ConfigurationErrorsException)
{
Console.WriteLine("Error writing app settings");
}
}
you can use it like:
AddUpdateAppSettings("MachineGuid", Guid.NewGuid().ToString());
I want to create a timer job or workflow which runs once a month and exports sharepoint list data to excel and stores this file in a document library.
I have downloaded the code to create timer job from below link but dont know how to include the above requirement
http://code.msdn.microsoft.com/SharePoint-2010-Custom-416cd3a1
//Create class derived from SPJonDefinition Class
class ListTimerJob : SPJobDefinition
{
public ListTimerJob()
: base()
{
}
public ListTimerJob(string jobName, SPService service, SPServer server, SPJobLockType targetType)
: base(jobName, service, server, targetType)
{
}
public ListTimerJob(string jobName, SPWebApplication webApplication)
: base(jobName, webApplication, null, SPJobLockType.ContentDatabase)
{
this.Title = "List Timer Job";
}
public override void Execute(Guid contentDbId)
{
// get a reference to the current site collection's content database
SPWebApplication webApplication = this.Parent as SPWebApplication;
SPContentDatabase contentDb = webApplication.ContentDatabases[contentDbId];
// get a reference to the "ListTimerJob" list in the RootWeb of the first site collection in the content database
SPList Listjob = contentDb.Sites[0].RootWeb.Lists["ListTimerJob"];
// create a new list Item, set the Title to the current day/time, and update the item
SPListItem newList = Listjob.Items.Add();
newList["Title"] = DateTime.Now.ToString();
newList.Update();
}
}
//Add Event receiver at Feature Level
[Guid("9a724fdb-e423-4232-9626-0cffc53fb74b")]
public class Feature1EventReceiver : SPFeatureReceiver
{
const string List_JOB_NAME = "ListLogger";
// Uncomment the method below to handle the event raised after a feature has been activated.
public override void FeatureActivated(SPFeatureReceiverProperties properties)
{
SPSite site = properties.Feature.Parent as SPSite;
// make sure the job isn't already registered
foreach (SPJobDefinition job in site.WebApplication.JobDefinitions)
{
if (job.Name == List_JOB_NAME)
job.Delete();
}
// install the job
ListTimerJob listLoggerJob = new ListTimerJob(List_JOB_NAME, site.WebApplication);
SPMinuteSchedule schedule = new SPMinuteSchedule();
schedule.BeginSecond = 0;
schedule.EndSecond = 59;
schedule.Interval = 5;
listLoggerJob.Schedule = schedule;
listLoggerJob.Update();
}
// Uncomment the method below to handle the event raised before a feature is deactivated.
public override void FeatureDeactivating(SPFeatureReceiverProperties properties)
{
SPSite site = properties.Feature.Parent as SPSite;
// delete the job
foreach (SPJobDefinition job in site.WebApplication.JobDefinitions)
{
if (job.Name == List_JOB_NAME)
job.Delete();
}
}
I would also advice you not to use SharePoint timer jobs engine.
It's definitely not stable.
Sometimes jobs simply don't trigger, they are difficult and slow to instantiate.
Of course you could always spend time tweaking SharePoint to achieve stability, but there is no guarantee. I know it sounds imperative, but trust me, I can't remember all the problems we had with this engine, but we lost much time on it.
I recommend you Quartz.NET or Windows Scheduler, as mentionned earlier.
These are well proven solutions, used by many people, also for SharePoint.
We implemented Quartz.Net for SharePoint at my company, all our Timer Jobs run on this engine.
We had no glitch for two years.
Best Regards.
You should change the SPMinuteSchedule to SPMonthlyByDaySchedule, see http://msdn.microsoft.com/en-us/library/microsoft.sharepoint.spschedule.aspx.
But, my recommendation is to use windows server scheduler and console application. Easy to change, easy to maintain (not iisreset!!) , and easy to log everything. We use console applications for various scheduled jobs, varying from 1 hour to 1 day.
I have a code that I want to run in global.asax in ASP.NET. I want this code to run only on localhost and when the code on an EC2 instance, I want it to run another code (the second code should run when I deploy my project on Amazon Web Service EC2 server), how can I do that without using the DEBUG functionality?
To check for whether or not the request is from the local machine, do this:
bool isLocal = HttpContext.Current.Request.IsLocal;
if(isLocal)
{
// Do things that should only be done when request is local here
}
Note: Read HttpRequest.IsLocal documentation for more information.
I guess uyou can pick few environment variables frm these:
Environment Variables (type ENV)
EC2_AMI_ID
EC2_AKI_ID
EC2_ARI_ID
EC2_AMI_MANIFEST_PATH
EC2_PLACEMENT_AVAILABILITY_ZONE
EC2_HOSTNAME
EC2_INSTANCE_ID
EC2_INSTANCE_TYPE
EC2_LOCAL_HOSTNAME
EC2_PUBLIC_HOSTNAME
EC2_PUBLIC_IPV4
EC2_RESERVATION_ID
EC2_SECURITY_GROUPS
RS_EIP
RS_SERVER
RS_SKETCHY
RS_SYSLOG
RS_TOKEN
RS_SERVER_NAME
List taken from here: https://support.rightscale.com/09-Clouds/AWS/FAQs/FAQ_0013_-_What_EC2_environment_variables_are_available_in_RightScripts%3F
I think, that checking for availability of EC2_AMI_ID and EC2_INSTANCE_ID would be enough to answer your questions.
If you are using ASP.NET dev server (VS 2012 and earlier) use this method:
public static bool IsDebugWebServer()
{
if (HttpContext.Current != null && HttpContext.Current.Request != null)
{
return HttpContext.Current.Request.ServerVariables["SERVER_SOFTWARE"] == null || HttpContext.Current.Request.ServerVariables["SERVER_SOFTWARE"] == string.Empty;
}
else
{
return false;
}
}
And if you are using local IIS Express use this:
public static bool IsDebugWebServer()
{
return String.Compare(Process.GetCurrentProcess().ProcessName, "iisexpress") == 0;
}