DynamoDB disable auto-scaling programmatically - c#

I want to run a daily update of a set of Dynamo tables. I have written a console app to do this however I want to be able to programmatically disable the capacity auto-scaling at the start of the update process and then re-enable it at the end.
I have managed to increase the provisioned throughput for both the table and it's Global Secondary Indexes using the UpdateTableAsync method but this does not have any options for handling auto-scaling and I can't find any other functionality to let me do this.
Does it even exist?
EDIT: I have found the CLI command required for this here: https://docs.aws.amazon.com/cli/latest/reference/application-autoscaling/delete-scaling-policy.html. My question is now, does this exist anywhere in the .NET SDK?

After a lot of digging through the AWS documentation (there doesn't seem to be any tutorials or examples, especially for .NET) I've discovered that this functionality does exist but it is not at Dynamo-level. It is an AWS-wide package that handles auto-scaling for all AWS resources.
There is a nuget package called AWSSDK.ApplicationAutoScaling. You'll need to create yourself an instance of AmazonApplicationAutoScalingClient (in the code below, this is represented by autoScaling).
When setting up auto-scaling in the AWS DynamoDB Console, two things are created; a description of the scaling (min capacity, max capacity etc) and a policy which I believe links the auto-scaling with CloudWatch so that alrms can be raised. Both of these objects need to be managed.
To solve my problem of disabling auto-scaling and then re-enabling it after updating my tables I had to following this process:
Save the policies and scaling descriptions (called ScalableTargets) before running the update.
this.preUpdatePolicies = (await autoScaling.DescribeScalingPoliciesAsync(new DescribeScalingPoliciesRequest
{
ResourceId = $"table/{this.tableName}",
ServiceNamespace = ServiceNamespace.Dynamodb,
ScalableDimension = ScalableDimension.DynamodbTableWriteCapacityUnits
})).ScalingPolicies;
this.preUpdateScaling = (await autoScaling.DescribeScalableTargetsAsync(new DescribeScalableTargetsRequest
{
ResourceIds = new List<string>() { $"table/{this.tableName}" },
ServiceNamespace = ServiceNamespace.Dynamodb,
ScalableDimension = ScalableDimension.DynamodbTableWriteCapacityUnits
})).ScalableTargets;
I then deregister the scaling descriptions which also deletes any associated policies.
foreach (var scaling in this.preUpdateScaling)
{
await autoScaling.DeregisterScalableTargetAsync(new DeregisterScalableTargetRequest
{
ResourceId = scaling.ResourceId,
ServiceNamespace = ServiceNamespace.Dynamodb,
ScalableDimension = ScalableDimension.DynamodbTableWriteCapacityUnits
});
}
After I have run my update I then reregister the descriptions/scalable targets and put the policies back based on the values I saved before running the update.
foreach (var scaling in this.preUpdateScaling)
{
await autoScaling.RegisterScalableTargetAsync(new RegisterScalableTargetRequest
{
ResourceId = scaling.ResourceId,
ServiceNamespace = scaling.ServiceNamespace,
ScalableDimension = scaling.ScalableDimension,
RoleARN = scaling.RoleARN,
MinCapacity = scaling.MinCapacity,
MaxCapacity = scaling.MaxCapacity
});
}
foreach (var policy in this.preUpdatePolicies)
{
await autoScaling.PutScalingPolicyAsync(new PutScalingPolicyRequest
{
ServiceNamespace = policy.ServiceNamespace,
ResourceId = policy.ResourceId,
ScalableDimension = policy.ScalableDimension,
PolicyName = policy.PolicyName,
PolicyType = policy.PolicyType,
TargetTrackingScalingPolicyConfiguration = policy.TargetTrackingScalingPolicyConfiguration
});
}
Hopefully this is helpful for anyone else who would like to use .NET to manage auto-scaling.

Related

Add older documents to stream - based on filter

I created a MongoDB watcher to create actions based on the created document.
Of course, the watcher does not detect documents that are created while the service itself is not running.
The current code is only detecting newly created documents.
How can I fetch and add older documents to the pipeline based on a field state e.g.: actionDone : true/false.
var pipeline =
new EmptyPipelineDefinition<ChangeStreamDocument<BsonDocument>>()
.Match(x => x.OperationType == ChangeStreamOperationType.Insert);
using (var cursor = collection.Watch(pipeline))
{
foreach (var change in cursor.ToEnumerable())
{
string mongoID = change.FullDocument.GetValue("_id").ToString();
}
}
Is StartAtOperationTime a option? Didnt find any good documentation here.
Update:
StartAtOperationTime was the solution I was looking for. If anybody is having the same problem, here my solution.
Start to lookup the last 10 days.
var options = new ChangeStreamOptions
{
StartAtOperationTime = new BsonTimestamp(DateTime.Now.AddDays(-10).Ticks)
};
var cursor = collection.Watch(pipeline,options)

Fetching more than 1000 rows from Domino LDAP server using .NET Core 5 and Novell.Directory.Ldap.NETStandard

I want to fetch all the users from a large location of our Domino LDAP, around ~2000 users altogether. Since .NET Core sadly doesn't have a platform independent LDAP library, I'm using Novell.Directory.Ldap.NETStandard with this POC:
var cn = new Novell.Directory.Ldap.LdapConnection();
cn.Connect("dc.internal", 389);
cn.Bind("user", "pw");
string filter = "location=MyLoc";
var result = cn.Search("", Novell.Directory.Ldap.LdapConnection.ScopeOne, filter, new string[] { Novell.Directory.Ldap.LdapConnection.AllUserAttrs }, typesOnly: false);
int count = 0;
while (result.HasMore()) {
var entry = result.Next();
count++;
Console.WriteLine(entry.Dn);
}
It prints me a lot of entries, but not all. When count = 1000 I got an Size Limit Exceeded exception. I guess this is because I need to use some kind of pagination, so not all entries woult be returned in a single request. There are different questions like this or this one. Both in Java, the .NET Core API seems somehow different.
Approach 1: Try to find out how LdapSearchRequest works in .NET Core
byte[] resumeCookie = null;
LdapMessageQueue queue = null;
var searchReq = new LdapSearchRequest("", LdapConnection.ScopeOne, filter, new string[] { LdapConnection.AllUserAttrs },
LdapSearchConstraints.DerefNever, maxResults: 3000, serverTimeLimit: 0, typesOnly: false, new LdapControl[] { new SimplePagedResultsControl(size: 100, resumeCookie) });
var searchRequest = cn.SendRequest(searchReq, queue);
I'm trying to figure out how the Java examples can be used in .NET Core. This looks good, however I can't figure out how to fetch the LDAP entries. I only get an message id. By looking into the source it seems that I'm on the right way, but they're using MessageAgent which cannot be used outside since it's internal sealed. This is propably the reason why searching for LdapRearchRequest in the source code doesn't give many results.
Approach 2: Using SimplePagedResultsControlHandler
var opts = new SearchOptions("", LdapConnection.ScopeOne, filter, new string[] { LdapConnection.AllUserAttrs });
// For testing purpose: https://github.com/dsbenghe/Novell.Directory.Ldap.NETStandard/issues/163
cn.SearchConstraints.ReferralFollowing = false;
var pageControlHandler = new SimplePagedResultsControlHandler(cn);
var rows = pageControlHandler.SearchWithSimplePaging(opts, pageSize: 100);
This throws a Unavaliable Cricital Extension exception. First I thought that this is an issue of the .NET port, which may doesn't support all the features of the original Java library yet. It seems complete and according to further researches, it looks like to be an LDAP error code. So this must be something which has to be supported by the server, but is not supported by Domino.
I couldn't make at least one of those approachs work, but found another way: Cross platform support for the System.DirectoryServices.Protocols namespace was was added in .NET 5. This was missing for a long time in .NET Core and I guess this is the main reason why libraries like Novell.Directory.Ldap.NETStandard were ported to .NET Core - in times of .NET Core 1.x this was the only way I found to authenticate against LDAP wich works on Linux too.
After having a deeper look into System.DirectoryServices.Protocols, it works well out of the box, even for ~2k users. My basic POC class looks like this:
public class DominoLdapManager {
LdapConnection cn = null;
public DominoLdapManager(string ldapHost, int ldapPort, string ldapBindUser, string ldapBindPassword) {
var server = new LdapDirectoryIdentifier(ldapHost, ldapPort);
var credentials = new NetworkCredential(ldapBindUser, ldapBindPassword);
cn = new LdapConnection(server);
cn.AuthType = AuthType.Basic;
cn.Bind(credentials);
}
public IEnumerable<DominoUser> Search(string filter, string searchBase = "") {
string[] attributes = { "cn", "mail", "companyname", "location" };
var req = new SearchRequest(searchBase, filter, SearchScope.Subtree, attributes);
var resp = (SearchResponse)cn.SendRequest(req);
foreach (SearchResultEntry entry in resp.Entries) {
var user = new DominoUser() {
Name = GetStringAttribute(entry, "cn"),
Mail = GetStringAttribute(entry, "mail"),
Company = GetStringAttribute(entry, "companyname"),
Location = GetStringAttribute(entry, "location")
};
yield return user;
}
yield break;
}
string GetStringAttribute(SearchResultEntry entry, string key) {
if (!entry.Attributes.Contains(key)) {
return string.Empty;
}
string[] rawVal = (string[])entry.Attributes[key].GetValues(typeof(string));
return rawVal[0];
}
}
Example usage:
var ldapManager = new DominoLdapManager("ldap.host", 389, "binduser", "pw");
var users = ldapManager.Search("objectClass=person");
But it's not solved with Novell.Directory.Ldap.NETStandard as the title said
This doesn't solve my problem with the Novell.Directory.Ldap.NETStandard library as the title suggested, yes. But since System.DirectoryServices.Protocols is a official .NET package maintained by Microsoft and the .NET foundation, this seems the better aproach for me. The foundation will take care to keep it maintained and compatible with further .NET releases. When I wrote the question, I was not aware of the fact that Linux support is added now.
Don't get me wrong, I don't want to say that third packages are bad by design - that would be completely wrong. However, when I have the choice between a official package and a third party one, I think it makes sense to prefer the official one. Except there would be a good reason against that - which is not the case here: The official package (which doesn't exist in the past) works better to solve this issue than the third party one.

How to use Google Cloud Speech (V1 API) for speech to text - need to be able to process over 3 hours audio files properly and efficiently

I am looking for documentation and stuff but could not find a solution yet
Installed NuGet package
Also generated API key
However can't find proper documentation how to use API key
Moreover, I want to be able to upload very long audio files
So what would be the proper way to upload up to 3 hours audio files and get their results?
I have 300$ budget so should be enough
Here my so far code
This code currently fails since I have not set the credentials correctly at the moment which I don't know how to
I also have service account file ready to use
public partial class MainWindow : Window
{
public MainWindow()
{
InitializeComponent();
}
private void Button_Click(object sender, RoutedEventArgs e)
{
var speech = SpeechClient.Create();
var config = new RecognitionConfig
{
Encoding = RecognitionConfig.Types.AudioEncoding.Flac,
SampleRateHertz = 48000,
LanguageCode = LanguageCodes.English.UnitedStates
};
var audio = RecognitionAudio.FromStorageUri("1m.flac");
var response = speech.Recognize(config, audio);
foreach (var result in response.Results)
{
foreach (var alternative in result.Alternatives)
{
Debug.WriteLine(alternative.Transcript);
}
}
}
}
I don't want to set environment variable. I have both API key and Service Account json file. How can I manually set?
You need to use the SpeechClientBuilder to create a SpeechClient with custom credentials, if you don't want to use the environment variable. Assuming you've got a service account file somewhere, change this:
var speech = SpeechClient.Create();
to this:
var speech = new SpeechClientBuilder
{
CredentialsPath = "/path/to/your/file"
}.Build();
Note that to perform a long-running recognition operation, you should also use the LongRunningRecognize method - I strongly suspect your current RPC will fail, either explicitly because it's trying to run on a file that's too large, or it'll just time out.
You need to set the environment variable before create the instance of Speech:
Environment.SetEnvironmentVariable("GOOGLE_APPLICATION_CREDENTIALS", "text-tospeech.json");
Where the second param (text-tospeech.json) is your file with credentials generated by Google Api.

How to setup Project Backlog in TFS pro grammatically?

I have look around the other post about this Project Backlog, but i want to those missing field in this image here
I need those missing fields like workitem, Title, Assigned To, State, Effort, Business.
I have this code with me right now.
/ Set up default team sprint date and time
var teamConfig = _tfs.GetService<TeamSettingsConfigurationService>();
var css = _tfs.GetService<ICommonStructureService4>();
string rootNodePath = string.Format("\\{0}\\Iteration\\Release 1\\Sprint 1", _selectedTeamProject.Name);
var pathRoot = css.GetNodeFromPath(rootNodePath);
css.SetIterationDates(pathRoot.Uri, DateTime.Now.AddDays(-5), DateTime.Now.AddDays(7));
var configs = teamConfig.GetTeamConfigurationsForUser(new[] { _selectedTeamProject.Uri });
var team = configs.Where(c => c.TeamName == "Demo").FirstOrDefault();
var ts = team.TeamSettings;
ts.BacklogIterationPath = string.Format(#"{0}\Release 1", _selectedTeamProject.Name);
ts.IterationPaths = new string[] { string.Format(#"{0}\Release 1\Sprint 1", _selectedTeamProject.Name), string.Format(#"{0}\Release 1\Sprint 2", _selectedTeamProject.Name) };
var tfv = new TeamFieldValue();
tfv.IncludeChildren = true;
tfv.Value = _selectedTeamProject.Name;
ts.TeamFieldValues = new []{tfv};
teamConfig.SetTeamSettings(team.TeamId, ts);
According to your screenshot, seems you are using the Work item Summary web part. After the upgrade to TFS2018, your TFS SharePoint sites will display, but all integration functionality is disabled.
The official recommended way is using TFS Dashboards for a better way to create dashboards. From that it's more easy to track/display the fields in a work item.
You could directly use some 3-party Work Item widget such as this one which also provides a summary for a selected work item.
To get or update work items such as product backlog fields pro grammatically, you could use Rest API-- Get a list of work items to handle this. It will also return all related fields name and value. Which also include a C# (GetWorkItemsByIDs method) sample code. About how to customize a dashboard in sharepoint, please take a look at this thread.

c# Directory Services Synchronization does not return changed relationships

I'm using C# to work with AD (Win 2012R2).
We are syncing AD users,groups and their relationship to SQL database.
Full sync works well.
But when using synchronization cookie, the relationship changes does not detected.
What may be the reason?
Thanks.
Here is my code:
public void DirSyncChanges(DirectoryEntry de, byte[] cookie)
{
DirectorySynchronization syncData = new DirectorySynchronization(cookie);
srch = new DirectorySearcher(de)
{
Filter = "(&(objectClass=user)(objectCategory=person))",
SizeLimit = Int32.MaxValue,
Tombstone = true
};
srch.DirectorySynchronization = syncData;
syncData.Option = DirectorySynchronizationOptions.None;
using(SearchResultCollection results = srch.FindAll())
foreach (SearchResult res in results)
{
//results is empty. no loop
}
}
Please specify the DirectorySearcher.PropertiesToLoad. Only if any of the attributes in PropertiesToLoad are updated, you will get them in delta sync.
As i remember the search root of DirSync must be naming context root object.
Better use paged search. No matter how large the value you set to SizeLimit. It will only return at most 1000 or 1500 (forgot exact number) results.
My answer is based on .NET 3.5.

Categories