With NEST version 6.x and NEST.JSonSerializer version 6.x, I am getting JSONSerialization exception, with inMemory elasticsearch for unit test.
I tried with Nest and JsonSersializer versions 7.x, and it was working fine. But my production server has 6.x versions, so I need to run the test on 6.x NEST Client.
var response = new
{
took = 1,
timed_out = false,
_shards = new
{
total = 2,
successful = 2,
failed = 0
},
hits = new
{
total = new { value = 25 },
max_score = 1.0,
hits = Enumerable.Range(1, 25).Select(i => (object)new
{
_index = "project",
_type = "project",
_id = $"Project {i}",
_score = 1.0,
_source = new { name = $"Project {i}" }
}).ToArray()
}
};
var responseBytes = Encoding.UTF8.GetBytes(JsonConvert.SerializeObject(response));
var connection = new InMemoryConnection(responseBytes, 200);
var connectionPool = new SingleNodeConnectionPool(new Uri("http://localhost:9200"));
var settings = new ConnectionSettings(connectionPool,connection).DefaultIndex("project");
settings.DisableDirectStreaming();
var client = new ElasticClient(settings);
var searchResponse = client.Search<Project>(s => s.MatchAll());
Expected output: response from inMemory elasticsearch
Getting Error:
JsonSerializationException: Cannot deserialize the current JSON object (e.g. {"name":"value"}) into type 'System.Int64' because the type requires a JSON primitive value (e.g. string, number, boolean, null) to deserialize correctly.
Schema of total part in the response body changed between versions 6.x and 7.x of elasticsearch. If you want to make your response body working with NEST 6.x you will need to change total part from
..
total = new { value = 25 },
..
to
..
total = 25,
..
Full example:
var response = new
{
took = 1,
timed_out = false,
_shards = new
{
total = 2,
successful = 2,
failed = 0
},
hits = new
{
total = 25,
max_score = 1.0,
hits = Enumerable.Range(1, 25).Select(i => (object)new
{
_index = "project",
_type = "project",
_id = $"Project {i}",
_score = 1.0,
_source = new { name = $"Project {i}" }
}).ToArray()
}
};
Hope that helps.
Related
I am getting a query exception SQL Error [1] [50000]: General error: "class org.apache.ignite.IgniteException: Fetched result set was too large. IGNITE_SQL_MERGE_TABLE_MAX_SIZE(10000) should be increased.
I am running Ignite embedded within my .net application
Unable to identify how to change this setting in the ignite configuration when starting ignite within a .net application
igniteConfiguration = new IgniteConfiguration
{
DataStorageConfiguration = new DataStorageConfiguration
{
StoragePath = storagePath,
DefaultDataRegionConfiguration = new DataRegionConfiguration
{
Name = "Default_Region",
PersistenceEnabled = true,
InitialSize = 100L * 1024L * 1024L,
MaxSize = 500L * 1024L * 1024L,
MetricsRateTimeInterval = TimeSpan.FromMinutes(5)
}
},
AuthenticationEnabled = true,
Logger = new IgniteLog4NetLogger(log),
ConsistentId = hostname,
IgniteInstanceName = hostname,
ClientMode = false,
JvmOptions = new[] { "-Xms512m", "-Xmx512m" },
BinaryConfiguration = new Apache.Ignite.Core.Binary.BinaryConfiguration(new Type[]
{
typeof(XYZ),
}),
MetricsLogFrequency = TimeSpan.FromMinutes(5),
};
You can set the option using JvmOptions property, as shown in the example below:
igniteConfiguration = new IgniteConfiguration
{
...
JvmOptions = new[] { "-Xms512m", "-Xmx512m", "-DIGNITE_SQL_MERGE_TABLE_MAX_SIZE=10000" },
...
};
Also, you can find additional details regarding .NET configuration options here.
I'm using .NET 5 and the nuget package Google.Analytics.Data.V1Beta to query Google Analytics. But I need to query events using the event_category and the event_label and I get this error: "Field customEvent:event_category is not a valid dimension". This is my code:
var click = new FilterExpression
{
Filter = new Filter
{
FieldName = "ga:event_category",
StringFilter = new Filter.Types.StringFilter
{
CaseSensitive = false,
MatchType = Filter.Types.StringFilter.Types.MatchType.Exact,
Value = "click"
}
},
};
var request = new RunReportRequest
{
Property = $"properties/{_settings.GoogleAnalytics.PropertyId}",
Dimensions = {new Dimension {Name = "eventName"}, new Dimension {Name = "customEvent:event_category" } },
DimensionFilter = click,
Metrics = {new Metric {Name = "eventCount"}},
DateRanges =
{
new DateRange
{StartDate = startDate.ToString("yyyy-MM-dd"), EndDate = endDate.ToString("yyyy-MM-dd")}
}
};
var response = await _analyticsClient.GetClient().RunReportAsync(request);
How do I pass event_category and event_label to the query?
I'm trying to create a bulk delete job in Dynamics 365 using the web api. As reference I've used the following web pages:
https://learn.microsoft.com/en-us/dynamics365/customer-engagement/web-api/bulkdelete?view=dynamics-ce-odata-9
Unable to call the BulkDelete action from Microsoft Dynamics CRM WebAPI
I'm using api-version 9.1.
I've gotten most of it to work and have removed quite a few validation errors, so I know I'm on the right track. However, now I get the following error message: "The Entity bookableresourcebooking does not support Synchronous Bulk Delete".
When I try to create the same bulk delete job manually in Dynamics, no errors occur.
Can anyone help me resolve this error?
The relevant code I'm using:
var relativeUrl = "BulkDelete()";
var bulkDelete = new BulkDeleteRequest("Delete all future bookings");
var querySet = new QuerySet();
querySet.EntityName = "bookableresourcebooking";
querySet.Distinct = false;
var conditionStarttimeGreaterEqualToday = new Condition();
conditionStarttimeGreaterEqualToday.AttributeName = "starttime";
conditionStarttimeGreaterEqualToday.Operator = "OnOrAfter";
conditionStarttimeGreaterEqualToday.Values = new List<ValueClass>();
conditionStarttimeGreaterEqualToday.Values.Add(new ValueClass(new DateTime(DateTime.Now.Year, DateTime.Now.Month, DateTime.Now.Day).ToUniversalTime().ToString("o"), "System.DateTime"));
var conditionVoltooidOpEmpty = new Condition();
conditionVoltooidOpEmpty.AttributeName = "new_voltooidop";
conditionVoltooidOpEmpty.Operator = "Null";
conditionVoltooidOpEmpty.Values = new List<ValueClass>();
querySet.Criteria = new Criteria();
querySet.Criteria.FilterOperator = "And";
querySet.Criteria.Conditions.Add(conditionStarttimeGreaterEqualToday);
querySet.Criteria.Conditions.Add(conditionVoltooidOpEmpty);
bulkDelete.QuerySet.Add(querySet);
await _crmClient.PostCRMData(relativeUrl, JsonConvert.SerializeObject(bulkDelete)); //Dependency injected httpclient.
Extra info:
bookableresourcebooking is a standard entity that comes with Field Service
new_voltooidop up is a custom datetime field I've added to this entity
We have to make this job asynchronous and for that we have to pass RunNow: false. Then this error will disappear.
I have tested this working code in CRM REST Builder. But this is JS equivalent.
var parameters = {};
var queryset1 = {
EntityName: "account",
ColumnSet: {
AllColumns: true
},
Distinct: false,
};
queryset1["#odata.type"] = "Microsoft.Dynamics.CRM.QueryExpression";
parameters.QuerySet = [queryset1];
parameters.JobName = "arun test";
parameters.SendEmailNotification = false;
var torecipients1 = {};
torecipients1.activitypartyid = "00000000-0000-0000-0000-000000000000"; //Delete if creating new record
torecipients1["#odata.type"] = "Microsoft.Dynamics.CRM.activityparty";
parameters.ToRecipients = [torecipients1];
var ccrecipients1 = {};
ccrecipients1.activitypartyid = "00000000-0000-0000-0000-000000000000"; //Delete if creating new record
ccrecipients1["#odata.type"] = "Microsoft.Dynamics.CRM.activityparty";
parameters.CCRecipients = [ccrecipients1];
parameters.RecurrencePattern = "FREQ=DAILY;";
parameters.StartDateTime = JSON.stringify(new Date("05/07/2021 13:30:00").toISOString());
parameters.RunNow = false;
var bulkDeleteRequest = {
QuerySet: parameters.QuerySet,
JobName: parameters.JobName,
SendEmailNotification: parameters.SendEmailNotification,
ToRecipients: parameters.ToRecipients,
CCRecipients: parameters.CCRecipients,
RecurrencePattern: parameters.RecurrencePattern,
StartDateTime: parameters.StartDateTime,
RunNow: parameters.RunNow,
getMetadata: function() {
return {
boundParameter: null,
parameterTypes: {
"QuerySet": {
"typeName": "Collection(mscrm.QueryExpression)",
"structuralProperty": 4
},
"JobName": {
"typeName": "Edm.String",
"structuralProperty": 1
},
"SendEmailNotification": {
"typeName": "Edm.Boolean",
"structuralProperty": 1
},
"ToRecipients": {
"typeName": "Collection(mscrm.activityparty)",
"structuralProperty": 4
},
"CCRecipients": {
"typeName": "Collection(mscrm.activityparty)",
"structuralProperty": 4
},
"RecurrencePattern": {
"typeName": "Edm.String",
"structuralProperty": 1
},
"StartDateTime": {
"typeName": "Edm.DateTimeOffset",
"structuralProperty": 1
},
"RunNow": {
"typeName": "Edm.Boolean",
"structuralProperty": 1
}
},
operationType: 0,
operationName: "BulkDelete"
};
}
};
Xrm.WebApi.online.execute(bulkDeleteRequest).then(
function success(result) {
if (result.ok) {
var results = JSON.parse(result.responseText);
}
},
function(error) {
Xrm.Utility.alertDialog(error.message);
}
);
I have a simple C# windows forms app. What I am trying to accomplish is modify the Default Root Object of one of my Cloud front Distributions. I can't seem to find any related articles that describes how this is accomplished. Please help
After alot of trial and error Here's the C# code to update an AWS cloud front distribution. You will need AWS's new modularized assemblies. AWSSKD.Core v3 & AWSSDK.Cloudfront.
First you need to get the current Distribution for 2 reasons. Mostly for validation you'll need to grab the Etag and Caller Reference. Etag to var and Caller Reference to string.
var client2 = new AmazonCloudFrontClient();
var tag = client2.GetDistributionConfig(new GetDistributionConfigRequest
{
Id = "YOURDISTID"
}).ETag;
string cf = client2.GetDistributionConfig(new GetDistributionConfigRequest
{
Id = "YOURDISTID"
}).DistributionConfig.CallerReference;
client2.Dispose();
Next you will need to make a update to the distribution. What I have below is the minimally required to update a distribution (Pretty much Everything you see in the AWS console when editing a distribution.
Take notice of where tag variable and cf string are used. If the Etag's do not match you will get 400 bad request back.
var client = new AmazonCloudFrontClient();
client.UpdateDistribution(new UpdateDistributionRequest
{
Id = "YOURDISTID",
DistributionConfig = new DistributionConfig
{
WebACLId = "",
HttpVersion = "http2",
IsIPV6Enabled = true,
DefaultRootObject = "maintenance.html",
CacheBehaviors = new CacheBehaviors {
Quantity = 0,
},
Restrictions = new Restrictions {
GeoRestriction = new GeoRestriction
{
Quantity = 0,
RestrictionType = "none"
}
},
CustomErrorResponses = new CustomErrorResponses {
Quantity = 0
},
ViewerCertificate = new ViewerCertificate {
SSLSupportMethod = "sni-only",
ACMCertificateArn = "YOUR_IMPORTED_CERT_ARN",
MinimumProtocolVersion = "TLSv1.1_2016"
},
Enabled = true,
Comment = "Maintenance",
Origins = new Origins
{
Items = new List<Origin>() {
new Origin(){Id = "S3-example.example.com", DomainName = "example.example.com.s3.amazonaws.com", S3OriginConfig = new S3OriginConfig(){ OriginAccessIdentity = "" }, OriginPath = "", CustomHeaders = new CustomHeaders{ Quantity = 0 } }
},
Quantity = 1
},
Logging = new Amazon.CloudFront.Model.LoggingConfig
{
Bucket = "example.example.com.s3.amazonaws.com",
IncludeCookies = false,
Enabled = false,
Prefix = ""
},
PriceClass = "PriceClass_All",
Aliases = new Aliases
{
Quantity = 1,
Items = list
},
CallerReference = cf,
DefaultCacheBehavior = new DefaultCacheBehavior
{
ForwardedValues = new ForwardedValues
{
QueryString = false,
QueryStringCacheKeys = new QueryStringCacheKeys {
Quantity = 0
},
Headers = new Headers {
Quantity = 0
},
Cookies = new CookiePreference
{
Forward = "none"
}
},
AllowedMethods = new AllowedMethods {
Quantity = 2,
Items = httpmeth,
CachedMethods = new CachedMethods
{
Quantity = 2,
Items = httpmeth
}
},
DefaultTTL = 86400,
Compress = false,
MaxTTL = 31536000,
TargetOriginId = "S3-example.example.com",
LambdaFunctionAssociations = new LambdaFunctionAssociations {
Quantity = 0
},
ViewerProtocolPolicy = "allow-all",
MinTTL = 0,
SmoothStreaming = false,
TrustedSigners = new TrustedSigners
{
Enabled = false,
Quantity = 0,
},
}
},
IfMatch = tag
});
client.Dispose();
If you have issues with ACMCertificationARN not being found in the reference Assembly chances are you are using the old v2 AWSSDK. Remove\Uninstall it. Get the latest Nuget Package for AWSSKD.Core & AWSSDK.CloudFront
Nuget Package Manager Console Install:
Install-Package AWSSDK.Core -Version 3.3.21.17
Install-Package AWSSDK.CloudFront -Version 3.3.6.3
I am trying to use filter with search query. Search requset works correctly without filter. But using filter I get 400 error as response.
This is type mapping:
var mapp = new
{
mappings = new
{
posts = new
{
properties = new
{
FullText = new
{
type = "string",
analyzer = "russian"
},
Title = new
{
type = "string",
analyzer = "russian"
},
PostPubDate = new
{
type = "date"
},
Link = new
{
type = "string",
index = "not_analyzed"
},
RubricsIds = new
{
type = "integer"
},
ObjectsIds = new
{
type = "integer"
},
SourceId = new
{
type = "integer"
}
}
}
}
};
This is a request to index with filtered query:
string url = "http://localhost:9200/neg_collector/posts/_search";
var request = (HttpWebRequest)HttpWebRequest.Create(url);
var o = new
{
size = 20,
query = new
{
filtered = new
{
query = new
{
query_string = new
{
fields = new[] { "Title" },
query = search_query
}
},
filter = new
{
#bool = new
{
should = new
{
term = new
{
SourceId = sIds
}
}
}
}
}
}
};
request.Method = "POST";
var jsonObj = JsonConvert.SerializeObject(o);
var data = Encoding.UTF8.GetBytes(jsonObj);
request.ContentType = "application/x-www-form-urlencoded";
request.ContentLength = data.Length;
I want to use an array of integers to filter result with certain SourceId-s. But I got error 400.
What am I doing wrong? Thank you
So the problem was that this syntax was for elasticsearch 2 version (And it worked nice on another computer). Here I have ElasticSearch 5 and should use another was of filtering:
var o = new
{
size = 20,
query = new
{
#bool = new
{
must = new
{
query_string = new
{
fields = new[] { "Title" },
query = search_query
}
},
filter = new
{
terms = new
{
SourceId = new[] {10,11,12}
}
}
}
}
};
It is describerd HERE