I need to get the results of google searches in order to loop through and parse them. With that aim in view, I followed (as best I could) the tutorial on how to do that here
This is my code, based on the sample/example code in the article referenced above:
private void btnRentFlick_Click(object sender, EventArgs e)
{
OpenBestPageForSearchString("rent amazon movie Will Penny");
}
private void OpenBestPageForSearchString(string searchStr)
{
try
{
const string apiKey = "blaBlaBla"; // "blaBlaBla" stands for my API key
const string searchEngineId = "bla"; // "bla" stands for various things I tried: my client_id
(also called UniqueId), private_key_id (also called KeyId), and project_id. Not having
the correct value may be the problem. If so, how do I get it?
const string query = "rent amazon movie Will Penny";
var customSearchService = new CustomsearchService(new BaseClientService.Initializer { ApiKey
= apiKey });
//CseResource.ListRequest listRequest = customSearchService.Cse.List(query); // This is the
code in the article, but it won't compile - "no overload for "List" takes one argument"
// So how is the value in "query" assigned, then?
CseResource.ListRequest listRequest = customSearchService.Cse.List();
listRequest.Cx = searchEngineId;
List<string> linksReturned = new List<string>();
IList<Result> paging = new List<Result>();
var count = 0; // I don't know what the purpose of the counting is, but I'll leave as-is
until I get it working at least
while (paging != null)
{
listRequest.Start = count * 10 + 1;
paging = listRequest.Execute().Items; // this takes several seconds, then it throws an
exception
if (paging != null)
{
foreach (var item in paging)
{
linksReturned.Add("Title : " + item.Title + Environment.NewLine + "Link : " +
item.Link +
Environment.NewLine + Environment.NewLine);
}
}
count++;
}
MessageBox.Show("Done with google amazon query");
}
catch (Exception ex)
{
MessageBox.Show(ex.Message);
}
}
As the comment at the end of that line says, this line of code:
paging = listRequest.Execute().Items;
...works for several seconds, then throws an exception, namely this:
So what is causing this exception? Is it because the searchEngineId value I assigned is bad? Or is it because the search string (assigned to the query variable) has not been provided to the call?
The info about my Ids is contained in a .json file provided by google, and there is no "searchEngineId" value in it. This is what it does contain:
"type": "service_account", "project_id": "flix4famsasinlocator",
"private_key_id": "[my private key id]", "private_key": "-----BEGIN
PRIVATE KEY-----. . . PRIVATE KEY-----\n", "client_email":
"[bla].gserviceaccount.com", "client_id": "[my client Id]",
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://oauth2.googleapis.com/token",
"auth_provider_x509_cert_url":
"https://www.googleapis.com/oauth2/v1/certs",
"client_x509_cert_url":
"https://www.googleapis.com/robot/v1/metadata/x509/[bla]gserviceaccount.com"
So though the article previously mentioned purported to be, and at first appeared to be, just what the doctor ordered, I have ran into a wall of considerable dimensions. Does anybody know how to scale this wall - perhaps primarily by providing the search string to the CseResource.ListRequest object?
UPDATE
Trying DalmTo's code first, I used this (not showing his GetService() method, which I copied verbatim):
var query = "rent amazon movie Will Penny";
var service = GetService("theRainInSpainFallsMainlyOnTheDirt");
var request = service.Cse.List();
// add option values to the request here.
request.ExactTerms = query;
request.Q = query;
var response = request.ExecuteAsync();
// my contribution:
List<string> linksReturned = new List<string>();
foreach (var item in response.Result.Items)
{
//Console.WriteLine(item.Title);
// next two lines also mine
MessageBox.Show(string.Format("Title: {0}; Link: {1}; ETag: {2}", item.Title, item.Link, item.ETag));
linksReturned.Add(item.Link);
}
...but this exception was thrown while in the foreach loop:
UPDATE 2
Yes, this works (adapted from Trekco's answer):
const string apiKey = "gr8GooglyMoogly";
const string searchEngineId = "theRainInSpainFallsMainOnTheDirt";
const string query = "rent amazon movie Will Penny";
var customSearchService = new CustomsearchService(new BaseClientService.Initializer { ApiKey = apiKey });
CseResource.ListRequest listRequest = customSearchService.Cse.List();
listRequest.Cx = searchEngineId;
listRequest.Q = query;
List<string> linksReturned = new List<string>();
IList<Result> paging = new List<Result>();
var count = 0;
while (paging != null)
{
listRequest.Start = count * 10 + 1;
paging = listRequest.Execute().Items;
if (paging != null)
{
foreach (var item in paging)
{
linksReturned.Add(item.Link);
}
}
count++;
}
The query is not being send to google. To fix your code you need to tell the api what query to use. After listRequest.Cx = searchEngineId; add listRequest.Q = query;
var count = 0;
string apiKey = "THE API KEY";
string searchEngineId = "THE SEARCH ENGIN ID";
string query = "rent amazon movie Will Penny";
var customSearchService = new CustomsearchService(new BaseClientService.Initializer
{
ApiKey = apiKey
});
CseResource.ListRequest listRequest = customSearchService.Cse.List();
listRequest.Cx = searchEngineId;
listRequest.Q = query; // <---- Add this line
List<string> linksReturned = new List<string>();
while (count < 10) // Google limit you to 100 records
{
listRequest.Start = count * 10;
var paging = listRequest.Execute().Items;
foreach (var item in paging)
{
linksReturned.Add("Title : " + item.Title + Environment.NewLine + "Link : " +
item.Link +
Environment.NewLine + Environment.NewLine);
}
count++;
}
In your code you have a comment that you don't know what var count = 0; is for. It is to keep track on how many items you have requested.
If you look at google's documentation you will see that they will only return 100 results max. After that they will give you a error. That error will also be the same generic message: "INVALID_ARGUMENT"
You can review the custom search api requirements here: https://developers.google.com/custom-search/v1/reference/rest/v1/cse/list
The searchEngineId variable is the search Engine id that you generate on the site https://www.google.com/cse/all. The documentation you followed is a bit out of date. you will find the id here:
If you check the doucmntation cse.list I think you will find that the list method has no required fields which means that you need to add the option values in the following manner.
I think ExactTerms may be the one you are looking for. But it could also be Q i think you should read though the option values and decide which one is best for your purpose.
var query = "rent amazon movie Will Penny";
var service = GetService("MYKEY");
var request = service.Cse.List();
// add option values to the request here.
request.ExactTerms = query;
request.Q = query;
var response = request.ExecuteAsync();
foreach (var item in response.Result.Items)
{
Console.WriteLine(item.Title);
}
My get service method
public static CustomsearchService GetService(string apiKey)
{
try
{
if (string.IsNullOrEmpty(apiKey))
throw new ArgumentNullException("api Key");
return new CustomsearchService(new BaseClientService.Initializer()
{
ApiKey = apiKey,
ApplicationName = string.Format("{0} API key example", System.Diagnostics.Process.GetCurrentProcess().ProcessName),
});
}
catch (Exception ex)
{
throw new Exception("Failed to create new Customsearch Service", ex);
}
}
I am using "Kentor.AuthServices.dll" and "Kentor.AuthServices.Mvc.dll" in my code to allowing Single sign on with ADFS server and it is working fine but the problem is that it is taking around more than 1 min show the adfs login screen.
I have debugged the code and record the timing and found the all the code working fine but identity provider creating code is taking more than 1 min.
I am not able to understand why it is taking too much time.
I am putting my code below can anyone please help?
thanks in advance.
try
{
CommonUtility.LogMessage("Start at:" + DateTime.Now);
string adfsUrl = System.Configuration.ConfigurationManager.AppSettings["ADServer"] ?? "";
if(string.IsNullOrEmpty(adfsUrl))
{
CommonUtility.LogMessage("no adfs server found in config");
return RedirectToAction("Login", "Account", string.Empty);
}
string requestUrlScheme = System.Configuration.ConfigurationManager.AppSettings["ADInstance"] ?? "https";
string federationUrl = System.Configuration.ConfigurationManager.AppSettings["ADFSMetaData"] ?? "";
CommonUtility.LogMessage("metdaDataUrl=" + federationUrl);
string trustUrl = string.Format("{0}/adfs/services/trust", adfsUrl);
CommonUtility.LogMessage("trustURL=" + trustUrl);
var idps = Kentor.AuthServices.Mvc.AuthServicesController.Options.IdentityProviders.KnownIdentityProviders;
foreach (var idpItem in idps)
{
CommonUtility.LogMessage("existing ENtity ID=" + idpItem.EntityId.Id);
if (idpItem.EntityId.Id.Equals(trustUrl))
{
Kentor.AuthServices.Mvc.AuthServicesController.Options.IdentityProviders.Remove(idpItem.EntityId);
CommonUtility.LogMessage("removed existing entity at:" + DateTime.Now);
}
}
var spOptions = CreateSPOptions(requestUrlScheme);
CommonUtility.LogMessage("SP option created at:" + DateTime.Now);
Kentor.AuthServices.IdentityProvider idp = null;
**idp = new Kentor.AuthServices.IdentityProvider(new EntityId(trustUrl), spOptions)
{
AllowUnsolicitedAuthnResponse = true,
LoadMetadata = true,
MetadataLocation = federationUrl,
};**
CommonUtility.LogMessage("idp added at:" + DateTime.Now);
if (Kentor.AuthServices.Mvc.AuthServicesController.Options.SPOptions.EntityId == null)
Kentor.AuthServices.Mvc.AuthServicesController.Options.SPOptions.EntityId = new EntityId(string.Concat(string.Format("{0}://{1}{2}", requestUrlScheme, Request.Url.Authority, Url.Content("~")), "AuthServices"));
else
Kentor.AuthServices.Mvc.AuthServicesController.Options.SPOptions.EntityId.Id =
string.Concat(string.Format("{0}://{1}{2}", requestUrlScheme, Request.Url.Authority, Url.Content("~")), "AuthServices");
CommonUtility.LogMessage("AuthServicesURL=" + string.Concat(string.Format("{0}://{1}{2}", requestUrlScheme, Request.Url.Authority, Url.Content("~")), "AuthServices"));
Kentor.AuthServices.Mvc.AuthServicesController.Options.SPOptions.ReturnUrl =
new Uri(string.Concat(string.Format("{0}://{1}{2}", requestUrlScheme, Request.Url.Authority, Url.Content("~")), "SAMLAuthentication/SAMLResponse"));
CommonUtility.LogMessage("SAMLResponseURL=" + string.Concat(string.Format("{0}://{1}{2}", requestUrlScheme, Request.Url.Authority, Url.Content("~")), "SAMLAuthentication/SAMLResponse"));
Kentor.AuthServices.Mvc.AuthServicesController.Options.IdentityProviders.Add(idp);
CommonUtility.LogMessage("redirect times:" + DateTime.Now);
return RedirectToAction("SignIn", "AuthServices", new { idp = trustUrl });
}
catch (Exception ex)
{
CommonUtility.LogException(ex);
throw ex;
}
When you use "LoadMetadata", the IdentityProvider object will load the metadata from the remote address at construction time. If I remember correctly, that's done synchronously to be able to report errors back as an exception. Does it take time (or give a timeout) to download the metadata?
As the title suggests, I need to insert 100,000+ records into a DocumentDb collection programatically. The data will be used for creating reports later on. I am using the Azure Documents SDK and a stored procedure for bulk inserting documents (See question Azure documentdb bulk insert using stored procedure).
The following console application shows how I'm inserting documents.
InsertDocuments generates 500 test documents to pass to the stored procedure. The main function calls InsertDocuments 10 times, inserting 5,000 documents overall. Running this application results in 500 documents getting inserted every few seconds. If I increase the number of documents per call I start to get errors and lost documents.
Can anyone recommend a faster way to insert documents?
static void Main(string[] args)
{
Console.WriteLine("Starting...");
MainAsync().Wait();
}
static async Task MainAsync()
{
int campaignId = 1001,
count = 500;
for (int i = 0; i < 10; i++)
{
await InsertDocuments(campaignId, (count * i) + 1, (count * i) + count);
}
}
static async Task InsertDocuments(int campaignId, int startId, int endId)
{
using (DocumentClient client = new DocumentClient(new Uri(documentDbUrl), documentDbKey))
{
List<dynamic> items = new List<dynamic>();
// Create x number of documents to insert
for (int i = startId; i <= endId; i++)
{
var item = new
{
id = Guid.NewGuid(),
campaignId = campaignId,
userId = i,
status = "Pending"
};
items.Add(item);
}
var task = client.ExecuteStoredProcedureAsync<dynamic>("/dbs/default/colls/campaignusers/sprocs/bulkImport", new RequestOptions()
{
PartitionKey = new PartitionKey(campaignId)
},
new
{
items = items
});
try
{
await task;
int insertCount = (int)task.Result.Response;
Console.WriteLine("{0} documents inserted...", insertCount);
}
catch (Exception e)
{
Console.WriteLine("Error: {0}", e.Message);
}
}
}
The fastest way to insert documents into Azure DocumentDB. is available as a sample on Github: https://github.com/Azure/azure-documentdb-dotnet/tree/master/samples/documentdb-benchmark
The following tips will help you achieve the best througphput using the .NET SDK:
Initialize a singleton DocumentClient
Use Direct connectivity and TCP protocol (ConnectionMode.Direct and ConnectionProtocol.Tcp)
Use 100s of Tasks in parallel (depends on your hardware)
Increase the MaxConnectionLimit in the DocumentClient constructor to a high value, say 1000 connections
Turn gcServer on
Make sure your collection has the appropriate provisioned throughput (and a good partition key)
Running in the same Azure region will also help
With 10,000 RU/s, you can insert 100,000 documents in about 50 seconds (approximately 5 request units per write).
With 100,000 RU/s, you can insert in about 5 seconds. You can make this as fast as you want to, by configuring throughput (and for very high # of inserts, spread inserts across multiple VMs/workers)
EDIT: You can now use the bulk executor library at https://learn.microsoft.com/en-us/azure/cosmos-db/bulk-executor-overview, 7/12/19
The Cosmos Db team have just released a bulk import and update SDK, unfortunately only available in Framework 4.5.1 but this apparently does a lot of the heavy lifting for you and maximize use of throughput. see
https://learn.microsoft.com/en-us/azure/cosmos-db/bulk-executor-overview
https://learn.microsoft.com/en-us/azure/cosmos-db/sql-api-sdk-bulk-executor-dot-net
Cosmos DB SDK has been updated to allow bulk insert: https://learn.microsoft.com/en-us/azure/cosmos-db/tutorial-sql-api-dotnet-bulk-import via the AllowBulkExecution option.
Other aproach is stored procedure as mentioned by other people . Stored procedure requires partitioning key. Also stored procedure should end within 4sec as per documentation otherwise all records will rollback. See code below using python azure documentdb sdk and javascript based stored procedure. I have modified the script and resolved lot of error below code is working fine:-
function bulkimport2(docObject) {
var collection = getContext().getCollection();
var collectionLink = collection.getSelfLink();
// The count of imported docs, also used as current doc index.
var count = 0;
getContext().getResponse().setBody(docObject.items);
//return
// Validate input.
//if (!docObject.items || !docObject.items.length) getContext().getResponse().setBody(docObject);
docObject.items=JSON.stringify(docObject.items)
docObject.items = docObject.items.replace("\\\\r", "");
docObject.items = docObject.items.replace("\\\\n", "");
var docs = JSON.parse(docObject.items);
var docsLength = docObject.items.length;
if (docsLength == 0) {
getContext().getResponse().setBody(0);
return;
}
// Call the CRUD API to create a document.
tryCreate(docs[count], callback, collectionLink,count);
// Note that there are 2 exit conditions:
// 1) The createDocument request was not accepted.
// In this case the callback will not be called, we just call setBody and we are done.
// 2) The callback was called docs.length times.
// In this case all documents were created and we don't need to call tryCreate anymore. Just call setBody and we are done.
function tryCreate(doc, callback, collectionLink,count ) {
doc=JSON.stringify(doc);
if (typeof doc == "undefined") {
getContext().getResponse().setBody(count);
return ;
} else {
doc = doc.replace("\\r", "");
doc = doc.replace("\\n", "");
doc=JSON.parse(doc);
}
getContext().getResponse().setBody(doc);
var isAccepted = collection.upsertDocument(collectionLink, doc, callback);
// If the request was accepted, callback will be called.
// Otherwise report current count back to the client,
// which will call the script again with remaining set of docs.
// This condition will happen when this stored procedure has been running too long
// and is about to get cancelled by the server. This will allow the calling client
// to resume this batch from the point we got to before isAccepted was set to false
if (!isAccepted) {
getContext().getResponse().setBody(count);
}
}
// This is called when collection.createDocument is done and the document has been persisted.
function callback(err, doc, options) {
if (err) throw getContext().getResponse().setBody(err + doc);
// One more document has been inserted, increment the count.
count++;
if (count >= docsLength) {
// If we have created all documents, we are done. Just set the response.
getContext().getResponse().setBody(count);
return ;
} else {
// Create next document.
tryCreate(docs[count], callback, collectionLink,count);
}
}
}
EDIT:- getContext().getResponse().setBody(count);
return ; //when all records are processed completely.
python script to load stored procedure and do batch import
# Initialize the Python DocumentDB client
client = document_client.DocumentClient(config['ENDPOINT'], {'masterKey': config['MASTERKEY'] ,'DisableSSLVerification' : 'true' })
# Create a database
#db = client.CreateDatabase({ 'id': config['DOCUMENTDB_DATABASE'] })
db=client.ReadDatabases({ 'id': 'db2' })
print(db)
# Create collection options
options = {
'offerEnableRUPerMinuteThroughput': True,
'offerVersion': "V2",
'offerThroughput': 400
}
# Create a collection
#collection = client.CreateCollection('dbs/db2' , { 'id': 'coll2'}, options)
#collection = client.CreateCollection({ 'id':'db2'},{ 'id': 'coll2'}, options)
database_link = 'dbs/db2'
collection_link = database_link + '/colls/coll2'
"""
#List collections
collection = client.ReadCollection(collection_link)
print(collection)
print('Databases:')
databases = list(client.ReadDatabases())
if not databases:
print('No Databases:')
for database in databases:
print(database['id'])
"""
# Create some documents
"""
document1 = client.CreateDocument(collection['_self'],
{
'Web Site': 0,
'Cloud Service': 0,
'Virtual Machine': 0,
'name': 'some'
})
document2 = client.CreateDocument(collection['_self'],
{
'Web Site': 1,
'Cloud Service': 0,
'Virtual Machine': 0,
'name': 'some'
})
"""
# Query them in SQL
"""
query = { 'query': 'SELECT * FROM server s' }
options = {}
options['enableCrossPartitionQuery'] = True
options['maxItemCount'] = 20
#result_iterable = client.QueryDocuments(collection['_self'], query, options)
result_iterable = client.QueryDocuments(collection_link, query, options)
results = list(result_iterable);
print(results)
"""
##How to store procedure and use it
"""
sproc3 = {
'id': 'storedProcedure2',
'body': (
'function (input) {' +
' getContext().getResponse().setBody(' +
' \'a\' + input.temp);' +
'}')
}
retrieved_sproc3 = client.CreateStoredProcedure(collection_link,sproc3)
result = client.ExecuteStoredProcedure('dbs/db2/colls/coll2/sprocs/storedProcedure3',{'temp': 'so'})
"""
## delete all records in collection
"""
result = client.ExecuteStoredProcedure('dbs/db2/colls/coll2/sprocs/bulkDeleteSproc',"SELECT * FROM c ORDER BY c._ts DESC ")
print(result)
"""
multiplerecords="""[{
"Virtual Machine": 0,
"name": "some",
"Web Site": 0,
"Cloud Service": 0
},
{
"Virtual Machine": 0,
"name": "some",
"Web Site": 1,
"Cloud Service": 0
}]"""
multiplerecords=json.loads(multiplerecords)
print(multiplerecords)
print(str(json.dumps(json.dumps(multiplerecords).encode('utf8'))))
#bulkloadresult = client.ExecuteStoredProcedure('dbs/db2/colls/coll2/sprocs/bulkImport',json.dumps(multiplerecords).encode('utf8'))
#bulkloadresult = client.ExecuteStoredProcedure('dbs/db2/colls/coll2/sprocs/bulkImport',json.dumps(json.loads(r'{"items": [{"name":"John","age":30,"city":"New York"},{"name":"John","age":30,"city":"New York"}]}')).encode('utf8'))
str1='{name":"John","age":30,"city":"New York","PartitionKey" : "Morisplane"}'
str2='{name":"John","age":30,"city":"New York","partitionKey" : "Morisplane"}'
key1=base64.b64encode(str1.encode("utf-8"))
key2=base64.b64encode(str2.encode("utf-8"))
data= {"items":[{"id": key1 ,"name":"John","age":30,"city":"Morisplane","PartitionKey" : "Morisplane" },{"id": key2,"name":"John","age":30,"city":"Morisplane","partitionKey" : "Morisplane"}] , "city": "Morisplane", "partitionKey" : "Morisplane"}
print(repr(data))
#retrieved_sproc3 =client.DeleteStoredProcedure('dbs/db2/colls/coll2/sprocs/bulkimport2')
sproc3 = {
'id': 'bulkimport2',
'body': (
"""function bulkimport2(docObject) {
var collection = getContext().getCollection();
var collectionLink = collection.getSelfLink();
// The count of imported docs, also used as current doc index.
var count = 0;
getContext().getResponse().setBody(docObject.items);
//return
// Validate input.
//if (!docObject.items || !docObject.items.length) getContext().getResponse().setBody(docObject);
docObject.items=JSON.stringify(docObject.items)
docObject.items = docObject.items.replace("\\\\r", "");
docObject.items = docObject.items.replace("\\\\n", "");
var docs = JSON.parse(docObject.items);
var docsLength = docObject.items.length;
if (docsLength == 0) {
getContext().getResponse().setBody(0);
return;
}
// Call the CRUD API to create a document.
tryCreate(docs[count], callback, collectionLink,count);
// Note that there are 2 exit conditions:
// 1) The createDocument request was not accepted.
// In this case the callback will not be called, we just call setBody and we are done.
// 2) The callback was called docs.length times.
// In this case all documents were created and we don't need to call tryCreate anymore. Just call setBody and we are done.
function tryCreate(doc, callback, collectionLink,count ) {
doc=JSON.stringify(doc);
if (typeof doc == "undefined") {
getContext().getResponse().setBody(count);
return ;
} else {
doc = doc.replace("\\r", "");
doc = doc.replace("\\n", "");
doc=JSON.parse(doc);
}
getContext().getResponse().setBody(doc);
return
var isAccepted = collection.upsertDocument(collectionLink, doc, callback);
// If the request was accepted, callback will be called.
// Otherwise report current count back to the client,
// which will call the script again with remaining set of docs.
// This condition will happen when this stored procedure has been running too long
// and is about to get cancelled by the server. This will allow the calling client
// to resume this batch from the point we got to before isAccepted was set to false
if (!isAccepted) {
getContext().getResponse().setBody(count);
}
}
// This is called when collection.createDocument is done and the document has been persisted.
function callback(err, doc, options) {
if (err) throw getContext().getResponse().setBody(err + doc);
// One more document has been inserted, increment the count.
count++;
if (count >= docsLength) {
// If we have created all documents, we are done. Just set the response.
getContext().getResponse().setBody(count);
return ;
} else {
// Create next document.
tryCreate(docs[count], callback, collectionLink,count);
}
}
}"""
)
}
#retrieved_sproc3 = client.CreateStoredProcedure(collection_link,sproc3)
bulkloadresult = client.ExecuteStoredProcedure('dbs/db2/colls/coll2/sprocs/bulkimport2', data , {"partitionKey" : "Morisplane"} )
print(repr(bulkloadresult))
private async Task<T> ExecuteDataUpload<T>(IEnumerable<object> data,PartitionKey partitionKey)
{
using (var client = new DocumentClient(m_endPointUrl, m_authKey, connPol))
{
while (true)
{
try
{
var result = await client.ExecuteStoredProcedureAsync<T>(m_spSelfLink, new RequestOptions { PartitionKey = partitionKey }, data);
return result;
}
catch (DocumentClientException ex)
{
if (429 == (int)ex.StatusCode)
{
Thread.Sleep(ex.RetryAfter);
continue;
}
if (HttpStatusCode.RequestTimeout == ex.StatusCode)
{
Thread.Sleep(ex.RetryAfter);
continue;
}
throw ex;
}
catch (Exception)
{
Thread.Sleep(TimeSpan.FromSeconds(1));
continue;
}
}
}
}
public async Task uploadData(IEnumerable<object> data, string partitionKey)
{
int groupSize = 600;
int dataSize = data.Count();
int chunkSize = dataSize > groupSize ? groupSize : dataSize;
List<Task> uploadTasks = new List<Task>();
while (dataSize > 0)
{
IEnumerable<object> chunkData = data.Take(chunkSize);
object[] taskData = new object[3];
taskData[0] = chunkData;
taskData[1] = chunkSize;
taskData[2] = partitionKey;
uploadTasks.Add(Task.Factory.StartNew(async (arg) =>
{
object[] reqdData = (object[])arg;
int chunkSizes = (int)reqdData[1];
IEnumerable<object> chunkDatas = (IEnumerable<object>)reqdData[0];
var partKey = new PartitionKey((string)reqdData[2]);
int chunkDatasCount = chunkDatas.Count();
while (chunkDatasCount > 0)
{
int insertedCount = await ExecuteDataUpload<int>(chunkDatas, partKey);
chunkDatas = chunkDatas.Skip(insertedCount);
chunkDatasCount = chunkDatasCount - insertedCount;
}
}, taskData));
data = data.Skip(chunkSize);
dataSize = dataSize - chunkSize;
chunkSize = dataSize > groupSize ? groupSize : dataSize;
}
await Task.WhenAll(uploadTasks);
}
Now call the uploadData in parallel with list of objects you want to upload. Just keep one thing in mind send data of like Partitionkey only.
I'm using Roslyn to try and compile and run code at runtime. I've ysed some code I found online and have it somewhat working.
public Type EvalTableScript(string Script, CRMMobileFramework.EnbuUtils EnbuUtils, CRMMobileFramework.Includes.DBAdapter dbConn)
{
var syntaxTree = SyntaxTree.ParseText(Script);
var compilation = Compilation.Create("EnbuScript.dll",
options: new CompilationOptions(outputKind: OutputKind.DynamicallyLinkedLibrary),
references: new[]
{
new MetadataFileReference(typeof(object).Assembly.Location),
new MetadataFileReference(typeof(EnbuUtils).Assembly.Location),
new MetadataFileReference(typeof(DBAdapter).Assembly.Location),
MetadataFileReference.CreateAssemblyReference("System.Data"),
MetadataFileReference.CreateAssemblyReference("System.Linq"),
MetadataFileReference.CreateAssemblyReference("System"),
MetadataFileReference.CreateAssemblyReference("System.XML")
},
syntaxTrees: new[] { syntaxTree });
var diagnostics = compilation.GetDiagnostics();
foreach (var diagnostic in diagnostics)
{
Console.WriteLine("Error: {0}", diagnostic.Info.GetMessage());
}
Assembly assembly;
using (var stream = new MemoryStream())
{
EmitResult emitResult = compilation.Emit(stream);
assembly = Assembly.Load(stream.GetBuffer());
}
Type ScriptClass = assembly.GetType("EnbuScript");
// Pass back the entire class so we can call it at the appropriate time.
return ScriptClass;
}
Then I'm trying to call this:
string Script = #"
using System;
using System.Data;
using System.IO;
using System.Linq;
public class EnbuScript
{
public string PostInsertRecord(CRMMobileFramework.EnbuUtils EnbuUtils,CRMMobileFramework.Includes.DBAdapter dbConn)
{
string ScriptTable = ""QuoteItems"";
DataSet EntityRecord = dbConn.FindRecord(""*"", ScriptTable, ""QuIt_LineItemID='"" + EnbuUtils.GetContextInfo(ScriptTable) + ""'"", """", 1, 1, false);
string OrderId = EntityRecord.Tables[""item""].Rows[0][""QuIt_orderquoteid""].ToString();
string UpdateOrderTotalCommand = ""UPDATE Quotes SET Quot_nettamt = (select SUM(QuIt_listprice * quit_quantity) from QuoteItems where quit_orderquoteid = "" + OrderId + "" ) where Quot_OrderQuoteID = "" + OrderId;
dbConn.ExecSql(UpdateOrderTotalCommand);
return ""Complete"";
}
}";
Type EnbuScript = EnbuUtils.EvalTableScript(Script, EnbuUtils, dbConn);
MethodInfo methodInfo = EnbuScript.GetMethod("InsertRecord");
object[] parameters = { EnbuUtils, dbConn };
string InsertRecordResult = methodInfo.Invoke(null, parameters).ToString();
As you can see I've been messing around with trying to pass parameters to the compilation.
Basically I've got 4 functions I need to support, that will come in as a string. What I'm trying to do is create a class for these 4 functions and compile and run them. This part works.
What I now need to be able to do is pass class instances to this. In the code you'll see a dbConn which is basically my database connection. I need to pass the instance of this to the method I'm calling at runtime so it has it's correct context.
I have another implementation of this where I'm using the Roslyn session. I originally tried to use this and override my function at runtime but that didn't work either. See below what I tried:
public static void EvalTableScript(ref EnbuUtils EnbuUtils, DBAdapter dbConn, string EvaluateString)
{
ScriptEngine roslynEngine = new ScriptEngine();
Roslyn.Scripting.Session Session = roslynEngine.CreateSession(EnbuUtils);
Session.AddReference(EnbuUtils.GetType().Assembly);
Session.AddReference(dbConn.GetType().Assembly);
Session.AddReference("System.Web");
Session.AddReference("System.Data");
Session.AddReference("System");
Session.AddReference("System.XML");
Session.ImportNamespace("System");
Session.ImportNamespace("System.Web");
Session.ImportNamespace("System.Data");
Session.ImportNamespace("CRMMobileFramework");
Session.ImportNamespace("CRMMobileFramework.Includes");
try
{
var result = (string)Session.Execute(EvaluateString);
}
catch (Exception ex)
{
}
}
I tried to call this using:
string PostInsertRecord = "" +
" public override void PostInsertRecord() " +
"{ " +
" string ScriptTable = \"QuoteItems\"; " +
"DataSet EntityRecord = dbConn.FindRecord(\"*\", ScriptTable, \"QuIt_LineItemID='\" + EnbuUtils.GetContextInfo(ScriptTable) + \"'\", \"\", 1, 1, false); " +
"string OrderId = EntityRecord.Tables[\"item\"].Rows[0][\"QuIt_orderquoteid\"].ToString(); " +
"string UpdateOrderTotalCommand = \"UPDATE Quotes SET Quot_nettamt = (select SUM(QuIt_listprice * quit_quantity) from QuoteItems where quit_orderquoteid = \" + OrderId + \" ) where Quot_OrderQuoteID = \" + OrderId; " +
"dbConn.ExecSql(UpdateOrderTotalCommand); " +
"} ";
The function is declared as a public virtual void in the EnbuUtils class but it says it doesn't have a suitable method to override.
Safe to say, I'm stumped!
Any help appreciated!
Thanks
I got this in the end - this first method was very close to what I actually needed. Changed the method to static and had to add a few references including the full namespace.
I try to write a Webservice that can access to my exchange-server and search for names, companys and cities. At the moment i get the names like this:
ExchangeServiceBinding esb = new ExchangeServiceBinding();
esb.UseDefaultCredentials = true;
// Create the ResolveNamesType and set
// the unresolved entry.
ResolveNamesType rnType = new ResolveNamesType();
rnType.ReturnFullContactData = true;
rnType.UnresolvedEntry = "searchname";
// Resolve names.
ResolveNamesResponseType resolveNamesResponse
= esb.ResolveNames(rnType);
ArrayOfResponseMessagesType responses
= resolveNamesResponse.ResponseMessages;
// Check the result.
if (responses.Items.Length > 0 && responses.Items[0].ResponseClass != ResponseClassType.Error)
{
ResolveNamesResponseMessageType responseMessage = responses.Items[0] as
ResolveNamesResponseMessageType;
// Display the resolution information.
ResolutionType[] resolutions = responseMessage.ResolutionSet.Resolution;
foreach (ResolutionType resolution in resolutions)
{
Console.WriteLine(
"Name: " +
resolution.Contact.DisplayName
);
Console.WriteLine(
"EmailAddress: " +
resolution.Mailbox.EmailAddress
);
if (resolution.Contact.PhoneNumbers != null)
{
foreach (
PhoneNumberDictionaryEntryType phone
in resolution.Contact.PhoneNumbers)
{
Console.WriteLine(
phone.Key.ToString() +
" : " +
phone.Value
);
}
}
Console.WriteLine(
"Office location:" +
resolution.Contact.OfficeLocation
);
Console.WriteLine("\n");
}
}
But anybody know how i can serach for Propertys like Company and Street?
EWS only has limited Directory operations if your using OnPrem Exchange then the easiest way to do this is just use LDAP and lookup Active Directory directly. The resolveName operation is meant to be used to resolve a partial number and doesn't work with any other properties. If you have Exchange 2013 then there is the FindPeople operation http://msdn.microsoft.com/en-us/library/office/jj191039(v=exchg.150).aspx which supports using a QueryString which should work if those properties are indexed. eg
EWSProxy.FindPeopleType fpType = new EWSProxy.FindPeopleType();
EWSProxy.IndexedPageViewType indexPageView = new EWSProxy.IndexedPageViewType();
indexPageView.BasePoint = EWSProxy.IndexBasePointType.Beginning;
indexPageView.Offset = 0;
indexPageView.MaxEntriesReturned = 100;
indexPageView.MaxEntriesReturnedSpecified = true;
fpType.IndexedPageItemView = indexPageView;
fpType.ParentFolderId = new EWSProxy.TargetFolderIdType();
EWSProxy.DistinguishedFolderIdType Gal = new EWSProxy.DistinguishedFolderIdType();
Gal.Id = EWSProxy.DistinguishedFolderIdNameType.directory;
fpType.QueryString = "Office";
fpType.ParentFolderId.Item = Gal;
EWSProxy.FindPeopleResponseMessageType fpm = null;
do
{
fpm = esb.FindPeople(fpType);
if (fpm.ResponseClass == EWSProxy.ResponseClassType.Success)
{
foreach (EWSProxy.PersonaType PsCnt in fpm.People)
{
Console.WriteLine(PsCnt.EmailAddress.EmailAddress);
}
indexPageView.Offset += fpm.People.Length;
}
else
{
throw new Exception("Error");
}
} while (fpm.TotalNumberOfPeopleInView > indexPageView.Offset);
Cheers
Glen