I would need a way to know the most efficient way to see if a table in azure storage has any data
Exemple
cloudTable.ExecuteQuery("what do i do");
Thanks for the suggestions! I finally did something like this:
var query = new TableQuery<T>()
{
TakeCount = 1,
SelectColumns = new List<string>()
{
"PartitionKey"
}
};
var table = await this.GetTableAsync();
var segment = await table.ExecuteQuerySegmentedAsync(query, null);
return segment.Results.Any();
You could do this which limits your query to 1 record if there is data.
// select value = take smallest data field to avoid returning everything for a record
var query = new TableQuery().Select(new List<string>(){"smallcolumn"}).Take(1);
Related
How to apply Sort by Last Created (oldest record) & Limit results to 20 records from DynamoDB Table using BatchGetItemAsync Method. Thanks in Advance.
var table = Table.LoadTable(client, TableName);
var request = new BatchGetItemRequest
{
RequestItems = new Dictionary<string, KeysAndAttributes>()
{
{ TableName,
new KeysAndAttributes
{
AttributesToGet = new List<string> { "ID", "Status", "Date" },
Keys = new List<Dictionary<string, AttributeValue>>()
{
new Dictionary<string, AttributeValue>()
{
{ "Status", new AttributeValue { S = "Accepted" } }
}
}
}
}
}
};
var response = await client.BatchGetItemAsync(request);
var results = response.Responses;
var result = results[fullTableName];
There isn't a way to do what you're asking for with BatchGetItemAsync. That call is to get specific records, when you know the specific keys you are looking for. You'll need to use a query to do this, and you'll want to get your data in a structure that supports this access pattern. There was a really great session on DynamoDB access patterns at re:Invent 2018. I suggest watching it: https://www.youtube.com/watch?v=HaEPXoXVf2k
I'm developing a project using CosmosDb and Microsoft Azure Document library with c#.
I want to execute a stored procedure that retrieve all the record of a contsiner but the code retrieve only 100 record.
The code is the following :
string collectionToUse;
string partition;
if (typeof(T).ToString().IndexOf("Telemetry") != -1)
{
DocumentDBRepository<EBB.Web.Telemerty.Models.Telemetry>.Initialize();
collectionToUse = AppSettings.collection;
partition = "id";
}
else
{
DocumentDBRepository<EBB.Web.Telemerty.Models.Events>.Initialize();
collectionToUse = AppSettings.collection2;
partition = "uid";
}
Uri uri =
UriFactory.CreateStoredProcedureUri(AppSettings.database, collectionToUse, AppSettings.spName);
RequestOptions options = new RequestOptions { PartitionKey = new PartitionKey(partition) };
var result = await client.ExecuteStoredProcedureAsync<string>(uri, options, null);
List<T> list = new List<T>();
list = JsonConvert.DeserializeObject<List<T>>(result.Response);
return list;
What is the problem?
Thanks in advance for help me.
Simone
In Cosmos DB stored procedures cannot return all the documents in the container because the scope of the stored procedure execution isn't the full container but rather ONLY the logical partition that you specify in the RequestOptions. This means that your SP will only return documents that have this logical partition value in their properties.
I have 2 collections in my database. Let's say collection_1 and collection_2. I want to copy or move all of my documents in collection_1 to collection_2 using C#.
Any idea please?
Here's a solution to copy between databases. If they are on the same database then it is even more simple, just use one mongo client
var fromConnectionString = "mongodb://localhost:27017"; // if copy between same database then obviously you only need one connectionstring and one MongoClient
var toConnectionString = "mongodb://localhost:27017";
var sourceClient = new MongoClient(fromConnectionString);
var copyFromDb = sourceClient.GetDatabase("CopyFromDatabaseName");
var copyCollection = copyFromDb.GetCollection<BsonDocument>("FromCollectionName").AsQueryable(); // or use the c# class in the collection
var targetClient = new MongoClient(toConnectionString);
var targetMongoDb = targetClient.GetDatabase("CopyToDatabase");
var targetCollection = targetMongoDb.GetCollection<BsonDocument>("ToCollectionName");
targetCollection.InsertMany(copyCollection);
With database query.
Source :https://docs.mongodb.com/manual/reference/method/db.cloneCollection/
db.cloneCollection('mongodb.example.net:27017', 'profiles', { 'active' : true } )
With C#
Source: Duplicate a mongodb collection
var source = db.GetCollection("test");
var dest = db.GetCollection("testcopy");
dest.InsertBatch(source.FindAll());
I need to filter vertices in Azure Cosmos Graph DB by a property containing a value, I tried the code below but I am getting an error says (Unable to find any method 'filter')
var g = client.CreateTraversalSource();
var p = new P("containing", text);
var query = g.V().Filter(p).Range<Vertex>(page, pageSize);
var result = await client.ExcuteAsync<IEnumerable<Vertex>>(query);
Any idea how to achieve this?
This might help someone else, I managed to figure it out with some help of a friend:
var p = new P("containing", text);
var query = g.V().has("propertyName", p).Range<Vertex>(page, pageSize);
var result = await client.ExecuteAsync<IEnumerable<Vertext>>(query);
In case anyone is still looking into this, there's predefined predicate values that can be used as string filters in the class TextP.
The above can be accomplished with the following:
var query = g.V().has("propertyName", TextP.Containing(text)).Range<Vertex>(page, pageSize);
var result = await client.ExecuteAsync<IEnumerable<Vertext>>(query);
So I have a list that comes from a database on which I'm using linq to find items and here's how i do it:
public static List<myEntity> GetEntities()
{
List<myEntity> result = new List<myEntity>();
var entities = _data.AsEnumerable()
.Where(ent => ent.Field<string>(Constants.MODEL_DESC).Split(';')[0] == Constants.MODEL_ENTITY);
foreach (var entity in entities)
{
myEntitytemp = new myEntity();
string[] zzzDESC = entity.Field<string>(Constants.MODEL_DESC).Split(';');
temp.ID = Convert.ToInt16(zzzDESC[1]);
temp.icon = zzzDESC[2].Replace(".bmp", "");
temp.Translations = GetTranslation(Convert.ToInt16(zzzDESC[1]));
temp.tableName = GetTableName(temp.ID);
temp.attributes = GetAttributes(temp.ID);
result.Add(temp);
}
return result;
}
So basically there are 3 columns in my table and 1 of them stores the useful data separated by ";". Inside this function i have GetTableName (returns a string) and GetAttributes (return a list of class Attribute) and GetTranslations() in which i'm using the same kind of logic that in this method GetEntities().
private static List<Translation> GetTranslation(int id)
{
List<Translation> result = new List<Translation>();
var translations = _data.AsEnumerable()
.Where(trl => trl.Field<string>(Constants.MODEL_DESC).Split(';')[0] == Constants.MODEL_TRANSLATION)
.Where(trl => trl.Field<string>(Constants.MODEL_DESC).Split(';')[2] == id.ToString());
foreach(var trl in translations)
{
Translation temp = new Translation();
string[] ZZZDesc = trl.Field<string>(Constants.MODEL_DESC).Split(';');
temp.ID = Convert.ToInt16(ZZZDesc[1]);
temp.ParentID = Convert.ToInt16(ZZZDesc[2]);
temp.Language = (Language)Convert.ToInt16(ZZZDesc[3]);
temp.ShortName = ZZZDesc[4];
temp.LongName = ZZZDesc[5];
temp.HelpID = ZZZDesc[6];
temp.Description = ZZZDesc[7];
result.Add(temp);
}
return result;
}
They all query the same data which in this case is _data. It's a DataTable. And i can already notice that the process is very long. Pretty sure it's because i have to query the same table over and over to find what i'm searching for. The table itself has around 8000 rows so it's not that much.
I thought i may delete the rows once i got my stuff, i don't know if that a good idea since i will have to find the rows i was working with so that's 1 more entrance in _data per function and then delete it. Or maybe do it with linq if it's possible? and do it at the same time as i'm constructing the var entities, var translations, and so on.
Or is it just my way of working with linq that's the problem?
IEnumerable uses lazy loading. Put it in a list at the first call instead and it may quicken the process.
If you are reading the table a Lot, you might want to have it in cache.