MongoCollection.Update() using C# to update list<T> - c#

I am using MongoVue application to show the data preview stored in "MongoDb".
In the attached image, the database name "Energy" has collection name "DataLog". In "DataLog", there are several rows. I am adding these row to the collection by reading it from a .CSV file.
Now sometimes the column name Pings has huge data [say array of 2000 items] for a single row due to which the exception occurs i.e if "MaxDocumentSize exceeds in 16MB"
Since the Pings array was huge which threw an exception and to avoid this, I removed the collection of Pings [i.e. entered blank collection] from row and tried to Insert, it went successful.
Now I want to update the Pings for the same entry, but in case the array is something like 2000 elements or above, then I wish to update it in group of 500 items [500 x 4 = 2000] in a loop.
Can anyone help me out.
** SAMPLE CODE **
private void InsertData(Datalog xiDatalog)
{
List<Ping> tempPings = new List<Ping>();
tempPings.AddRange(xiDatalog.Pings);
xiDatalog.Pings.RemoveAll(x => x.RowId != 0);
WriteConcernResult wc = mongoCollection.Insert(xiDatalog);
counter++;
var query = new QueryDocument("_id", xiDatalog.Id);
MongoCursor<Datalog> cursor = mongoCollection.FindAs<Datalog>(query);
foreach (Datalog data in cursor)
{
AddPings(data, tempPings, mongoCollection);
break;
}
}
private void AddPings(Datalog xiDatalog, List<Ping> xiPings, MongoCollection<Datalog> mongoCollection)
{
int groupCnt = 0;
int insertCnt = 0;
foreach (Ping px in xiPings)
{
xiDatalog.Pings.Add(px);
groupCnt++;
if (((int)(groupCnt / 500)) > insertCnt)
{
UpdateDataLog(xiDatalog.Id, xiDatalog.Pings, mongoCollection);
insertCnt++;
}
}
}
private bool UpdateDataLog(BsonValue Id, List<Ping> tempPings, MongoCollection<Datalog> mongoCollection)
{
bool success = false;
try
{
var query = new QueryDocument("_id", Id);
var update = Update<Datalog>.Set(e => e.Pings, tempPings);
mongoCollection.Update(query, update);
success = true;
}
catch (Exception ex)
{
string error = ex.Message;
}
return success;
}

Answer : Just modified the code to use Update.PushAll() instead of Update.Set()
Please refer below code
private bool UpdateDataLog(BsonValue Id, List<Ping> tempPings, MongoCollection<Datalog> mongoCollection)
{
bool success = false;
try
{
var query = new QueryDocument("_id", Id);
var update = Update<Datalog>.PushAll(e => e.Pings, tempPings);
mongoCollection.Update(query, update);
success = true;
}
catch (Exception ex)
{
string error = ex.Message;
}
return success;
}

Related

Send row by row without duplicates

I have the following table
As you can see there is a column called Integration of type bool, all this table is shown in a DataGridView through a stored procedure that is this
CREATE PROCEDURE [dbo].[SP_Payments]
AS
SELECT 'CE-'+CardCode AS CardCode,DocType,Series,DocDate,dbo.udf_GetNumeric(DocNum) AS DocNum,
DocEntry,TrsfrAcct,TrsfrDate,TrsfrSum, Integration,Comments,SumApplied
FROM PaymentsReceived WHERE Integration = 0
This SP says to only show me those that are at 0 which is false, what I do with those that are false is to send them through a web service, I have a method that goes through each row and sends it every time it does a send sets it to true, then it disappears from the DataGridView, this method is inside a timer that fires every 5 seconds, in this method there is a condition that says if the Integration is == false, let it send, this is the method.
private async void Envio_Timer_Tick(object sender, EventArgs e)
{
try
{
ProxyBL proxy = new ProxyBL();
foreach (DataGridViewRow Datos in dataGridView1.Rows)
{
PagosRecibidos pagos = new PagosRecibidos
{
CardCode = Convert.ToString(Datos.Cells[0].Value),
DocType = Convert.ToString(Datos.Cells[1].Value),
Series = Convert.ToInt32(Datos.Cells[2].Value),
DocDate = Convert.ToDateTime(Datos.Cells[3].Value),
DocEntry = Convert.ToInt32(Datos.Cells[5].Value),
TrsfrAcct = Convert.ToString(Datos.Cells[6].Value),
TrsfrDate = Convert.ToDateTime(Datos.Cells[7].Value),
TrsfrSum = Convert.ToDecimal(Datos.Cells[8].Value),
Integration = Convert.ToBoolean(Datos.Cells[9].Value),
Comments = Convert.ToString(Datos.Cells[10].Value),
SumApplied = Convert.ToDecimal(Datos.Cells[11].Value)
};
Inte = pagos.Integration;
if (Inte == false)
{
var EnvioDatos = await proxy.EnviarPago(pagos);
}
ListarEmple();
ListarLog();
}
}
catch (Exception ex)
{
MessageBox.Show(ex.Message);
}
}
This is the method where SendPayment comes in
This method I get the response from Services, where if the operation was successful or failed, it inserts this in a Log
Consultas c = new Consultas();
public async Task<string> EnviarPago(PagosRecibidos detalle)
{
try
{
ProxyXML xmlProxy = new ProxyXML();
string respuesta = await xmlProxy.EnviarSAP(detalle);
c.InsertarLog(1, DateTime.Now, respuesta, xmlProxy.XmlSerializado);
return respuesta;
}
catch (Exception ex)
{
throw ex;
}
}
After this method, enter the shipment, EnviaSAP
Which is where I capture the answer, if the operation was successful then give me the Integration column as "1 true"
readonly Consultas c = new Consultas();
public string XmlSerializado = null;
public async Task<string> EnviarSAP(PagosRecibidos detalle)
{
try
{
using (WSSincronizacionClient clienteSAP = new WSSincronizacionClient())
{
XmlSerializado = this.SerializarXml(detalle);
var respuesta = await clienteSAP.EnviarDatosSAPAsync(XmlSerializado);
if (respuesta.Contains("true|OperaciĆ³n Exitosa|"))
{
c.EditarIntegration(true, Convert.ToInt32(detalle.DocEntry));
}
return respuesta;
}
}
catch (Exception ex)
{
throw ex;
}
}
Everything works correctly but when sending it, sometimes it sends it double or duplicate, that is, in SAP it arrives twice, how can I do this validation, that the one that is false only enters me once, that it does not send it for any reason twice , and in which part should I do this validation
I also don't know why if you already have that validation that I only sent the fake ones, you still send it twice.
The only thing I can think of is that the time to process an individual row exceeds the timer interval, therefore you might be iterating over the same item in two handler calls in parallel, because there isn't enough time for the other handler to complete its processing.
Maybe you could flag the row when you're fetching it, and save the changes so that flag could act as a discriminator for other timer tick handler calls to ignore the row:
foreach (DataGridViewRow Datos in dataGridView1.Rows)
{
var alreadyProcessingRow = Convert.ToBoolean(Datos.Cells[{ProcessingFlagColumnIndex}].Value);
if(alreadyProcessingRow)
continue; //skip the row, don't reprocess it
Datos.Cells[{ProcessingFlagColumnIndex}] = true; //Mark the row as processing
PagosRecibidos pagos = new PagosRecibidos
{
CardCode = Convert.ToString(Datos.Cells[0].Value),
DocType = Convert.ToString(Datos.Cells[1].Value),
Series = Convert.ToInt32(Datos.Cells[2].Value),
DocDate = Convert.ToDateTime(Datos.Cells[3].Value),
DocEntry = Convert.ToInt32(Datos.Cells[5].Value),
TrsfrAcct = Convert.ToString(Datos.Cells[6].Value),
TrsfrDate = Convert.ToDateTime(Datos.Cells[7].Value),
TrsfrSum = Convert.ToDecimal(Datos.Cells[8].Value),
Integration = Convert.ToBoolean(Datos.Cells[9].Value),
Comments = Convert.ToString(Datos.Cells[10].Value),
SumApplied = Convert.ToDecimal(Datos.Cells[11].Value)
};
Inte = pagos.Integration;
if (Inte == false)
{
var EnvioDatos = await proxy.EnviarPago(pagos);
}
Datos.Cells[{ProcessingFlagColumnIndex}] = false; //reset the flag (not that important if you just don't care after processing)
ListarEmple();
ListarLog();
}
I'm not really sure if you can do that with that datagridview object, but in case you couldn't you could use something like a ConcurrentDictionary, and store the rows (or row Ids) you are currently processing there in order to check and avoid duplicate processing.

How to convert from large csv file into json without using split (Out Of Memory issue) C#

I am trying to parse a 300MB csv file and save it on mongodb. In order to do that I will need to convert this csv file into a list of BsonDocument which include key value pairs which create a document. each row in the csv file is a new BsonDocument.
Every couple of minutes of parallel testing, I am getting OOM exception on the split operation.
I've read this article which is very interesting. but I couldn't find any practical solution which I can implement on those huge files.
I was looking into different csv helpers, but couldn't find anything which solve this issue.
Any help will be much appreciated.
You should be able to read it line by line like this:
public static void Main()
{
using (StreamReader sr = new StreamReader(path))
{
string[] headers = null;
string[] curLine;
while ((curLine = sr.ReadLine().Split(',')) != null)
{
if (firstLine == null)
{
headers = curLine;
}
else
{
processLine(headers, curLine);
}
}
}
}
public static void processLine(string[] headers, string[] line)
{
for (int i = 0; i < headers.Length)
{
string header = headers[i];
string line = line[i];
//Now you have individual header/line pairs that you can put into mongodb
}
}
I've never used mongodb and I don't know the structure of your csv or your mongo, so I won't be able to help much there. Hopefully you can get it from here though. If not, edit your post with some more details about how you need to structure your mongodb and hopefully somebody will post a more helpful answer.
Thank you #dbc That worked!
#ashbygeek, I needed to add this to your code,
while (!sr.EndOfStream && (curLine = sr.ReadLine().Split('\t')) != null)
{
//do process
}
So I am uploading my code which I get my big CSV file from Azure blob, and insert in Batch to mongoDB instead of each document.
I also created my own primary key hash, and index, in order to identify duplicates documents, and if I found one, I'll start insert them one by one in order to identify the duplicate.
I hope it will help for someone in the future.
using (TextFieldParser parser = new TextFieldParser(blockBlob2.OpenRead()))
{
parser.TextFieldType = FieldType.Delimited;
parser.SetDelimiters("\t");
bool headerWritten = false;
List<BsonDocument> listToInsert = new List<BsonDocument>();
int chunkSize = 50;
int counter = 0;
var headers = new string[0];
while (!parser.EndOfData)
{
//Processing row
var fields = parser.ReadFields();
if (!headerWritten)
{
headers = fields;
headerWritten = true;
continue;
}
listToInsert.Add(new BsonDocument(headers.Zip(fields, (k, v) => new { k, v }).ToDictionary(x => x.k, x => x.v)));
counter++;
if (counter != chunkSize) continue;
AdditionalInformation(listToInsert, dataCollectionQueueMessage);
CalculateHashForPrimaryKey(listToInsert);
await InsertDataIntoDB(listToInsert, dataCollectionQueueMessage);
counter = 0;
listToInsert.Clear();
}
if (listToInsert.Count > 0)
{
AdditionalInformation(listToInsert, dataCollectionQueueMessage);
CalculateHashForPrimaryKey(listToInsert);
await InsertDataIntoDB(listToInsert, dataCollectionQueueMessage);
}
}
private async Task InsertDataIntoDB(List<BsonDocument>listToInsert, DataCollectionQueueMessage dataCollectionQueueMessage)
{
const string connectionString = "mongodb://127.0.0.1/localdb";
var client = new MongoClient(connectionString);
_database = client.GetDatabase("localdb");
var collection = _database.GetCollection<BsonDocument>(dataCollectionQueueMessage.CollectionTypeEnum.ToString());
await collection.Indexes.CreateOneAsync(new BsonDocument("HashMultipleKey", 1), new CreateIndexOptions() { Unique = true, Sparse = true, });
try
{
await collection.InsertManyAsync(listToInsert);
}
catch (Exception ex)
{
ApplicationInsights.Instance.TrackException(ex);
await InsertSingleDocuments(listToInsert, collection, dataCollectionQueueMessage);
}
}
private async Task InsertSingleDocuments(List<BsonDocument> dataCollectionDict, IMongoCollection<BsonDocument> collection
,DataCollectionQueueMessage dataCollectionQueueMessage)
{
ApplicationInsights.Instance.TrackEvent("About to start insert individual documents and to find the duplicate one");
foreach (var data in dataCollectionDict)
{
try
{
await collection.InsertOneAsync(data);
}
catch (Exception ex)
{
ApplicationInsights.Instance.TrackException(ex,new Dictionary<string, string>() {
{
"Error Message","Duplicate document was detected, therefore ignoring this document and continuing to insert the next docuemnt"
}, {
"FilePath",dataCollectionQueueMessage.FilePath
}}
);
}
}
}

Application crashes on reading a number in string

So i have a google spreedsheet from where I'm reading some data. So e.g my first column looks something like this:
When I change any of the rows into some random string like "abc" my application crashes:
I have also recorded a short video where I demonstrate this in action, and put my method in TRY-CATCH so the app doesn't crash. This is very strange... Is it maybe because of the variable var where if half of the data array is an integer value, then it becomes int or what else could it be?
http://screencast.com/t/7OlDOzDZX70R
If I, for example put all rows in some strings "a","b" and insert a number in between, the app crashes again...
I just don't know what might be the problem. It also happens in other rows...
crash report
Here is my code that deserilazes the json when I get it from the web:
private async Task GetDataAsync()
{
//if (this._table.Count != 0) return;
this.Table.Clear();
var jsonObject = await DownloadSpreadsheet.GetJson();
for (int row = 0; row < jsonObject["rows"].Count(); row++)
{
Table table = new Table();
table.Day = jsonObject["rows"][row]["c"][0]["v"].ToString();
table.Month = jsonObject["rows"][row]["c"][5]["v"].ToString();
table.Year = jsonObject["rows"][row]["c"][6]["v"].ToString();
table.People = jsonObject["rows"][row]["c"][7]["v"].ToString();
this.Table.Add(table);
}
And here is the model where all fields are clearly declared as STRING...
public class Table
{
[DataMember(Name="id")]
public string Id { get; set; }
[DataMember(Name="day")]
public string Day { get; set; }
[DataMember(Name="month")]
public string Month { get; set; }
[DataMember(Name="year")]
public string Year { get; set; }
}
Here is also the method for getting the json:
public class DownloadSpreadsheet
{
// 1tJ64Y8hje0ui4ap9U33h3KWwpxT_-JuVMSZzxD2Er8k
private static readonly string spreadsheetKey = "1Ka-8bTSo4E7sNmsP41prSQqpjawooAvajnFnLi-jtCI";
private static string jsonUrlTemplate = "http://spreadsheets.google.com/a/google.com/tq?key={0}";
async public static Task<JObject> GetJson()
{
var url = string.Format(jsonUrlTemplate, spreadsheetKey);
var client = new HttpClient();
var response = await client.GetAsync(new Uri(url));
var rawResult = await response.Content.ReadAsStringAsync();
int start = rawResult.IndexOf("{", rawResult.IndexOf("{") + 1);
int end = rawResult.LastIndexOf("}");
String jsonResponse = rawResult.Substring(start, end - start);
return JObject.Parse(jsonResponse);
}
}
There is no error catching or checking for null values anywhere in the code and an unknown error is occuring.
I moved out your code into linqpad and I get an exception on this line:
var result = jsonObject.Result;
for (int row = 0; row < result["rows"].Count(); row++)
{
table.Day = result["rows"][row]["c"][0]["v"].ToString(); // Exception here
the exception is this
InvalidOperationException: Cannot access child value on
Newtonsoft.Json.Linq.JValue.
I recommend that you put error checking into your code and a way to see the message such as:
private async Task GetDataAsync()
{
//if (this._table.Count != 0) return;
try
{
this.Table.Clear();
var jsonObject = await DownloadSpreadsheet.GetJson();
for (int row = 0; row < jsonObject["rows"].Count(); row++)
{
Table table = new Table();
table.Day = jsonObject["rows"][row]["c"][0]["v"].ToString();
table.Month = jsonObject["rows"][row]["c"][5]["v"].ToString();
table.Year = jsonObject["rows"][row]["c"][6]["v"].ToString();
table.People = jsonObject["rows"][row]["c"][7]["v"].ToString();
this.Table.Add(table);
}
}
catch(Exception ex)
{ this.Error = ex; }
}
One other thing, in my version of the code I had to work off of the result of the task jsonObject.Result before extracting data.

Threading issues with SQLite3 and C# async

I am trying to save some data to a SQLite3 database. If I do not use async, I can save the data without any problems. As soon as I try to use the following code however, I receive the following error:
{Unable to evaluate expression because the code is optimized or a native frame is on top of the call stack.}
From my UI, I invoke the following SyncDomainTablesAsync method:
private readonly IDataCoordinator _coordinator;
public Configuration(IDataCoordinator coordinator)
{
_coordinator = coordinator;
}
public async Task<int> SyncDomainTablesAsync(IProgress<string> progress, CancellationToken ct, DateTime? lastDateSynced=null, string tableName = null)
{
//Determine the different type of sync calls
// 1) Force Resync (Drop/Create Tables and Insert)
// 2) Auto Update
var domainTable = await GetDomainTablesAsync(progress,ct,lastDateSynced, tableName);
var items = domainTable.Items;
int processCount = await Task.Run<int>( async () =>
{
int p = 0;
progress.Report(String.Format("Syncing Configurations..."));
foreach (var item in items)
{
progress.Report(String.Format("Syncing {0} Information",item.Key));
var task = await SyncTableAsync(item.Value); // INVOKED BELOW
if (task) progress.Report(String.Format("Sync'd {0} {1} records", item.Value.Count,item.Key));
if (ct.IsCancellationRequested) goto Cancelled;
p += item.Value.Count;
}
Cancelled:
if (ct.IsCancellationRequested)
{
//Update Last Sync'd Records
progress.Report(String.Format("Canceling Configuration Sync..."));
ct.ThrowIfCancellationRequested();
}
else
progress.Report(String.Format("Syncing Configurations Compleleted"));
return p;
},ct);
return processCount;
}
private async Task<bool> SyncTableAsync(IEnumerable<object> items, bool includeRelationships = false)
{
try
{
//TODO: Replace with SaveObjects method
var i = await Task.Run(() => _coordinator.SaveObjects(items, includeRelationships));
if (i == 0)
return false;
}
catch(Exception ex)
{
return false;
}
return true;
}
The UI invokes the SyncDomainTablesAsync method. I then create a new Task and loop through the items that were returned from the GetDomainTablesAsync method. During each iteration I await until the SyncTableAsync method completes. Within the SyncTableAsync I am calling a SaveObject method inside of a class that implements my IDataCoordinator interface.
public override int SaveObjects(IEnumerable<object> items, Type underlyingType, bool saveRelationships = true)
{
int result = 0;
if (items == null)
throw new ArgumentNullException("Can not save collection of objects. The collection is null.");
else if (items.Count() == 0)
return 0;
// Check if table exists.
foreach (var item in items)
this.CreateTable(item.GetType(), saveRelationships);
using (SQLiteConnection connection = new SQLiteConnection(this.StorageContainerPath))
{
connection.BeginTransaction();
foreach (var item in items)
{
result += ProcessSave(item, saveRelationships, connection);
}
try
{
connection.Commit();
}
catch (SQLiteException ex)
{
connection.Rollback();
throw ex;
}
}
return result;
}
public override int CreateTable(Type type, bool createRelationalTables = false)
{
if (this.TableExists(type) == 1)
return 1;
using (SQLiteConnection cn = new SQLiteConnection(this.StorageContainerPath))
{
try
{
// Check if the Table attribute is used to specify a table name not matching that of the Type.Name property.
// If so, we generate a Sql Statement and create the table based on the attribute name.
//if (Attribute.IsDefined(type, typeof(TableAttribute)))
//{
// TableAttribute attribute = type.GetAttribute<TableAttribute>();
// Strongly typed to SQLiteCoordinator just to get a SqlQuery instance. The CreateCommand method will create a table based on 'type'
var query = new SqlQuery<SQLiteCoordinator>().CreateCommand(DataProviderTypes.Sqlite3, type);
query = query.TrimEnd(';') + ";";
cn.Execute(query);
//}
// Otherwise create the table using the Type.
//else
//{
// cn.CreateTable(type);
//}
// If we are to create relationship tables, we cascade through all relationship properties
// and create tables for them as well.
if (createRelationalTables)
{
this.CreateCascadingTables(type, cn);
}
}
catch (Exception ex)
{
return 0;
}
}
return 1;
}
The flow of the code goes
UI->SyncDomainTablesAsync->SyncTableAsync->SaveObjects->SaveTable(type)
The issue that I have is within Save Table. If I just use SaveTable synchronously I have no issues. Using it in my async method above, causes a thread abort exception. The exception is thrown within the SQLite.cs file included with SQLite.net (within the . The weird thing is that the table is created in the database, even though the exception is thrown. The error is thrown some times when the Prepare() function is called and the rest of the time when the SQLite3.Step() function is called.
public int ExecuteNonQuery ()
{
if (_conn.Trace) {
Debug.WriteLine ("Executing: " + this);
}
var r = SQLite3.Result.OK;
var stmt = Prepare (); // THROWS THE ERRROR
r = SQLite3.Step(stmt); // THROWS THE ERRROR
Finalize(stmt);
if (r == SQLite3.Result.Done) {
int rowsAffected = SQLite3.Changes (_conn.Handle);
return rowsAffected;
} else if (r == SQLite3.Result.Error) {
string msg = SQLite3.GetErrmsg (_conn.Handle);
throw SQLiteException.New (r, msg);
} else {
throw SQLiteException.New (r, r.ToString ());
}
}
I assume that because my foreach statement awaits the return of SyncTableAsync that none of the threads are closed. I am also getting a system transaction critical exception that says "attempting to access a unloaded app domain".
Am I using await/async incorrectly with Sqlite3 or is this an issue with Sqlite3 that I am not aware of.
Attached is a photo of the Parallel's stack and the exception.
EDIT
When I try to run the code above as well in unit tests, the unit tests process never dies. I have to exit Visual Studio in order to get the process to die. I am assuming something in SQLite.dll is grabbing a hold of the process when the exception is thrown and not letting go, but I am not sure.
EDIT 2
I can modify the initial method SyncDomainTablesAsync to the following and the code runs without error. The issue is my use of async and await I believe.
public async Task<int> SyncDomainTablesAsync(IProgress<string> progress, CancellationToken ct, DateTime? lastDateSynced=null, string tableName = null)
{
var domainTable = await GetDomainTablesAsync(progress,ct,lastDateSynced, tableName);
var items = domainTable.Items;
foreach (var item in items)
{
_coordinator.SaveObjects(item.Value, typeof(object), true);
}
return 1;
}

Update in couchbase

I need some help in update using couchbase. I have a task in my page. If the user clicks the like then the likes count should be updated in my couchbase bucket. I have tried my own update handler code but that has some time latency. I have included my update code too below.
This is my code for liking a task...
public ResponseVO LikeTask(LikeVO likeVO)
{
ResponseVO response = new ResponseVO();
try
{
if (!isLiked(likeVO.TaskID, likeVO.UserID))
{
UpdateTaskDB likeUpdate = new UpdateTaskDB();
UpdateTaskVO updatetaskvo = new UpdateTaskVO();
updatetaskvo.FieldName = "Likes";
LikeVO tempvo = new LikeVO();
tempvo.LikedOn = DateTime.Now.ToString();
tempvo.UserID = likeVO.UserID;
tempvo.UserName = likeVO.UserName;
tempvo.TaskID = likeVO.TaskID;
updatetaskvo.ObjectValue = tempvo;
updatetaskvo.TaskID = likeVO.TaskID;
likeUpdate.UpdateDocument(updatetaskvo);
}
response.StatusMessage = "Liked Successfully";
}
catch (Exception ex)
{
response.StatusCode = "0";
response.StatusMessage = ex.Message;
}
return response;
}
My own update handler code:
public class UpdateTaskDB
{
CouchbaseClient oCouchbase;
public UpdateTaskDB()
{
oCouchbase = new CouchbaseClient("vwspace", "");
}
public TaskVO GetTaskByID(string task_id)
{
TaskVO results = null;
try
{
String str1;
str1 = (String)oCouchbase.Get(task_id);
results = JsonConvert.DeserializeObject<TaskVO>(str1);
}
catch (Exception ex)
{
}
return results;
}
public void UpdateDocument(UpdateTaskVO inputParams)
{
try
{
var client = new CouchbaseClient("vwspace", "");
TaskVO taskDoc = GetTaskByID(inputParams.TaskID);
switch (inputParams.FieldName)
{
case "Likes":
List<LikeVO> docLikes = taskDoc.likes;
docLikes.Add((LikeVO)inputParams.ObjectValue);
taskDoc.likes = docLikes;
break;
case "UnLike":
LikeVO unlikevo = (LikeVO)inputParams.ObjectValue;
for (int count = 0; count < taskDoc.likes.Count; count++)
{
if (taskDoc.likes[count].UserID.Equals(unlikevo.UserID))
{
unlikevo = taskDoc.likes[count];
break;
}
}
taskDoc.likes.Remove(unlikevo);
break;
default:
break;
}
String json = JsonConvert.SerializeObject(taskDoc);
client.Store(StoreMode.Set, inputParams.TaskID, json);
}
catch (Exception ex)
{
Console.Write("Exception :" + ex.Message);
}
}
}
Is ther any other way to handle this update in couchbase? Kindly help me out..
The latency you're seeing is likely due to the fact that you're creating two instances of the CouchbaseClient for each click. Creating an instance of a CouchbaseClient is an expensive operation, because of the bootstrapping and configuration setup that takes place.
There are a couple of different approaches you can take to minimize how frequently you create CouchbaseClient instances. One would be to create a static client that is reused by your data access classes. Another approach for web apps is to associate instances with HttpApplication instances. For an example of the Web approach, see my (incomplete) sample project on GitHub below.
https://github.com/jzablocki/couchbase-beer.net/blob/master/src/CouchbaseBeersWeb/Models/WebRepositoryBase%271.cs
Also, I would suggest using CAS operations when updating a document's like count. You want to make sure that a "like" vote doesn't cause the entire document to be update from a stale read.
For example:
public TaskVO GetTaskByID(string task_id)
{
var getResult = oCouchbase.ExecuteGet<string>(task_id);
var results = JsonConvert.DeserializeObject<TaskVO>(str1.Value);
results.Cas = getResult.Cas; //Here I'm suggesting adding a Cas property to your TaskVO
return results;
}
Then on your update:
public void UpdateDocument(UpdateTaskVO inputParams)
{
try
{
TaskVO taskDoc = GetTaskByID(inputParams.TaskID);
switch (inputParams.FieldName)
{
...
}
String json = JsonConvert.SerializeObject(taskDoc);
client.ExecuteStore(StoreMode.Set, inputParams.TaskID, json, taskDoc.Cas);
//this will fail if the document has been updated by another user. You could use a retry strategy
}
catch (Exception ex)
{
Console.Write("Exception :" + ex.Message);
}
}

Categories