Send row by row without duplicates - c#

I have the following table
As you can see there is a column called Integration of type bool, all this table is shown in a DataGridView through a stored procedure that is this
CREATE PROCEDURE [dbo].[SP_Payments]
AS
SELECT 'CE-'+CardCode AS CardCode,DocType,Series,DocDate,dbo.udf_GetNumeric(DocNum) AS DocNum,
DocEntry,TrsfrAcct,TrsfrDate,TrsfrSum, Integration,Comments,SumApplied
FROM PaymentsReceived WHERE Integration = 0
This SP says to only show me those that are at 0 which is false, what I do with those that are false is to send them through a web service, I have a method that goes through each row and sends it every time it does a send sets it to true, then it disappears from the DataGridView, this method is inside a timer that fires every 5 seconds, in this method there is a condition that says if the Integration is == false, let it send, this is the method.
private async void Envio_Timer_Tick(object sender, EventArgs e)
{
try
{
ProxyBL proxy = new ProxyBL();
foreach (DataGridViewRow Datos in dataGridView1.Rows)
{
PagosRecibidos pagos = new PagosRecibidos
{
CardCode = Convert.ToString(Datos.Cells[0].Value),
DocType = Convert.ToString(Datos.Cells[1].Value),
Series = Convert.ToInt32(Datos.Cells[2].Value),
DocDate = Convert.ToDateTime(Datos.Cells[3].Value),
DocEntry = Convert.ToInt32(Datos.Cells[5].Value),
TrsfrAcct = Convert.ToString(Datos.Cells[6].Value),
TrsfrDate = Convert.ToDateTime(Datos.Cells[7].Value),
TrsfrSum = Convert.ToDecimal(Datos.Cells[8].Value),
Integration = Convert.ToBoolean(Datos.Cells[9].Value),
Comments = Convert.ToString(Datos.Cells[10].Value),
SumApplied = Convert.ToDecimal(Datos.Cells[11].Value)
};
Inte = pagos.Integration;
if (Inte == false)
{
var EnvioDatos = await proxy.EnviarPago(pagos);
}
ListarEmple();
ListarLog();
}
}
catch (Exception ex)
{
MessageBox.Show(ex.Message);
}
}
This is the method where SendPayment comes in
This method I get the response from Services, where if the operation was successful or failed, it inserts this in a Log
Consultas c = new Consultas();
public async Task<string> EnviarPago(PagosRecibidos detalle)
{
try
{
ProxyXML xmlProxy = new ProxyXML();
string respuesta = await xmlProxy.EnviarSAP(detalle);
c.InsertarLog(1, DateTime.Now, respuesta, xmlProxy.XmlSerializado);
return respuesta;
}
catch (Exception ex)
{
throw ex;
}
}
After this method, enter the shipment, EnviaSAP
Which is where I capture the answer, if the operation was successful then give me the Integration column as "1 true"
readonly Consultas c = new Consultas();
public string XmlSerializado = null;
public async Task<string> EnviarSAP(PagosRecibidos detalle)
{
try
{
using (WSSincronizacionClient clienteSAP = new WSSincronizacionClient())
{
XmlSerializado = this.SerializarXml(detalle);
var respuesta = await clienteSAP.EnviarDatosSAPAsync(XmlSerializado);
if (respuesta.Contains("true|OperaciĆ³n Exitosa|"))
{
c.EditarIntegration(true, Convert.ToInt32(detalle.DocEntry));
}
return respuesta;
}
}
catch (Exception ex)
{
throw ex;
}
}
Everything works correctly but when sending it, sometimes it sends it double or duplicate, that is, in SAP it arrives twice, how can I do this validation, that the one that is false only enters me once, that it does not send it for any reason twice , and in which part should I do this validation
I also don't know why if you already have that validation that I only sent the fake ones, you still send it twice.

The only thing I can think of is that the time to process an individual row exceeds the timer interval, therefore you might be iterating over the same item in two handler calls in parallel, because there isn't enough time for the other handler to complete its processing.
Maybe you could flag the row when you're fetching it, and save the changes so that flag could act as a discriminator for other timer tick handler calls to ignore the row:
foreach (DataGridViewRow Datos in dataGridView1.Rows)
{
var alreadyProcessingRow = Convert.ToBoolean(Datos.Cells[{ProcessingFlagColumnIndex}].Value);
if(alreadyProcessingRow)
continue; //skip the row, don't reprocess it
Datos.Cells[{ProcessingFlagColumnIndex}] = true; //Mark the row as processing
PagosRecibidos pagos = new PagosRecibidos
{
CardCode = Convert.ToString(Datos.Cells[0].Value),
DocType = Convert.ToString(Datos.Cells[1].Value),
Series = Convert.ToInt32(Datos.Cells[2].Value),
DocDate = Convert.ToDateTime(Datos.Cells[3].Value),
DocEntry = Convert.ToInt32(Datos.Cells[5].Value),
TrsfrAcct = Convert.ToString(Datos.Cells[6].Value),
TrsfrDate = Convert.ToDateTime(Datos.Cells[7].Value),
TrsfrSum = Convert.ToDecimal(Datos.Cells[8].Value),
Integration = Convert.ToBoolean(Datos.Cells[9].Value),
Comments = Convert.ToString(Datos.Cells[10].Value),
SumApplied = Convert.ToDecimal(Datos.Cells[11].Value)
};
Inte = pagos.Integration;
if (Inte == false)
{
var EnvioDatos = await proxy.EnviarPago(pagos);
}
Datos.Cells[{ProcessingFlagColumnIndex}] = false; //reset the flag (not that important if you just don't care after processing)
ListarEmple();
ListarLog();
}
I'm not really sure if you can do that with that datagridview object, but in case you couldn't you could use something like a ConcurrentDictionary, and store the rows (or row Ids) you are currently processing there in order to check and avoid duplicate processing.

Related

C# use field 1 value if field 2 empty

I am running through a set of records using a for each loop, and also doing simple checks to ensure that good data is inserted into a database table.
Sometimes the dataset can be missing the LegistarID value, the change I need to do in my code, is to add a check for LegistarItem,
if the value of LegistarID is missing, but the AgendaItem value is not, then assign the value of AgendaItem to LegistarID
if LegistarId is missing, and there is also no AgendaItem value, then return a message to the user, to let them know that these values need to be present in the dataset they are trying to import.
I know it does not sound complex, but I am having a hard time making this change successfully. I need a bit of help if possible, please.
Here is my code as I currently have it:
if (ModelState.IsValid)
{
using (Etities db = new Entities())
{
foreach (var i in meeting)
{
if (i.MeetingID == 0)
{
message = string.Format("This file is missing the Meeting ID value of at least 1 record. \n Verify that the data you are trying to upload meets the criteria, and then try to upload your file again.", i.MeetingID);
return new JsonResult { Data = new { status = status, message = message } };
}
else
{
// development
var compositeKey = db.MeetingAgenda.Find(i.MeetingID, i.AgendaItem);
if (compositeKey == null)
{
// Add new
// development
db.MeetingAgenda.Add(i);
//
}
else
{
// Serves as an update, or addition of a previously imported dataset
db.Entry(compositeKey).CurrentValues.SetValues(i.MeetingID);
db.Entry(compositeKey).State = EntityState.Modified;
}
}
}
db.SaveChanges();
status = true;
}
}
else
{
message = string.Format("Please, verify that the file you are trying to upload is correctly formatted, and that the data it contains, meets the expected criteria, then click the upload button again. \n Thank you!");
return new JsonResult { Data = new { status = status, message = message } };
}
I think that part of the code I need is something like this:
else if (i.LegistarID == 0 and i.AgendaItem != 0)
{
i.LegistarID = i.AgendaItem
}
I just am unsure how in the current code place it.
I would check all rows before returning a result.
if (ModelState.IsValid) {
var errors = new List<string> ();
var rowCounter = 1;
using (Etities db = new Entities ()) {
foreach (var i in meeting) {
if (i.MeetingID == 0) {
// Let the user know this row is bad
errors.Add ($"Row {rowCounter}: This file is missing the Meeting ID. Verify that the data you are trying to upload meets the criteria, and then try to upload your file again.");
}
// Check if LegistarID is missing
if (i.LegistarID == 0) {
// Check if Agenda Item is present
if (i.AgendaItem == 0) {
errors.Add ($"Row {rowCounter}: Meeting has no LegistarID and no Agenda Item. Please check data");
} else {
i.LegistarID = i.AgendaItem
}
}
// development
var compositeKey = db.MeetingAgenda.Find (i.MeetingID, i.AgendaItem);
if (compositeKey == null) {
// Add new
// development
db.MeetingAgenda.Add (i);
//
} else {
// Serves as an update, or addition of a previously imported dataset
db.Entry (compositeKey).CurrentValues.SetValues (i.MeetingID);
db.Entry (compositeKey).State = EntityState.Modified;
}
rowCounter++;
}
// If there are errors do not save and return error message
if (errors.Count > 0) {
return new JsonResult { Data = new { status = false, message = string.Join ("\n", errors) } };
}
db.SaveChanges ();
status = true;
}
} else {
message = string.Format ("Please, verify that the file you are trying to upload is correctly formatted, and that the data it contains, meets the expected criteria, then click the upload button again. \n Thank you!");
return new JsonResult { Data = new { status = status, message = message } };
}
The "if(i.MeetingID == 0)" else is redundant, because you are returning if the condition is met. So to avoid unneeded/confusing nesting I would rewrite the actual code (of the loop only) as:
foreach (var i in meeting)
{
if (i.MeetingID == 0)
{
message = string.Format("This file is missing the Meeting ID value of at least 1 record. \n Verify that the data you are trying to upload meets the criteria, and then try to upload your file again.", i.MeetingID);
return new JsonResult { Data = new { status = status, message = message } };
}
// development
var compositeKey = db.MeetingAgenda.Find(i.MeetingID, i.AgendaItem);
if (compositeKey == null)
{
// Add new
// development
db.MeetingAgenda.Add(i);
//
}
else
{
// Serves as an update, or addition of a previously imported dataset
db.Entry(compositeKey).CurrentValues.SetValues(i.MeetingID);
db.Entry(compositeKey).State = EntityState.Modified;
}
}
Then, I would add the new condition in between the MeetingID = 0 check and the rest of the code, like this:
foreach (var i in meeting)
{
if (i.MeetingID == 0)
{
message = string.Format("This file is missing the Meeting ID value of at least 1 record. \n Verify that the data you are trying to upload meets the criteria, and then try to upload your file again.", i.MeetingID);
return new JsonResult { Data = new { status = status, message = message } };
}
// *** New check on LegistarID and AgendaItem ***
if(i.LegistarID == 0)
{
// Is there a chance to fill LegistarID with AgendaItem?
if(i.AgendaItem != 0)
{
// Yes, fill it and then let the rest of the code flow peacefully.
i.LegistarID = i.AgendaItem
}
else
{
// No way: I must stop the procedure here and warn the user about this.
// return "these values need to be present in the dataset they are trying to import."
}
}
// development
var compositeKey = db.MeetingAgenda.Find(i.MeetingID, i.AgendaItem);
if (compositeKey == null)
{
// Add new
// development
db.MeetingAgenda.Add(i);
//
}
else
{
// Serves as an update, or addition of a previously imported dataset
db.Entry(compositeKey).CurrentValues.SetValues(i.MeetingID);
db.Entry(compositeKey).State = EntityState.Modified;
}
}

Threading issues with SQLite3 and C# async

I am trying to save some data to a SQLite3 database. If I do not use async, I can save the data without any problems. As soon as I try to use the following code however, I receive the following error:
{Unable to evaluate expression because the code is optimized or a native frame is on top of the call stack.}
From my UI, I invoke the following SyncDomainTablesAsync method:
private readonly IDataCoordinator _coordinator;
public Configuration(IDataCoordinator coordinator)
{
_coordinator = coordinator;
}
public async Task<int> SyncDomainTablesAsync(IProgress<string> progress, CancellationToken ct, DateTime? lastDateSynced=null, string tableName = null)
{
//Determine the different type of sync calls
// 1) Force Resync (Drop/Create Tables and Insert)
// 2) Auto Update
var domainTable = await GetDomainTablesAsync(progress,ct,lastDateSynced, tableName);
var items = domainTable.Items;
int processCount = await Task.Run<int>( async () =>
{
int p = 0;
progress.Report(String.Format("Syncing Configurations..."));
foreach (var item in items)
{
progress.Report(String.Format("Syncing {0} Information",item.Key));
var task = await SyncTableAsync(item.Value); // INVOKED BELOW
if (task) progress.Report(String.Format("Sync'd {0} {1} records", item.Value.Count,item.Key));
if (ct.IsCancellationRequested) goto Cancelled;
p += item.Value.Count;
}
Cancelled:
if (ct.IsCancellationRequested)
{
//Update Last Sync'd Records
progress.Report(String.Format("Canceling Configuration Sync..."));
ct.ThrowIfCancellationRequested();
}
else
progress.Report(String.Format("Syncing Configurations Compleleted"));
return p;
},ct);
return processCount;
}
private async Task<bool> SyncTableAsync(IEnumerable<object> items, bool includeRelationships = false)
{
try
{
//TODO: Replace with SaveObjects method
var i = await Task.Run(() => _coordinator.SaveObjects(items, includeRelationships));
if (i == 0)
return false;
}
catch(Exception ex)
{
return false;
}
return true;
}
The UI invokes the SyncDomainTablesAsync method. I then create a new Task and loop through the items that were returned from the GetDomainTablesAsync method. During each iteration I await until the SyncTableAsync method completes. Within the SyncTableAsync I am calling a SaveObject method inside of a class that implements my IDataCoordinator interface.
public override int SaveObjects(IEnumerable<object> items, Type underlyingType, bool saveRelationships = true)
{
int result = 0;
if (items == null)
throw new ArgumentNullException("Can not save collection of objects. The collection is null.");
else if (items.Count() == 0)
return 0;
// Check if table exists.
foreach (var item in items)
this.CreateTable(item.GetType(), saveRelationships);
using (SQLiteConnection connection = new SQLiteConnection(this.StorageContainerPath))
{
connection.BeginTransaction();
foreach (var item in items)
{
result += ProcessSave(item, saveRelationships, connection);
}
try
{
connection.Commit();
}
catch (SQLiteException ex)
{
connection.Rollback();
throw ex;
}
}
return result;
}
public override int CreateTable(Type type, bool createRelationalTables = false)
{
if (this.TableExists(type) == 1)
return 1;
using (SQLiteConnection cn = new SQLiteConnection(this.StorageContainerPath))
{
try
{
// Check if the Table attribute is used to specify a table name not matching that of the Type.Name property.
// If so, we generate a Sql Statement and create the table based on the attribute name.
//if (Attribute.IsDefined(type, typeof(TableAttribute)))
//{
// TableAttribute attribute = type.GetAttribute<TableAttribute>();
// Strongly typed to SQLiteCoordinator just to get a SqlQuery instance. The CreateCommand method will create a table based on 'type'
var query = new SqlQuery<SQLiteCoordinator>().CreateCommand(DataProviderTypes.Sqlite3, type);
query = query.TrimEnd(';') + ";";
cn.Execute(query);
//}
// Otherwise create the table using the Type.
//else
//{
// cn.CreateTable(type);
//}
// If we are to create relationship tables, we cascade through all relationship properties
// and create tables for them as well.
if (createRelationalTables)
{
this.CreateCascadingTables(type, cn);
}
}
catch (Exception ex)
{
return 0;
}
}
return 1;
}
The flow of the code goes
UI->SyncDomainTablesAsync->SyncTableAsync->SaveObjects->SaveTable(type)
The issue that I have is within Save Table. If I just use SaveTable synchronously I have no issues. Using it in my async method above, causes a thread abort exception. The exception is thrown within the SQLite.cs file included with SQLite.net (within the . The weird thing is that the table is created in the database, even though the exception is thrown. The error is thrown some times when the Prepare() function is called and the rest of the time when the SQLite3.Step() function is called.
public int ExecuteNonQuery ()
{
if (_conn.Trace) {
Debug.WriteLine ("Executing: " + this);
}
var r = SQLite3.Result.OK;
var stmt = Prepare (); // THROWS THE ERRROR
r = SQLite3.Step(stmt); // THROWS THE ERRROR
Finalize(stmt);
if (r == SQLite3.Result.Done) {
int rowsAffected = SQLite3.Changes (_conn.Handle);
return rowsAffected;
} else if (r == SQLite3.Result.Error) {
string msg = SQLite3.GetErrmsg (_conn.Handle);
throw SQLiteException.New (r, msg);
} else {
throw SQLiteException.New (r, r.ToString ());
}
}
I assume that because my foreach statement awaits the return of SyncTableAsync that none of the threads are closed. I am also getting a system transaction critical exception that says "attempting to access a unloaded app domain".
Am I using await/async incorrectly with Sqlite3 or is this an issue with Sqlite3 that I am not aware of.
Attached is a photo of the Parallel's stack and the exception.
EDIT
When I try to run the code above as well in unit tests, the unit tests process never dies. I have to exit Visual Studio in order to get the process to die. I am assuming something in SQLite.dll is grabbing a hold of the process when the exception is thrown and not letting go, but I am not sure.
EDIT 2
I can modify the initial method SyncDomainTablesAsync to the following and the code runs without error. The issue is my use of async and await I believe.
public async Task<int> SyncDomainTablesAsync(IProgress<string> progress, CancellationToken ct, DateTime? lastDateSynced=null, string tableName = null)
{
var domainTable = await GetDomainTablesAsync(progress,ct,lastDateSynced, tableName);
var items = domainTable.Items;
foreach (var item in items)
{
_coordinator.SaveObjects(item.Value, typeof(object), true);
}
return 1;
}

MongoCollection.Update() using C# to update list<T>

I am using MongoVue application to show the data preview stored in "MongoDb".
In the attached image, the database name "Energy" has collection name "DataLog". In "DataLog", there are several rows. I am adding these row to the collection by reading it from a .CSV file.
Now sometimes the column name Pings has huge data [say array of 2000 items] for a single row due to which the exception occurs i.e if "MaxDocumentSize exceeds in 16MB"
Since the Pings array was huge which threw an exception and to avoid this, I removed the collection of Pings [i.e. entered blank collection] from row and tried to Insert, it went successful.
Now I want to update the Pings for the same entry, but in case the array is something like 2000 elements or above, then I wish to update it in group of 500 items [500 x 4 = 2000] in a loop.
Can anyone help me out.
** SAMPLE CODE **
private void InsertData(Datalog xiDatalog)
{
List<Ping> tempPings = new List<Ping>();
tempPings.AddRange(xiDatalog.Pings);
xiDatalog.Pings.RemoveAll(x => x.RowId != 0);
WriteConcernResult wc = mongoCollection.Insert(xiDatalog);
counter++;
var query = new QueryDocument("_id", xiDatalog.Id);
MongoCursor<Datalog> cursor = mongoCollection.FindAs<Datalog>(query);
foreach (Datalog data in cursor)
{
AddPings(data, tempPings, mongoCollection);
break;
}
}
private void AddPings(Datalog xiDatalog, List<Ping> xiPings, MongoCollection<Datalog> mongoCollection)
{
int groupCnt = 0;
int insertCnt = 0;
foreach (Ping px in xiPings)
{
xiDatalog.Pings.Add(px);
groupCnt++;
if (((int)(groupCnt / 500)) > insertCnt)
{
UpdateDataLog(xiDatalog.Id, xiDatalog.Pings, mongoCollection);
insertCnt++;
}
}
}
private bool UpdateDataLog(BsonValue Id, List<Ping> tempPings, MongoCollection<Datalog> mongoCollection)
{
bool success = false;
try
{
var query = new QueryDocument("_id", Id);
var update = Update<Datalog>.Set(e => e.Pings, tempPings);
mongoCollection.Update(query, update);
success = true;
}
catch (Exception ex)
{
string error = ex.Message;
}
return success;
}
Answer : Just modified the code to use Update.PushAll() instead of Update.Set()
Please refer below code
private bool UpdateDataLog(BsonValue Id, List<Ping> tempPings, MongoCollection<Datalog> mongoCollection)
{
bool success = false;
try
{
var query = new QueryDocument("_id", Id);
var update = Update<Datalog>.PushAll(e => e.Pings, tempPings);
mongoCollection.Update(query, update);
success = true;
}
catch (Exception ex)
{
string error = ex.Message;
}
return success;
}

Update in couchbase

I need some help in update using couchbase. I have a task in my page. If the user clicks the like then the likes count should be updated in my couchbase bucket. I have tried my own update handler code but that has some time latency. I have included my update code too below.
This is my code for liking a task...
public ResponseVO LikeTask(LikeVO likeVO)
{
ResponseVO response = new ResponseVO();
try
{
if (!isLiked(likeVO.TaskID, likeVO.UserID))
{
UpdateTaskDB likeUpdate = new UpdateTaskDB();
UpdateTaskVO updatetaskvo = new UpdateTaskVO();
updatetaskvo.FieldName = "Likes";
LikeVO tempvo = new LikeVO();
tempvo.LikedOn = DateTime.Now.ToString();
tempvo.UserID = likeVO.UserID;
tempvo.UserName = likeVO.UserName;
tempvo.TaskID = likeVO.TaskID;
updatetaskvo.ObjectValue = tempvo;
updatetaskvo.TaskID = likeVO.TaskID;
likeUpdate.UpdateDocument(updatetaskvo);
}
response.StatusMessage = "Liked Successfully";
}
catch (Exception ex)
{
response.StatusCode = "0";
response.StatusMessage = ex.Message;
}
return response;
}
My own update handler code:
public class UpdateTaskDB
{
CouchbaseClient oCouchbase;
public UpdateTaskDB()
{
oCouchbase = new CouchbaseClient("vwspace", "");
}
public TaskVO GetTaskByID(string task_id)
{
TaskVO results = null;
try
{
String str1;
str1 = (String)oCouchbase.Get(task_id);
results = JsonConvert.DeserializeObject<TaskVO>(str1);
}
catch (Exception ex)
{
}
return results;
}
public void UpdateDocument(UpdateTaskVO inputParams)
{
try
{
var client = new CouchbaseClient("vwspace", "");
TaskVO taskDoc = GetTaskByID(inputParams.TaskID);
switch (inputParams.FieldName)
{
case "Likes":
List<LikeVO> docLikes = taskDoc.likes;
docLikes.Add((LikeVO)inputParams.ObjectValue);
taskDoc.likes = docLikes;
break;
case "UnLike":
LikeVO unlikevo = (LikeVO)inputParams.ObjectValue;
for (int count = 0; count < taskDoc.likes.Count; count++)
{
if (taskDoc.likes[count].UserID.Equals(unlikevo.UserID))
{
unlikevo = taskDoc.likes[count];
break;
}
}
taskDoc.likes.Remove(unlikevo);
break;
default:
break;
}
String json = JsonConvert.SerializeObject(taskDoc);
client.Store(StoreMode.Set, inputParams.TaskID, json);
}
catch (Exception ex)
{
Console.Write("Exception :" + ex.Message);
}
}
}
Is ther any other way to handle this update in couchbase? Kindly help me out..
The latency you're seeing is likely due to the fact that you're creating two instances of the CouchbaseClient for each click. Creating an instance of a CouchbaseClient is an expensive operation, because of the bootstrapping and configuration setup that takes place.
There are a couple of different approaches you can take to minimize how frequently you create CouchbaseClient instances. One would be to create a static client that is reused by your data access classes. Another approach for web apps is to associate instances with HttpApplication instances. For an example of the Web approach, see my (incomplete) sample project on GitHub below.
https://github.com/jzablocki/couchbase-beer.net/blob/master/src/CouchbaseBeersWeb/Models/WebRepositoryBase%271.cs
Also, I would suggest using CAS operations when updating a document's like count. You want to make sure that a "like" vote doesn't cause the entire document to be update from a stale read.
For example:
public TaskVO GetTaskByID(string task_id)
{
var getResult = oCouchbase.ExecuteGet<string>(task_id);
var results = JsonConvert.DeserializeObject<TaskVO>(str1.Value);
results.Cas = getResult.Cas; //Here I'm suggesting adding a Cas property to your TaskVO
return results;
}
Then on your update:
public void UpdateDocument(UpdateTaskVO inputParams)
{
try
{
TaskVO taskDoc = GetTaskByID(inputParams.TaskID);
switch (inputParams.FieldName)
{
...
}
String json = JsonConvert.SerializeObject(taskDoc);
client.ExecuteStore(StoreMode.Set, inputParams.TaskID, json, taskDoc.Cas);
//this will fail if the document has been updated by another user. You could use a retry strategy
}
catch (Exception ex)
{
Console.Write("Exception :" + ex.Message);
}
}

C# webservice losing data on return

I am programming a client program that calls a webmethod but when I get the return data there are missing values on some of the fields and objects.
The webmethod in turn is calling a WCF method and in the WCF method the return data is fine. But when it is passing to the webservice the return data is missing.
Is there any way to fix this problem?
This is my client code calling the webservice:
ReLocationDoc query = new ReLocationDoc();
query.PerformerSiteId = 1;
query.PerformerUserId = 1;
query.FromStatus = 10;
query.ToStatus = 200;
ReLocationDoc doc = new ReLocationDoc();
ServiceReference1.QPSoapClient service = new QPSoapClient();
try {
service.GetRelocationAssignment(query, out doc);
string test = doc.Assignment.Id.ToString();
} catch(Exception ex) {
MessageBox.Show(ex.Message);
}
The webmethod code is here:
[WebMethod]
return m_reLocationClient.GetRelocationAssignment(query, out reLocationDoc);
}
And at last the WCF code:
public ReLocationResult GetRelocationAssignment(ReLocationDoc query, out ReLocationDoc reLocationDoc) {
try {
LOGGER.Trace("Enter GetRelocationAssignment().");
ReLocationResult result = reLocationCompactServiceClient.GetRelocationAssignment(out reLocationDoc, query);
if(reLocationDoc.Assignment == null || reLocationDoc.Assignment.CurrentStatus == STATUS_FINISHED) {
ReLocationDoc newQuery = new ReLocationDoc();
newQuery.Assignment = new AssignmentDoc();
newQuery.Assignment.EAN = DateTime.Today.ToString();
newQuery.PerformerSiteId = QPSITE;
newQuery.PerformerUserId = QPUSER;
reLocationDoc.AssignmentStatus = m_settings.ReadyStatus; ;
result = reLocationCompactServiceClient.CreateReLocationAssignment(out reLocationDoc, newQuery);
}
return result;
} finally {
LOGGER.Trace("Exit GetRelocationAssignment().");
}
}
The GetRelocationAssignment:
public ReLocationResult GetRelocationAssignment(ReLocationDoc query, out ReLocationDoc reLocationDoc) {
try {
LOGGER.Trace("Enter GetRelocationAssignment().");
ReLocationDoc doc = new ReLocationDoc();
ReLocationResult result = new ReLocationResult();
new Database(Connection).Execute(delegate(DBDataContext db) {
User user = GetVerifiedUser(db, query, MODULE_ID);
SiteModule siteModule = SiteModule.Get(db, query.PerformerSiteId, MODULE_ID);
Status status = Status.Get(db, query.FromStatus, query.ToStatus, 0);
Status startStatus = Status.Get(db, query.FromStatus, 0);
Status endStatus = Status.Get(db, query.ToStatus, 0);
IQueryable<Assignment> assignments = Assignment.GetAssignmentsWithEndStatus(db, siteModule, endStatus);
assignments = Assignment.FilterAssignmentStartStatus(assignments, startStatus);
foreach(Assignment assignment in assignments) {
LOGGER.Debug("Handling assignment: " + assignment.Id);
result.Status = true;
AssignmentDoc assignmentDoc = FillAssignmentDoc(assignment);
//ReLocationDoc doc = new ReLocationDoc();
AssignmentStatus sts = assignment.AssignmentStatus.OrderByDescending(ass => ass.Id).First();
assignmentDoc.CurrentStatus = sts.Status.Zone;
Status currentStatus = sts.Status;
IList<Item> items = assignment.Items.ToList();
IList<ItemDoc> itemDocs = new List<ItemDoc>();
foreach(Item item in items) {
ItemDoc itemDoc = FillItemDoc(item);
ItemDetail itemDetail;
if(ItemDetail.TryGet(db, item.Id, out itemDetail)) {
ItemDetailDoc itemDetailDoc = FillItemDetailDoc(itemDetail);
itemDoc.Details = new ItemDetailDoc[1];
Event eEvent = null;
if(Event.GetEvent(db, itemDetail, currentStatus, out eEvent)) {
EventDoc eventDoc = FillEventDoc(eEvent);
itemDetailDoc.Events = new EventDoc[1];
if(eEvent.LocationId.HasValue) {
Location location = null;
if(Location.TryGet(db, eEvent.LocationId.Value, out location)) {
eventDoc.Location = new LocationDoc();
eventDoc.Location = FillLocationDoc(location, db);
}
}
itemDetailDoc.Events[0] = eventDoc;
}
itemDoc.Details[0] = itemDetailDoc;
}
itemDocs.Add(itemDoc);
}
assignmentDoc.Items = itemDocs.ToArray();
doc.Assignment = assignmentDoc;
}
}, delegate(Exception e) {
result.Message = e.Message;
});
reLocationDoc = doc;
return result;
} finally {
LOGGER.Trace("Exit GetRelocationAssignment().");
}
}
In all this code the return data is fine. It is loosing data only when passing to the webmetod.
Enter code here.
Also, the ordering of the XML tags in the message makes difference - I had a similar problem about maybe two years ago, and in that case parameter values were dissappearing during transmission because the sending part ordered the tags differently than what was defined in the schema.
Make surethe XML tags are being accessed with the same casing at either end. if the casing is not the same then the value won't be read.
You should check it all message are sending back from your webservice. Call your webservice manually and check its response.
If all data is there, probably your webservice reference is outdated; update it by right-clicking your webservice reference and choose "Update"
If your data don't came back, your problem is probably related to webservice code. You should check your serialization code (if any) again, and make sure all returned types are [Serializable]. You should check if all return types are public as it's mandatory for serialization.
As noted per John Saunders, [Serializable] isn't used by XmlSerializer.

Categories