What does the operationTimeout in ISessionClient.AcceptMessageSessionAsync actually do? - c#

Context: I have some code that's creating a message session for a particular session, using
ISessionClient.Task<IMessageSession> AcceptMessageSessionAsync(string sessionId, TimeSpan operationTimeout);
Question: What does the operationTimeout in AcceptMessageSessionAsync do? I tried setting it to one minute but, after a minute, nothing happened. Does this timeout just set a property that I need to check myself? Shouldn't a SessionLockLostException fire?
Code Sample:
var session = await sessionClient.AcceptMessageSessionAsync(0, TimeSpan.FromMinutes(1));
var gotSession = true;
if (gotSession)
{
while (!session.IsClosedOrClosing)
{
try
{
Message message = await session.ReceiveAsync(TimeSpan.FromMinutes(2));
if (message != null)
{
await session.CompleteAsync(message.SystemProperties.LockToken);
}
else
{
await session.CloseAsync();
}
}
}
}

OperationTimeout in AcceptMessageSessionAsync is the amount of time for which the call should wait for to fetch the next session.
You can find the Here is the complete implementation of the AcceptMessageSessionAsync method
/// <summary>
/// Gets a particular session object identified by <paramref name="sessionId"/> that can be used to receive messages for that sessionId.
/// </summary>
/// <param name="sessionId">The sessionId present in all its messages.</param>
/// <param name="operationTimeout">Amount of time for which the call should wait to fetch the next session.</param>
/// <remarks>All plugins registered on <see cref="SessionClient"/> will be applied to each <see cref="MessageSession"/> that is accepted.
/// Individual sessions can further register additional plugins.</remarks>
public async Task<IMessageSession> AcceptMessageSessionAsync(string sessionId, TimeSpan operationTimeout)
{
this.ThrowIfClosed();
MessagingEventSource.Log.AmqpSessionClientAcceptMessageSessionStart(
this.ClientId,
this.EntityPath,
this.ReceiveMode,
this.PrefetchCount,
sessionId);
bool isDiagnosticSourceEnabled = ServiceBusDiagnosticSource.IsEnabled();
Activity activity = isDiagnosticSourceEnabled ? this.diagnosticSource.AcceptMessageSessionStart(sessionId) : null;
Task acceptMessageSessionTask = null;
var session = new MessageSession(
this.EntityPath,
this.EntityType,
this.ReceiveMode,
this.ServiceBusConnection,
this.CbsTokenProvider,
this.RetryPolicy,
this.PrefetchCount,
sessionId,
true);
try
{
acceptMessageSessionTask = this.RetryPolicy.RunOperation(
() => session.GetSessionReceiverLinkAsync(operationTimeout),
operationTimeout);
await acceptMessageSessionTask.ConfigureAwait(false);
}
catch (Exception exception)
{
if (isDiagnosticSourceEnabled)
{
this.diagnosticSource.ReportException(exception);
}
MessagingEventSource.Log.AmqpSessionClientAcceptMessageSessionException(
this.ClientId,
this.EntityPath,
exception);
await session.CloseAsync().ConfigureAwait(false);
throw AmqpExceptionHelper.GetClientException(exception);
}
finally
{
this.diagnosticSource.AcceptMessageSessionStop(activity, session.SessionId, acceptMessageSessionTask?.Status);
}
MessagingEventSource.Log.AmqpSessionClientAcceptMessageSessionStop(
this.ClientId,
this.EntityPath,
session.SessionIdInternal);
session.UpdateClientId(ClientEntity.GenerateClientId(nameof(MessageSession), $"{this.EntityPath}_{session.SessionId}"));
// Register plugins on the message session.
foreach (var serviceBusPlugin in this.RegisteredPlugins)
{
session.RegisterPlugin(serviceBusPlugin);
}
return session;
}
You can find the complete sample in below link
https://github.com/Azure/azure-service-bus-dotnet/blob/dev/src/Microsoft.Azure.ServiceBus/SessionClient.cs
Hope it helps.

Related

How to pass through pages using selenium and c#

This is most likely a error of logic in my code.
In first place, what i'm trying to do is:
go to the respective page of my website which is in this case link and collect the data, with my public void GDataPicker.
now where i want you to help me is, i use the following code to see if the button next exists in the webpage, and collect it's respective data, but always give me the same error:
OpenQA.Selenium.StaleElementReferenceException: 'stale element reference: element is not attached to the page document
(Session info: chrome=58.0.3029.110)
(Driver info: chromedriver=2.30.477700 (0057494ad8732195794a7b32078424f92a5fce41),platform=Windows NT 10.0.15063 x86_64)' , i think it's probably because i donĀ“t update my NextButtonElement.
Code:
Boolean ElementDisplayed;
try
{
Gdriver.Navigate().GoToUrl("http://www.codigo-postal.pt/");
IWebElement searchInput1 = Gdriver.FindElement(By.Id("cp4"));
searchInput1.SendKeys("4710");//4730
IWebElement searchInput2 = Gdriver.FindElement(By.ClassName("cp3"));
searchInput2.SendKeys("");//324
searchInput2.SendKeys(OpenQA.Selenium.Keys.Enter);
IWebElement NextButtonElement = Gdriver.FindElement(By.XPath("/html/body/div[4]/div/div/div[2]/ul/li[13]/a"));
GDataPicker();
while (ElementDisplayed = NextButtonElement.Displayed)
{
GDataPicker();
Gdriver.Manage().Timeouts().ImplicitlyWait(TimeSpan.FromSeconds(2000));
NextButtonElement.SendKeys(OpenQA.Selenium.Keys.Enter);
}
}
catch (NoSuchElementException i)
{
ElementDisplayed = false;
GDataPicker();
}
I cant help you with C#, however StaleElementReferenceException occurs when the element you act upon is still in the dom but has been replaced with an identical one. what i would do is catch that exception and find the element again
catch (StaleElementReferenceException i)
{
IWebElement NextButtonElement = Gdriver.FindElement(By.XPath("/html/body/div[4]/div/div/div[2]/ul/li[13]/a"));
}
http://www.seleniumhq.org/exceptions/stale_element_reference.jsp
I would use ExpectedConditions.ElementToBeClickable with the dynamic wait feature selenium has.
var wait = new WebDriverWait(GDriver, TimeSpan.FromSeconds(5));
IWebElement NextButtonElement = wait.Until(ExpectedConditions.ElementToBeClickable(By.XPath("/html/body/div[4]/div/div/div[2]/ul/li[13]/a")));
ExpectedConditions.ElementToBeClickable does exactly what you want it to do, wait a little bit until the element is displayed and not stale.
/// <summary>
/// An expectation for checking an element is visible and enabled such that you
/// can click it.
/// </summary>
/// <param name="locator">The locator used to find the element.</param>
/// <returns>The <see cref="IWebElement"/> once it is located and clickable (visible and enabled).</returns>
public static Func<IWebDriver, IWebElement> ElementToBeClickable(By locator)
{
return (driver) =>
{
var element = ElementIfVisible(driver.FindElement(locator));
try
{
if (element != null && element.Enabled)
{
return element;
}
else
{
return null;
}
}
catch (StaleElementReferenceException)
{
return null;
}
};
}
From https://github.com/SeleniumHQ/selenium/blob/master/dotnet/src/support/UI/ExpectedConditions.cs

Microsoft Bot Framework works locally, but fails remotely

I have this bot that is fairly complicated, but it works locally. But as soon as I publish it, it fails with the error:
Sorry, my bot code is having an issue.
I have tried using Application Insights, but it isn't showing the error details.
The error always happens at the same point:
/// <summary>
/// Start our response
/// </summary>
/// <param name="context">The current context</param>
/// <returns></returns>
public async Task StartAsync(IDialogContext context)
{
// Get our current step
_groups = await _groupProvider.ListAsync();
_group = _groups.First();
// Post the question header
await context.PostAsync(_group.Text);
// Wait for the users response
context.Wait(AskQuestion);
}
/// <summary>
/// When our message is recieved we execute this delegate
/// </summary>
/// <param name="context">The current context</param>
/// <param name="result">The result object</param>
/// <returns></returns>
private async Task AskQuestion(IDialogContext context, IAwaitable<IMessageActivity> result)
{
// Get our question and answers
var question = this._group.Questions[_currentQuestion];
var questionText = question.Text;
var answers = question.Answers.Select(m => m.Text).ToList();
var answerCount = question.Answers.Count;
// Create our options
var options = new PromptOptions<string>(questionText, options: answers);
// Ask our question
Choice<string>(context, GetAnswer, options);
}
/// <summary>
/// Get our answer and decide what to do next
/// </summary>
/// <param name="context">The current context</param>
/// <param name="result">The answer text</param>
/// <returns></returns>
private async Task GetAnswer(IDialogContext context, IAwaitable<string> result)
{
// Get our quest
var questions = _group.Questions;
var length = questions.Count;
var question = _group.Questions[_currentQuestion];
var selectedAnswer = await result;
// Assign our answer to our question
foreach (var answer in question.Answers)
if (answer.Text == selectedAnswer)
question.Answer = answer;
// If we have an answer, filter the products
if (question.Answer != null)
_productProvider.Score(await GetCurrentProducts(), _groups);
// Increase our index
_currentQuestion++;
// If our current index is greater or equal than the length of the questions
if (_currentQuestion == length)
{
// Create our dialog
var dialog = _dialogFactory.CreateSecondStepDialog(_dialogFactory, _groupProvider, _questionProvider, _productProvider, await GetCurrentProducts());
// Otherwise, got to the next step
await context.Forward(dialog, ResumeAfter, new Activity { }, CancellationToken.None);
return;
}
// Ask our next question
await AskQuestion(context, null);
}
So when this dialog starts, it posts the question introduction to the client.
I then invoke context.Wait and ask the question. The question is a choice. I believe that this is where the issue is, because it always throws that message as soon as the question appears.
Can anyone spot anything glaringly obvious with the code?
The error message you have cited, is given when your bot throws a 500. Looking at your code, my guess is that you have a null ref exception. In the section under "// Get our question and answers", try checking that group.Questions isn't null and that _currentQuestion is a valid index into that array.

Using BrowserSession and HtmlAgilityPack to login to Facebook through .NET

I'm trying to use Rohit Agarwal's BrowserSession class together with HtmlAgilityPack to login to and subsequently navigate around Facebook.
I've previously managed doing the same by writing my own HttpWebRequest's. However, it then only works when I manually fetch the cookie from my browser and insert a fresh cookie-string to the request each time I'm doing a new "session". Now I'm trying to use BrowserSession to get smarter navigation.
Here's the current code:
BrowserSession b = new BrowserSession();
b.Get(#"http://www.facebook.com/login.php");
b.FormElements["email"] = "some#email.com";
b.FormElements["pass"] = "xxxxxxxx";
b.FormElements["lsd"] = "qDhIH";
b.FormElements["trynum"] = "1";
b.FormElements["persistent_inputcheckbox"] = "1";
var response = b.Post(#"https://login.facebook.com/login.php?login_attempt=1");
The above works fine. Trouble comes when I try to use this BrowserSession again to fetch another page. I'm doing it this way since BrowserSession saves the cookies from the last response and inserts them into the next request, thus I should not have to manually inser cookiedata fetched from my browser anymore.
However, when I try to do something like this:
var profilePage = b.Get(#"https://m.facebook.com/profile.php?id=1111111111");
the doc I get back is empty. I would appreciate any input on what I'm doing wrong.
I fixed the root cause of this if anyone cares. It turns out the cookies were being saved in the CookieContainer of the REQUEST object and not the response object. I also added the ability to download a file (provided that file is string based). Code definitely is NOT thread-safe, but the object wasn't thread-safe to begin with:
public class BrowserSession
{
private bool _isPost;
private bool _isDownload;
private HtmlDocument _htmlDoc;
private string _download;
/// <summary>
/// System.Net.CookieCollection. Provides a collection container for instances of Cookie class
/// </summary>
public CookieCollection Cookies { get; set; }
/// <summary>
/// Provide a key-value-pair collection of form elements
/// </summary>
public FormElementCollection FormElements { get; set; }
/// <summary>
/// Makes a HTTP GET request to the given URL
/// </summary>
public string Get(string url)
{
_isPost = false;
CreateWebRequestObject().Load(url);
return _htmlDoc.DocumentNode.InnerHtml;
}
/// <summary>
/// Makes a HTTP POST request to the given URL
/// </summary>
public string Post(string url)
{
_isPost = true;
CreateWebRequestObject().Load(url, "POST");
return _htmlDoc.DocumentNode.InnerHtml;
}
public string GetDownload(string url)
{
_isPost = false;
_isDownload = true;
CreateWebRequestObject().Load(url);
return _download;
}
/// <summary>
/// Creates the HtmlWeb object and initializes all event handlers.
/// </summary>
private HtmlWeb CreateWebRequestObject()
{
HtmlWeb web = new HtmlWeb();
web.UseCookies = true;
web.PreRequest = new HtmlWeb.PreRequestHandler(OnPreRequest);
web.PostResponse = new HtmlWeb.PostResponseHandler(OnAfterResponse);
web.PreHandleDocument = new HtmlWeb.PreHandleDocumentHandler(OnPreHandleDocument);
return web;
}
/// <summary>
/// Event handler for HtmlWeb.PreRequestHandler. Occurs before an HTTP request is executed.
/// </summary>
protected bool OnPreRequest(HttpWebRequest request)
{
AddCookiesTo(request); // Add cookies that were saved from previous requests
if (_isPost) AddPostDataTo(request); // We only need to add post data on a POST request
return true;
}
/// <summary>
/// Event handler for HtmlWeb.PostResponseHandler. Occurs after a HTTP response is received
/// </summary>
protected void OnAfterResponse(HttpWebRequest request, HttpWebResponse response)
{
SaveCookiesFrom(request, response); // Save cookies for subsequent requests
if (response != null && _isDownload)
{
Stream remoteStream = response.GetResponseStream();
var sr = new StreamReader(remoteStream);
_download = sr.ReadToEnd();
}
}
/// <summary>
/// Event handler for HtmlWeb.PreHandleDocumentHandler. Occurs before a HTML document is handled
/// </summary>
protected void OnPreHandleDocument(HtmlDocument document)
{
SaveHtmlDocument(document);
}
/// <summary>
/// Assembles the Post data and attaches to the request object
/// </summary>
private void AddPostDataTo(HttpWebRequest request)
{
string payload = FormElements.AssemblePostPayload();
byte[] buff = Encoding.UTF8.GetBytes(payload.ToCharArray());
request.ContentLength = buff.Length;
request.ContentType = "application/x-www-form-urlencoded";
System.IO.Stream reqStream = request.GetRequestStream();
reqStream.Write(buff, 0, buff.Length);
}
/// <summary>
/// Add cookies to the request object
/// </summary>
private void AddCookiesTo(HttpWebRequest request)
{
if (Cookies != null && Cookies.Count > 0)
{
request.CookieContainer.Add(Cookies);
}
}
/// <summary>
/// Saves cookies from the response object to the local CookieCollection object
/// </summary>
private void SaveCookiesFrom(HttpWebRequest request, HttpWebResponse response)
{
//save the cookies ;)
if (request.CookieContainer.Count > 0 || response.Cookies.Count > 0)
{
if (Cookies == null)
{
Cookies = new CookieCollection();
}
Cookies.Add(request.CookieContainer.GetCookies(request.RequestUri));
Cookies.Add(response.Cookies);
}
}
/// <summary>
/// Saves the form elements collection by parsing the HTML document
/// </summary>
private void SaveHtmlDocument(HtmlDocument document)
{
_htmlDoc = document;
FormElements = new FormElementCollection(_htmlDoc);
}
}
/// <summary>
/// Represents a combined list and collection of Form Elements.
/// </summary>
public class FormElementCollection : Dictionary<string, string>
{
/// <summary>
/// Constructor. Parses the HtmlDocument to get all form input elements.
/// </summary>
public FormElementCollection(HtmlDocument htmlDoc)
{
var inputs = htmlDoc.DocumentNode.Descendants("input");
foreach (var element in inputs)
{
string name = element.GetAttributeValue("name", "undefined");
string value = element.GetAttributeValue("value", "");
if (!this.ContainsKey(name))
{
if (!name.Equals("undefined"))
{
Add(name, value);
}
}
}
}
/// <summary>
/// Assembles all form elements and values to POST. Also html encodes the values.
/// </summary>
public string AssemblePostPayload()
{
StringBuilder sb = new StringBuilder();
foreach (var element in this)
{
string value = System.Web.HttpUtility.UrlEncode(element.Value);
sb.Append("&" + element.Key + "=" + value);
}
return sb.ToString().Substring(1);
}
}
Sorry, I don't know much about the HTML agility pack or BrowserSession class you've mentioned. But I did try the same scenario with HtmlUnit and it working just fine. I'm using a .NET wrapper (the source code of which can be found here and is explained a bit more here), and here's the code I've used (some details removed to protect the innocent):
var driver = new HtmlUnitDriver(true);
driver.Url = #"http://www.facebook.com/login.php";
var email = driver.FindElement(By.Name("email"));
email.SendKeys("some#email.com");
var pass = driver.FindElement(By.Name("pass"));
pass.SendKeys("xxxxxxxx");
var inputs = driver.FindElements(By.TagName("input"));
var loginButton = (from input in inputs
where input.GetAttribute("value").ToLower() == "login"
&& input.GetAttribute("type").ToLower() == "submit"
select input).First();
loginButton.Click();
driver.Url = #"https://m.facebook.com/profile.php?id=1111111111";
Assert.That(driver.Title, Is.StringContaining("Title of page goes here"));
Hope this helps.
You might want to use WatiN (Web Application Testing In .Net) Or Selenium to drive your browser. This will help make sure you don't have to fiddle with the cookies and do any custom work to make subsequent requests work since you're simulating actual user.
I had similar symptoms - login worked but authentication cookie was not present in the cookie container and so it was not sent on subsequent requests. I found out this was because the web request was handling the Location: header internally, redirecting behind the scenes to a new page, losing the cookies in the process. I fixed this by adding:
request.AllowAutoRedirect = false; // Location header messing up cookie handling!
...to the OnPreRequest() function. It now looks like this:
protected bool OnPreRequest(HttpWebRequest request)
{
request.AllowAutoRedirect = false; // Location header messing up cookie handling!
AddCookiesTo(request); // Add cookies that were saved from previous requests
if (_isPost) AddPostDataTo(request); // We only need to add post data on a POST request
return true;
}
I hope this can help someone experiencing the same issue.
Today I was facing the same problem. I also worked with Rohit Agarwal's BrowserSession class together with HtmlAgilityPack.
After trial and error programming the whole day, I figured out that the problem is caused, because of not setting the correct cookies in the subsequent requests.
I couln't change the initial BrowserSession code to work correctly but I added the following functions and slightly modified the SameCookieFrom-function. In the end it worked nicely for me.
The added/modified functions are the following:
class BrowserSession{
private bool _isPost;
private HtmlDocument _htmlDoc;
public CookieContainer cookiePot; //<- This is the new CookieContainer
...
public string Get2(string url)
{
HtmlWeb web = new HtmlWeb();
web.UseCookies = true;
web.PreRequest = new HtmlWeb.PreRequestHandler(OnPreRequest2);
web.PostResponse = new HtmlWeb.PostResponseHandler(OnAfterResponse2);
HtmlDocument doc = web.Load(url);
return doc.DocumentNode.InnerHtml;
}
public bool OnPreRequest2(HttpWebRequest request)
{
request.CookieContainer = cookiePot;
return true;
}
protected void OnAfterResponse2(HttpWebRequest request, HttpWebResponse response)
{
//do nothing
}
private void SaveCookiesFrom(HttpWebResponse response)
{
if ((response.Cookies.Count > 0))
{
if (Cookies == null)
{
Cookies = new CookieCollection();
}
Cookies.Add(response.Cookies);
cookiePot.Add(Cookies); //-> add the Cookies to the cookiePot
}
}
What it does: It basically saves the cookies from the initial "Post-Response" and adds the same CookieContainer to the request called later. I do not fully understand why it was not working in the initial version because it somehow does the same in the AddCookiesTo-function. (if (Cookies != null && Cookies.Count > 0) request.CookieContainer.Add(Cookies);)
Anyhow, with these added functions it should work fine now.
It can be used like this:
//initial "Login-procedure"
BrowserSession b = new BrowserSession();
b.Get("http://www.blablubb/login.php");
b.FormElements["username"] = "yourusername";
b.FormElements["password"] = "yourpass";
string response = b.Post("http://www.blablubb/login.php");
all subsequent calls should use:
response = b.Get2("http://www.blablubb/secondpageyouwannabrowseto");
response = b.Get2("http://www.blablubb/thirdpageyouwannabrowseto");
...
I hope it helps many people facing the same problem!
Have you checked out their new API?
http://developers.facebook.com/docs/authentication/
You can call a straightforward URL to get an oauth2.0 access token and attach that on the rest of your requests...
https://graph.facebook.com/oauth/authorize?
client_id=...&
redirect_uri=http://www.example.com/oauth_redirect
Change redirect_uri to whatever URL you want, and it will get called back with a parameter called "access_token" on it. Get that and make whatever automated SDK calls you want.

DB won't reinitialize to local subscribers after Schema changes

First, I will outline my issue in case someone has an alternate fix.
The Problem:
I have a winform app that uses MergeReplication. This is working great except I needed to make changes to the columns and Primary Key on 5 Tables. I dropped them from the Articles and then made my changes. I then re-added them to the Articles and set the Publication to Reintilialize All.
Unfortunately, this does not work. When I go to run the Subscription Program it tells me that the Subscription is InValid.
EDIT 1
I have a correction/addition here. The actual errors I am getting in the Replication Monitor are such -->
Error messages:
The schema script 'tblCaseNotes_4.sch' could not be propagated to the subscriber. (Source: MSSQL_REPL, Error number: MSSQL_REPL-2147201001)
Get help: http://help/MSSQL_REPL-2147201001
Could not drop object 'dbo.tblCaseNotes' because it is referenced by a FOREIGN KEY constraint. (Source: MSSQLServer, Error number: 3726)
Get help: http://help/3726
This seems important because it means that my MergeRepl sync process is trying to ReInitialize but cannot because of the below issue.
The way I was able to fix it on my machine was to use MSSSMS to DELETE the DB and then run my program which creates a db and syncs it. Unfortunately I do not have MSSSMS access to all the remote user SQL Express installs as for security reasons remote connections are off.
My Idea:
Create a small program that runs a .sql script to DELETE the DB on local machine. A la; DROP DATABASE MyDB This is only the test stage so no data preservation is needed.
Unfortunately I haven't the faintest idea how to have a program do that.
The Code:
This is the code that runs as my program loads. It takes care of creating the local db's and subscription if they aren't already there. It then checks to see if they need to be syncronized and kicks off a Pull Sync if needed. I include it because of the possibility that my solution is a change to this code.
I call this code like this -->
MergeRepl matrixMergeRepl = new MergeRepl(SystemInformation.ComputerName + "\\SQLEXPRESS","WWCSTAGE","MATRIX","MATRIX","MATRIX");
matrixMergeRepl.RunDataSync();
MergeRepl is below -->
public class MergeRepl
{
// Declare nessesary variables
private string subscriberName;
private string publisherName;
private string publicationName;
private string subscriptionDbName;
private string publicationDbName;
private MergePullSubscription mergeSubscription;
private MergePublication mergePublication;
private ServerConnection subscriberConn;
private ServerConnection publisherConn;
private Server theLocalSQLServer;
private ReplicationDatabase localRepDB;
public MergeRepl(string subscriber, string publisher, string publication, string subscriptionDB, string publicationDB)
{
subscriberName = subscriber;
publisherName = publisher;
publicationName = publication;
subscriptionDbName = subscriptionDB;
publicationDbName = publicationDB;
//Create connections to the Publisher and Subscriber.
subscriberConn = new ServerConnection(subscriberName);
publisherConn = new ServerConnection(publisherName);
// Define the pull mergeSubscription
mergeSubscription = new MergePullSubscription
{
ConnectionContext = subscriberConn,
DatabaseName = subscriptionDbName,
PublisherName = publisherName,
PublicationDBName = publicationDbName,
PublicationName = publicationName
};
// Ensure that the publication exists and that it supports pull subscriptions.
mergePublication = new MergePublication
{
Name = publicationName,
DatabaseName = publicationDbName,
ConnectionContext = publisherConn
};
// Create the local SQL Server instance
theLocalSQLServer = new Server(subscriberConn);
// Create a Replication DB Object to initiate Replication settings on local DB
localRepDB = new ReplicationDatabase(subscriptionDbName, subscriberConn);
// Check that the database exists locally
CreateDatabase(subscriptionDbName);
}
/// <exception cref="ApplicationException">There is insufficient metadata to synchronize the subscription.Recreate the subscription with the agent job or supply the required agent properties at run time.</exception>
public void RunDataSync()
{
// Keep program from appearing 'Not Responding'
///// Application.DoEvents();
// Does the needed Databases exist on local SQLExpress Install
/////CreateDatabase("ContactDB");
try
{
// Connect to the Subscriber
subscriberConn.Connect();
// if the Subscription exists, then start the sync
if (mergeSubscription.LoadProperties())
{
// Check that we have enough metadata to start the agent
if (mergeSubscription.PublisherSecurity != null || mergeSubscription.DistributorSecurity != null)
{
// Synchronously start the merge Agent for the mergeSubscription
// lblStatus.Text = "Data Sync Started - Please Be Patient!";
mergeSubscription.SynchronizationAgent.Synchronize();
}
else
{
throw new ApplicationException("There is insufficient metadata to synchronize the subscription." +
"Recreate the subscription with the agent job or supply the required agent properties at run time.");
}
}
else
{
// do something here if the pull mergeSubscription does not exist
// throw new ApplicationException(String.Format("A mergeSubscription to '{0}' does not exist on {1}", publicationName, subscriberName));
CreateMergeSubscription();
}
}
catch (Exception ex)
{
// Implement appropriaate error handling here
throw new ApplicationException("The subscription could not be synchronized. Verify that the subscription has been defined correctly.", ex);
//CreateMergeSubscription();
}
finally
{
subscriberConn.Disconnect();
}
}
/// <exception cref="ApplicationException"><c>ApplicationException</c>.</exception>
public void CreateMergeSubscription()
{
// Keep program from appearing 'Not Responding'
// Application.DoEvents();
try
{
if (mergePublication.LoadProperties())
{
if ((mergePublication.Attributes & PublicationAttributes.AllowPull) == 0)
{
mergePublication.Attributes |= PublicationAttributes.AllowPull;
}
// Make sure that the agent job for the mergeSubscription is created.
mergeSubscription.CreateSyncAgentByDefault = true;
// Create the pull mergeSubscription at the Subscriber.
mergeSubscription.Create();
Boolean registered = false;
// Verify that the mergeSubscription is not already registered.
foreach (MergeSubscription existing in mergePublication.EnumSubscriptions())
{
if (existing.SubscriberName == subscriberName
&& existing.SubscriptionDBName == subscriptionDbName
&& existing.SubscriptionType == SubscriptionOption.Pull)
{
registered = true;
}
}
if (!registered)
{
// Register the local mergeSubscription with the Publisher.
mergePublication.MakePullSubscriptionWellKnown(
subscriberName, subscriptionDbName,
SubscriptionSyncType.Automatic,
MergeSubscriberType.Local, 0);
}
}
else
{
// Do something here if the publication does not exist.
throw new ApplicationException(String.Format(
"The publication '{0}' does not exist on {1}.",
publicationName, publisherName));
}
}
catch (Exception ex)
{
// Implement the appropriate error handling here.
throw new ApplicationException(String.Format("The subscription to {0} could not be created.", publicationName), ex);
}
finally
{
publisherConn.Disconnect();
}
}
/// <summary>
/// This will make sure the needed DataBase exists locally before allowing any interaction with it.
/// </summary>
/// <param name="whichDataBase">The name of the DataBase to check for.</param>
/// <returns>True if the specified DataBase exists, False if it doesn't.</returns>
public void CreateDatabase(string whichDataBase)
{
Database db = LocalDBConn(whichDataBase, theLocalSQLServer, localRepDB);
if (!theLocalSQLServer.Databases.Contains(whichDataBase))
{
//Application.DoEvents();
// Create the database on the instance of SQL Server.
db = new Database(theLocalSQLServer, whichDataBase);
db.Create();
}
localRepDB.Load();
localRepDB.EnabledMergePublishing = false;
localRepDB.CommitPropertyChanges();
if (!mergeSubscription.LoadProperties())
{
CreateMergeSubscription();
}
}
private Database LocalDBConn(string databaseName, Server server, ReplicationDatabase replicationDatabase)
{
return server.Databases[replicationDatabase.Name];
}
/// <summary>
/// Checks for the existence of the Publication. If there is one it verifies Allow Pull is set
/// </summary>
/// <returns>True if Publication is present. False if not.</returns>
public bool CheckForPublication()
{
// If LoadProperties() returns TRUE then the Publication exists and is reachable
if (mergePublication.LoadProperties())
return true;
if ((mergePublication.Attributes & PublicationAttributes.AllowPull) == 0)
{
mergePublication.Attributes |= PublicationAttributes.AllowPull;
}
return false;
} // end CheckForPublication()
/// <summary>
/// Checks for the existence of a Subscription.
/// </summary>
/// <returns>True if a Subscription is present. False if not</returns>
public bool CheckForSubscription()
{
// Check for the existence of the Subscription
return mergeSubscription.IsExistingObject;
} // end CheckForSubscription()
}
The Guerdon (Reward):
This is extremely important to me so even if I am a flaming idiot and there is a super simple solution I will be adding a bounty to the correct answer.
EDIT 2
I created this to try and remove the Subscription first....which it does but still errors out on the DROP DB portion saying it is in use...
class Program
{
static void Main(string[] args)
{
DropSubscription();
DropDB();
}
private static void DropSubscription()
{
ServerConnection subscriberConn = new ServerConnection(".\\SQLEXPRESS");
MergePullSubscription mergePullSubscription = new MergePullSubscription("MATRIX","WWCSTAGE","MATRIX","MATRIX",subscriberConn);
mergePullSubscription.Remove();
}
private static void DropDB()
{
SqlCommand cmd;
string sql;
string dbName = "MATRIX";
SqlConnection sqlConnection = new SqlConnection("Server=.\\SQLEXPRESS;Initial Catalog="+ dbName + ";Integrated Security=True;User Instance=False");
sqlConnection.Open();
sql = "DROP DATABASE " + dbName;
cmd = new SqlCommand(sql,sqlConnection);
cmd.ExecuteNonQuery();
sqlConnection.Close();
}
}
If you're in testing phase (and I certainly don't recommend significant schema changes on a production system), then just drop the subscription and database on the subscriber machines and start over. If you can connect to them through SSMS then you can do it from there; or if you have physical access to them you can do it with SQLCMD.
I have code for dropping subscriptions and databases using SMO but it has to be run on the subscriber. Let me know if you think it would be helpful and I'll post it.
Edited to add: OK, the code is below. I don't have time right now to clean it up so it's raw. RaiseSyncManagerStatus is a method to display the status back to the UI because these methods are invoked asynchronously. Hope this helps -- bring on the guerdon. :-)
public void DropSubscription()
{
try
{
RaiseSyncManagerStatus(string.Format("Dropping subscription '{0}'.", _publicationName));
Server srv = new Server(_subscriberName);
MergePullSubscription sub = GetSubscription(srv.ConnectionContext);
// Remove if it exists
// Cannot remove from publisher because sysadmin or dbo roles are required
if (sub.LoadProperties() == true)
{
sub.Remove();
RaiseSyncManagerStatus("Subscription dropped.");
RaiseSyncManagerStatus("Removing subscription registration from the publisher.");
Server srvPub = new Server(_publisherName);
MergePublication pub = GetPublication(srvPub.ConnectionContext);
// Remove the subscription registration
pub.RemovePullSubscription(srv.Name, _subscriberDbName);
}
else
{
RaiseSyncManagerStatus("Failed to drop subscription; LoadProperties failed.");
}
}
catch (Exception ex)
{
RaiseSyncManagerStatus(ex);
throw;
}
}
public void DropSubscriberDb()
{
try
{
RaiseSyncManagerStatus(string.Format("Dropping subscriber database '{0}'.", _subscriberDbName));
if (SubscriptionValid())
{
throw new Exception("Subscription exists; cannot drop local database.");
}
Server srv = new Server(_subscriberName);
Database db = srv.Databases[_subscriberDbName];
if (db == null)
{
RaiseSyncManagerStatus("Subscriber database not found.");
}
else
{
RaiseSyncManagerStatus(string.Format("Subscriber database state: '{0}'.", db.State));
srv.KillDatabase(_subscriberDbName);
RaiseSyncManagerStatus("Subscriber database dropped.");
}
}
catch (Exception ex)
{
RaiseSyncManagerStatus(ex);
throw;
}
}
If I have understood your 'original' issue correctly, then you need to create a new Snapshot of the publication before it can be reinitialized. This is so that any structural changes you have made are applied to the subscribers.
See Adding Articles to and Dropping Articles from Existing Publications for more information and follow the specific steps for Merge replication.

Retry the request if the first one failed

I am using a xml web service on my web app and sometimes remote server fails to respond in time. I came up with the idea of re-request if first attempt fails. To prevent loop I want to limit concurrent request at 2. I want to get an opinion if what I have done below is ok and would work as I expect it.
public class ScEngine
{
private int _attemptcount = 0;
public int attemptcount
{
get
{
return _attemptcount;
}
set
{
_attemptcount = value;
}
}
public DataSet GetStat(string q, string job)
{
try
{
//snip....
attemptcount += attemptcount;
return ds;
}
catch
{
if (attemptcount>=2)
{
return null;
}
else
{
return GetStat(q, job);
}
}
}
}
public class ScEngine
{
public DataSet GetStat(string q, string job)
{
int attemptCount;
while(attemptCount < 2)
{
try
{
attemptCount++;
var ds = ...//web service call
return ds;
}
catch {}
}
//log the error
return null;
}
}
You forgot to increment the attemptcount. Plus, if there's any error on the second run, it will not be caught (thus, becomes an unhandled exception).
I wouldn't recurse in order to retry. Also, I wouldn't catch and ignore all exceptions. I'd learn which exceptions indicate an error that should be retried, and would catch those. You will be ignoring serious errors, as your code stands.
You don't want to solve it this way. You will just put more load on the servers and cause more timeouts.
You can increase the web service timeout via httpRuntime. Web services typically return a lot of data in one call, so I find myself doing this pretty frequently. Don't forget to increase how long the client is willing to wait on the client side.
Here's a version that doesn't use recursion but achieves the same result. It also includes a delay so you can give the server time to recover if it hiccups.
/// <summary>
/// The maximum amount of attempts to use before giving up on an update, delete or create
/// </summary>
private const int MAX_ATTEMPTS = 2;
/// <summary>
/// Attempts to execute the specified delegate with the specified arguments.
/// </summary>
/// <param name="operation">The operation to attempt.</param>
/// <param name="arguments">The arguments to provide to the operation.</param>
/// <returns>The result of the operation if there are any.</returns>
public static object attemptOperation(Delegate operation, params object[] arguments)
{
//attempt the operation using the default max attempts
return attemptOperation(MAX_ATTEMPTS, operation, arguments);
}
/// <summary>
/// Use for creating a random delay between retry attempts.
/// </summary>
private static Random random = new Random();
/// <summary>
/// Attempts to execute the specified delegate with the specified arguments.
/// </summary>
/// <param name="operation">The operation to attempt.</param>
/// <param name="arguments">The arguments to provide to the operation.</param>
/// <param name="maxAttempts">The number of times to attempt the operation before giving up.</param>
/// <returns>The result of the operation if there are any.</returns>
public static object attemptOperation(int maxAttempts, Delegate operation, params object [] arguments)
{
//set our initial attempt count
int attemptCount = 1;
//set the default result
object result = null;
//we've not succeeded yet
bool success = false;
//keep trying until we get a result
while (success == false)
{
try
{
//attempt the operation and get the result
result = operation.DynamicInvoke(arguments);
//we succeeded if there wasn't an exception
success = true;
}
catch
{
//if we've got to the max attempts and still have an error, give up an rethrow it
if (attemptCount++ == maxAttempts)
{
//propogate the exception
throw;
}
else
{
//create a random delay in milliseconds
int randomDelayMilliseconds = random.Next(1000, 5000);
//sleep for the specified amount of milliseconds
System.Threading.Thread.Sleep(randomDelayMilliseconds);
}
}
}
//return the result
return result;
}

Categories