Sub Sonic not closing connections - c#

I'm using Linq from SubSonic 3 like this:
for(int x; x < 100; x++) {
var v = (from c in db.categories
where c.parent == 10
select c);
if (v.Count() > 0) return null;
category[] c = v.ToArray();
}
for some reason SubSonic is not closing the connections...so after a few runs of the above loop I run out of SQL connections in the pool or MySQL just refuses to allow more connections...I've tried this both with SS 3.0.3 and with SVN, and I keep getting these errors.
What should I be doing to close out the connections after I get a set of results?
Thanks

The problem - believe it or not - isn't SubSonic. It's the $*$&$ MySQL driver. We explicitly close off the connection when you do queries like this, but I've seen the MySQL driver completely ignore the closure in favor of some really, really lame attempts at optimization.
I don't know what to tell you here - I'm very sorry to say.

There seem to be some problems with the MySQL .NET library. A few days ago they fixed some of those issues with 6.2.2 related to releasing connections.
But there is also a problem with SubSonic. I used the LINQ templates with MySQL to generate my classes. Whenever FirstOrDefault() or First() (the other similar functions probably have the same issue).
For a query such as:
var db = new MyDb("CONNECTIONSTRING_NAME");
var userExt = (from pe in db.PhoneExtensions
where
pe.FirstName.ToLower() == firstName.ToLower() &&
pe.LastName.ToLower() == lastName.ToLower()
select pe.Extension).FirstOrDefault();
This will cause the query to be executed and the reader will not be disposed.
The problem is in Linq.Structure.DbQueryProvider in the Project of T method.
while (reader.Read())
{
yield return fnProjector(reader);
}
reader.Dispose();
The Dispose() never gets called when using FirstOrDefault() and other similar methods.
A Simple fix:
try
{
while (reader.Read())
{
yield return fnProjector(reader);
}
}
finally
{
reader.Dispose();
}
Simple quick test showing the issue:
private class DbDataReader : System.IDisposable
{
#region IDisposable Members
public void Dispose() { }
#endregion
}
private class DbQueryProvider
{
private DbDataReader _reader;
public bool IsReaderDisposed { get { return _reader == null; } }
public DbQueryProvider()
{
_reader = new DbDataReader();
}
public IEnumerable<int> Project(int numResults)
{
int i = 0;
while (i < numResults)
{
yield return i++;
}
_reader.Dispose();
_reader = null;
}
public IEnumerable<int> ProjectWithFinally(int numResults)
{
int i = 0;
try
{
while (i < numResults)
{
yield return i++;
}
}
finally
{
_reader.Dispose();
_reader = null;
}
}
}
[Test]
public void YieldReturn_Returns_TrueForIsReaderDisposed()
{
const int numResults = 1;
var qp1 = new DbQueryProvider();
var q1 = qp1.Project(numResults);
Assert.IsInstanceOf(typeof(int), q1.First());
var qp2 = new DbQueryProvider();
var q2 = qp2.Project(numResults);
Assert.IsInstanceOf(typeof(int), q2.FirstOrDefault());
var qp3 = new DbQueryProvider();
var q3 = qp3.Project(numResults);
Assert.IsInstanceOf(typeof(int), q3.Single());
var qp4 = new DbQueryProvider();
var q4 = qp4.Project(numResults);
Assert.IsInstanceOf(typeof(int), q4.SingleOrDefault());
Assert.IsTrue(qp1.IsReaderDisposed);
Assert.IsTrue(qp2.IsReaderDisposed);
Assert.IsTrue(qp3.IsReaderDisposed);
Assert.IsTrue(qp4.IsReaderDisposed);
}
[Test]
public void YieldReturnFinally_Returns_TrueForIsReaderDisposed()
{
const int numResults = 1;
var qp1 = new DbQueryProvider();
var q1 = qp1.ProjectWithFinally(numResults);
Assert.IsInstanceOf(typeof(int), q1.First());
var qp2 = new DbQueryProvider();
var q2 = qp2.ProjectWithFinally(numResults);
Assert.IsInstanceOf(typeof(int), q2.FirstOrDefault());
var qp3 = new DbQueryProvider();
var q3 = qp3.ProjectWithFinally(numResults);
Assert.IsInstanceOf(typeof(int), q3.Single());
var qp4 = new DbQueryProvider();
var q4 = qp4.ProjectWithFinally(numResults);
Assert.IsInstanceOf(typeof(int), q4.SingleOrDefault());
Assert.IsTrue(qp1.IsReaderDisposed);
Assert.IsTrue(qp2.IsReaderDisposed);
Assert.IsTrue(qp3.IsReaderDisposed);
Assert.IsTrue(qp4.IsReaderDisposed);
}
YieldReturnFinally_Returns_TrueForIsReaderDisposed passes but YieldReturn_Returns_TrueForIsReaderDisposed fails.
I have tested this on the project I am working on which will soon be in production and this seems to work without any problems. Tested with a connection pool max size of 5 and had no connection pool issues (never ran out of connections on my dev machine when doing 1 query at a time).
I also found some issues in Extensions.Database related to type changing and assignments.
I forked the project on github, committed my changes and did a pull request, hopefully that gets to the right people.

Related

Page LDAP query against AD in .NET Core using Novell LDAP

I am using the Novell LDAP library for making queries to an Active Directory from a .NET Code application. Most of the queries succeed, but some return more than 1000 results, which the AD server refuses. I therefore tried to find out how to page LDAP queries using Novell's library. The solution I put together looks like
public IEnumerable<LdapUser> GetUsers() {
this.Connect();
try {
var cntRead = 0; // Total users read.
int? cntTotal = null; // Users available.
var curPage = 0; // Current page.
var pageSize = this._config.LdapPageSize; // Users per page.
this.Bind();
this._logger.LogInformation("Searching LDAP users.");
do {
var constraints = new LdapSearchConstraints();
// The following has no effect:
//constraints.MaxResults = 10000;
// Commenting out the following succeeds until the 1000th entry.
constraints.setControls(GetListControl(curPage, pageSize));
var results = this._connection.Search(
this._config.LdapSearchBase,
this.LdapSearchScope,
this._config.LdapUsersFilter,
this.LdapUserProperties,
false,
constraints);
while (results.hasMore() && ((cntTotal == null) || (cntRead < cntTotal))) {
++cntRead;
LdapUser user = null;
try {
var result = results.next();
Debug.WriteLine($"Found user {result.DN}.");
user = new LdapUser() {
AccountName = result.getAttribute(this._config.LdapAccountAttribute)?.StringValue,
DisplayName = result.getAttribute(this._config.LdapDisplayNameAttribute)?.StringValue
};
} catch (LdapReferralException) {
continue;
}
yield return user;
}
++curPage;
cntTotal = GetTotalCount(results);
} while ((cntTotal != null) && (cntRead < cntTotal));
} finally {
this._connection.Disconnect();
}
}
and uses the following two helper methods:
private static LdapControl GetListControl(int page, int pageSize) {
Debug.Assert(page >= 0);
Debug.Assert(pageSize >= 0);
var index = page * pageSize + 1;
var before = 0;
var after = pageSize - 1;
var count = 0;
Debug.WriteLine($"LdapVirtualListControl({index}, {before}, {after}, {count}) = {before}:{after}:{index}:{count}");
return new LdapVirtualListControl(index, before, after, count);
}
private static int? GetTotalCount(LdapSearchResults results) {
Debug.Assert(results != null);
if (results.ResponseControls != null) {
var r = (from c in results.ResponseControls
let d = c as LdapVirtualListResponse
where (d != null)
select (LdapVirtualListResponse) c).SingleOrDefault();
if (r != null) {
return r.ContentCount;
}
}
return null;
}
Setting constraints.MaxResults does not seem to have an effect on the AD server. If I do not set the LdapVirtualListControl, the retrieval succeeds until the 1000th entry was retrieved.
If I use the LdapVirtualListControl, the operation fails at the first call to results.next() with the following exception:
System.Collections.Generic.KeyNotFoundException: The given key '76' was not present in the dictionary.
at System.Collections.Generic.Dictionary`2.get_Item(TKey key)
at Novell.Directory.Ldap.Utilclass.ResourcesHandler.getResultString(Int32 code, CultureInfo locale)
at Novell.Directory.Ldap.LdapResponse.get_ResultException()
at Novell.Directory.Ldap.LdapResponse.chkResultCode()
at Novell.Directory.Ldap.LdapSearchResults.next()
The code at https://github.com/dsbenghe/Novell.Directory.Ldap.NETStandard/blob/master/src/Novell.Directory.Ldap.NETStandard/Utilclass/ResultCodeMessages.cs suggests that this is just a follow-up error and the real problem is that the call fails with error code 76, which I do not know what it is. I therefore think that I am missing something in my query. What is wrong there?
I fixed it - in case someone else runs into this:
After some Internet research, I found on https://ldap.com/ldap-result-code-reference-other-server-side-result-codes/#rc-virtualListViewError what error code 76 means and that the LdapVirtualListResponse contains more information. In my case, the error was https://ldap.com/ldap-result-code-reference-other-server-side-result-codes/#rc-sortControlMissing - so it seems that a sort control is required for paging.
In order to fix it, I added
constraints.setControls(new[] {
new LdapSortControl(new LdapSortKey("cn"), true),
GetListControl(curPage, pageSize)
});

Waiting for a parse.com async task to finish in Unity3D

As part of my school project I'm trying to link two tables to decrease the amount of data stored in one table so I wanted to link my "Scores" class with my "CorrectAnswers" class via the ObjectID. However since the tasks are asynchronous, by the time one task is done saving, the other task has already begun or also finished saving and so the ObjectID returns as null.
Here's the code I'm using:
public void SaveScore()
{
ParseObject SendScore = new ParseObject("Scores");
SendScore["Score"] = CheckAnswer.score;
SendScore["user"] = ParseObject.CreateWithoutData("_User", ParseUser.CurrentUser.ObjectId);
SendScore["TestMode"] = MainMenu.testmode;
SendScore["TotalQuestions"] = QuestionCreation.TotalQuestions;
SendScore["CorrectQuestions"] = CheckAnswer.CorrectQuestions;
SendScore.SaveAsync().ContinueWith(t =>
{
ScoreObjectId = SendScore.ObjectId;
});
ParseObject SendCorrectTopics = new ParseObject("CorrectAnswers");
SendCorrectTopics["Score"] = SendScore.ObjectId;
for (int i = 0; i <= 9; i++)
{
string Topic = "Topic" + (i + 1).ToString();
SendCorrectTopics[Topic] = CheckAnswer.CorrectTopics[i];
}
SendCorrectTopics.SaveAsync();
SceneManager.LoadScene(0);
}
How would I be able to make the second save hold until the first save has finished? I'm somewhat new to C# and so don't quite know all it's features yet. I've looked into "await" but unity doesn't seem to like that. Any help would be greatly appreciated!
Thanks in advance,
EDIT: Okay, after a bit more reading on Unity's coroutines, I found a much better way of checking that only relies on checking when needed:
IEnumerator CheckSave()
{
while(ScoreObjectId == null & !DoneSave))
{
print("Running");
yield return new WaitForSeconds(0.5f);
}
DoneSave = false;
SaveTotalTopics();
}
This seems like a much better way of doing it.
Well, it seems the answer was something I've already done before, even if it is a little ugly.
Using Unity's update function I created a check to make sure the ObjectID is not null and the previous save had completed, as so:
void Update () {
if (ScoreObjectId != null & DoneSave)
{
DoneSave = false;
SaveTotalTopics();
}
Thus splitting it into two saves and creating:
public void SaveScore()
{
ParseObject SendScore = new ParseObject("Scores");
SendScore["Score"] = CheckAnswer.score;
SendScore["user"] = ParseObject.CreateWithoutData("_User", ParseUser.CurrentUser.ObjectId);
SendScore["TestMode"] = MainMenu.testmode;
SendScore["TotalQuestions"] = QuestionCreation.TotalQuestions;
SendScore["CorrectQuestions"] = CheckAnswer.CorrectQuestions;
Task SendingScores = SendScore.SaveAsync().ContinueWith(t =>
{
if (t.IsFaulted || t.IsCanceled)
{
DoneSave = false;
print(t.Exception);
}
else
{
DoneSave = true;
print("Setting object ID!");
ScoreObjectId = SendScore.ObjectId;
print(ScoreObjectId);
}
});
}
void SaveTotalTopics()
{
for (int i = 0; i <= 9; i++)
{
string Topic = "Topic" + (i + 1).ToString();
SendCorrectTopics[Topic] = CheckAnswer.CorrectTopics[i];
}
SendCorrectTopics["UserScore"] = ParseObject.CreateWithoutData("Scores", ScoreObjectId);
SendCorrectTopics.SaveAsync().ContinueWith(t =>
{
if(t.IsFaulted || t.IsCanceled)
{
print(t.Exception);
}
else
{
print("Saved!");
}
});
}
I'd also forgotten to use ParseObject.CreateWithoutData() so my first code snippet wouldn't have worked even if I'd found a better method...
So, although I'm not happy with the final result, at least it works and I don't think running an if statement every frame should significantly impact on my game's performance.
Why not use a bool and a while loop?
public IEnumerator SaveScore()
{
bool canContinue = false;
ParseObject SendScore = new ParseObject("Scores");
SendScore["Score"] = CheckAnswer.score;
SendScore["user"] = ParseObject.CreateWithoutData("_User", ParseUser.CurrentUser.ObjectId);
SendScore["TestMode"] = MainMenu.testmode;
SendScore["TotalQuestions"] = QuestionCreation.TotalQuestions;
SendScore["CorrectQuestions"] = CheckAnswer.CorrectQuestions;
SendScore.SaveAsync().ContinueWith(t =>
{
ScoreObjectId = SendScore.ObjectId;
//set the bool canContinue to true because the first portion of code has finished running
canContinue = true;
});
//wait while the canContinue bool is false
while(!canContinue){
yield return null;
}
//continue your parse code
ParseObject SendCorrectTopics = new ParseObject("CorrectAnswers");
SendCorrectTopics["Score"] = SendScore.ObjectId;
for (int i = 0; i <= 9; i++)
{
string Topic = "Topic" + (i + 1).ToString();
SendCorrectTopics[Topic] = CheckAnswer.CorrectTopics[i];
}
SendCorrectTopics.SaveAsync();
SceneManager.LoadScene(0);
return null;
}

Mongo C# driver 2.0 - DeleteOneAsync inside a cursor gives unexpected results

I have done the paging implementation using the following:
.Find(_ => true).Skip(PageSize * (int)(PageNumber - 1)).Limit(PageSize).ToListAsync().Result;
and inside of the paging code I have called DeleteOneAsync( with _id filter), I have over 5,00,000 records and the paging working fine, just that the delete api doesn't delete all the records as expected. My pseudo-code is as follows:
while(true)
{
var page = GetPage(pageIdx++); //starts with 1
if(page.Count == 0)
break;
foreach(var p in page)
{
Delete(p);
}
}
There is no error raised anywhere, all the processing runs fine, but at the ends I expect all records to be deleted but I see that only a few chunk is deleted. Any clue why this happens or if there is a issue in paging part of mine?
Sample code:
public static class ConsoleProgram
{
const int PAGE_SIZE = 100;
public static void Main(string[] args)
{
MongoClientSettings clientSettings = new MongoClientSettings();
clientSettings.Server = new MongoServerAddress("localhost", 27017);
MongoDB.Driver.MongoClient client = new MongoDB.Driver.MongoClient(clientSettings);
IMongoDatabase db = client.GetDatabase("petrel");
IMongoCollection<BsonDocument> mt = db.GetCollection<BsonDocument>("PatientDocuments");
int pageNo = 1;
int count = 0;
while (true)
{
IEnumerable<MongoDB.Bson.BsonDocument> page = null;
if(pageNo == 1)
page = mt.Find(_ => true).Limit(PAGE_SIZE).ToListAsync().Result;
else
page = mt.Find(_ => true).Skip(PAGE_SIZE * (pageNo -1)).Limit(PAGE_SIZE).ToListAsync().Result;
if(page.Count() == 0)
break;
foreach (var p in page)
{
ObjectId id = (ObjectId)p["_id"];
DeleteResult dr = mt.DeleteOneAsync(Builders<BsonDocument>.Filter.Eq("_id", id)).Result;
if (dr.IsAcknowledged)
{
if (dr.DeletedCount != 1)
{
throw new Exception(string.Format("Count [value:{0}] after delete is invalid", dr.DeletedCount));
}
}
else
throw new Exception(string.Format("Delete for [_id:{0}] was not acknowledged", id.ToString()));
}
count += page.Count();
}
Console.WriteLine("Done, count:{0}", count);
Console.ReadLine();
}
The cursor is not isolated so it recognizes that you've deleted some data and when you pull the next page you are skipping records you intend to delete. If you pulled page 1 each time it would work like what it seems you want it to.

Add Multiple record using Linq-to-SQL

I want to add Multiple rows into Table using Linq to SQL
public static FeedbackDatabaseDataContext context = new FeedbackDatabaseDataContext();
public static bool Insert_Question_Answer(List<QuestionClass.Tabelfields> AllList)
{
Feedback f = new Feedback();
List<Feedback> fadd = new List<Feedback>();
for (int i = 0; i < AllList.Count; i++)
{
f.Email = AllList[i].Email;
f.QuestionID = AllList[i].QuestionID;
f.Answer = AllList[i].SelectedOption;
fadd.Add(f);
}
context.Feedbacks.InsertAllOnSubmit(fadd);
context.SubmitChanges();
return true;
}
When I add records into list object i.e. fadd the record is overwrites with last value of AllList
I'm late to the party, but I thought you might want to know that the for-loop is unnecessary. Better use foreach (you don't need the index).
It gets even more interesting when you use LINQ (renamed method for clarity):
public static void InsertFeedbacks(IEnumerable<QuestionClass.Tabelfields> allList)
{
var fadd = from field in allList
select new Feedback
{
Email = field.Email,
QuestionID = field.QuestionID,
Answer = field.SelectedOption
};
context.Feedbacks.InsertAllOnSubmit(fadd);
context.SubmitChanges();
}
By the way, you shouldn't keep one data context that you access all the time; it's better to create one locally, inside a using statement, that will properly handle the database disconnection.
You should create object of Feedback in the scope of for loop, so change your method to :
public static bool Insert_Question_Answer(List<QuestionClass.Tabelfields> AllList)
{
List<Feedback> fadd = new List<Feedback>();
for (int i = 0; i < AllList.Count; i++)
{
Feedback f = new Feedback();
f.Email = AllList[i].Email;
f.QuestionID = AllList[i].QuestionID;
f.Answer = AllList[i].SelectedOption;
fadd.Add(f);
}
context.Feedbacks.InsertAllOnSubmit(fadd);
context.SubmitChanges();
return true;
}

nhibernate random data from database

IList<Companies> companies = NHibernateSession.CreateCriteria(typeof(Companies))
.AddOrder(new RandomOrder())
.SetMaxResults(3)
.List<Companies>();
public class RandomOrder : Order
{
public RandomOrder() : base("", true) { }
public override NHibernate.SqlCommand.SqlString ToSqlString(ICriteria criteria, ICriteriaQuery criteriaQuery)
{
return new NHibernate.SqlCommand.SqlString("newid()");
}
}
how can i make random data from DB. 3 of them. Code i paste not working very well.
Something like this might work... though it'll require 2 db calls:
public IEnumerable<Company> GetRandomCompanies(int maxSelections)
{
try
{
IList<int> companyIds = _session.CreateCriteria<Company>() // get all available company ids
.SetProjection(LambdaProjection.Property<Company>(c => c.Id)).List<int>();
return _session.CreateCriteria<Company>()
.Add(Restrictions.In(LambdaProjection.Property<Company>(c => c.Id), GetRandomCompanyIds(companyIds.ToList(), maxSelections))) // get 3 random Ids
.List<Company>();
}
catch (Exception xpt)
{
ErrorSignal.FromCurrentContext().Raise(xpt);
}
return new List<Company>();
}
private List<int> GetRandomCompanyIds(List<int> companyIds, int maxSelections)
{
List<int> randomIds = new List<int>();
for (int i = 0; i <= maxSelections; i++)
{
// this will get you the same result all day, new next day
// it might not be what you need, so you could just use a new seed.
Random rng = new Random(DateTime.Now.DayOfYear);
randomIds.Add(companyIds[rng.Next(companyIds.Count)]);
}
return randomIds;
}
edit: also, I haven't tested this at all so who knows what it'll do! It should be at least on the right track. Maybe there's a way that doesn't require 2 db calls
In Nhibernate you can simply select random rows like so using SQL
var query = "SELECT top 3 * from [Companies] ORDER BY NEWID()";
ISQLQuery qry = session.CreateSQLQuery(query).AddEntity(typeof(Companies));
Companies randomCompanies = qry.List<Companies>();

Categories