C# Windows Service Memory Usage steadily increasing - c#

I don't know if it is a problem but...
This is the main call inside a Windows service.
The lower level logs into a WebService, retrieves information and passes it up.
A timer is hitting the main call 4 times a minute.
Gen 2 heap size gradually increases . GC does its best so you end up with a sort of Saw Tooth memory consumption, but the gradient of the Saw Tooth is slowly upwards. Working Set and Private Bytes also increase.
LOH size however is consistent.
The culprit appears to be System. Byte[].
ANTS Memory Profiler shows no linkage shown to any program objects. i.e. The retention Graphs only mention System objects.
The Profiler shows 60 % being held by System.Collections.Concurrent.ConcurrentStack,T> + Node The other 40% is divided amongst other system classes.
I have refactored until I am blue in the face. Behavior seems consistent, making me think that perhaps it is not a problem but expected behavior.
If I return before the marked assignment the memory consumption is 'correct' like wise if I return immediately after the assignment the consumption is 'faulty'
So...
It is a real problem and if so any suggestions on what to do would be welcome.
public void GetItsValues()
{
using (Dt dt = new Dt())
{
List<ItsSite> siteList = dt.GetCurrentItsSites().ToList();
IEnumerator<ItsSite> enSite = siteList.GetEnumerator();
while (enSite.MoveNext())
{
using (ClsItsParking its = new ClsItsParking(enSite.Current))
{
List<ClsPbpRegistration> resultList;
//ClsPbpRegistration[] reglist = its.PbpRegistrationsBySite(enSite.Current).ToArray();
resultList = its.PbpRegistrationsBySite(enSite.Current);//***THIS Assignment causes the problem
if (resultList.Any())
{
using (Dt dtSite = new Dt(enSite.Current.ConnectionString))
{
// var result = dtSite.WriteItsPurchases(reglist);
var result = dtSite.WriteItsPurchases(resultList);
if (!result)
ClsLogger.WriteErrorLog("dt.WritePurchases returned false for site " + enSite.Current.CompanyName);
}
}
}
}
}
}
Sorry not much good at posting code.
dt.GetITSPurchases, GetCurrentItsSites signatures and PbpRegistrationBySite body
public IEnumerable<ItsSite> GetCurrentItsSites()
public ClsPbpRegistration[] GetItsPurchases(DateTime purchaseDate)
public List<ClsPbpRegistration> PbpRegistrationsBySite(ItsSite site)
{
string regUrl = null;
try
{
if (!Logon(site))
ClsLogger.WriteErrorLog("ClsItsParking. Logon returned false");
/**** Working Code*****/
TimeSpan tWindow = new TimeSpan(0, TimeWindowMinutes, 0);
DateTime startDate = DateTime.Now - tWindow;
string startHour = startDate.Hour.ToString("00");
string startMinute = startDate.Minute.ToString("00");
DateTime endDate = DateTime.Now;
string endHour = endDate.Hour.ToString("00");
string endMinute = endDate.Minute.ToString("00");
string toDaysPurchases = "?searchFromDate=" + startDate.ToString(DateFormat) +
"&searchFromTimeHour=" + startHour + "&searchFromTimeMinute=" + startMinute + "&searchToDate=" + endDate.ToString(DateFormat)
+ "&searchToTimeHour=" + endHour + "&searchToTimeMinute=" + endMinute + "&searchBy=modified";
regUrl = site.UrlBase + site.RegistrationListExtension + toDaysPurchases;
string registrationResponseString;
registrationResponseString = HttpGetRequest(regUrl, Cookies, ItsUserAgent);
{
Logoff();
if (!string.IsNullOrEmpty(registrationResponseString))
{
XmlDocument motherShipRegistrations = new XmlDocument();
motherShipRegistrations.LoadXml(registrationResponseString);
//listRegistrations = GetNodeList(motherShipRegistrations);
//ClsPbpRegistration[] listRegistrations = GetRegistrationList(motherShipRegistrations);
//return null;
//return listRegistrations;
return GetRegistrationList(motherShipRegistrations);
}
return null;
}
}
catch (Exception ex)
{
ClsLogger.WriteErrorLog("PbpRegistrationsBySite " + ex.Message + " Sent url was " + regUrl);
return null;
}
}

Related

Where do I have memory leaks and how to fix it? Why memory consumption increases?

I am struggling a few days already with a problem of growing memory consumption by console application in .Net Core 2.2, and just now I ran out of ideas what else I could improve.
Im my application I have a method that triggers StartUpdatingAsync method:
public MenuViewModel()
{
if (File.Exists(_logFile))
File.Delete(_logFile);
try
{
StartUpdatingAsync("basic").GetAwaiter().GetResult();
}
catch (ArgumentException aex)
{
Console.WriteLine($"Caught ArgumentException: {aex.Message}");
}
Console.ReadKey();
}
StartUpdatingAsync creates 'repo' and instance is getting from DB a list of objects to be updated (around 200k):
private async Task StartUpdatingAsync(string dataType)
{
_repo = new DataRepository();
List<SomeModel> some_list = new List<SomeModel>();
some_list = _repo.GetAllToBeUpdated();
await IterateStepsAsync(some_list, _step, dataType);
}
And now, within IterateStepsAsync we are getting updates, parsing them with existing data and updating DB. Inside of each while I was creating new instances of all new classes and lists, to be sure that old ones are releasing memory, but it didnt help. Also I was GC.Collect() at the end of the method, what also is not helping. Please note, that method below triggers lots of parralel Tasks, but they supposed to be disposed within it, am I right?:
private async Task IterateStepsAsync(List<SomeModel> some_list, int step, string dataType)
{
List<Area> areas = _repo.GetAreas();
int counter = 0;
while (counter < some_list.Count)
{
_repo = new DataRepository();
_updates = new HttpUpdates();
List<Task> tasks = new List<Task>();
List<VesselModel> vessels = new List<VesselModel>();
SemaphoreSlim throttler = new SemaphoreSlim(_degreeOfParallelism);
for (int i = counter; i < step; i++)
{
int iteration = i;
bool skip = false;
if (dataType == "basic" && (some_list[iteration].Mmsi == 0 || !some_list[iteration].Speed.HasValue)) //if could not be parsed with "full"
skip = true;
tasks.Add(Task.Run(async () =>
{
string updated= "";
await throttler.WaitAsync();
try
{
if (!skip)
{
Model model= await _updates.ScrapeSingleModelAsync(some_list[iteration].Mmsi);
while (Updating)
{
await Task.Delay(1000);
}
if (model != null)
{
lock (((ICollection)vessels).SyncRoot)
{
vessels.Add(model);
scraped = BuildData(model);
}
}
}
else
{
//do nothing
}
}
catch (Exception ex)
{
Log("Scrape error: " + ex.Message);
}
finally
{
while (Updating)
{
await Task.Delay(1000);
}
Console.WriteLine("Updates for " + counter++ + " of " + some_list.Count + scraped);
throttler.Release();
}
}));
}
try
{
await Task.WhenAll(tasks);
}
catch (Exception ex)
{
Log("Critical error: " + ex.Message);
}
finally
{
_repo.UpdateModels(vessels, dataType, counter, some_list.Count, _step);
step = step + _step;
GC.Collect();
}
}
}
Inside of the method above, we are calling _repo.UpdateModels, where is updated DB. I tryed two approaches, with using EC Core and SqlConnection. Both with similar results. Below you can find both of them.
EF Core
internal List<VesselModel> UpdateModels(List<Model> vessels, string dataType, int counter, int total, int _step)
{
for (int i = 0; i < vessels.Count; i++)
{
Console.WriteLine("Parsing " + i + " of " + vessels.Count);
Model existing = _context.Vessels.Where(v => v.id == vessels[i].Id).FirstOrDefault();
if (vessels[i].LatestActivity.HasValue)
{
existing.LatestActivity = vessels[i].LatestActivity;
}
//and similar parsing several times, as above
}
Console.WriteLine("Saving ...");
_context.SaveChanges();
return new List<Model>(_step);
}
SqlConnection
internal List<VesselModel> UpdateModels(List<Model> vessels, string dataType, int counter, int total, int _step)
{
if (vessels.Count > 0)
{
using (SqlConnection connection = GetConnection(_connectionString))
using (SqlCommand command = connection.CreateCommand())
{
connection.Open();
StringBuilder querySb = new StringBuilder();
for (int i = 0; i < vessels.Count; i++)
{
Console.WriteLine("Updating " + i + " of " + vessels.Count);
//PARSE
VesselAisUpdateModel existing = new VesselAisUpdateModel();
if (vessels[i].Id > 0)
{
//find existing
}
if (existing != null)
{
//update for basic data
querySb.Append("UPDATE dbo." + _vesselsTableName + " SET Id = '" + vessels[i].Id+ "'");
if (existing.Mmsi == 0)
{
if (vessels[i].MMSI.HasValue)
{
querySb.Append(" , MMSI = '" + vessels[i].MMSI + "'");
}
}
//and similar parsing several times, as above
querySb.Append(" WHERE Id= " + existing.Id+ "; ");
querySb.AppendLine();
}
}
try
{
Console.WriteLine("Sending SQL query to " + counter);
command.CommandTimeout = 3000;
command.CommandType = CommandType.Text;
command.CommandText = querySb.ToString();
command.ExecuteNonQuery();
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
}
finally
{
connection.Close();
}
}
}
return new List<Model>(_step);
}
Main problem is, that after tenths/hundreds of thousands of updated models my console application memory consumption increases continuously. And I have no idea why.
SOLUTION my problem was inside of ScrapeSingleModelAsync method, where I was using incorrectly HtmlAgilityPack, what I could debug thanks to cassandrad.
Your code is messy, with huge amount of different objects with unknown lifetime. It's hardly possible to figure out the problem just looking at it.
Consider using profiling tools, for example Visual Studio's Diagnostic Tools, they will help you to find what objects are living too long in the heap. Here is overview of its functions related to memory profiling. Highly recomended to be read.
In short, you need to take two snapshots and look at what objects are taking the most memory. Let's look at simple example.
int[] first = new int[10000];
Console.WriteLine(first.Length);
int[] secod = new int[9999];
Console.WriteLine(secod.Length);
Console.ReadKey();
Take the first snapshot when your function works at least once. In my case, I took snapshot when the first huge space has been alocated.
After that, let your app be working some time so the difference in memory usage become noticeable, take the second memory snapshot.
You'll notice that another snapshot is added with info about how much is the difference. To get more specific info, click on one or another blue label of the latest snapshot to open snapshots comparison.
Following my example, we can see that there is change in count of int arrays. By default int[] wasn't visible in the table, so I had to uncheck Just My Code in filtration options.
So, this is what needs to be done. After you figure out what objects increase in count or size over time, you can locate where these objects are create and optimize this operation.

SAML Idp Creation taking too much time

I am using "Kentor.AuthServices.dll" and "Kentor.AuthServices.Mvc.dll" in my code to allowing Single sign on with ADFS server and it is working fine but the problem is that it is taking around more than 1 min show the adfs login screen.
I have debugged the code and record the timing and found the all the code working fine but identity provider creating code is taking more than 1 min.
I am not able to understand why it is taking too much time.
I am putting my code below can anyone please help?
thanks in advance.
try
{
CommonUtility.LogMessage("Start at:" + DateTime.Now);
string adfsUrl = System.Configuration.ConfigurationManager.AppSettings["ADServer"] ?? "";
if(string.IsNullOrEmpty(adfsUrl))
{
CommonUtility.LogMessage("no adfs server found in config");
return RedirectToAction("Login", "Account", string.Empty);
}
string requestUrlScheme = System.Configuration.ConfigurationManager.AppSettings["ADInstance"] ?? "https";
string federationUrl = System.Configuration.ConfigurationManager.AppSettings["ADFSMetaData"] ?? "";
CommonUtility.LogMessage("metdaDataUrl=" + federationUrl);
string trustUrl = string.Format("{0}/adfs/services/trust", adfsUrl);
CommonUtility.LogMessage("trustURL=" + trustUrl);
var idps = Kentor.AuthServices.Mvc.AuthServicesController.Options.IdentityProviders.KnownIdentityProviders;
foreach (var idpItem in idps)
{
CommonUtility.LogMessage("existing ENtity ID=" + idpItem.EntityId.Id);
if (idpItem.EntityId.Id.Equals(trustUrl))
{
Kentor.AuthServices.Mvc.AuthServicesController.Options.IdentityProviders.Remove(idpItem.EntityId);
CommonUtility.LogMessage("removed existing entity at:" + DateTime.Now);
}
}
var spOptions = CreateSPOptions(requestUrlScheme);
CommonUtility.LogMessage("SP option created at:" + DateTime.Now);
Kentor.AuthServices.IdentityProvider idp = null;
**idp = new Kentor.AuthServices.IdentityProvider(new EntityId(trustUrl), spOptions)
{
AllowUnsolicitedAuthnResponse = true,
LoadMetadata = true,
MetadataLocation = federationUrl,
};**
CommonUtility.LogMessage("idp added at:" + DateTime.Now);
if (Kentor.AuthServices.Mvc.AuthServicesController.Options.SPOptions.EntityId == null)
Kentor.AuthServices.Mvc.AuthServicesController.Options.SPOptions.EntityId = new EntityId(string.Concat(string.Format("{0}://{1}{2}", requestUrlScheme, Request.Url.Authority, Url.Content("~")), "AuthServices"));
else
Kentor.AuthServices.Mvc.AuthServicesController.Options.SPOptions.EntityId.Id =
string.Concat(string.Format("{0}://{1}{2}", requestUrlScheme, Request.Url.Authority, Url.Content("~")), "AuthServices");
CommonUtility.LogMessage("AuthServicesURL=" + string.Concat(string.Format("{0}://{1}{2}", requestUrlScheme, Request.Url.Authority, Url.Content("~")), "AuthServices"));
Kentor.AuthServices.Mvc.AuthServicesController.Options.SPOptions.ReturnUrl =
new Uri(string.Concat(string.Format("{0}://{1}{2}", requestUrlScheme, Request.Url.Authority, Url.Content("~")), "SAMLAuthentication/SAMLResponse"));
CommonUtility.LogMessage("SAMLResponseURL=" + string.Concat(string.Format("{0}://{1}{2}", requestUrlScheme, Request.Url.Authority, Url.Content("~")), "SAMLAuthentication/SAMLResponse"));
Kentor.AuthServices.Mvc.AuthServicesController.Options.IdentityProviders.Add(idp);
CommonUtility.LogMessage("redirect times:" + DateTime.Now);
return RedirectToAction("SignIn", "AuthServices", new { idp = trustUrl });
}
catch (Exception ex)
{
CommonUtility.LogException(ex);
throw ex;
}
When you use "LoadMetadata", the IdentityProvider object will load the metadata from the remote address at construction time. If I remember correctly, that's done synchronously to be able to report errors back as an exception. Does it take time (or give a timeout) to download the metadata?

Entity Framework - How To Handle Batch SaveChanges Failure

In my C# program I am using Entity Framework to synchronize a local SQL Server database with QuickBooks data. Getting the data from QuickBooks does not seem to have any issues. However I am running into a stumbling block when doing batch commits of entities.
Currently I am building up the DataContext with a configurable number of entities and then committing the entities in batch. So far the batch has not failed, but what if it does? My idea to combat this would be to iterate over the batch and submit each entity one at a time and then log the one(s) that is/are causing the commit failure.
However I do not see a way to do this with the data context since it appears to be an all or nothing matter when using SaveChanges(). Is there a way to handle what I am trying to accomplish, or should I be going about dealing with the failures in a completely different way?
Here is the code that I currently have, in case you want to take a look at it:
int itemsCount = 0;
int itemsSynced = 0;
int itemsFailed = 0;
ArrayList exceptions = new ArrayList();
int batchSliceCount = Properties.Settings.Default.SyncBatchSize; //Getting the max batch size from the settings
int index = 1; //Index used for keeping track of current batch size on data context
List<Customer> currentBatch = new List<Customer>(); // List to hold curent batch
db = new DataContext(DatabaseHelper.GetLocalDatabaseConnectionString());
foreach (var customer in QBResponse.customers)
{
itemsCount++;
try
{
string debugMsg = "Saving Customer with the Following Details....." + Environment.NewLine;
debugMsg += "ListId: " + customer.CustomerListId + Environment.NewLine;
debugMsg += "FullName: " + customer.FullName + Environment.NewLine;
int progressPercentage = (itemsCount * 100) / opResponse.retCount;
UpdateStatus(Enums.LogLevel.Debug, debugMsg, progressPercentage);
var dbCustomer = db.Customers.FirstOrDefault(x => x.CustomerListId == customer.CustomerListId);
if (dbCustomer == null)
{
// customer.CopyPropertiesFrom(customer, db);
Customer newCustomer = new Customer();
newCustomer.CopyCustomer(customer, db);
newCustomer.AddBy = Enums.OperationUser.SyncOps;
newCustomer.AddDateTime = DateTime.Now;
newCustomer.EditedBy = Enums.OperationUser.SyncOps;
newCustomer.EditedDateTime = DateTime.Now;
newCustomer.SyncStatus = true;
db.Customers.Add(newCustomer);
currentBatch.Add(newCustomer);
}
else
{
//dbCustomer.CopyPropertiesFrom(customer, db);
dbCustomer.CopyCustomer(customer, db);
dbCustomer.EditedBy = Enums.OperationUser.SyncOps;
dbCustomer.EditedDateTime = DateTime.Now;
dbCustomer.SyncStatus = true;
currentBatch.Add(dbCustomer);
}
try
{
if (index % batchSliceCount == 0 || index == opResponse.customers.Count()) //Time to submit the batch
{
UpdateStatus(Enums.LogLevel.Information, "Saving Batch of " + batchSliceCount + "Customers to Local Database");
db.SaveChanges();
itemsSynced += currentBatch.Count();
currentBatch = new List<Customer>();
db.Dispose();
db = new DataContext(DatabaseHelper.GetLocalDatabaseConnectionString());
}
}
catch (Exception ex)
{
string errorMsg = "Error occured submitting batch. Itterating and submitting one at a time. " + Environment.NewLine;
errorMsg += "Error Was: " + ex.GetBaseException().Message + Environment.NewLine + "Stack Trace: " + ex.GetBaseException().StackTrace;
UpdateStatus(Enums.LogLevel.Debug, errorMsg, progressPercentage);
//What to do here? Is there a way to properly iterate over the context and submit a change one at a time?
}
}
catch (Exception ex)
{
//Log exception and restart the data context
db.Dispose();
db = new DataContext(DatabaseHelper.GetLocalDatabaseConnectionString());
}
Thread.Sleep(Properties.Settings.Default.SynchronizationSleepTimer);
index++;
}
it depends on the exceptions you want to recover from...
If you are just looking for a way to retry if the connection was interrupted you could use a custom Execution Strategy based on DbExecutionStrategy that retries if specific errors occur as demonstrated in this CodeProject article.

Azure Cloud Service for loop much slower in production than on localhost

This particular piece of code for some reason runs much slower on production server (which is XL Azure Cloud Service) than on localhost. The worst thing is that this slowness is not consistent, i.e. it is slow most of the time, but sometimes it works fast. By slow I mean this piece of code takes 4.000ms to run on production (I know this because I'm using Azure AppInsights) , and only around 60ms on localhost. I cannot figure out why this is happening. Here's code (please note that this is only part of bigger method, but I'm sure that this part is slowest, everything else is much faster):
for (var i = 0; i < feedDeserialized.Length; i++)
{
RedisWorkoutItem workout;
bool hasRespected;
string username, fullName, profilePic;
// usersFromRedis is an array of Dictonary <string,string>
var userRedis = usersFromRedis[i];
// workoutsFromRedis is just an array of objects - RedisValue object
var stringWorkout = workoutsFromRedis[i];
// just an array of objects - RedisValue object
var workoutComment = commentsFromRedis[i].HasValue ? commentsFromRedis[i].ToString() : "";
if (userRedis != null)
{
profilePic = userRedis["ProfilePhotoUrl"].HasValue
? userRedis["ProfilePhotoUrl"].ToString()
: "";
fullName = userRedis["FirstName"] + " " + userRedis["LastName"];
username = userRedis["UserName"].HasValue ? userRedis["UserName"].ToString() : "";
}
//code inside this else statement never happens
else
{
var stopWatch2 = new Stopwatch();
stopWatch2.Start();
var user = databaseContext.Users.Find(feedDeserialized[i].UserId);
profilePic = user.ProfilePhotoUrl;
username = user.UserName;
fullName = user.FirstName + " " + user.LastName;
stopWatch2.Stop();
telemetryHelper.TrackEvent(_telemetryClient,
"CreateRedisFeedViewModelAsync: Went to DB for user", stopWatch2.Elapsed);
}
if (stringWorkout.HasValue)
{
workout = JsonConvert.DeserializeObject<RedisWorkoutItem>(stringWorkout);
hasRespected = workout.UsersWhoRespected.Contains(userId);
}
//code inside this else statement never happens
else
{
var stopWatch2 = new Stopwatch();
stopWatch2.Start();
var workoutGuid = Guid.Parse(feedDeserialized[i].WorkoutId);
var workoutFromDb = await databaseContext.Trenings.FindAsync(workoutGuid);
var routine = await databaseContext.AllRoutineses.FindAsync(workoutFromDb.AllRoutinesId);
workout = new RedisWorkoutItem
{
Name = routine.Name,
Id = workoutFromDb.TreningId.ToString(),
Comment = workoutFromDb.UsersCommentOnWorkout,
DateWhenFinished = workoutFromDb.DateTimeWhenTreningCreated,
NumberOfRespects = workoutFromDb.NumberOfLikes,
NumberOfComments = workoutFromDb.NumberOfComments,
UserId = workoutFromDb.UserId,
Length = workoutFromDb.LengthInSeconds,
Points = workoutFromDb.Score
};
workoutComment = workoutFromDb.UsersCommentOnWorkout;
hasRespected = databaseContext.TreningRespects
.FirstOrDefault(r => r.TreningId == workoutGuid && r.UserId == userId) != null;
stopWatch2.Stop();
telemetryHelper.TrackEvent(_telemetryClient,
"CreateRedisFeedViewModelAsync: Went to DB for workout", stopWatch2.Elapsed);
}
string workoutLength;
if (workout.Length >= 3600)
{
var t = TimeSpan.FromSeconds(workout.Length);
workoutLength = $"{t.Hours:D2}:{t.Minutes:D2}:{t.Seconds:D2}";
}
else
{
var t = TimeSpan.FromSeconds(workout.Length);
workoutLength = $"{t.Minutes:D2}:{t.Seconds:D2}";
}
listToReturn.Add(new FeedMobileHelper
{
Id = feedDeserialized[i].Id.ToString(),
UserId = workout.UserId,
WorkoutId = feedDeserialized[i].WorkoutId,
Points = workout.Points.ToString("N0", new NumberFormatInfo
{
NumberGroupSizes = new[] {3},
NumberGroupSeparator = "."
}),
WorkoutName = workout.Name,
WorkoutLength = workoutLength,
NumberOfRespects = workout.NumberOfRespects,
NumberOfComments = workout.NumberOfComments,
WorkoutComment = workoutComment,
HasRespected = hasRespected,
UserImageUrl = profilePic,
UserName = username,
DisplayName = string.IsNullOrWhiteSpace(fullName) ? username : fullName,
TimeStamp = workout.DateWhenFinished,
DateFormatted = workout.DateWhenFinished.FormatDateToHrsDaysWeeksString()
});
}
I cannot understand why this takes 4 seconds to complete on cloud service with 8 cores and 15 GB of RAM, especially because there is nothing fancy in here, no trips to database (as I noted in comments the "else parts" of code are never executed) or disc, everything is done in memory. And most of things in loop are done in constant time - O(1). Please can anyone help me to figure out what's problem here.
And one more thing, the API call which executes this part of code isn't called by 1000s of users at the same time, it is only called by me (I'm sure of this).
P.S.
feedDeserialized.Length is around 60
First thought - are you deploying Debug or Release build to Azure Cloud Service? Release build is optimized and much faster. If you right click on your cloud service and choose Publish... you can select which build to deploy. Also, you might want to enable profiling (under advanced settings as part of the publish wizard) to see the slow parts of your application.

High memory Usage problem for wpf application

I made a WPF application that opens the CSV file and does some operation that includes webscraping and gets some values that has type long.(0-10000000)
Now the issue is that when large list of about 2000 is opened then memory usage for software raises above 700MB in some cases 1G.
I am shocked to see this.
some things I think is that
If each entry of csv file has long value associated with it it will take much memory.and single entry has approx 10-12 column each is long in type.now when there are huge row count then memory shoots
There are certain places in code that has a loop (on all csv rows) that creates a instance of custom class.i thought of having destructor then came to know that dot net manages memory automatically.
here goes code for loading CSV
try
{
StreamReader sr = new StreamReader(path,Encoding.Default);
labelRankCheckStatus.Dispatcher.Invoke(DispatcherPriority.Normal, new Action(delegate()
{
labelRankCheckStatus.Content = "Loading Data";
}));
string strline = "";
string[] _values = null;
int x = 0;
while (!sr.EndOfStream)
{
x++;
strline = sr.ReadLine();
_values = strline.Split(',');
if (x == 1)
{
textBoxKw1.Text = _values[12];
textBoxKw2.Text = _values[14];
textBoxKw3.Text = _values[16];
textBoxKw4.Text = _values[18];
}
else if (x != 1)
{
if (_values[0] != "")
{
Url info = new Url();
srNo++;
info.URL = idn.GetAscii(_values[0].ToString().Trim());
info.IsChecked = true;
info.TestResults = int.Parse(_values[1].Replace("%","").TrimEnd().TrimStart());
info.PageRank= int.Parse(_values[2]);
info.RelPageRank = int.Parse(_values[3].Replace("%","").TrimEnd().TrimStart());
info.Alexa= long.Parse(_values[4]);
info.RelAlexa = long.Parse(_values[5].Replace("%","").TrimEnd().TrimStart());
info.Links= long.Parse(_values[6]);
info.RelLinks = long.Parse(_values[7].Replace("%","").TrimEnd().TrimStart());
info.GIW= long.Parse(_values[8]);
info.RelGIW = long.Parse(_values[9].Replace("%","").TrimEnd().TrimStart());
info.GIN= long.Parse(_values[10]);
info.RelGIN = long.Parse(_values[11].Replace("%","").TrimEnd().TrimStart());
info.Kw1Indexed= long.Parse(_values[12]);
info.RelKw1Indexed = long.Parse(_values[13].Replace("%","").TrimEnd().TrimStart());
info.Kw2Indexed= long.Parse(_values[14]);
info.RelKw2Indexed = long.Parse(_values[15].Replace("%","").TrimEnd().TrimStart());
info.Kw3Indexed= long.Parse(_values[16]);
info.RelKw3Indexed = long.Parse(_values[17].Replace("%","").TrimEnd().TrimStart());
info.Kw4Indexed= long.Parse(_values[18]);
info.RelKw4Indexed = long.Parse(_values[19].Replace("%","").TrimEnd().TrimStart());
info.DKwIndexed= long.Parse(_values[20]);
info.RelDKwIndexed = long.Parse(_values[21].Replace("%","").TrimEnd().TrimStart());
info.Info= _values[22];
info.srNo = srNo;
url.Add(info);
}
}
dataGrid1.Dispatcher.Invoke(DispatcherPriority.Normal, new Action(delegate()
{
dataGrid1.Columns[2].Header = "URL ( " + url.Count + " )";
try
{
if (dataGrid1.ItemsSource == null)
dataGrid1.ItemsSource = url;
else
dataGrid1.Items.Refresh();
}
catch (Exception)
{
}
labelRankCheckStatus.Dispatcher.Invoke(DispatcherPriority.Normal, new Action(delegate()
{
labelRankCheckStatus.Content = "Done";
}));
}));
}
sr.Close();
labelRankCheckStatus.Dispatcher.Invoke(DispatcherPriority.Normal, new Action(delegate()
{
labelRankCheckStatus.Content = "Complete ";
}));
}
catch (Exception c)
{
MessageBox.Show(c.Message);
}`
Instead of building in-memory copies of your large objects, consider a more functional approach where you stream data in, process it and output it to your database of choice. If you need to do operations on the old data, you can use an SQL database like Sqlite.
Creating managed objects for every single entity in your system is beyond wasteful, you won't need most of them.
Of course, if you have a lot of RAM, it might simply be that the GC isn't yet bothering to collect all your garbage because the memory isn't actively needed by anything. It's more likely that you're holding references to it though.

Categories