SQL Query pauses - c#

My query seems to be stalling every so many passes through the query.
status_text.Text = "Check existing records...";
status_text.Refresh();
using (StreamReader reader = new StreamReader(df_text_filename))
{
using (StreamWriter writer = new StreamWriter(df_text_filename + "_temp"))
{
while ((product = reader.ReadLine()) != null)
{
if (product != _aff_svc.DFHeaderProd)
{
df_product = _product_factory.GetProductData(_vsi, product);
}
status_text.Text = "Checking for existing record of vendor record ID " + df_product.SKU;
status_text.Refresh();
if (_pctlr.GetBySKU(df_product.SKU) != null)
{
continue;
}
writer.WriteLine(product);
Application.DoEvents();
}
writer.Close();
}
reader.Close();
}
System.IO.File.Delete(df_text_filename);
System.IO.File.Move(df_text_filename + "_temp", df_text_filename);
The code quickly runs through the GetBySKU about 10 times, pauses for about a second or so, then quickly does another ten records. This occurs throughout my processes, not just with this particular query.
It also occurs whether or not I have the Application.DoEvents() fire.
The other problem is that it is not consistent. I can be working like this for a few hours, then all of a sudden, it will zip through the loop as intended (expected).
My SQL server is running on the same machine as the program.
I looked into dedicating resources to the server to mitigate this behavior, but have found nothing.

It looks as if you're program is parsing a text file for product info and then as it parses, in a while loop you're executing a couple SQL queries. It's almost always a bad idea to make SQL round trips inside a loop.
Instead, I would look into parsing the file, gathering all the product ideas, closing the file and then make one call to sql passing a/many TVPs (table valued parameters) to a sproc and return all the data you need from that sproc - possibly as many tables.
EDIT:
You mentioned in the comments that the file is very large with lots of processing. You could consider batching the SQL work in lets say something like 100?
Also, if you're SQL isn't tuned it would continually slow down as more data is written. There's not enough info in the question to analyze the indexes, query plans etc... but have a look at that as the data set grows.

I will work on a batch solution later, however, this works much faster than the previous code. No pauses at all.
List<Product> _prod_list = new List<Product>();
_prod_list = ProductDataFactory.GetProductListByVendor(vendor_name);
if (_prod_list.Count() > 0)
{
using (StreamReader reader = new StreamReader(df_text_filename))
{
using (StreamWriter writer = new StreamWriter(df_text_filename + "_temp"))
{
while ((product = reader.ReadLine()) != null)
{
if (product != _aff_svc.DFHeaderProd)
{
df_product = _product_factory.GetProductData(_vsi, product);
}
if (_prod_list.Find(o => o.SKU == df_product.SKU) != null)
{
continue;
}
writer.WriteLine(product);
}
writer.Close();
}
reader.Close();
}
System.IO.File.Delete(df_text_filename);
System.IO.File.Move(df_text_filename + "_temp", df_text_filename);
}
Just pulling a list of product objects and query it for existing records if there are any; if not, it skips the whole process of course. No need to hit the db in the loop either.
Thanks.

Related

Streamreader adds Column with opened File Dialog

in the following code I want to have my OpenFileDialog open in a method until a valid file is selected. This works only conditionally. For some reason it adds a column after the message is displayed. This causes correct data tables to also be read incorrectly if I have previously selected an incorrect file.
public static InputData GetCSVData()
{
InputData InputData = new InputData();
OpenFileDialog OFDReader = new OpenFileDialog();
//Filter OpenFileDialog; show only CSV-Files
OFDReader.Filter = "CSV files|*.csv;";
// check if data contains "Date/Time" .
OFDReader.FileOk += delegate (object s, CancelEventArgs ev)
{
//search for Line to start reader
int LineCounter = 0;
var readertmp = new StreamReader(OFDReader.FileName);
while (true)
{
string LineTmp = readertmp.ReadLine();
string record = "Date/Time";
if (LineTmp.Contains(record))
{ break; }
else if (readertmp.EndOfStream)
{
MessageBox.Show("Data has no DataPoints !", "Wrong Data", MessageBoxButtons.OK, MessageBoxIcon.Warning);
ev.Cancel = true;
{ break; }
}
LineCounter++;
}
//read InputData
var reader = new StreamReader(OFDReader.FileName);
for (int i = 0; i < LineCounter; i++)
{
reader.ReadLine();
}
// settings CSVHelper
var config = new CsvConfiguration(CultureInfo.InvariantCulture)
{
Delimiter = ";", // Set delimiter
};
var csv = new CsvReader(reader, config);
var DataRead = new CsvDataReader(csv);
InputData.DataTable.Load(DataRead);
//check for enough columns
int ColumnCounter = 0;
ColumnCounter = InputData.DataTable.Columns.Count;
if (ColumnCounter <= 2)
{
MessageBox.Show("Data has not enough columns!", "Wrong Data", MessageBoxButtons.OK, MessageBoxIcon.Warning);
ev.Cancel = true;
}
};
if (OFDReader.ShowDialog() == DialogResult.OK)
{
InputData.FilePath = OFDReader.FileName;
}
return InputData;
}
}
It appears you are making this more complicated than it has to be. For starters it seems odd (at least to me) that you would bother with the FileOK delegate. I do not see what difference it would make if the user is presented with an OpenFileDialog once, twice or many times. Using a single OpenFileDialog for this just seems like a waste of effort.
If the user selects a file and it fails to meet the necessary requirements, then simply open another OpenFileDialog and let the user try again. Doing this in a single dialog is certainly possible, however, “where” else would you use this? It appears this dialog is “specific” to a “certain” type of file, why limit the dialog to the requirement we need. I would think a simple method that loops forever until the user selects a valid file or Cancels the OpenFileDialog would be an easier approach.
With that said, following your code is a little odd. The reason for your issue is that the code is reading the file into the InputData.DataTable regardless if the file FAILS having datapoints OR enough columns. Put a breakpoint on the line…
InputData.DataTable.Load(DataRead);
You will see that the DataTable is filled with the data even if the data has no “DataPoints.” After the above line of code executes the next few lines check to see if the DataTable has 2 or more columns of data. If there are not enough columns, then the code simply pops up a message box indicating this.
This appears straight forward, however, the InputData.DataTable STILL HAS THE DATA even if it was bad. Next time you call the above Load method, it will simply ADD the new table to the existing table. It will add the columns if needed and simply add the rows to the bottom of the existing DataTable. Try opening several BAD files then eventually open the good file and you will see many added columns and rows.
I will assume that you may be under the impression that when you call…
ev.Cancel = true;
That the code stops right there and goes back to the first line in the delegate…
int LineCounter = 0;
… this would not be true. The code continues after ev.Cancel = true; is executed.
This can be seen by the fact that you are getting extra columns and rows every time a BAD file is attempted to be opened. A simple solution is to simply create a “new” InputData object just before you call the load method. Something like…
InputData = new InputData();
InputData.DataTable.Load(DataRead);
This will fix the extra columns issue, however, IF the user selects a BAD file and the error message pops up and the user clicks the OK button to go back to the open file dialog… THEN… IF the user then clicks the open file dialogs “Cancel” button, the BAD file will still be displayed in the grid. I am confident you may not want this behavior.
Without going into detail about some of the other strange aspects of the posted code. I proffer one other possible simpler solution as described in the beginning. Granted, the code below uses multiple OpenFileDialogs, however the user still cannot escape until they pick a valid file or cancel the dialog.
Much of the code below is taken from the existing posted code however, it is structured differently. Initially some variables are created before we stat an endless loop. Specifically, the CsvConfiguration variable config has some added properties set that ignore some code crashing problems when reading the file. I am confident you will want to set up the CsvReader to handle these problems the way you want them to be handled.
Once inside the endless while loop, the code creates a new InputData object, initializes a new OpenFileDialog and sets its properties. Then the code displays the OpenFileDialog and when the dialog returns, the DialogResult result variable is set to the dialogs returned DialogResult.
If the dialog returns OK then the code checks to see if the file is an “empty” file. If the file is empty, a message box is displayed to inform the user, then we branch back up to the begging of the loop. If the dialog result is Cancel, then the code will return a “new” InputData object. The reason for the empty check is that an exception (No header record was found) will be throw on the line…
DataRead = new CsvDataReader(csv);
If the file is empty.
I am confident that there may be some CsvHelper property that I missed that would prevent this “empty” file exception. If there is some better way to check for this “empty” file or prevent the exception, I am open to suggestions.
If the file is NOT empty, we continue by opening the file and go ahead and read its data as intended using the CsvDataReader. The idea is that… IF the file reads correctly without errors and fits the requirements, then we will already have the InputData.DataTable set and all that is left to do is to set its FilePath property and return the InputData object.
Once we have the InputData.DataTable we can check the number of columns in the InputData.DataTable. If the number of columns is less than two (2), then pop up the error message box to the user and loop back to the begging of the while loop.
If the InputData.DataTable meets the two (2) or more columns requirement, then another check is made by looping through all the columns in the data table. If at least ONE (1) column name is “Date/Time” then we are done checking the requirements and simply set the InputData.FileName property and return the InputData object.
If none of the column names in the InputData.DataTable columns is named ”Date/Time,” then again we pop up the error message box and loop back to the begging of the while loop.
It should be noted that if the file fails the number of columns test or the column named Date/Time test… then as with your problem, the InputData.DataTable STILL HAS THE DATA. This is OK here since we will re-initialize a “new” InputData object when we loop back up to the begging of the while loop.
Lastly, you do not show the InputData Class, however it appears to have at least two (2) properties… 1) a string FilePath and 2) a DataTable named DataTable??? this looks odd and is ambiguous… I have renamed my InputData object’s DataTable property to DT. The same “ambiguity” applies to the InputData variable which I have changed to TempInputData.
Since the code may “potentially” create numerous InputData objects each time the user selects a BAD file, I have implemented the IDisposable interface in the InputData Class. This way we can use this Class in a using statement and properly dispose of the unused InputData objects the code creates. I hope I have implemented this correctly.
public class InputData : IDisposable {
public DataTable DT;
public string FilePath;
private bool isDisposed;
public InputData() {
DT = new DataTable();
FilePath = "";
}
public void Dispose() {
Dispose(true);
GC.SuppressFinalize(this);
}
protected virtual void Dispose(bool disposing) {
if (isDisposed) {
return;
}
if (disposing) {
DT?.Dispose();
FilePath = null;
}
isDisposed = true;
}
}
private InputData GetInputDataFromSCV() {
InputData TempInputData;
OpenFileDialog OFDReader;
string initialDirectory = #"D:\Test\CSV";
DialogResult result;
CsvConfiguration config = new CsvConfiguration(CultureInfo.InvariantCulture) {
Delimiter = ";",
IgnoreBlankLines = true,
MissingFieldFound = null,
BadDataFound = null
};
CsvReader csv;
CsvDataReader DataRead;
StreamReader readertmp;
FileInfo fi;
while (true) {
using (TempInputData = new InputData()) {
using (OFDReader = new OpenFileDialog()) {
OFDReader.Filter = "CSV files|*.csv;";
OFDReader.InitialDirectory = initialDirectory;
result = OFDReader.ShowDialog();
if (result == DialogResult.OK) {
fi = new FileInfo(OFDReader.FileName);
if (fi.Length != 0) {
using (readertmp = new StreamReader(OFDReader.FileName)) {
csv = new CsvReader(readertmp, config);
DataRead = new CsvDataReader(csv);
TempInputData.DT.Load(DataRead);
if (TempInputData.DT.Columns.Count > 2) {
foreach (DataColumn column in TempInputData.DT.Columns) {
if (column.ColumnName == "Date/Time") {
TempInputData.FilePath = OFDReader.FileName;
return TempInputData;
}
}
// if we get here we know a column named "Date/Time" was NOT found
MessageBox.Show("Data has no DataPoints !", "Wrong Data", MessageBoxButtons.OK, MessageBoxIcon.Warning);
}
else {
MessageBox.Show("Data has less than 2 columns?", "Wrong Data", MessageBoxButtons.OK, MessageBoxIcon.Warning);
}
}
}
else {
MessageBox.Show("File is empty!", "Wrong Data", MessageBoxButtons.OK, MessageBoxIcon.Warning);
}
}
else {
if (result == DialogResult.Cancel) {
return new InputData();
}
}
}
}
}
}
I hope this makes sense and helps.
Im sorry for the inconvenience. Sometimes I really make it too complicated for myself. I have now solved it as follows:
if (result == DialogResult.Cancel)
{
if (inputDataHistory.Loadcount != 0)
{
TempInputData.FilePath = inputDataHistory.FilePathCache;
TempInputData.LineCounter = inputDataHistory.LinecounterCache;
var reader = new StreamReader(TempInputData.FilePath);
for (int i = 0; i < TempInputData.LineCounter; i++)
{
reader.ReadLine();
}
csv = new CsvReader(reader, config);
DataRead = new CsvDataReader(csv);
TempInputData.DT.Load(DataRead);
TempInputData.IsDisposed = true;
return TempInputData;
}
else
{
return new InputData();
}
I don't know if it is the most efficient solution but I read the key variables into another class before. These are used when canceling to re-read the file before.

Why is C# List<T> Add method very slow?

I have an ASP.NET MVC project using Dapper to read data from a database and I need to export to Excel.
Dapper is fast! ExecuteReader takes only 35 seconds.
But list.Add(InStock); spends too much time! Over 1020 seconds!
Do you have any idea why this is?
public List<InStock> GetList(string stSeId, string edSeId, string stSeDay, string edSeDay, string qDate)
{
List<InStock> list = new List<InStock>();
InStock InStock = null;
IDataReader reader;
using (var conn = _connection.GetConnection())
{
try
{
conn.Open();
//******************Only 35 seconds*****
reader = conn.ExecuteReader(fileHelper.GetScriptFromFile("GetInStock"),
new { STSeId = stSeId, EDSeId = edSeId, STSeDay = stSeDay, EDSeDay = edSeDay, qDate = qDate });
//*************************************
//******************Over 1020 seconds**********
while (reader.Read())
{
InStock = new InStock();
InStock.ColA = reader.GetString(reader.GetOrdinal("ColA"));
InStock.ColB = reader.GetString(reader.GetOrdinal("ColB"));
InStock.ColC = reader.GetString(reader.GetOrdinal("ColC"));
list.Add(InStock);
}
//*********************************************
return list;
}
catch (Exception err)
{
throw err;
}
}
}
It's the database.
From Retrieve data using a DataReader,
The DataReader is a good choice when you're retrieving large amounts of data because the data is not cached in memory.
The key clue for your performance concern regards "because the data is not cached in memory". While strictly an implementation detail, each call to Read() gets new data from the database, while the List<InStock>.Add() call is just adding the new InStock to the list.
There are orders of magnitude of difference in processing times between disk access (even SSDs) compared to RAM. And theres orders of magnitude of difference between network requests and disk access. There's not really a conceivable way that anything other than the database access is the cause of most of your run time.
--
As a side note, you're going to exceed the maximum number of rows in an Excel worksheet.

How to get file size of all datasets of data factory - specially data lake store and blob?

We have many different pipelines in Azure data factory with many data sets. Mainly we have data sets of Azure data lake store and Azure Blobs. I want to know the file size of all files (from all datasets of all pipelines). I am able to iterate all the datasets from all the pipeline using DataFactoryManagementClient in C# but when I am trying to see fileName or folderName of the dataset, I am getting null. You can see my below code -
private static void GetDataSetSize(DataFactoryManagementClient dataFactoryManagementClient)
{
string resourceGroupName = "resourceGroupName";
foreach (var dataFactory in dataFactoryManagementClient.DataFactories.List(resourceGroupName).DataFactories)
{
var linkedServices = new List<LinkedService>(dataFactoryManagementClient.LinkedServices.List(resourceGroupName, dataFactory.Name).LinkedServices);
var datasets = dataFactoryManagementClient.Datasets.List(resourceGroupName, dataFactory.Name).Datasets;
foreach (var dataset in datasets)
{
var lsTypeProperties = linkedServices.First(ls => ls.Name == dataset.Properties.LinkedServiceName).Properties.TypeProperties;
if(lsTypeProperties.GetType() == typeof(AzureDataLakeStoreLinkedService))//AzureDataLakeStoreLinkedService))
{
AzureDataLakeStoreLinkedService outputLinkedService = lsTypeProperties as AzureDataLakeStoreLinkedService;
var folder = GetBlobFolderPathDL(dataset);
var file = GetBlobFileNameDL(dataset);
}
}
}
}
public static string GetBlobFolderPathDL(Dataset dataset)
{
if (dataset == null || dataset.Properties == null)
{
return string.Empty;
}
AzureDataLakeStoreDataset dlDataset = dataset.Properties.TypeProperties as AzureDataLakeStoreDataset;
if (dlDataset == null)
{
return string.Empty;
}
return dlDataset.FolderPath;
}
public static string GetBlobFileNameDL(Dataset dataset)
{
if (dataset == null || dataset.Properties == null)
{
return string.Empty;
}
AzureDataLakeStoreDataset dlDataset = dataset.Properties.TypeProperties as AzureDataLakeStoreDataset;
if (dlDataset == null)
{
return string.Empty;
}
return dlDataset.FileName;
}
With this, I want to generate monitoring tool which will tell me how data is increasing for each file/dataset?
FYI - I am going to monitor retries, failures of each slice. I can get this information without any issue, but now the problem is about getting the file name and folder path because it's returning me null(It seems to be a bug in API). Once I have folder and file path, then using DataLakeStoreFileSystemManagementClient I will get the file size of those files. I am planning to ingest all this data (size, fileName, retries, failure etc) into SQL database and on top of it - I will generate reports which will tell me how my data is growing daily or hourly etc.
I want to make it generic, in such a way that - if in future I add new dataset or pipeline - I get the size of all newly added datasets also without changing any code.
Please help me how can I achieve this. Suggest me if there is an alternate way if possible.
Just place this code in your main method and execute.You may able to see your datasets folderpath and filenames.Use this and change accordingly to your requirement.
Hope this helps!
foreach (var dataFactory in dataFactoryManagementClient.DataFactories.List(resourceGroupName).DataFactories)
{
var datasets = dataFactoryManagementClient.Datasets.List(resourceGroupName, dataFactory.Name).Datasets;
foreach (var dataset in datasets)
{
var lsTypeProperties = dataFactoryManagementClient.Datasets.Get(resourceGroupName,dataFactory.Name,dataset.Name);
if (lsTypeProperties.Dataset.Properties.TypeProperties.GetType() == typeof(AzureDataLakeStoreDataset))//AzureDataLakeStoreDataset))
{
AzureDataLakeStoreDataset OutputDataSet = lsTypeProperties.Dataset.Properties.TypeProperties as AzureDataLakeStoreDataset;
Console.WriteLine(OutputDataSet.FolderPath);
Console.WriteLine(OutputDataSet.FileName);
Console.ReadKey();
}
}
}

EventLogQuery does not return new event log records

I have a task to poll application event log periodically to check for new entries and process them, later parsing them to an .evtx file.
This task is not a problem.
I am using a code like that:
using (var els = new EventLogSession())
{
string timeString = timestamp.ToString("yyyy-MM-ddTHH:mm:ss.fffZ");
string queryString = String.Format(
"<QueryList> <Query Id='0' Path='{0}'> <Select Path='{0}'> " +
"*[System[TimeCreated[#SystemTime>'{1}']]]</Select> </Query> </QueryList>",
logName, timeString);
var query = new EventLogQuery(logName, PathType.LogName, queryString) { Session = els };
var records = new List<EventRecord>();
using (var logReader = new EventLogReader(query))
{
EventRecord record;
while ((record = logReader.ReadEvent()) != null)
{
//log entries processed here
...
//finally export log messages to a file
els.ExportLogAndMessages(logName, PathType.LogName, queryString, fileName, true, CultureInfo.GetCultureInfo("en"));
}
}
}
Unfortunately I found out that after restarting my PC and starting application - EventLogReader always returns the same set of messages, even if I restart my application. That means - new messages do not appear in the results yielded by logReader.ReadEvent() method.
However - changing query string to a simple asterisk and passing it to EventLogQuery - resolves this situation, logReader returns all messages including new ones.
For now, I stick with "*" query string and filter old entries with code, but this seems to be not the best solution to me.
Is it a mistake in my query? Is it a mistake with my handling of EventLogSession or EventLogReader objects? Is it a known MS bug?
I know this is an old question, but it doesn't have an answer and I've been working in this area lately.
Possible issues with this code:
Use UTC time for your date, and
.ToString("o")
instead of
.ToString("yyyy-MM-ddTHH:mm:ss.fffZ")
Don't include any of the extraneous XML in the query string. You only need the query bit. So it should be:
string queryString = String.Format(
"*[System[TimeCreated[#SystemTime>'{0}']]]",
timeString);
Consider using an EventBookmark instead of constructing a complex query expression for date filtering. EventBookmarks were designed for that purpose. See How to create an EventBookmark when querying the event log for an example of using EventBookmark for a very similar kind of purpose.
Once you have an EventBookmark you simply use:
using (var logReader = new EventLogReader(query, EventBookmark))
to ensure that your reader starts at the entry after the previous bookmark.

What is wrong with my program logic

Please help me find the defect in my logic. I have two variables named "prev" and "next"...What i am basically doing is reading the data from my database every 5s and printing it out using Websync server if next and prev are NOT equal. I have two rows in my database . It looks like
ID
8
10
Here is the link to the code http://pastebin.com/Hb3eH2Qv
When i run my program, i get the result as
8 10 8 10
8 10
8 10 8 10
8 10
..... (so on)
But, the result should be just
8 10
I dont know how 8 10 8 10 appears. Data gets concatenated twice.
NOTE: You can just see the code in PublishLoop() function
private void PublishLoop()
{
String prev=String.Copy("");
String next=String.Copy("");
String ConnectionString = ConfigurationManager.ConnectionStrings["MyDbConn"].ToString();
SqlConnection connection = new SqlConnection(ConnectionString);
SqlCommand command = connection.CreateCommand();
command.CommandText = "select ID from Tab1";
command.Notification = null;
while (Running)
{
connection.Open();
using (SqlDataReader reader = command.ExecuteReader(CommandBehavior.CloseConnection))
{
StreamWriter sw1 = new StreamWriter("C:\\Users\\Thothathri\\Desktop\\next.txt");
while ((reader.Read()))
{
//Response.Write(reader[0].ToString());
next = String.Concat(next,reader[0].ToString());
sw1.WriteLine(next);
}
sw1.Close();
if (!prev.Equals(next))
{
Publisher publisher = new Publisher(new PublisherArgs
{
DomainKey = "c80cb405-eb77-4574-9405-5ba51832f5e6",
DomainName="localhost"
});
Publication publication = publisher.Publish("/test", JSON.Serialize(next));
if (publication.Successful == true)
{
StreamWriter sw = new StreamWriter("C:\\Users\\Thothathri\\Desktop\\error123.txt");
sw.WriteLine("success");
sw.WriteLine(next);
sw.Close();
}
else
{
StreamWriter sw = new StreamWriter("C:\\Users\\Thothathri\\Desktop\\error123.txt");
sw.Write("failed");
sw.Close();
}
prev = String.Copy(next);
next = String.Copy("");
}
}
Thread.Sleep(5000);
}
}
Renuiz answered it in a comment, but it is because you're not clearing next.
So you build the string "8 10" in next, store it in prev. Next time you concat "8 10" with next, making "8 10 8 10". Which is different so you print it.
if (!prev.Equals(next))
{
....
prev = String.Copy(next);
next = String.Copy("");
}
This is the end of that loop. You should really be clearing next at the beginning of that loop.
Also you can just set the string
next = String.Empty;
I would declare next inside your while loop, as you don't need it in the greater scope, and I would call it current rather than next.
What is really wrong with your program logic - logic is not obvious. It is so obscure, that you can't understand where error is. So, my advise is following - if you can't find the error, try to simplify your code.
Currently your method has many responsibilities - it queries database, it dumps data to file, it publishes data somewhere and logs the results. And you stuck with all that stuff. If someone will need to change database query, or publishing logic - he will need to review all other stuff.
So, separate logic first:
private void PublishLoop()
{
string previousIDs = String.Empty;
int timeout = Int32.Parse(ConfigurationManager.AppSettings["publishTimeout"]);
while (Running)
{
string currentIDs = ConcatenateList(LoadIDs());
Dump(currentIDs);
if (!previousIDs.Equals(currentIDs))
{
try
{
Publish(currentIDs);
_log.Info("Published successfuly");
}
catch (PublicationException exception)
{
_log.Error("Publication failed");
}
previousIDs = currentIDs;
}
Thread.Sleep(timeout);
}
}
Well, I don't know much about your domain, so you probably can think about better names for variables and methods.
Here you have data access logic extracted to separate method (it's ok for first step of refactoring and for small applications). Keep in mind, that wrapping connection object into using block guarantee that connection will be closed in case of exception:
private IList<int> LoadIDs()
{
List<int> ids = new List<int>();
String connectionString = ConfigurationManager.ConnectionStrings["MyDbConn"].ConnectionString;
using (SqlConnection connection = new SqlConnection(connectionString))
{
SqlCommand command = connection.CreateCommand();
command.CommandText = "select ID from Tab1";
command.Notification = null;
connection.Open();
using (SqlDataReader reader = command.ExecuteReader(CommandBehavior.CloseConnection))
{
while ((reader.Read()))
ids.Add((int)reader["ID"]);
}
}
return ids;
}
Next - simple method for concatenating ids into one string:
private string ConcatenateList(IList<int> values)
{
return String.Join(" ", values.Select(value => value.ToString()).ToArray());
}
Dumping (mind, that file name moved to configuration file):
private void Dump(string ids)
{
using (StreamWriter writer = new StreamWriter(ConfigurationManager.AppSettings["dumpFilePath"]))
writer.WriteLine(ids);
}
And publishing logic:
private void Publish(string ids)
{
PublisherArgs args = new PublisherArgs
{
DomainKey = "c80cb405-eb77-4574-9405-5ba51832f5e6",
DomainName = "localhost"
};
Publisher publisher = new Publisher(args);
Publication publication = publisher.Publish("/test", JSON.Serialize(ids));
if (!publication.Successful)
throw new PublicationException();
}
I think that failures are exceptional and they not occur very often (so I decided to use exceptions for that case). But if it's something ordinary - you can simply use boolean method like TryPublish.
BTW you can use some logging library like log4net for logging successful and failure publishing. Or you can extract logging logic to separate method - this will make primary logic cleaner and easier to understand.
PS try to avoid comparison of boolean variables with true/false (publication.Successful == true) - you can occasionally assign value to your variable.

Categories