how to create a fresh database (everytime) before tests run from a schema file ?
You can use the SchemaExport class in NHibernate to do this in code:
var schema = new SchemaExport(config);
schema.Drop(true, true);
schema.Execute(true, true, false);
drop the entire database - don't drop table by table - that adds too much maintenance overhead
I have used the following utility methods for running SQL scripts for setting up databases and test data in a project that I am working with every now and then. It has worked rather well:
internal static void RunScriptFile(SqlConnection conn, string fileName)
{
long fileSize = 0;
using (FileStream stream = File.OpenRead(fileName))
{
fileSize = stream.Length;
using (StreamReader reader = new StreamReader(stream))
{
StringBuilder sb = new StringBuilder();
string line = string.Empty;
while (!reader.EndOfStream)
{
line = reader.ReadLine();
if (string.Compare(line.Trim(), "GO", StringComparison.InvariantCultureIgnoreCase) == 0)
{
RunCommand(conn, sb.ToString());
sb.Length = 0;
}
else
{
sb.AppendLine(line);
}
}
}
}
}
private static void RunCommand(SqlConnection connection, string commandString)
{
using (SqlCommand command = new SqlCommand(commandString, connection))
{
try
{
command.ExecuteNonQuery();
}
catch (Exception ex)
{
Console.WriteLine(string.Format("Exception while executing statement: {0}", commandString));
Console.WriteLine(ex.ToString());
}
}
}
I have used the Database Publishing Wizard to generate SQL scripts (and in some cases edited them to include only the data I want to use in the test), and just pass the script file paths into the RunScriptFile method before the tests. The method parses the script file and executes each part that is separated by a GO line separately (I found that this greatly helped in troubleshooting errors that happened while running the SQL scripts).
I has been a while since I wrote the code, but I think it requires the the script file ends with a GO line in order for the last part of it to be executed.
Have a look at these posts.
Ayende Rahien - nhibernate-unit-testing
Scott Muc - unit-testing-domain-persistence-with-ndbunit-nhibernate-and-sqlite
I have found them to be very usefull and basically they are extending the example by Mike Glenn
I use Proteus (Unit Test Utility), available on Google code here :
http://code.google.com/p/proteusproject/
You create a set of data. Each time, you run a unit test, the current data are saved, the set of data is loaded, then you use all the time the same set of data to make your tests. At the end the original data are restored.
Very powerfull
HTH
Related
I have a .NET Core application which is multithreaded. One aspect of the application is a health check which parses a log file for errors. This is the code used to access it:
using StreamReader reader = new StreamReader(GetLogFile);
I noticed that I occasionally get this error:
2021-01-12 11:15:14.890Z ERROR APP=2227 COMP=3789 [16] Health check Check logs for application issues threw an unhandled exception after 96.2407ms - Logger=Microsoft.Extensions.Diagnostics.HealthChecks.DefaultHealthCheckService,Level=ERROR,ThreadId=16,,Exception="System.IO.IOException: The process cannot access the file 'c:\apps\Cb.Publisher\Logs\Cb.Publisher.log' because it is being used by another process.
I changed my code to this:
using StreamReader reader = new StreamReader(File.OpenRead(GetLogFile));
In testing it I haven't encountered the issue but it occurred so rarely that I am not 100% sure it's resolved. Is my change likely to resolve this issue or is there a better way to do it?
Additional Info
This is the entire function:
private int LogLine(Regex reg)
{
GetLogFile = DefaultLogFile.GetLogFileName();
using StreamReader reader = new StreamReader(File.OpenRead(GetLogFile));
string line;
int lineNo = 0;
int errorLine = 0;
while ((line = reader.ReadLine()) != null)
{
Match match = reg.Match(line);
if (match.Success)
{
errorLine = lineNumber;
}
lineNo++;
}
return errorLine;
}
If I set a breakpoint on the while line in Visual Studio and run the function, then try to edit the file in Notepad I fails with the error The process cannot access the file because it is being used by another process.
After some investigation I'm wondering if this line could actually be the cause of my problems:
var fileTarget = (FileTarget)LogManager.Configuration.FindTargetByName("file-target");
It's in DefaultLogFile.GetLogFileName:
public string GetLogFileName()
{
var fileTarget = (FileTarget)LogManager.Configuration.FindTargetByName("file-target");
var logEventInfo = new LogEventInfo();
string fileName = fileTarget.FileName.Render(logEventInfo);
if (!File.Exists(fileName))
{
throw new Exception("Log file does not exist.");
}
return fileName;
}
You currently suggested solution will likely be enough, yes:
using StreamReader reader = new StreamReader(File.OpenRead(GetLogFile));
However, the proper solution is for you to check that the file is not being locked from wherever else you are using the file. This also means your logging framework (be it Log4Net, NLog, Serilog etc.) should be properly configured to not take an exclusive lock on the log file. I believe logging frameworks usually do not lock it from read access by default, so unless you have customized the configuration the logging framework should not be a problem.
I have a c# web application and I would like to be able to see errors being thrown in the Service.asmx.cs section. I.e. when the sql statement fails or I spelled a variable name wrong.. etc. I'm fairly new to .net but the template for the app that I have has log4net installed.
protected static log4net.ILog Log = EarthSoft.Common.log4net.GetLogger(typeof (Service));
Then my get method is :
[WebMethod(Description = "Get a list of facilities")]
public List<Facility> GetFacilities()
{
try
{
var context = EarthSoft.Server.Helpers.RequestContext.GetRequestContext(true);
if (context.Connection == null || context.User == null) return null;
var lst = new List<Facility>();
string sql =
string.Format("select analyte_full,conc from dt_biological");
var cmd = context.Connection.CreateCommand(sql);
context.Connection.PrepareTextCommand(cmd);
using (var reader = cmd.ExecuteReader())
{
while (reader.Read())
{
lst.Add(new Facility()
{
name = (string)reader.GetValue(0),
distance = (decimal)reader.GetValue(1)
});
}
}
context.Connection.Close();
return lst;
}
catch (Exception ex)
{
Log.Error(ex.Message, ex);
return null;
}
}
So the error message is appended to the Log here but I have no idea how to
access it or where it writes the log? Is it possible the log output is saved onto a table into the sql database somewhere?
Log4Net is higly configurable, and to specify "where" to log, it uses the "appender" concepts. There is a lot of appender already done that write in files/database/queues and you can write your own, if needed ( normally not ).
Anyway start to look at the examples from here: https://logging.apache.org/log4net/release/config-examples.html
Somewher ein your code, typically in the application initialization, you have to call:
XmlConfigurator.Configure()
This will read the configuration from the application configuration file.
Alternatively you can configure log4net programmatically, but it can be sort of advanced if you never used before with manual configuration, so I suggest to start looking at the examples above and use configuration from file.
If no configuration is supplied, log4net work perfectly but... no log are produced at all, and I think this is the reason you can't find any log at the moment.
Ok, I have a really weird situation happening here. First I need to give some background. I'm creating AI agents for a game that was made on the XNA engine. The way things are set up, people are supposed to use the agent's framework to generate a .dll that the game then uses to load the agents when it runs.
I have access to the code of the game (so I can see what's happening) and at this point I'm using someone else's agents as a starting point for my own. Recently, there were a few changes to the game (and consequentially, the framework), mostly in names of classes and interfaces which means I have to bring the agents up to speed. So, after I made all the updates necessary to be able to compile the agents with the new version of the framework, I came up with a problem. This is the code for the game loading the .dll
// dynamically load assembly from file GeometryFriendsAgents.dll
Assembly agentsDLL = Assembly.LoadFile(path);
// get type of classes BallAgent and SquareAgent from just loaded Assembly
Type circleType = AgentsDLL.GetType("GeometryFriendsAgents.CircleAgent");
Type rectangleType = AgentsDLL.GetType("GeometryFriendsAgents.RectangleAgent");
try {
// create instances of classes BallAgent and SquareAgent
npcCircle = (ICircleAgent)Activator.CreateInstance(circleType);
npcRectangle = (IRectangleAgent)Activator.CreateInstance(rectangleType);
}catch(TargetInvocationException e){
throw e.InnerException;
}
I can confirm that the path is correct. The lines inside the try/catch will throw a TargetInvocationException when I try to run the game (Which will automatically load the agents). I added the try/catch to see the inner exception, which is a FormatException, and VisualStudio gives the aditional information that the input string was not in the correct format.
I don't know what part of the agents code would be relevant for this, but I have yet to get to the weird part. In the implementation I'm using, the agents make use of a LearningCenter class. This class essentially reads and writes the learning files of the agents. at the start of the class it stores the path for the learning files:
protected const string path = #"..\..\..\..\Agents\";
So here's where things get weird. This is the correct path for the learning files. When earlier I made a mistake, I had this path (which before was repeated many times throughout the code) as
protected const string path = #"..\..\..\..\Agents";
When I build the .dll with the incorrect path, I can sucessfully load the agents and it will run the game. The problem then is that the path is incorrect, and when the LearningCenter tries to write the learning file, it will evidently fail with a DirectoryNotFoundException. The method in question is:
public void EndGame(float knownStatesRatio) {
if (_toSave) {
FileStream fileStream = new FileStream(path + _learningFolder + "\\Ratios.csv", FileMode.Append);
StreamWriter sw = new StreamWriter(fileStream);
sw.WriteLine(knownStatesRatio);
sw.Close();
fileStream.Close();
fileStream = new FileStream(path + _learningFolder + "\\IntraPlatformLearning.csv", FileMode.Create);
DumpLearning(fileStream, _intraplatformPlayedStates);
fileStream.Close();
if (interPlatform) {
fileStream = new FileStream(path + _learningFolder + "\\InterPlatformLearning.csv", FileMode.Create);
DumpLearning(fileStream, _interplatformPlayedStates);
fileStream.Close();
}
}
}
The exception occurs immediatly when creating the new filestream. I've tried shifting the missing \ to the _learningFolder variable, but when I do it goes back to the first problem. So long as the path is incorrect, I can run the game...
I should also mention that before this I initially encountered another TargetInvocationException at the same location. At the time the problem was fixed by changing the visibility of the agent classes to public.
I realize that the thing with the path is probably hiding the actual problem, but I just don't know where to look next.
edit: Here's the stack trace for the first problem
GeometryFriends.exe!GeometryFriends.AI.AgentsManager.LoadAgents() Line 396
GeometryFriends.exe!GeometryFriends.Levels.SinglePlayerLevel.LoadLevelContent() Line 78
GeometryFriends.exe!GeometryFriends.Levels.Level.LoadContent() Line 262
GeometryFriends.exe!GeometryFriends.ScreenSystem.ScreenManager.LoadContent() Line 253
Microsoft.Xna.Framework.Game.dll!Microsoft.Xna.Framework.DrawableGameComponent.Initialize()
GeometryFriends.exe!GeometryFriends.ScreenSystem.ScreenManager.Initialize() Line 221
Microsoft.Xna.Framework.Game.dll!Microsoft.Xna.Framework.Game.Initialize()
GeometryFriends.exe!GeometryFriends.Engine.Initialize() Line 203
Microsoft.Xna.Framework.Game.dll!Microsoft.Xna.Framework.Game.RunGame(bool useBlockingRun)
Microsoft.Xna.Framework.Game.dll!Microsoft.Xna.Framework.Game.Run()
GeometryFriends.exe!GeometryFriends.Program.Main(string[] args) Line 16
The agent that's failing first is the CircleAgent, here's the constructor:
public CircleAgent() {
//Change flag if agent is not to be used
SetImplementedAgent(true);
lastMoveTime = DateTime.Now;
lastRefreshTime = DateTime.Now;
currentAction = 0;
rnd = new Random(DateTime.Now.Millisecond);
model = new CircleWorldModel(this);
learningCenter = new CircleLearningCenter(model);
learningCenter.InitializeLearning();
startTime = DateTime.Now;
}
edit 2: Ok, I managed to zone in on the source of the FormatException.
The error occurs in this method of the CircleLearningCenter (the statement in the first if):
public override void addStateMovementValue(string[] lineSplit, string stateId, ref Dictionary<string, Dictionary<int, double>> lessons) {
if (!lineSplit[1].Equals("0")) {
lessons[stateId].Add(Moves.ROLL_LEFT, double.Parse(lineSplit[1]));
}
if (!lineSplit[2].Equals("0")) {
lessons[stateId].Add(Moves.ROLL_RIGHT, double.Parse(lineSplit[2]));
}
if (!lineSplit[3].Equals("0")) {
lessons[stateId].Add(Moves.JUMP, double.Parse(lineSplit[3]));
}
}
Which is called by this method in the LearningCenter:
private void createLearningFromFile(FileStream fileStream, ref Dictionary<string, Dictionary<int, double>> lessons) {
lessons = new Dictionary<string, Dictionary<int, double>>();
StreamReader sr = new StreamReader(fileStream);
string line;
while ((line = sr.ReadLine()) != null) {
string[] lineSplit = line.Split(',');
string stateId = lineSplit[0];
lessons.Add(stateId, new Dictionary<int, double>());
addStateMovementValue(lineSplit, stateId, ref lessons);
}
}
which in turn is called by this method (which it's called in the constructor of the circle):
public void InitializeLearning() {
if (File.Exists(Path.Combine(Path.Combine(path, _learningFolder), "IntraPlatformLearning.csv"))) {
FileStream fileStream = new FileStream(Path.Combine(Path.Combine(path, _learningFolder),"IntraPlatformLearning.csv"), FileMode.Open);
createLearningFromFile(fileStream, ref _intraplatformLessonsLearnt);
fileStream.Close();
} else {
createEmptyLearning(ref _intraplatformLessonsLearnt);
}
if (File.Exists(Path.Combine(Path.Combine(path, _learningFolder), "InterPlatformLearning.csv"))) {
FileStream fileStream = new FileStream(Path.Combine(Path.Combine(path, _learningFolder), "InterPlatformLearning.csv"), FileMode.Open);
createLearningFromFile(fileStream, ref _interplatformLessonsLearnt);
fileStream.Close();
} else {
createEmptyLearning(ref _interplatformLessonsLearnt);
}
}
In case it's not apparent, CircleLearningCenter is a subclass of LearningCenter. Also, sorry for the text wall, but I'm at my wits end.
Use System.IO.Path.Combine() to con-cat path parts. For example:
instead of :
FileStream(path + _learningFolder + "\\Ratios.csv")
use :
FileStream(Path.Combine(Path.Combine(path , _learningFolder) , "Ratios.csv"))
Just don't forget to remove \\ from each part.
And do the same for other FileStream paths.
This is a C#/VSTO program. I've been working on a data capture project. The scope is basically 'process Excel files sent by a variety of third party companies.' Practically, this mean:
Locate columns that contain the data I want through a search method.
Grab data out of the workbooks
Clean the data, run some calculations, etc
Output cleaned data into new workbook
The program I have written works great for small-medium data sets, ~25 workbooks with a combined total of ~1000 rows of relavent data. I'm grabbing 7 columns with of data out of these workbooks. One edge case I have, though, is occasionally I will need to run a much larger data set, ~50 workbooks with a combined total of ~8,000 rows of relavent data (and possibly another ~2000 of duplicate data that I also have to remove).
I am currently putting a list of the files through a Parallel.ForEach loop inside of which I open a new Excel.Application() to process each file with multiple ActiveSheets. The parallel process runs much faster on the smaller data set than going through each one sequentially. But on the larger data set, I seem to hit a wall.
I start getting the message: Microsoft Excel is waiting for another application to complete an OLE action and eventually it just fails. Switching back to sequential foreach does allow the program to finish, but it just grinds along - going from 1-3 minutes for a Parallel medium sized data set to 20+ minutes for a sequential large data set. If I mess with ParallelOptions.MaxDegreeOfParallelism set to 10 it will complete the cycle, but still take 15 minutes. If I set it to 15, it fails. I also really don't like messing with TPL settings if I don't have to. I've also tried inserting a Thread.Sleep to just manually slow things down, but that only made the failure happen further out.
I close the workbook, quit the application, then ReleaseComObject to the Excel object and GC.Collect and GC.WaitForPendingFinalizers at the end of each loop.
My ideas at the moment are:
Split the list in half and run them seperately
Open some number of new Excel.Application() in parallel, but run a list of files sequentially inside of that Excel instance (so kinda like #1, but using a different path)
Seperate the list by file size, and run a small set of very large files independently/sequentially, run the rest as I have been
Things I am hoping to get some help with:
Suggestions on making real sure my memory is getting cleared (maybe Process.Id is getting twisted up in all the opening and closing?)
Suggestions on ordering a parallel process - I'm wondering if I can throw the 'big' guys in first, that will make the longer-running process more stable.
I have been looking at: http://reedcopsey.com/2010/01/26/parallelism-in-net-part-5-partitioning-of-work/ and he says "With prior knowledge about your work, it may be possible to partition data more meaningfully than the default Partitioner." But I'm having a hard time really knowing what/if partitioning makes sense.
Really appreciate any insights!
UPDATE
So as a general rule I test against Excel 2010, as we have both 2010 and 2013 under use here. I ran it against 2013 and it works fine - run time about 4 minutes, which is about what I would expect. Before I just abandon 2010 compatibility, any other ideas? The 2010 machine is a 64-bit machine with 64-bit Office, and the 2013 machine is a 64-bit machine with a 32-bit Office. Would that matter at all?
A few years ago i worked with excel files and automation. I then had problems of having zombie processes in task manager. Although our program ended and i thought i quit excel properly, the processes were not quitting.
The solution was not something i liked but it was effective. I can summarize the solution like this.
1) never use two dots consecutively like:
workBook.ActiveSheet.PageSetup
instead use variables.. when you are done relase and null them.
example: instead of doing this:
m_currentWorkBook.ActiveSheet.PageSetup.LeftFooter = str.ToString();
follow the practices in this function. (This function adds a barcode to excel footer.)
private bool SetBarcode(string text)
{
Excel._Worksheet sheet;
sheet = (Excel._Worksheet)m_currentWorkbook.ActiveSheet;
try
{
StringBuilder str = new StringBuilder();
str.Append(#"&""IDAutomationHC39M,Regular""&22(");
str.Append(text);
str.Append(")");
Excel.PageSetup setup;
setup = sheet.PageSetup;
try
{
setup.LeftFooter = str.ToString();
}
finally
{
RemoveReference(setup);
setup = null;
}
}
finally
{
RemoveReference(sheet);
sheet = null;
}
return true;
}
Here is the RemoveReference function (putting null in this function did not work)
private void RemoveReference(object o)
{
try
{
System.Runtime.InteropServices.Marshal.ReleaseComObject(o);
}
catch
{ }
finally
{
o = null;
}
}
If you follow this pattern EVERYWHERE it guarantees no leaks, no zombie processes etc..
2) In order to create excel files you can use excel application, however to get data from excel, i suggesst using OleDB. You can approach excel like a database and get data from it with sql queries, datatables etc.
Sample Code: (instead of filling dataset, you can use datareader for memory performance)
private List<DataTable> getMovieTables()
{
List<DataTable> movieTables = new List<DataTable>();
var connectionString = "Provider=Microsoft.ACE.OLEDB.12.0;Data Source=" + excelFilePath + ";Extended Properties=\"Excel 12.0;IMEX=1;HDR=NO;TypeGuessRows=0;ImportMixedTypes=Text\""; ;
using (var conn = new OleDbConnection(connectionString))
{
conn.Open();
DataRowCollection sheets = conn.GetOleDbSchemaTable(OleDbSchemaGuid.Tables, new object[] { null, null, null, "TABLE" }).Rows;
foreach (DataRow sheet in sheets)
{
using (var cmd = conn.CreateCommand())
{
cmd.CommandText = "SELECT * FROM [" + sheet["TABLE_NAME"].ToString() + "] ";
var adapter = new OleDbDataAdapter(cmd);
var ds = new DataSet();
try
{
adapter.Fill(ds);
movieTables.Add(ds.Tables[0]);
}
catch (Exception ex)
{
//Debug.WriteLine(ex.ToString());
continue;
}
}
}
}
return movieTables;
}
As an alternative solution to the one proposed by #Mustafa Düman I recommend you to use Version 4 beta of EPPlus. I used it without problems in several projects.
Pros:
Fast
No memory leaks (I can't tell the same for versions <4)
Does not require Office to be installed on the machine where you use it
Cons:
Can be used only for .xlsx files ( Excel 2007 / 2010 )
I tested it with the following code on 20 excel files around 12.5 MB each (over 50k records in each file) and I think it's enough to mention that it didn't crashed :)
Console.Write("Path: ");
var path = Console.ReadLine();
var dirInfo = new DirectoryInfo(path);
while (string.IsNullOrWhiteSpace(path) || !dirInfo.Exists)
{
Console.WriteLine("Invalid path");
Console.Write("Path: ");
path = Console.ReadLine();
dirInfo = new DirectoryInfo(path);
}
string[] files = null;
try
{
files = Directory.GetFiles(path, "*.xlsx", SearchOption.AllDirectories);
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
Console.ReadLine();
return;
}
Console.WriteLine("{0} files found.", files.Length);
if (files.Length == 0)
{
Console.ReadLine();
return;
}
int succeded = 0;
int failed = 0;
Action<string> LoadToDataSet = (filePath) =>
{
try
{
FileInfo fileInfo = new FileInfo(filePath);
using (ExcelPackage excel = new ExcelPackage(fileInfo))
using (DataSet dataSet = new DataSet())
{
int workSheetCount = excel.Workbook.Worksheets.Count;
for (int i = 1; i <= workSheetCount; i++)
{
var worksheet = excel.Workbook.Worksheets[i];
var dimension = worksheet.Dimension;
if (dimension == null)
continue;
bool hasData = dimension.End.Row >= 1;
if (!hasData)
continue;
DataTable dataTable = new DataTable();
//add columns
foreach (var firstRowCell in worksheet.Cells[1, 1, 1, dimension.End.Column])
dataTable.Columns.Add(firstRowCell.Start.Address);
for (int j = 0; j < dimension.End.Row; j++)
dataTable.Rows.Add(worksheet.Cells[j + 1, 1, j + 1, dimension.End.Column].Select(erb => erb.Value).ToArray());
dataSet.Tables.Add(dataTable);
}
dataSet.Clear();
dataSet.Tables.Clear();
}
Interlocked.Increment(ref succeded);
}
catch (Exception)
{
Interlocked.Increment(ref failed);
}
};
Stopwatch sw = new Stopwatch();
sw.Start();
files.AsParallel().ForAll(LoadToDataSet);
sw.Stop();
Console.WriteLine("{0} succeded, {1} failed in {2} seconds", succeded, failed, sw.Elapsed.TotalSeconds);
Console.ReadLine();
I want to do Multiple Download/Upload parallely in FTP using C# without using FTPWebRequest.
I have written my custom code and when i try to download two files simultaneously first one get download properly while second one shows size as 0 KB(it also downloads).
public void sendCommand(String command, params string[] strfilename)
{
if (command == "RETR ") //Downloading file from Server
{
FileStream output = null;
if (!File.Exists(strfilename[0]))
output = File.Create(strfilename[0]);
else
output = new FileStream(strfilename[0] , FileMode.Open);
command = "RETR " + strfilename[0];
Byte[] cmdBytes = Encoding.ASCII.GetBytes((command + "\r\n").ToCharArray());
clientSocket.Send(cmdBytes, cmdBytes.Length, 0);
Socket csocket = createDataSocket();
DateTime timeout = DateTime.Now.AddSeconds(this.timeoutSeconds);
while (timeout > DateTime.Now)
{
this.bytes = csocket.Receive(buffer, buffer.Length, 0);
output.Write(this.buffer, 0, this.bytes);
if (this.bytes <= 0)
{
break;
}
}
// this.BinaryMode = true;
output.Close();
if (csocket.Connected) csocket.Close();
this.readResponse();
MessageBox.Show("File Downloaded successfully");
else if....so on
}
}
In my main method i do like this:
ftpcommand.sendCommand("RETR ","RMSViewer.xml"); //Downloading from Server
ftpcommand.sendCommand("RETR ","cms.xml");//Downloading from Server
Any code snippet....
As Dave said, you'd need separate instances of your ftpCommand class. Look into use the BackgroundWorker to run the commands in the background (asynchronously).
How are you issuing your requests simultaneously?
Threaded?
If so - you may want to ensure you're creating separate instances of your "ftpcommand" class.
I think we'll need more information to be able to help you :)