Having issue creating recursive constructor - c#

I'm trying to build Quad-tree which receive array of points and split them into 4 cells. those 4 cells(cell1...cell4) are suppose to be created recursively but it doesn't works.
What I'm doing wrong? wasted all day trying fix it.
I erased some basic things not to overload the script here.
class QuadTreeNode
{
private QuadTreeNode cell1;
private QuadTreeNode cell2;
private QuadTreeNode cell3;
private QuadTreeNode cell4;
private int maxppc;
private double leftminX, leftminY, rightmaxX, rightmaxY;
public QuadTreeNode(Point2D leftMin, Point2D rightMax, int maxPPC)
{
this.leftminX = leftMin.East;
this.leftminY = leftMin.North;
this.rightmaxX = rightMax.East;
this.rightmaxY = rightMax.North;
this.maxppc = maxPPC;
}
public QuadTreeNode(Point2D[] pointArray, Point2D leftMin, Point2D rightMax, int maxPPC)
{
for (int i=0; i <4; i++)
{
cellList[i] = new List<Point2D>();
}
allPointList = pointArray.ToList();
this.leftminX = leftMin.East;
this.leftminY = leftMin.North;
this.rightmaxX = rightMax.East;
this.rightmaxY = rightMax.North;
this.maxppc = maxPPC;
if (allPointList.Count > maxPPC)
{
Split(allPointList);
}
}
public void Split(List<Point2D> Array)
{
if (Array.Count > maxppc)
{
double centerE = (this.leftminX + this.rightmaxX) / 2;
double centerN = (this.leftminY + this.rightmaxY) / 2;
double deltaE = (this.rightmaxX - this.leftminX) / 2;
double deltaN = (this.rightmaxY - this.leftminY) / 2;
Point2D Center = new Point2D(centerE, centerN);
this.cell1 = new QuadTreeNode(cellList[0].ToArray(),new Point2D((Center.East - deltaE), (Center.North - deltaN)), Center, maxppc);
this.cell2 = new QuadTreeNode(cellList[1].ToArray(), new Point2D(Center.East, Center.North - deltaN), new Point2D((Center.East + deltaE), Center.North), maxppc);
this.cell3 = new QuadTreeNode(cellList[2].ToArray(), new Point2D((Center.East - deltaE), (Center.North)), new Point2D(Center.East, (Center.North + deltaN)), maxppc);
this.cell4 = new QuadTreeNode(cellList[3].ToArray(), Center, new Point2D((Center.East + deltaE), (Center.North + deltaN)), maxppc);
for (pntIndex = 0; pntIndex < Array.Count; pntIndex++)
{
CellIndex(Array[pntIndex]);
}
pntIndex = 0;
Array.Clear();
for (int c=0; c < 4; c++)
{
Array = cellList[c].ToList();
cellList[0].Clear();
cellList[1].Clear();
cellList[2].Clear();
cellList[3].Clear();
Split(Array);
}
return;
}
else
{
return;
}
}
public void CellIndex(Point2D point)
{
//locates points up to East,North
}

The problem may be with this part:
Array = cellList[c].ToList();
cellList[0].Clear();
cellList[1].Clear();
cellList[2].Clear();
cellList[3].Clear();
Split(Array);
Here, you're copying the list at location c but you're clearing each list at every iteration anyway, so you're likely losing some of your points.

Related

Performing linear search on randomized array

I am trying to perform a linear search on some randomized array, but just cant see where the code gets stuck. Its a simple BigONotation algorithm test that I am currently learning and messed up somewhere. I am using a text editor for my coding. Can I get some help with this code?
using System;
using System.Collections.Generic;
using System.Data;
namespace TheBigONotations
{
public class BigONotations
{
///<properties>Class Properties</properties>
private int[] theArray;
private int arraySize;
private int itemsInArray = 0;
DateTime startTime;
DateTime endTime;
public BigONotations(int size)
{
arraySize = size;
theArray = new int[size];
}
///<order>O(1)</order>
public void addItemToArray(int newItem)
{
theArray[itemsInArray++] = newItem;
}
///<order>O(n)</order>
public void linearSearch(int value)
{
bool valueInArray = false;
string indexsWithValue = "";
startTime = DateTime.Now;
for (int i = 0; i < arraySize; i++)
{
if (theArray[i] == value)
{
valueInArray = true;
indexsWithValue += i + " ";
}
}
Console.WriteLine("Value found: " + valueInArray);
endTime = DateTime.Now;
Console.WriteLine("Linear Search took "+(endTime - startTime) );
}
///<order>O(n^2)</order>
public void BinarySearch()
{
}
///<order>O(n)</order>
///<order>O(n)</order>
///<order>O(n)</order>
public void generateRandomArray()
{
Random rnd = new Random();
for (int i = 0; i < arraySize; i++)
{
theArray[i] = (int)(rnd.Next() * 1000) + 10;
}
itemInArray = arraySize - 1;
}
public static void Main()
{
BigONotations algoTest1 = new BigONotations(100000);
algoTest1.generateRandomArray();
BigONotations algoTest2 = new BigONotations(200000);
algoTest2.generateRandomArray();
BigONotations algoTest3 = new BigONotations(300000);
algoTest3.generateRandomArray();
BigONotations algoTest4 = new BigONotations(400000);
algoTest4.generateRandomArray();
BigONotations algoTest5 = new BigONotations(500000);
algoTest5.generateRandomArray();
algoTest2.linearSearch(20);
algoTest3.linearSearch(20);
algoTest4.linearSearch(20);
}
}
}
I put all information in the code but had some error and cant see exactly where it is.
I think the error is because itemInArray inside generateRandomArray() function is not declared. Perhaps you meant itemsInArray?
String addition (with + and += operator) for too many results might be taking long time during linearSearch(). Consider using StringBuilder which is significantly faster and more memory efficient.
StringBuilder sb = new StringBuilder();
// ...
// Let's say "i" is an integer.
sb.Append(i.ToString());
sb.Append(" ");
// ...
Console.WriteLine("Indexes: " + sb.ToString());

Copying static list to local temp list

Hello I have some static lists, and I have lots of websocket connection and uptading these lists. But in paribuMiktarHesaplama() function, I want to use local temp lists to not change my static lists. But when anything changes on local list , its also effecting my static lists. I didnt any assignment, why its changing and How can I fix ?
static List<decimal> p_btcTlAskPrice = new List<decimal>(new decimal[2]);
static List<decimal> p_btcTlAskQuantity = new List<decimal>(new decimal[2]);
static List<decimal> b_btcUsdtBidPrice = new List<decimal>(new decimal[20]);
static List<decimal> b_btcUsdtBidQuantity = new List<decimal>(new decimal[20]);
/// and more list here...
Static lists
public static (decimal,int,decimal,decimal,decimal) FindMin(decimal paribuPrice, decimal paribuAmount, decimal binancePrice,decimal binanceAmount,decimal usdtTryPrice, decimal usdtTryAmount)
{
decimal paribuTotal,binanceTotal,usdtTotal;
decimal enkucuk;
int enkucukList;
paribuTotal = paribuPrice* paribuAmount/usdtTryPrice;
binanceTotal = binancePrice * binanceAmount;
enkucuk = usdtTryAmount;
usdtTotal = usdtTryAmount;
enkucukList = 3;
if (paribuTotal < enkucuk) {
enkucuk = paribuTotal;
enkucukList = 1;
}
if (binanceTotal < enkucuk) {
enkucuk = binanceTotal;
enkucukList = 2;
}
return (enkucuk, enkucukList,paribuTotal,binanceTotal,usdtTotal);
}
public static void paribuMiktarHesaplama(List<decimal> paribuPrice,List<decimal> paribuAmount,List<decimal> binancePrice,List<decimal> binanceAmount)
{
var i = 0; var j = 0; var k = 0; decimal enKucuk = 0 ; int enKucukList; decimal totalAmount=0; decimal enKucukParibu = 0; decimal enKucukBinance = 0; decimal enKucukUsdt = 0;
var tempParibuAmount = paribuAmount; var tempBinanceAmount = binanceAmount; var tempUsdtTryAmount = b_usdtTryBidQuantity;
var tempParibuPrice = paribuPrice; var tempBinancePrice = binancePrice; var tempUsdtTryPrice = b_usdtTryBidPrice;
while (true)
{
if (tempParibuPrice[i] * paribuFee < tempBinancePrice[j] / binanceFee * tempUsdtTryPrice[k] / binanceFee)
{//en küçük miktarın çıkarılması lazım
var returnTuple = FindMin(tempParibuPrice[i], tempParibuAmount[i], tempBinancePrice[j], tempBinanceAmount[j], tempUsdtTryPrice[k], tempUsdtTryAmount[k]);
enKucuk = returnTuple.Item1;
enKucukList = returnTuple.Item2;
enKucukParibu = returnTuple.Item3;
enKucukBinance = returnTuple.Item4;
enKucukUsdt = returnTuple.Item5;
totalAmount += enKucuk; // dolar cinsinden
if (enKucukList == 1)
{
//paribu
tempBinanceAmount[j] = (enKucukBinance - enKucuk)/binancePrice[j];
tempUsdtTryAmount[k] = (enKucukUsdt - enKucuk);
tempParibuAmount.RemoveAt(0);
tempParibuPrice.RemoveAt(0);
//i++;
}
else if (enKucukList == 2)
{
//binance
tempParibuAmount[i] = (enKucukParibu - enKucuk ) * b_usdtTryBidPrice[k];
tempUsdtTryAmount[k] = (enKucukUsdt - enKucuk);
tempBinanceAmount.RemoveAt(0);
tempBinancePrice.RemoveAt(0);
//j++;
}
else
{
//tether
tempBinanceAmount[j] = (enKucukBinance - enKucuk) / binancePrice[j];
tempParibuAmount[i] = (enKucukParibu - enKucuk) * b_usdtTryBidPrice[k]/paribuPrice[i];
tempUsdtTryAmount.RemoveAt(0);
tempUsdtTryPrice.RemoveAt(0);
//k++;
}
}
else
{
break;
}
}
Console.WriteLine($"alınacak miktar {totalAmount} fiyatlar paribu {paribuPrice[i]} binance {binancePrice[j]} usdt {b_usdtTryBidPrice[k]}");
}
functions
p_btcTlAskPrice[0] = 600;
p_btcTlAskQuantity[0] = 60;
b_btcUsdtBidPrice[0] = 100;
b_btcUsdtBidQuantity[0] = 80;
b_usdtTryBidPrice[0] = 7;
b_usdtTryBidQuantity[0] = 500;
p_btcTlAskPrice[1] = 650;
p_btcTlAskQuantity[1] = 100;
b_btcUsdtBidPrice[1] = 95;
b_btcUsdtBidQuantity[1] = 50;
b_usdtTryBidPrice[1] = 6.8m;
b_usdtTryBidQuantity[1] = 10000;
paribuMiktarHesaplama(p_btcTlAskPrice, p_btcTlAskQuantity, b_btcUsdtBidPrice, b_btcUsdtBidQuantity);
var tempParibuAmount = paribuAmount
is an assignment that takes the reference of the List you passed. What you need to do is to create a copy of the paribuAmount and store that copy instead:
var tempParibuAmount = new List<decimal>(paribuAmount)
Do that instead for every temp list you create

How to optimize recursive function for a graph for clearing and settlement

I have to make a module in an Insurance application that deals with clearing and settlement (I think this is the correct financial terminology) between insurance companies enroled in the system. Practically, the system must pair all the amounts that companies have to pay to one another, and only the unpaired (remaining) sums to be paid through the bank. For now there are about 30 companies in the system.
All the readings I did about clearing and settlement pointed me towards graphs and graphs theory (which I have studied in the highschool quite a long time ago).
For a system with 4 companies the graph would look like this:
where each company represents a node (N1 ... N4) and each weighted edge represents the amount that a company has to pay to the other. In my code, the nodes are int, representing the id's of the companies.
What I did so far... I created the graph (for test I used the Random generator for the amounts) and made a recursive function to calculate all posible cycles in the graph. Then I made another recursive function that takes all non-zero cycles starting with the longest path with maximum common sum to pair.
The algorithm seems valid in terms of final results, but for graphs bigger than 7-8 nodes it takes too long to complete. The problem is in the recursive function that creates the possible cycles in the graph. Here is my code:
static void Main(string[] args)
{
int nodes = 4;
try
{
nodes = Convert.ToInt32(args[0]);
}
catch { }
DateTime start = DateTime.Now;
Graph g = new Graph(nodes);
int step = 0;
double CompensatedAmount = 0;
double TotalCompensatedAmount = 0;
DateTime endGeneration = DateTime.Now;
Console.WriteLine("Graph generated in: " + (endGeneration - start).TotalSeconds + " seconds.");
Compensare.RunCompensation(false, g, step, CompensatedAmount, TotalCompensatedAmount, out CompensatedAmount, out TotalCompensatedAmount);
DateTime endCompensation = DateTime.Now;
Console.WriteLine("Graph compensated in: " + (endCompensation - endGeneration).TotalSeconds + " seconds.");
}
... and the main class:
public static class Compensare
{
public static void RunCompensation(bool exit, Graph g, int step, double prevCompensatedAmount, double prevTotalCompensatedAmount, out double CompensatedAmount, out double TotalCompensatedAmount)
{
step++;
CompensatedAmount = prevCompensatedAmount;
TotalCompensatedAmount = prevTotalCompensatedAmount;
if (!exit)
{
List<Cycle> orderedList = g.Cycles.OrderByDescending(x => x.CycleCompensatedAmount).ToList();
g.ListCycles(orderedList, "OrderedCycles" + step.ToString() + ".txt");
using (Graph clona = g.Clone())
{
int maxCycleIndex = clona.GetMaxCycleByCompensatedAmount();
double tmpCompensatedAmount = clona.Cycles[maxCycleIndex].CycleMin;
exit = tmpCompensatedAmount <= 0 ? true : false;
CompensatedAmount += tmpCompensatedAmount;
TotalCompensatedAmount += (tmpCompensatedAmount * clona.Cycles[maxCycleIndex].EdgesCount);
clona.CompensateCycle(maxCycleIndex);
clona.UpdateCycles();
Console.WriteLine(String.Format("{0} - edges: {4} - min: {3} - {1} - {2}\r\n", step, CompensatedAmount, TotalCompensatedAmount, tmpCompensatedAmount, clona.Cycles[maxCycleIndex].EdgesCount));
RunCompensation(exit, clona, step, CompensatedAmount, TotalCompensatedAmount, out CompensatedAmount, out TotalCompensatedAmount);
}
}
}
}
public class Edge
{
public int Start { get; set; }
public int End { get; set; }
public double Weight { get; set; }
public double InitialWeight {get;set;}
public Edge() { }
public Edge(int _start, int _end, double _weight)
{
this.Start = _start;
this.End = _end;
this.Weight = _weight;
this.InitialWeight = _weight;
}
}
public class Cycle
{
public List<Edge> Edges = new List<Edge>();
public double CycleWeight = 0;
public double CycleMin = 0;
public double CycleMax = 0;
public double CycleAverage = 0;
public double CycleCompensatedAmount = 0;
public int EdgesCount = 0;
public Cycle() { }
public Cycle(List<Edge> _edges)
{
this.Edges = new List<Edge>(_edges);
UpdateCycle();
}
public void UpdateCycle()
{
UpdateCycle(this);
}
public void UpdateCycle(Cycle c)
{
double sum = 0;
double min = c.Edges[0].Weight;
double max = c.Edges[0].Weight;
for(int i=0;i<c.Edges.Count;i++)
{
sum += c.Edges[i].Weight;
min = c.Edges[i].Weight < min ? c.Edges[i].Weight : min;
max = c.Edges[i].Weight > max ? c.Edges[i].Weight : max;
}
c.EdgesCount = c.Edges.Count;
c.CycleWeight = sum;
c.CycleMin = min;
c.CycleMax = max;
c.CycleAverage = sum / c.EdgesCount;
c.CycleCompensatedAmount = min * c.EdgesCount;
}
}
public class Graph : IDisposable
{
public List<int> Nodes = new List<int>();
public List<Edge> Edges = new List<Edge>();
public List<Cycle> Cycles = new List<Cycle>();
public int NodesCount { get; set; }
public Graph() { }
public Graph(int _nodes)
{
this.NodesCount = _nodes;
GenerateNodes();
GenerateEdges();
GenerateCycles();
}
private int FindNode(string _node)
{
for(int i = 0; i < this.Nodes.Count; i++)
{
if (this.Nodes[i].ToString() == _node)
return i;
}
return 0;
}
private int FindEdge(string[] _edge)
{
for(int i = 0; i < this.Edges.Count; i++)
{
if (this.Edges[i].Start.ToString() == _edge[0] && this.Edges[i].End.ToString() == _edge[1] && Convert.ToDouble(this.Edges[i].Weight) == Convert.ToDouble(_edge[2]))
return i;
}
return 0;
}
public Graph Clone()
{
Graph clona = new Graph();
clona.Nodes = new List<int>(this.Nodes);
clona.Edges = new List<Edge>(this.Edges);
clona.Cycles = new List<Cycle>(this.Cycles);
clona.NodesCount = this.NodesCount;
return clona;
}
public void CompensateCycle(int cycleIndex)
{
for(int i = 0; i < this.Cycles[cycleIndex].Edges.Count; i++)
{
this.Cycles[cycleIndex].Edges[i].Weight -= this.Cycles[cycleIndex].CycleMin;
}
}
public int GetMaxCycleByCompensatedAmount()
{
int toReturn = 0;
for (int i = 0; i < this.Cycles.Count; i++)
{
if (this.Cycles[i].CycleCompensatedAmount > this.Cycles[toReturn].CycleCompensatedAmount)
{
toReturn = i;
}
}
return toReturn;
}
public void GenerateNodes()
{
for (int i = 0; i < this.NodesCount; i++)
{
this.Nodes.Add(i + 1);
}
}
public void GenerateEdges()
{
Random r = new Random();
for(int i = 0; i < this.Nodes.Count; i++)
{
for(int j = 0; j < this.Nodes.Count; j++)
{
if(this.Nodes[i] != this.Nodes[j])
{
int _weight = r.Next(0, 500);
Edge e = new Edge(this.Nodes[i], this.Nodes[j], _weight);
this.Edges.Add(e);
}
}
}
}
public void GenerateCycles()
{
for(int i = 0; i < this.Edges.Count; i++)
{
FindCycles(new Cycle(new List<Edge>() { this.Edges[i] }));
}
this.UpdateCycles();
}
public void UpdateCycles()
{
for (int i = 0; i < this.Cycles.Count; i++)
{
this.Cycles[i].UpdateCycle();
}
}
private void FindCycles(Cycle path)
{
List<Edge> nextPossibleEdges = GetNextEdges(path.Edges[path.Edges.Count - 1].End);
for (int i = 0; i < nextPossibleEdges.Count; i++)
{
if (path.Edges.IndexOf(nextPossibleEdges[i]) < 0) // the edge shouldn't be already in the path
{
Cycle temporaryPath = new Cycle(path.Edges);
temporaryPath.Edges.Add(nextPossibleEdges[i]);
if (nextPossibleEdges[i].End == temporaryPath.Edges[0].Start) // end of path - valid cycle
{
if (!CycleExists(temporaryPath))
{
this.Cycles.Add(temporaryPath);
break;
}
}
else
{
FindCycles(temporaryPath);
}
}
}
}
private bool CycleExists(Cycle cycle)
{
bool toReturn = false;
if (this.Cycles.IndexOf(cycle) > -1) { toReturn = true; }
else
{
for (int i = 0; i < this.Cycles.Count; i++)
{
if (this.Cycles[i].Edges.Count == cycle.Edges.Count && !CompareEdges(this.Cycles[i].Edges[0], cycle.Edges[0]))
{
bool cycleExists = true;
for (int j = 0; j < cycle.Edges.Count; j++)
{
bool edgeExists = false; // if there is an edge not in the path, then the searched cycle is diferent from the current cycle and we can pas to the next iteration
for (int k = 0; k < this.Cycles[i].Edges.Count; k++)
{
if (CompareEdges(cycle.Edges[j], this.Cycles[i].Edges[k]))
{
edgeExists = true;
break;
}
}
if (!edgeExists)
{
cycleExists = false;
break;
}
}
if (cycleExists) // if we found an cycle with all edges equal to the searched cycle, then the cycle is not valid
{
toReturn = true;
break;
}
}
}
}
return toReturn;
}
private bool CompareEdges(Edge e1, Edge e2)
{
return (e1.Start == e2.Start && e1.End == e2.End && e1.Weight == e2.Weight);
}
private List<Edge> GetNextEdges(int endNode)
{
List<Edge> tmp = new List<Edge>();
for(int i = 0; i < this.Edges.Count; i++)
{
if(endNode == this.Edges[i].Start)
{
tmp.Add(this.Edges[i]);
}
}
return tmp;
}
#region IDisposable Support
private bool disposedValue = false; // To detect redundant calls
protected virtual void Dispose(bool disposing)
{
if (!disposedValue)
{
if (disposing)
{
// TODO: dispose managed state (managed objects).
this.Nodes = null;
this.Edges = null;
this.Cycles = null;
this.NodesCount = 0;
}
// TODO: free unmanaged resources (unmanaged objects) and override a finalizer below.
// TODO: set large fields to null.
disposedValue = true;
}
}
// TODO: override a finalizer only if Dispose(bool disposing) above has code to free unmanaged resources.
// ~Graph() {
// // Do not change this code. Put cleanup code in Dispose(bool disposing) above.
// Dispose(false);
// }
// This code added to correctly implement the disposable pattern.
public void Dispose()
{
// Do not change this code. Put cleanup code in Dispose(bool disposing) above.
Dispose(true);
// TODO: uncomment the following line if the finalizer is overridden above.
// GC.SuppressFinalize(this);
}
#endregion
}
I've found several articles/answers about graphs, both in Java and C# (including quickgraph), but they mainly focus on directed graphs (without cycles).
I have also read about tail call optimization, for recursion, but I don't know if/how to implement in my case.
I now there is a lot to grasp about this subject, but maybe someone had to deal with something similar and can either help me optimize the code (which as I said seems to do the job in the end), either point me to another direction to rethink the whole process.
I think you can massively simplify this.
All money is the same, so (using your example) N1 doesn't care whether it gets 350 from N2 and pays 150 to N2 and so on - N1 merely cares that overall it ends up 145 down (if I've done the arithmetic correctly). Similarly, each other N only cares about its overall position. So, summing the inflows and outflows at each node, we get:
Company Net position
N1 -145
N2 -65
N3 +195
N4 +15
So with someone to act as a central clearing house - the bank - simply arrange for N1 and N2 to pay the clearing house 145 and 65 respectively, and for N3 and N4 to receive 195 and 15 respectively from the clearing house. And everyone's happy.
I may have missed some aspect, of course, in which case I'm sure someone will point it out...

Suppressing Frequencies From FFT

What I am trying to do is to retrieve the frequencies from some song and suppress all the frequencies that do not appear in the human vocal range or in general any range. Here is my suppress function.
public void SupressAndWrite(Func<FrequencyUnit, bool> func)
{
this.WaveManipulated = true;
while (this.mainWave.WAVFile.NumSamplesRemaining > 0)
{
FrequencyUnit[] freqUnits = this.mainWave.NextFrequencyUnits();
Complex[] compUnits = (from item
in freqUnits
select (func(item)
? new Complex(item.Frequency, 0) :Complex.Zero))
.ToArray();
FourierTransform.FFT(compUnits, FourierTransform.Direction.Backward);
short[] shorts = (from item
in compUnits
select (short)item.Real).ToArray();
foreach (short item in shorts)
{
this.ManipulatedFile.AddSample16bit(item);
}
}
this.ManipulatedFile.Close();
}
Here is my class for my wave.
public sealed class ComplexWave
{
public readonly WAVFile WAVFile;
public readonly Int32 SampleSize;
private FourierTransform.Direction fourierDirection { get; set; }
private long position;
/// <param name="file"></param>
/// <param name="sampleSize in BLOCKS"></param>
public ComplexWave(WAVFile file, int sampleSize)
{
file.NullReferenceExceptionCheck();
this.WAVFile = file;
this.SampleSize = sampleSize;
if (this.SampleSize % 8 != 0)
{
if (this.SampleSize % 16 != 0)
{
throw new ArgumentException("Sample Size");
}
}
if (!MathTools.IsPowerOf2(sampleSize))
{
throw new ArgumentException("Sample Size");
}
this.fourierDirection = FourierTransform.Direction.Forward;
}
public Complex[] NextSampleFourierTransform()
{
short[] newInput = this.GetNextSample();
Complex[] data = newInput.CopyToComplex();
if (newInput.Any((x) => x != 0))
{
Debug.Write("done");
}
FourierTransform.FFT(data, this.fourierDirection);
return data;
}
public FrequencyUnit[] NextFrequencyUnits()
{
Complex[] cm = this.NextSampleFourierTransform();
FrequencyUnit[] freqUn = new FrequencyUnit[(cm.Length / 2)];
int max = (cm.Length / 2);
for (int i = 0; i < max; i++)
{
freqUn[i] = new FrequencyUnit(cm[i], this.WAVFile.SampleRateHz, i, cm.Length);
}
Array.Sort(freqUn);
return freqUn;
}
private short[] GetNextSample()
{
short[] retval = new short[this.SampleSize];
for (int i = 0; i < this.SampleSize; i++)
{
if (this.WAVFile.NumSamplesRemaining > 0)
{
retval[i] = this.WAVFile.GetNextSampleAs16Bit();
this.position++;
}
}
return retval;
}
}
Both FFT forward and FFT backwards work correctly. Could you please tell me what my error is.
Unfortunately, human voice, even when singing, isn't in 'frequency range'. It usually has one main frequency and multitude of harmonics that follow it, depending on the phoneme.
Use this https://play.google.com/store/apps/details?id=radonsoft.net.spectralview&hl=en or some similar app to see what I mean - and then re-define your strategy. Also google 'karaoke' effect.
NEXT:
It's not obvious from your example, but you should scan whole file in windows (google 'fft windowing') to process it whole.

C# - A faster alternative to Convert.ToSingle()

I'm working on a program which reads millions of floating point numbers from a text file. This program runs inside of a game that I'm designing, so I need it to be fast (I'm loading an obj file). So far, loading a relatively small file takes about a minute (without precompilation) because of the slow speed of Convert.ToSingle(). Is there a faster way to do this?
EDIT: Here's the code I use to parse the Obj file
http://pastebin.com/TfgEge9J
using System;
using System.IO;
using System.Collections.Generic;
using OpenTK.Math;
using System.Drawing;
using PlatformLib;
public class ObjMeshLoader
{
public static StreamReader[] LoadMeshes(string fileName)
{
StreamReader mreader = new StreamReader(PlatformLib.Platform.openFile(fileName));
MemoryStream current = null;
List<MemoryStream> mstreams = new List<MemoryStream>();
StreamWriter mwriter = null;
if (!mreader.ReadLine().Contains("#"))
{
mreader.BaseStream.Close();
throw new Exception("Invalid header");
}
while (!mreader.EndOfStream)
{
string cmd = mreader.ReadLine();
string line = cmd;
line = line.Trim(splitCharacters);
line = line.Replace(" ", " ");
string[] parameters = line.Split(splitCharacters);
if (parameters[0] == "mtllib")
{
loadMaterials(parameters[1]);
}
if (parameters[0] == "o")
{
if (mwriter != null)
{
mwriter.Flush();
current.Position = 0;
}
current = new MemoryStream();
mwriter = new StreamWriter(current);
mwriter.WriteLine(parameters[1]);
mstreams.Add(current);
}
else
{
if (mwriter != null)
{
mwriter.WriteLine(cmd);
mwriter.Flush();
}
}
}
mwriter.Flush();
current.Position = 0;
List<StreamReader> readers = new List<StreamReader>();
foreach (MemoryStream e in mstreams)
{
e.Position = 0;
StreamReader sreader = new StreamReader(e);
readers.Add(sreader);
}
return readers.ToArray();
}
public static bool Load(ObjMesh mesh, string fileName)
{
try
{
using (StreamReader streamReader = new StreamReader(Platform.openFile(fileName)))
{
Load(mesh, streamReader);
streamReader.Close();
return true;
}
}
catch { return false; }
}
public static bool Load2(ObjMesh mesh, StreamReader streamReader, ObjMesh prevmesh)
{
if (prevmesh != null)
{
//mesh.Vertices = prevmesh.Vertices;
}
try
{
//streamReader.BaseStream.Position = 0;
Load(mesh, streamReader);
streamReader.Close();
#if DEBUG
Console.WriteLine("Loaded "+mesh.Triangles.Length.ToString()+" triangles and"+mesh.Quads.Length.ToString()+" quadrilaterals parsed, with a grand total of "+mesh.Vertices.Length.ToString()+" vertices.");
#endif
return true;
}
catch (Exception er) { Console.WriteLine(er); return false; }
}
static char[] splitCharacters = new char[] { ' ' };
static List<Vector3> vertices;
static List<Vector3> normals;
static List<Vector2> texCoords;
static Dictionary<ObjMesh.ObjVertex, int> objVerticesIndexDictionary;
static List<ObjMesh.ObjVertex> objVertices;
static List<ObjMesh.ObjTriangle> objTriangles;
static List<ObjMesh.ObjQuad> objQuads;
static Dictionary<string, Bitmap> materials = new Dictionary<string, Bitmap>();
static void loadMaterials(string path)
{
StreamReader mreader = new StreamReader(Platform.openFile(path));
string current = "";
bool isfound = false;
while (!mreader.EndOfStream)
{
string line = mreader.ReadLine();
line = line.Trim(splitCharacters);
line = line.Replace(" ", " ");
string[] parameters = line.Split(splitCharacters);
if (parameters[0] == "newmtl")
{
if (materials.ContainsKey(parameters[1]))
{
isfound = true;
}
else
{
current = parameters[1];
}
}
if (parameters[0] == "map_Kd")
{
if (!isfound)
{
string filename = "";
for (int i = 1; i < parameters.Length; i++)
{
filename += parameters[i];
}
string searcher = "\\" + "\\";
filename.Replace(searcher, "\\");
Bitmap mymap = new Bitmap(filename);
materials.Add(current, mymap);
isfound = false;
}
}
}
}
static float parsefloat(string val)
{
return Convert.ToSingle(val);
}
int remaining = 0;
static string GetLine(string text, ref int pos)
{
string retval = text.Substring(pos, text.IndexOf(Environment.NewLine, pos));
pos = text.IndexOf(Environment.NewLine, pos);
return retval;
}
static void Load(ObjMesh mesh, StreamReader textReader)
{
//try {
//vertices = null;
//objVertices = null;
if (vertices == null)
{
vertices = new List<Vector3>();
}
if (normals == null)
{
normals = new List<Vector3>();
}
if (texCoords == null)
{
texCoords = new List<Vector2>();
}
if (objVerticesIndexDictionary == null)
{
objVerticesIndexDictionary = new Dictionary<ObjMesh.ObjVertex, int>();
}
if (objVertices == null)
{
objVertices = new List<ObjMesh.ObjVertex>();
}
objTriangles = new List<ObjMesh.ObjTriangle>();
objQuads = new List<ObjMesh.ObjQuad>();
mesh.vertexPositionOffset = vertices.Count;
string line;
string alltext = textReader.ReadToEnd();
int pos = 0;
while ((line = GetLine(alltext, pos)) != null)
{
if (line.Length < 2)
{
break;
}
//line = line.Trim(splitCharacters);
//line = line.Replace(" ", " ");
string[] parameters = line.Split(splitCharacters);
switch (parameters[0])
{
case "usemtl":
//Material specification
try
{
mesh.Material = materials[parameters[1]];
}
catch (KeyNotFoundException)
{
Console.WriteLine("WARNING: Texture parse failure: " + parameters[1]);
}
break;
case "p": // Point
break;
case "v": // Vertex
float x = parsefloat(parameters[1]);
float y = parsefloat(parameters[2]);
float z = parsefloat(parameters[3]);
vertices.Add(new Vector3(x, y, z));
break;
case "vt": // TexCoord
float u = parsefloat(parameters[1]);
float v = parsefloat(parameters[2]);
texCoords.Add(new Vector2(u, v));
break;
case "vn": // Normal
float nx = parsefloat(parameters[1]);
float ny = parsefloat(parameters[2]);
float nz = parsefloat(parameters[3]);
normals.Add(new Vector3(nx, ny, nz));
break;
case "f":
switch (parameters.Length)
{
case 4:
ObjMesh.ObjTriangle objTriangle = new ObjMesh.ObjTriangle();
objTriangle.Index0 = ParseFaceParameter(parameters[1]);
objTriangle.Index1 = ParseFaceParameter(parameters[2]);
objTriangle.Index2 = ParseFaceParameter(parameters[3]);
objTriangles.Add(objTriangle);
break;
case 5:
ObjMesh.ObjQuad objQuad = new ObjMesh.ObjQuad();
objQuad.Index0 = ParseFaceParameter(parameters[1]);
objQuad.Index1 = ParseFaceParameter(parameters[2]);
objQuad.Index2 = ParseFaceParameter(parameters[3]);
objQuad.Index3 = ParseFaceParameter(parameters[4]);
objQuads.Add(objQuad);
break;
}
break;
}
}
//}catch(Exception er) {
// Console.WriteLine(er);
// Console.WriteLine("Successfully recovered. Bounds/Collision checking may fail though");
//}
mesh.Vertices = objVertices.ToArray();
mesh.Triangles = objTriangles.ToArray();
mesh.Quads = objQuads.ToArray();
textReader.BaseStream.Close();
}
public static void Clear()
{
objVerticesIndexDictionary = null;
vertices = null;
normals = null;
texCoords = null;
objVertices = null;
objTriangles = null;
objQuads = null;
}
static char[] faceParamaterSplitter = new char[] { '/' };
static int ParseFaceParameter(string faceParameter)
{
Vector3 vertex = new Vector3();
Vector2 texCoord = new Vector2();
Vector3 normal = new Vector3();
string[] parameters = faceParameter.Split(faceParamaterSplitter);
int vertexIndex = Convert.ToInt32(parameters[0]);
if (vertexIndex < 0) vertexIndex = vertices.Count + vertexIndex;
else vertexIndex = vertexIndex - 1;
//Hmm. This seems to be broken.
try
{
vertex = vertices[vertexIndex];
}
catch (Exception)
{
throw new Exception("Vertex recognition failure at " + vertexIndex.ToString());
}
if (parameters.Length > 1)
{
int texCoordIndex = Convert.ToInt32(parameters[1]);
if (texCoordIndex < 0) texCoordIndex = texCoords.Count + texCoordIndex;
else texCoordIndex = texCoordIndex - 1;
try
{
texCoord = texCoords[texCoordIndex];
}
catch (Exception)
{
Console.WriteLine("ERR: Vertex " + vertexIndex + " not found. ");
throw new DllNotFoundException(vertexIndex.ToString());
}
}
if (parameters.Length > 2)
{
int normalIndex = Convert.ToInt32(parameters[2]);
if (normalIndex < 0) normalIndex = normals.Count + normalIndex;
else normalIndex = normalIndex - 1;
normal = normals[normalIndex];
}
return FindOrAddObjVertex(ref vertex, ref texCoord, ref normal);
}
static int FindOrAddObjVertex(ref Vector3 vertex, ref Vector2 texCoord, ref Vector3 normal)
{
ObjMesh.ObjVertex newObjVertex = new ObjMesh.ObjVertex();
newObjVertex.Vertex = vertex;
newObjVertex.TexCoord = texCoord;
newObjVertex.Normal = normal;
int index;
if (objVerticesIndexDictionary.TryGetValue(newObjVertex, out index))
{
return index;
}
else
{
objVertices.Add(newObjVertex);
objVerticesIndexDictionary[newObjVertex] = objVertices.Count - 1;
return objVertices.Count - 1;
}
}
}
Based on your description and the code you've posted, I'm going to bet that your problem isn't with the reading, the parsing, or the way you're adding things to your collections. The most likely problem is that your ObjMesh.Objvertex structure doesn't override GetHashCode. (I'm assuming that you're using code similar to http://www.opentk.com/files/ObjMesh.cs.
If you're not overriding GetHashCode, then your objVerticesIndexDictionary is going to perform very much like a linear list. That would account for the performance problem that you're experiencing.
I suggest that you look into providing a good GetHashCode method for your ObjMesh.Objvertex class.
See Why is ValueType.GetHashCode() implemented like it is? for information about the default GetHashCode implementation for value types and why it's not suitable for use in a hash table or dictionary.
Edit 3: The problem is NOT with the parsing.
It's with how you read the file. If you read it properly, it would be faster; however, it seems like your reading is unusually slow. My original suspicion was that it was because of excess allocations, but it seems like there might be other problems with your code too, since that doesn't explain the entire slowdown.
Nevertheless, here's a piece of code I made that completely avoids all object allocations:
static void Main(string[] args)
{
long counter = 0;
var sw = Stopwatch.StartNew();
var sb = new StringBuilder();
var text = File.ReadAllText("spacestation.obj");
for (int i = 0; i < text.Length; i++)
{
int start = i;
while (i < text.Length &&
(char.IsDigit(text[i]) || text[i] == '-' || text[i] == '.'))
{ i++; }
if (i > start)
{
sb.Append(text, start, i - start); //Copy data to the buffer
float value = Parse(sb); //Parse the data
sb.Remove(0, sb.Length); //Clear the buffer
counter++;
}
}
sw.Stop();
Console.WriteLine("{0:N0}", sw.Elapsed.TotalSeconds); //Only a few ms
}
with this parser:
const int MIN_POW_10 = -16, int MAX_POW_10 = 16,
NUM_POWS_10 = MAX_POW_10 - MIN_POW_10 + 1;
static readonly float[] pow10 = GenerateLookupTable();
static float[] GenerateLookupTable()
{
var result = new float[(-MIN_POW_10 + MAX_POW_10) * 10];
for (int i = 0; i < result.Length; i++)
result[i] = (float)((i / NUM_POWS_10) *
Math.Pow(10, i % NUM_POWS_10 + MIN_POW_10));
return result;
}
static float Parse(StringBuilder str)
{
float result = 0;
bool negate = false;
int len = str.Length;
int decimalIndex = str.Length;
for (int i = len - 1; i >= 0; i--)
if (str[i] == '.')
{ decimalIndex = i; break; }
int offset = -MIN_POW_10 + decimalIndex;
for (int i = 0; i < decimalIndex; i++)
if (i != decimalIndex && str[i] != '-')
result += pow10[(str[i] - '0') * NUM_POWS_10 + offset - i - 1];
else if (str[i] == '-')
negate = true;
for (int i = decimalIndex + 1; i < len; i++)
if (i != decimalIndex)
result += pow10[(str[i] - '0') * NUM_POWS_10 + offset - i];
if (negate)
result = -result;
return result;
}
it happens in a small fraction of a second.
Of course, this parser is poorly tested and has these current restrictions (and more):
Don't try parsing more digits (decimal and whole) than provided for in the array.
No error handling whatsoever.
Only parses decimals, not exponents! i.e. it can parse 1234.56 but not 1.23456E3.
Doesn't care about globalization/localization. Your file is only in a single format, so there's no point caring about that kind of stuff because you're probably using English to store it anyway.
It seems like you won't necessarily need this much overkill, but take a look at your code and try to figure out the bottleneck. It seems to be neither the reading nor the parsing.
Have you measured that the speed problem is really caused by Convert.ToSingle?
In the code you included, I see you create lists and dictionaries like this:
normals = new List<Vector3>();
texCoords = new List<Vector2>();
objVerticesIndexDictionary = new Dictionary<ObjMesh.ObjVertex, int>();
And then when you read the file, you add in the collection one item at a time.
One of the possible optimizations would be to save total number of normals, texCoords, indexes and everything at the start of the file, and then initialize these collections by these numbers. This will pre-allocate the buffers used by collections, so adding items to the them will be pretty fast.
So the collection creation should look like this:
// These values should be stored at the beginning of the file
int totalNormals = Convert.ToInt32(textReader.ReadLine());
int totalTexCoords = Convert.ToInt32(textReader.ReadLine());
int totalIndexes = Convert.ToInt32(textReader.ReadLine());
normals = new List<Vector3>(totalNormals);
texCoords = new List<Vector2>(totalTexCoords);
objVerticesIndexDictionary = new Dictionary<ObjMesh.ObjVertex, int>(totalIndexes);
See List<T> Constructor (Int32) and Dictionary<TKey, TValue> Constructor (Int32).
This related question is for C++, but is definitely worth a read.
For reading as fast as possible, you're probably going to want to map the file into memory and then parse using some custom floating point parser, especially if you know the numbers are always in a specific format (i.e. you're the one generating the input files in the first place).
I tested .Net string parsing once and the fastest function to parse text was the old VB Val() function. You could pull the relevant parts out of Microsoft.VisualBasic.Conversion Val(string)
Converting String to numbers
Comparison of relative test times (ms / 100000 conversions)
Double Single Integer Int(w/ decimal point)
14 13 6 16 Val(Str)
14 14 6 16 Cxx(Val(Str)) e.g., CSng(Val(str))
22 21 17 e! Convert.To(str)
23 21 16 e! XX.Parse(str) e.g. Single.Parse()
30 31 31 32 Cxx(str)
Val: fastest, part of VisualBasic dll, skips non-numeric,
ConvertTo and Parse: slower, part of core, exception on bad format (including decimal point)
Cxx: slowest (for strings), part of core, consistent times across formats

Categories