my goal is to calculate how much space a file occupies in memorycache ,
to do this I made a method that loads thousands of objects into memory and calculates the difference of some values between before loading the objects and after ,
this before file addition:
Process p = Process.GetCurrentProcess();
long memoriaNonPaginata = p.NonpagedSystemMemorySize64;
long memoriaPaginata = p.PagedMemorySize64;
long memoriaDiSistema = p.PagedSystemMemorySize64;
long ramIniziale = memoriaNonPaginata + memoriaPaginata + memoriaDiSistema;
I add files:
while (counter < numeroOggettiInCache)
{
string k = counter.ToString();
data = data + k.Substring(0,1);
int position = r.Next(0, 28000);
data = data.Remove(position, 1);
_cache.CreateEntry("cucu" + counter.ToString());
_cache.Set("cucu" + counter.ToString(), data, opt);
counter++;
}
I'm going to calculate the values later:
Process p1 = Process.GetCurrentProcess();
long memoriaNonPaginataPost = p1.NonpagedSystemMemorySize64;
long memoriaPaginataPost = p1.PagedMemorySize64;
long memoriaDiSistemaPost = p1.PagedSystemMemorySize64;
long ramFinale = memoriaNonPaginataPost + memoriaPaginataPost + memoriaDiSistemaPost;
I ask if the properties I used (NonpagedSystemMemorySize64,PagedMemorySize64,PagedSystemMemorySize64) are correct and sufficient to calculate how much space the cached files use, thank you,regards
Related
I have created a form to arrange a ship hold.
Problem: On multiple times through, it comes back with the decimal point shifted two more spaces. How do I keep it from shifting further?
Code Snippet:
private void btnGetInv_Click(object sender, EventArgs e)
{
//if this is a ship hold, roll percentile to get how much of the ship's total weight capacity is being carried.
if(ShipHold==true)
{
decimal Percentage = 0;
decimal HoldWeight = 0;
LoNum = 1;
HiNum = 100;
DieResult = 0;
Percentage = DieResult + 1 + RollDice.Next(LoNum - 1, HiNum);
WeightAvail = Percentage / 100 * WeightAvail;
HoldWeight = WeightAvail;
RTBItems.Text = "percentage of weight = " + HoldWeight + "\r\n" + RTBItems.Text;
}
WeightAvail is also a decimal. However; LoNum, HiNum, and DieResult are all types int to work with the Random.Next function. It might be best if I can round to the nearest whole number. I'm using Visual Studio 2019.
This was my fix:
HoldWeight = Percentage / 100 * WeightAvail;
//HoldWeight = WeightAvail;
RTBItems.Text = "percentage of weight = " + HoldWeight + "\r\n" + RTBItems.Text;
HoldWeight = 0;
Sorry for the length of the post. I want to illustrate what I have tried and what I am trying to accomplish.
Essentially what I am trying to do is write a VOIP network tester in C#. I've written all the VOIP code using the Ozeki VOIP SIP C# SDK. Essentially what it does is the client makes a VOIP call which the server side picks up. The client plays a WAV file and the server side records it. I've generated a tone file from audiocheck.net. I generated a 3000Hz wav file with a sine waveform for 5 seconds at a sample rate of 8000Hz and 16bit. This is what the client plays. I chose the frequency arbitrarily, so that can always change. What I want to do is then have the server side to do a simple analysis on the file to determine the amount of noise, which can be introduced through packet loss, latency, etc.
AudioProcessor.cs is a C# class that opens a WAV file and reads the header information. Since the file is a 16-bit wave, I use a "two complement" (thanks to http://www.codeproject.com/Articles/19590/WAVE-File-Processor-in-C) to read each 2-byte frame in to an array. For example I have:
0:0
1:-14321
2:17173
3:-9875
4:0
5:9875
6:-17175
7:14319
8:0
9:-14321
10:17173
11:-9875
The code is:
Console.WriteLine("Audio: Filename: " + fileName);
FileStream stream = File.Open(fileName, FileMode.Open, FileAccess.Read);
BinaryReader reader = new BinaryReader(stream);
int chunkID = reader.ReadInt32();
int fileSize = reader.ReadInt32();
int riffType = reader.ReadInt32();
int fmtID = reader.ReadInt32();
int fmtSize = reader.ReadInt32();
int fmtCode = reader.ReadInt16();
int channels = reader.ReadInt16();
int sampleRate = reader.ReadInt32();
int fmtAvgBPS = reader.ReadInt32();
int fmtBlockAlign = reader.ReadInt16();
int bitDepth = reader.ReadInt16();
if (fmtSize == 18)
{
// Read any extra values
int fmtExtraSize = reader.ReadInt16();
reader.ReadBytes(fmtExtraSize);
}
int dataID = reader.ReadInt32();
int dataSize = reader.ReadInt32();
Console.WriteLine("Audio: file size: " + fileSize.ToString());
Console.WriteLine("Audio: sample rate: " + sampleRate.ToString());
Console.WriteLine("Audio: channels: " + channels.ToString());
Console.WriteLine("Audio: bit depth: " + bitDepth.ToString());
Console.WriteLine("Audio: fmtAvgBPS: " + fmtAvgBPS.ToString());
Console.WriteLine("Audio: data id: " + dataID.ToString());
Console.WriteLine("Audio: data size: " + dataSize.ToString());
int frames = 8 * (dataSize / bitDepth) / channels;
int frameSize = dataSize / frames;
double timeLength = ((double)frames / (double)sampleRate);
Console.WriteLine("Audio: frames: " + frames.ToString());
Console.WriteLine("Audio: frame size: " + frameSize.ToString());
Console.WriteLine("Audio: Time length: " + timeLength.ToString());
// byte[] soundData = reader.ReadBytes(dataSize);
// Convert to two-complement
short[] frameData = new short[frames];
for (int i = 0; i < frames; i++)
{
short snd = reader.ReadInt16();
if (snd != 0)
snd = Convert.ToInt16((~snd | 1));
frameData[i] = snd;
}
The next step would be to calculate the amount of noise, or rather how much non-3000Hz signal is there. Based on research I initially tried using a Goertzel filter to detect a particular frequency. It appears to be used a lot to detect phone DTMF. This method is an implementation which I tried.
public static double Calculate(short[] samples, double freq)
{
double s_prev = 0.0;
double s_prev2 = 0.0;
double coeff,normalizedfreq,power,s;
int i;
normalizedfreq = freq / (double)SAMPLING_RATE;
coeff = 2.0*Math.Cos(2.0*Math.PI*normalizedfreq);
for (i=0; i<samples.Length; i++)
{
s = samples[i] + coeff * s_prev - s_prev2;
s_prev2 = s_prev;
s_prev = s;
}
power = s_prev2*s_prev2+s_prev*s_prev-coeff*s_prev*s_prev2;
return power;
}
I would call the function passing in a 1 second sample:
short[] sampleData = new short[4000];
Array.Copy(frameData,sampleData,4000);
for (int i = 1; i < 11; i++)
{
Console.WriteLine(i * 1000 + ": " + Goertzel2.Calculate(sampleData, i * 1000));
}
The output is:
1000: 4297489869.04579
2000: 19758026000000
3000: 1.17528628051013E+15
4000: 0
5000: 1.17528628051013E+15
6000: 19758026000000
7000: 4297489869.04671
8000: 4000000
9000: 4297489869.04529
10000: 19758026000000
3000Hz seems to have the biggest number, but so does 5000. I have no idea of whether these numbers are accurate or not. If this was to work, I would run this against smaller samples, say 1/10 s in an attempt to detect variations which I would interpret as noise.
I've also looked at notch filter or a FFT. I'm not really sure what the best step is next. I don't need anything complex. I just want to roughly be able to calculate how much of the output wav file is noise. As mentioned I'm writing this in C#, but I can port code from C, C++, Python and Java.
Edit: Here is my updated code.
Calculates the total power at each frequency
// Number of frequencies that are half of the sample rate to scan
int _frequencyGranularity = 2000;
// Number of frames to use to create a sample for the filter
int _sampleSize = 4000;
int frameCount = 0;
while(frameCount + _sampleSize < frameData.Length)
{
// Dictionary to store the power level at a particular frequency
Dictionary<int, double> vals = new Dictionary<int, double>(_frequencyGranularity);
double totalPower = 0;
for (int i = 1; i <= _frequencyGranularity; i++)
{
// Only process up to half of the sample rate as this is the Nyquist limit
// http://stackoverflow.com/questions/20864651/calculating-the-amount-of-noise-in-a-wav-file-compared-to-a-source-file
int freq = i * wave.SampleRate / 2 / _frequencyGranularity;
vals[freq] = Goertzel.Calculate(frameData, frameCount, _sampleSize, wave.SampleRate, freq);
totalPower += vals[freq];
}
// Calculate the percentange of noise by subtracting the percentage of power at the desided frequency of 3000 from 100.
double frameNoisePercentange = (100 - (vals[3000] / totalPower * 100));
logger.Debug("Frame: " + frameCount + " Noise: " + frameNoisePercentange);
noisePercentange += frameNoisePercentange;
frameCount += _sampleSize;
}
double averageNoise = (noisePercentange / (int)(frameCount/_sampleSize));
Updated Goertzel method
public static double Calculate(short[] sampleData, int offset, int length, int sampleRate, double searchFreq)
{
double s_prev = 0.0;
double s_prev2 = 0.0;
double coeff,normalizedfreq,power,s;
int i;
normalizedfreq = searchFreq / (double)sampleRate;
coeff = 2.0*Math.Cos(2.0*Math.PI*normalizedfreq);
for (i=0; i<length; i++)
{
s = sampleData[i+offset] + coeff * s_prev - s_prev2;
s_prev2 = s_prev;
s_prev = s;
}
power = s_prev2*s_prev2+s_prev*s_prev-coeff*s_prev*s_prev2;
return power;
}
One way to build a crude estimation of the noise would be to compute the standard deviation of the peak values of the signal.
Given that you know the expected frequency, you can divide the signal into chunks of one wavelength, i.e if your signal is a 3KHz and your sample rate is 16KHz, then your chunk size is 5.3333 samples, for each chunk find the highest value, then for that sequence of values, find the stddev.
Alternatively you can for each chunk track the min and max values, then over the whole sample, find the mean of the min and max, and the range for the min (i.e. the highest and lowest values of the min value) then the SNR is ~ (mean_max - mean_min) / (min_range)
Originally, I was using a short C# program I wrote to average some numbers. But now I want to do more extensive analysis so I converted my C# code to R. However, I really don't think that I am doing it the proper way in R or taking advantage of the language. I wrote the R in the exact same way I did the C#.
I have a CSV with two columns. The first column identifies the row's type (one of three values: C, E, or P) and the second column has a number. I want to average the numbers grouped on the type (C, E, or P).
My question is, what is the idiomatic way of doing this in R?
C# code:
string path = "data.csv";
string[] lines = File.ReadAllLines(path);
int cntC = 0; int cntE = 0; int cntP = 0; //counts
double totC = 0; double totE = 0; double totP = 0; //totals
foreach (string line in lines)
{
String[] cells = line.Split(',');
if (cells[1] == "NA") continue; //skip missing data
if (cells[0] == "C")
{
totC += Convert.ToDouble(cells[1]);
cntC++;
}
else if (cells[0] == "E")
{
totE += Convert.ToDouble(cells[1]);
cntE++;
}
else if (cells[0] == "P")
{
totP += Convert.ToDouble(cells[1]);
cntP++;
}
}
Console.WriteLine("C found " + cntC + " times with a total of " + totC + " and an average of " + totC / cntC);
Console.WriteLine("E found " + cntE + " times with a total of " + totE + " and an average of " + totE / cntE);
Console.WriteLine("P found " + cntP + " times with a total of " + totP + " and an average of " + totP / cntP);
R code:
dat = read.csv("data.csv", header = TRUE)
cntC = 0; cntE = 0; cntP = 0 # counts
totC = 0; totE = 0; totP = 0 # totals
for(i in 1:nrow(dat))
{
if(is.na(dat[i,2])) # missing data
next
if(dat[i,1] == "C"){
totC = totC + dat[i,2]
cntC = cntC + 1
}
if(dat[i,1] == "E"){
totE = totE + dat[i,2]
cntE = cntE + 1
}
if(dat[i,1] == "P"){
totP = totP + dat[i,2]
cntP = cntP + 1
}
}
sprintf("C found %d times with a total of %f and an average of %f", cntC, totC, (totC / cntC))
sprintf("E found %d times with a total of %f and an average of %f", cntE, totE, (totE / cntE))
sprintf("P found %d times with a total of %f and an average of %f", cntP, totP, (totP / cntP))
I would use the data.table package since it has group by functionality built in.
library(data.table)
dat <- data.table(dat)
dat[, mean(COL_NAME_TO_TAKE_MEAN_OF), by=COL_NAME_TO_GROUP_BY]
# no quotes for the column names
If you would like to take the mean (or perform other function) on multiple columns, still by group, use:
dat[, lapply(.SD, mean), by=COL_NAME_TO_GROUP_BY]
Alternatively, if you want to use Base R, you could use something like
by(dat, dat[, 1], lapply, mean)
# to convert the results to a data.frame, use
do.call(rbind, by(dat, dat[, 1], lapply, mean) )
I would do something like this :
dat = dat[complete.cases(dat),] ## The R way to remove missing data
dat[,2] <- as.numeric(dat[,2]) ## convert to numeric as you do in c#
by(dat[,2],dat[,1],mean) ## compute the mean by group
Of course to aggregate your result in a data.frame you can use the the classic , But I don't think is necessary here since it a list of 3 variables:
do.call(rbind,result)
EDIT1
Another option here is to use the elegant ave :
ave(dat[,2],dat[,1])
But the result is different here. In the sense you will get a vector of the same length as your original data.
EDIT2 To include more results you can elaborate your anonymous function:
by(dat[,2],dat[,1],function(x) c(min(x),max(x),mean(x),sd(x)))
Or returns data.frame more suitable to rbind call and with columns names:
by(dat[,2],dat[,1],function(x)
data.frame(min=min(x),max=max(x),mean=mean(x),sd=sd(x)))
Or use the elegant built-in function ( you can define your's also) summary:
by(dat[,2],dat[,1],summary)
One way:
library(plyr)
ddply(dat, .(columnOneName), summarize, Average = mean(columnTwoName))
I'm currently using a loop in C# to generate files, but the program takes a few seconds to actually do this and I feel the program user would benefit from a progress bar that updates to tell them how far the loop is through so they can estimate when the loop is going to finish and all of their files are generated.
I was wondering if there was a way to calculate the time it's going to take a loop to complete or update a progress bar with the loop to show how much progress the loop has left.
Here's my loop.
String GeneratedLevelName;
int levelx = 0;
int levely = 0;
for (int i = 0; i < GmapLevelArea; i ++) {
if (levelx >= (GmapWidth - 1)) {
levelx = 0;
levely ++;
GeneratedLevelName = (GmapPrefix + "_" + levelx + "_" + levely + ".nw");
File.Copy(ApplicationDirectory + TemplateFileName, GmapDirectory + GeneratedLevelName);
GmapFile.AppendLine('"' + GeneratedLevelName + '"' + ",");
} else {
levelx ++;
GeneratedLevelName = (GmapPrefix + "_" + levelx + "_" + levely + ".nw");
File.Copy(ApplicationDirectory + TemplateFileName, GmapDirectory + GeneratedLevelName);
GmapFile.Append('"' + GeneratedLevelName + '"' + ",");
}
}
Any help is greatly appreciated.
Since it would be difficult to time the File.Copy, I think you should just base your progress bar on total files worked on.
So if you have a progress bar: pb
pb.Minimum = 0;
pb.Maximum = GMapLevelArea;
then set the value of the progress bar to you i value
pb.Value = i;
Now when I usually use progress bars like this, I usually don't update ever iteration. I usually update after a few iterations.
So to do that I usually check for the iteration count:
if(i % 10 == 0) //only true when i is a multiple of 10
pb.Value = i;
Then at the end of the loop, if it doesn't happen to be a multiple of your update, you can either set the progress bar to its maximum value or reset it to zero to let them no its done.
pb.Value = GMapLevelArea; or pb.Value = 0;
It depends on how consistent your files are generated (do they all take a second, or take some longer than others?) and if you know how many you are going to generate. In the easiest case, you can estimate the time by taking the number of files to generate. For example if you create 50 files, you can increment the progress bar by two percent after a file is generated.
I could convert the pdf pages into images. if it is less than 50 pages its working fast...
if any pdf large than 1000 pages... it acquires lot of time to complete.
any one can review this code and make it work for large file size...
i have used PdfLibNet dll(will not work in 4.0) in .NET3.5
here is my sample code:
public void ConverIMG(string filename)
{
PDFWrapper wrapper = new PDFWrapper();
wrapper.RenderDPI = Dpi;
wrapper.LoadPDF(filename);
int count = wrapper.PageCount;
for (int i = 1; i <= wrapper.PageCount; i++)
{
string fileName = AppDomain.CurrentDomain.BaseDirectory + #"IMG\" + i.ToString() + ".png";
wrapper.ExportJpg(fileName, i, i, (double)100, 100);
while (wrapper.IsJpgBusy)
{
Thread.Sleep(50);
}
}
wrapper.Dispose();
}
PS:
we need to split pages and convert to images parallely and we need to get completed status.
If PDFWrapper performance degrades for documents bigger then 50 pages it suggests it is not very well written. To overcome this you could do conversion in 50 page batches and recreate PDFWrapper after each batch. There is an assumption that ExportJpg() gets slower with number of calls and its initial speed does not depend on the size of PDF.
This is only a workaround for apparent problems in PDFWrapper and a proper solution would be to use a fixed library. Also I would suggest Thread.Sleep(1) if you really need to wait with yielding.
public void ConverIMG(string filename)
{
PDFWrapper wrapper = new PDFWrapper();
wrapper.RenderDPI = Dpi;
wrapper.LoadPDF(filename);
int count = wrapper.PageCount;
for (int i = 1; i <= count; i++)
{
string fileName = AppDomain.CurrentDomain.BaseDirectory + #"IMG\" + i.ToString() + ".png";
wrapper.ExportJpg(fileName, i, i, (double) 100, 100);
while (wrapper.IsJpgBusy)
{
Thread.Sleep(1);
}
if (i % 50 == 0)
{
wrapper.Dispose();
wrapper = new PDFWrapper();
wrapper.RenderDPI = Dpi;
wrapper.LoadPDF(filename);
}
}
wrapper.Dispose();
}