Generating 2 Random Numbers that are Different C# - c#

I would like to generate two random numbers that are different from each other. For example, if the first random number generates 5, I want the next random number generated to not equal 5. This is my code thus far:
Random ran = new Random();
int getRanNum1 = ran.Next(10);
int getRanNum2 = ran.Next(10);
How do I tell getRandNum2 to not equal the value generated in getRandNum1?

With a loop:
int getRanNum2 = ran.Next(10);
while(getRanNum2 == getRanNum1)
getRanNum2 = ran.Next(10);

While the loop-and-check approach is likely suitable here, a variant of this question is "How to get N distinct random numbers in a range (or set)?"
For that, one possible alternative (and one that I like, which is why I wrote this answer) is to build said range (or set), shuffle the items, and then take the first N items.
An implementation of the Fischer-Yates shuffle can be found in this answer. Slightly cleaned up for the situation:
void Shuffle (int arr[]) {
Random rnd = new Random();
for (int i = arr.Length; i > 1; i--) {
int pos = rnd.Next(i);
var x = arr[i - 1];
arr[i - 1] = arr[pos];
arr[pos] = x;
}
}
// usage
var numbers = Enumerable.Range(0,10).ToArray();
Shuffle(numbers);
int getRanNum1 = numbers[0];
int getRanNum2 = numbers[1];
Since we know that only N (2) elements are being selected, the Shuffle method above can actually be modified such that only N (2) swaps are done. This is because arr[i - 1], for each i, is only swapped once (although it can be swapped with itself). In any case, this left as an exercise for the reader.

Related

Why multiply Random.Next() by a Constant?

I recently read an article explaining how to generate a weighted random number, and there's a piece of the code that I don't understand:
int r = ((int)(rand.Next() * (323567)) % prefix[n - 1]) + 1;
Why is rand.Next being multiplied by a constant 323567? Would the code work without this constant?
Below is the full code for reference, and you can find the full article here: https://www.geeksforgeeks.org/random-number-generator-in-arbitrary-probability-distribution-fashion/
Any help is appreciated, thank you!!
// C# program to generate random numbers
// according to given frequency distribution
using System;
class GFG{
// Utility function to find ceiling
// of r in arr[l..h]
static int findCeil(int[] arr, int r,
int l, int h)
{
int mid;
while (l < h)
{
// Same as mid = (l+h)/2
mid = l + ((h - l) >> 1);
if (r > arr[mid])
l = mid + 1;
else
h = mid;
}
return (arr[l] >= r) ? l : -1;
}
// The main function that returns a random number
// from arr[] according to distribution array
// defined by freq[]. n is size of arrays.
static int myRand(int[] arr, int[] freq, int n)
{
// Create and fill prefix array
int[] prefix = new int[n];
int i;
prefix[0] = freq[0];
for(i = 1; i < n; ++i)
prefix[i] = prefix[i - 1] + freq[i];
// prefix[n-1] is sum of all frequencies.
// Generate a random number with
// value from 1 to this sum
Random rand = new Random();
int r = ((int)(rand.Next() * (323567)) % prefix[n - 1]) + 1; // <--- RNG * Constant
// Find index of ceiling of r in prefix array
int indexc = findCeil(prefix, r, 0, n - 1);
return arr[indexc];
}
// Driver Code
static void Main()
{
int[] arr = { 1, 2, 3, 4 };
int[] freq = { 10, 5, 20, 100 };
int i, n = arr.Length;
// Let us generate 10 random numbers
// according to given distribution
for(i = 0; i < 5; i++)
Console.WriteLine(myRand(arr, freq, n));
}
}
UPDATE:
I ran this code to check it:
int[] intArray = new int[] { 1, 2, 3, 4, 5 };
int[] weights = new int[] { 5, 20, 20, 40, 15 };
List<int> results = new List<int>();
for (int i = 0; i < 100000; i++)
{
results.Add(WeightedRNG.GetRand(intArray, weights, intArray.Length));
}
for (int i = 0; i < intArray.Length; i++)
{
int itemsFound = results.Where(x => x == intArray[i]).Count();
Console.WriteLine($"{intArray[i]} was returned {itemsFound} times.");
}
And here are the results with the constant:
1 was returned 5096 times.
2 was returned 19902 times.
3 was returned 20086 times.
4 was returned 40012 times.
5 was returned 14904 times.
And without the constant...
1 was returned 100000 times.
2 was returned 0 times.
3 was returned 0 times.
4 was returned 0 times.
5 was returned 0 times.
It completely breaks without it.
The constant does serve a purpose in some environments, but I don't believe this code is correct for C#.
To explain, let's look at the arguments to the function. The first sign something is off is passing n as an argument instead of inferring it from the arrays. The second sign is it's poor practice in C# to deal with paired arrays rather than something like a 2D array or sequence of single objects (such as a Tuple). But those are just indicators something is odd, and not evidence of any bugs.
So let's put that on hold for a moment and explain why a constant might matter by looking a small example.
Say you have three numbers (1, 2, and 3) with weights 3, 2, and 2. This function first builds up a prefix array, where each item includes the chances of finding the number for that index and all previous numbers.
We end up with a result like this: (3, 5, 7). Now we can use the last value and take a random number from 1 to 7. Values 1-3 result in 1, values 4 and 5 result in 2, and values 6 and 7 result in 3.
To find this random number the code now calls rand.Next(), and this is where I think the error comes in. In many platforms, the Next() function returns a double between 0 and 1. That's too small to use to lookup your weighted value, so you then multiply by a prime constant related the platform's epsilon value to ensure you have a reasonably large result that will cover the entire possible range of desired weights (in this case, 1-7) and then some. Now you take the remainder (%) vs your max weight (7), and map it via the prefix array to get the final result.
So the first error is, in C#, .Next() does not return a double. It is documented to return a non-negative random integer between 0 and int.MaxValue. Multiply that by 323567 and you're asking for integer overflow exceptions. Another sign of this mistake is the cast to int: the result of this function is already an int. And let's not even talk the meaningless extra parentheses around (323567).
There is also another, more subtle, error.
Let's the say the result of the (int)(rand.Next() * 323567) expression is 10. Now we take this value and get the remainder when dividing by our max value (%7). The problem here is we have two chances to roll a 1, 2, or 3 (the extra chance is if the original was 8, 9, or 10), and only once chance for the remaining weights (4-7). So we have introduced unintended bias into the system, completely throwing off the expected weights. (This is a simplification. The actual number space is not 1-10; it's 0 * 323567 - 0.999999 * 323567. But the potential for bias still exists as long that max value is not evenly divisible by the max weight value).
It's possible the constant was chosen because it has properties to minimize this effect, but again: it was based on a completely different kind of .Next() function.
Therefore the rand.Next() line should probably be changed to look like this:
int r = rand.Next(prefix[n - 1]) +1;
or this:
int r = ((int)(rand.NextDouble() * (323567 * prefix[n - 1])) % prefix[n - 1]) + 1;
But, given the other errors, I'd be wary of using this code at all.
For fun, here's an example running several different options:
https://dotnetfiddle.net/Y5qhRm
The original random method (using NextDouble() and a bare constant) doesn't fare as badly as I'd expect.

how to create a random int[] [duplicate]

This question already has answers here:
Most efficient way to randomly "sort" (Shuffle) a list of integers in C#
(13 answers)
Closed 5 years ago.
I need to create a random array of int with certain parameters.
int[] test = new int[80];
Random random = new Random();
I need to assign 1's and 0's, randomly to the array. I know how to do that.
test[position] = Random.Next(0,2);//obviously with a for loop
But I need to have exactly 20 1's, but they need to be randomly positioned in the array. I don't know how to make sure that the 1's are randomly positioned, AND that there are exactly 20 1's. The rest of the positions in the array would be assigned 0.
I think you need to turn your thinking around.
Consider:
var cnt = 20;
while (cnt > 0) {
var r = Random.Next(0, 80);
if (test[r] != 1) {
test[r] = 1;
cnt--;
}
}
Expanding explanation based on comments from CodeCaster. Rather than generate a random value to place in the array, this code generates and index to set. Since C# automatically initializes the test array to 0 these values are already set. So all you need is to add your 1 values. The code generates a random index, tests it to see if it isn't 1, if so it sets the array element and decrements a count (cnt). Once count reaches zero the loop terminates.
This won't properly function if you need more values than 0 and 1 that is true. Of course the questions explicitly stated that these were the needed values.
"This causes horrible runtime performance". What!? Can you produce any prove of that? There is a chance that the index generated will collide with an existing entry. This chance increases as more 1's are added. Worst case is there is a 19/80 (~23%) chance of collision.
If you know you need exactly 20 of one value, a better way to go about this is to pre-populate the array with the required values and then shuffle it.
Something like this should work:
int[] array = new int[80];
for (int i = 0; i < 80; i++)
{
int val = 0;
if (i < 20)
{
val = 1;
}
array[i] = val;
}
Random rnd = new Random();
int[] shuffledArray = array.OrderBy(x => rnd.Next()).ToArray();
You can do
for (int i = 0; i < 20; i++)
{
var rand = random.Next(0,80);
if (test[rand] == 1)
{
i--;
continue;
}
test[rand] = 1;
}
Remaining are automatically zero.

Random selection for a variable set with a guarantee of every item at least once

I am working on a game in c# but that detail is not really neccessary to solve my problem.
At I high level here is what I want:
I have a set that could have any number of items in it.
I want to randomly select 10 items from that set.
If the set has less than 10 items in then I expect to select the same
item more than once.
I want to ensure every item is selected at least once.
What would be the algorithm for this?
Sorry I'm not sure if this is an appropriate place to ask, but I've got no pen and paper to hand and I can't quite get my head round what's needed so appreciate the help.
In addition I might also want to add weights to the items to
increase/decrease chance of selection, so if you are able to
incorporate that into your answer that would be fab.
Finally thought I should mention that my set is actually a List<string>, which might be relevent if you prefer to give a full answer rather than psuedo code.
This is what I use to randomize an array. It takes an integer array and randomly sorts that list a certain amount of times determined by the random number (r).
private int[] randomizeArray(int[] i)
{
int L = i.Length - 1;
int c = 0;
int r = random.Next(L);
int prev = 0;
int curr = 0;
int temp;
while (c < r)
{
curr = random.Next(0, L);
if (curr != prev)
{
temp = i[prev];
i[prev] = i[curr];
i[curr] = temp;
c++;
}
}
return i;
}
If you look for effective code, my answer isnt it. In theory, create some collection you can remove from that will mirror your set. Then select random member of the object from it ...and remove, this will garantee items wont repeat(if possible).
Random rnd = new Random();
List<int> indexes = new List<int>(items.Count);
for (int i = 0; i < items.Count; i++)
indexes.Add(i);
List<string> selectedItems = new List<string>(10);
int tmp;
for(int i = 0; i < 10; i++)
{
tmp = rnd.Next(1,10000); //something big
if(indexes.Count > 0)
{
selectedItems.Add(yourItems[indexes[tmp%indexes.Count]]);
indexes.RemoveAt(tmp%indexes.Count);
}
else
selectedItems.Add(yourItems[rnd.Next(0,9)]); //you ran out of unique items
}
where items is your list and yourItems is list of selected items, you dont need to store them if you want process them right away
Perhaps shuffle the collection and pick elements from the front until you have the required amount.
Once you've gone through all the elements, you should perhaps shuffle it again, so you don't just repeat the same sequence.
The basic algorithm for shuffling: (in pseudo-code)
for i from n − 1 downto 1 do
j ← random integer with 0 ≤ j ≤ i
exchange a[j] and a[i]
With the above algorithm (or a minor variation), it's possible to just shuffle until you reach the required number of elements, no need to shuffle the whole thing.

Unique number in random?

I know there are many questions on the net about it ,but I would like to know why my method fails
What Am i Doing wrong?
public class Generator
{
private static readonly Random random = new Random();
private static readonly object SyncLock = new object();
public static int GetRandomNumber(int min, int max)
{
lock (SyncLock)
{
return random.Next(min, max);
}
}
}
[TestFixture]
public class Class1
{
[Test]
public void SimpleTest()
{
var numbers=new List<int>();
for (int i = 1; i < 10000; i++)
{
var random = Generator.GetRandomNumber(1,10000);
numbers.Add(random);
}
CollectionAssert.AllItemsAreUnique(numbers);
}
}
EDIT
Test method is failing!! Sorry for not mentioning
Thanks for your time and suggestions
How can you possibly expect a sequence of 10,000 random numbers from a set of 10,000 possible values to be all unique unless you are extremely lucky? What you are expecting is wrong.
Flip a coin twice. Do you truly expect TH and HT to be the only possible sequences?
What makes you think random numbers should work any differently?
This output from a random number generator is possible:
1, 1, 1, 1, 1, 1, ..., 1
So is this:
1, 2, 3, 4, 5, 6, ..., 10000
In fact, both of those sequences are equally likely!
You seem to be under the misapprehension that the Random class generates a sequence of unique, though apparently random numbers. This is simply not the case; randomness implies that the next number could be any of the possible choices, not just any except one I've seen before.
That being the case, it is entirely unsurprising that your test fails: the probability that 10000 randomly generated integers (between 1 and 10000 no less) are unique is minuscule.
Random != Unique
The point here is that your code should model your problem and yours really doesn't. Random is not equal to unique. If you want unique, you need to get your set of values and shuffle them.
If you truly want random numbers, you can't expect them to be unique. If your (P)RNG offers an even distribution, then over many trials you should see similar counts of every value (see the Law of Large Numbers). Cases may show up that seem "wrong" but you can't discount that you hit that case by chance.
public static void FisherYatesShuffle<T>(T[] array)
{
Random r = new Random();
for (int i = array.Length - 1; i > 0; i--)
{
int j = r.Next(0, i + 1);
T temp = array[j];
array[j] = array[i];
array[i] = temp;
}
}
int[] array = new int[10000];
for (int i = 0; i < array.Length; i++) array[i] = i;
FisherYatesShuffle(array);
I think you failed to mention that your test method is failing.
It is failing because your random generator is not producing Unique numbers. I'm not sure how it would under it's current condition.

Random playlist algorithm

I need to create a list of numbers from a range (for example from x to y) in a random order so that every order has an equal chance.
I need this for a music player I write in C#, to create play lists in a random order.
Any ideas?
Thanks.
EDIT: I'm not interested in changing the original list, just pick up random indexes from a range in a random order so that every order has an equal chance.
Here's what I've wrriten so far:
public static IEnumerable<int> RandomIndexes(int count)
{
if (count > 0)
{
int[] indexes = new int[count];
int indexesCountMinus1 = count - 1;
for (int i = 0; i < count; i++)
{
indexes[i] = i;
}
Random random = new Random();
while (indexesCountMinus1 > 0)
{
int currIndex = random.Next(0, indexesCountMinus1 + 1);
yield return indexes[currIndex];
indexes[currIndex] = indexes[indexesCountMinus1];
indexesCountMinus1--;
}
yield return indexes[0];
}
}
It's working, but the only problem of this is that I need to allocate an array in the memory in the size of count. I'm looking for something that dose not require memory allocation.
Thanks.
This can actually be tricky if you're not careful (i.e., using a naïve shuffling algorithm). Take a look at the Fisher-Yates/Knuth shuffle algorithm for proper distribution of values.
Once you have the shuffling algorithm, the rest should be easy.
Here's more detail from Jeff Atwood.
Lastly, here's Jon Skeet's implementation and description.
EDIT
I don't believe that there's a solution that satisfies your two conflicting requirements (first, to be random with no repeats and second to not allocate any additional memory). I believe you may be prematurely optimizing your solution as the memory implications should be negligible, unless you're embedded. Or, perhaps I'm just not smart enough to come up with an answer.
With that, here's code that will create an array of evenly distributed random indexes using the Knuth-Fisher-Yates algorithm (with a slight modification). You can cache the resulting array, or perform any number of optimizations depending on the rest of your implementation.
private static int[] BuildShuffledIndexArray( int size ) {
int[] array = new int[size];
Random rand = new Random();
for ( int currentIndex = array.Length - 1; currentIndex > 0; currentIndex-- ) {
int nextIndex = rand.Next( currentIndex + 1 );
Swap( array, currentIndex, nextIndex );
}
return array;
}
private static void Swap( IList<int> array, int firstIndex, int secondIndex ) {
if ( array[firstIndex] == 0 ) {
array[firstIndex] = firstIndex;
}
if ( array[secondIndex] == 0 ) {
array[secondIndex] = secondIndex;
}
int temp = array[secondIndex];
array[secondIndex] = array[firstIndex];
array[firstIndex] = temp;
}
NOTE: You can use ushort instead of int to half the size in memory as long as you don't have more than 65,535 items in your playlist. You could always programmatically switch to int if the size exceeds ushort.MaxValue. If I, personally, added more than 65K items to a playlist, I wouldn't be shocked by increased memory utilization.
Remember, too, that this is a managed language. The VM will always reserve more memory than you are using to limit the number of times it needs to ask the OS for more RAM and to limit fragmentation.
EDIT
Okay, last try: we can look to tweak the performance/memory trade off: You could create your list of integers, then write it to disk. Then just keep a pointer to the offset in the file. Then every time you need a new number, you just have disk I/O to deal with. Perhaps you can find some balance here, and just read N-sized blocks of data into memory where N is some number you're comfortable with.
Seems like a lot of work for a shuffle algorithm, but if you're dead-set on conserving memory, then at least it's an option.
If you use a maximal linear feedback shift register, you will use O(1) of memory and roughly O(1) time. See here for a handy C implementation (two lines! woo-hoo!) and tables of feedback terms to use.
And here is a solution:
public class MaximalLFSR
{
private int GetFeedbackSize(uint v)
{
uint r = 0;
while ((v >>= 1) != 0)
{
r++;
}
if (r < 4)
r = 4;
return (int)r;
}
static uint[] _feedback = new uint[] {
0x9, 0x17, 0x30, 0x44, 0x8e,
0x108, 0x20d, 0x402, 0x829, 0x1013, 0x203d, 0x4001, 0x801f,
0x1002a, 0x2018b, 0x400e3, 0x801e1, 0x10011e, 0x2002cc, 0x400079, 0x80035e,
0x1000160, 0x20001e4, 0x4000203, 0x8000100, 0x10000235, 0x2000027d, 0x4000016f, 0x80000478
};
private uint GetFeedbackTerm(int bits)
{
if (bits < 4 || bits >= 28)
throw new ArgumentOutOfRangeException("bits");
return _feedback[bits];
}
public IEnumerable<int> RandomIndexes(int count)
{
if (count < 0)
throw new ArgumentOutOfRangeException("count");
int bitsForFeedback = GetFeedbackSize((uint)count);
Random r = new Random();
uint i = (uint)(r.Next(1, count - 1));
uint feedback = GetFeedbackTerm(bitsForFeedback);
int valuesReturned = 0;
while (valuesReturned < count)
{
if ((i & 1) != 0)
{
i = (i >> 1) ^ feedback;
}
else {
i = (i >> 1);
}
if (i <= count)
{
valuesReturned++;
yield return (int)(i-1);
}
}
}
}
Now, I selected the feedback terms (badly) at random from the link above. You could also implement a version that had multiple maximal terms and you select one of those at random, but you know what? This is pretty dang good for what you want.
Here is test code:
static void Main(string[] args)
{
while (true)
{
Console.Write("Enter a count: ");
string s = Console.ReadLine();
int count;
if (Int32.TryParse(s, out count))
{
MaximalLFSR lfsr = new MaximalLFSR();
foreach (int i in lfsr.RandomIndexes(count))
{
Console.Write(i + ", ");
}
}
Console.WriteLine("Done.");
}
}
Be aware that maximal LFSR's never generate 0. I've hacked around this by returning the i term - 1. This works well enough. Also, since you want to guarantee uniqueness, I ignore anything out of range - the LFSR only generates sequences up to powers of two, so in high ranges, it will generate wost case 2x-1 too many values. These will get skipped - that will still be faster than FYK.
Personally, for a music player, I wouldn't generate a shuffled list, and then play that, then generate another shuffled list when that runs out, but do something more like:
IEnumerable<Song> GetSongOrder(List<Song> allSongs)
{
var playOrder = new List<Song>();
while (true)
{
// this step assigns an integer weight to each song,
// corresponding to how likely it is to be played next.
// in a better implementation, this would look at the total number of
// songs as well, and provide a smoother ramp up/down.
var weights = allSongs.Select(x => playOrder.LastIndexOf(x) > playOrder.Length - 10 ? 50 : 1);
int position = random.Next(weights.Sum());
foreach (int i in Enumerable.Range(allSongs.Length))
{
position -= weights[i];
if (position < 0)
{
var song = allSongs[i];
playOrder.Add(song);
yield return song;
break;
}
}
// trim playOrder to prevent infinite memory here as well.
if (playOrder.Length > allSongs.Length * 10)
playOrder = playOrder.Skip(allSongs.Length * 8).ToList();
}
}
This would make songs picked in order, as long as they haven't been recently played. This provides "smoother" transitions from the end of one shuffle to the next, because the first song of the next shuffle could be the same song as the last shuffle with 1/(total songs) probability, whereas this algorithm has a lower (and configurable) chance of hearing one of the last x songs again.
Unless you shuffle the original song list (which you said you don't want to do), you are going to have to allocate some additional memory to accomplish what you're after.
If you generate the random permutation of song indices beforehand (as you are doing), you obviously have to allocate some non-trivial amount of memory to store it, either encoded or as a list.
If the user doesn't need to be able to see the list, you could generate the random song order on the fly: After each song, pick another random song from the pool of unplayed songs. You still have to keep track of which songs have already been played, but you can use a bitfield for that. If you have 10000 songs, you just need 10000 bits (1250 bytes), each one representing whether the song has been played yet.
I don't know your exact limitations, but I have to wonder if the memory required to store a playlist is significant compared to the amount required for playing audio.
There are a number of methods of generating permutations without needing to store the state. See this question.
I think you should stick to your current solution (the one in your edit).
To do a re-order with no repetitions & not making your code behave unreliable, you have to track what you have already used / like by keeping unused indexes or indirectly by swapping from the original list.
I suggest to check it in the context of the working application i.e. if its of any significance vs. the memory used by other pieces of the system.
From a logical standpoint, it is possible. Given a list of n songs, there are n! permutations; if you assign each permutation a number from 1 to n! (or 0 to n!-1 :-D) and pick one of those numbers at random, you can then store the number of the permutation that you are currently using, along with the original list and the index of the current song within the permutation.
For example, if you have a list of songs {1, 2, 3}, your permutations are:
0: {1, 2, 3}
1: {1, 3, 2}
2: {2, 1, 3}
3: {2, 3, 1}
4: {3, 1, 2}
5: {3, 2, 1}
So the only data I need to track is the original list ({1, 2, 3}), the current song index (e.g. 1) and the index of the permutation (e.g. 3). Then, if I want to find the next song to play, I know it's third (2, but zero-based) song of permutation 3, e.g. Song 1.
However, this method relies on you having an efficient means of determining the ith song of the jth permutation, which until I've had chance to think (or someone with a stronger mathematical background than I can interject) is equivalent to "then a miracle happens". But the principle is there.
If memory was really a concern after a certain number of records and it's safe to say that if that memory boundary is reached, there's enough items in the list to not matter if there are some repeats, just as long as the same song was not repeated twice, I would use a combination method.
Case 1: If count < max memory constraint, generate the playlist ahead of time and use Knuth shuffle (see Jon Skeet's implementation, mentioned in other answers).
Case 2: If count >= max memory constraint, the song to be played will be determined at run time (I'd do it as soon as the song starts playing so the next song is already generated by the time the current song ends). Save the last [max memory constraint, or some token value] number of songs played, generate a random number (R) between 1 and song count, and if R = one of X last songs played, generate a new R until it is not in the list. Play that song.
Your max memory constraints will always be upheld, although performance can suffer in case 2 if you've played a lot of songs/get repeat random numbers frequently by chance.
you could use a trick we do in sql server to order sets in random like this with the use of guid. the values are always distributed equaly random.
private IEnumerable<int> RandomIndexes(int startIndexInclusive, int endIndexInclusive)
{
if (endIndexInclusive < startIndexInclusive)
throw new Exception("endIndex must be equal or higher than startIndex");
List<int> originalList = new List<int>(endIndexInclusive - startIndexInclusive);
for (int i = startIndexInclusive; i <= endIndexInclusive; i++)
originalList.Add(i);
return from i in originalList
orderby Guid.NewGuid()
select i;
}
You're going to have to allocate some memory, but it doesn't have to be a lot. You can reduce the memory footprint (the degree by which I'm unsure, as I don't know that much about the guts of C#) by using a bool array instead of int. Best case scenario this will only use (count / 8) bytes of memory, which isn't too bad (but I doubt C# actually represents bools as single bits).
public static IEnumerable<int> RandomIndexes(int count) {
Random rand = new Random();
bool[] used = new bool[count];
int i;
for (int counter = 0; counter < count; counter++) {
while (used[i = rand.Next(count)]); //i = some random unused value
used[i] = true;
yield return i;
}
}
Hope that helps!
As many others have said you should implement THEN optimize, and only optimize the parts that need it (which you check on with a profiler). I offer a (hopefully) elegant method of getting the list you need, which doesn't really care so much about performance:
using System;
using System.Collections.Generic;
using System.Linq;
namespace Test
{
class Program
{
static void Main(string[] a)
{
Random random = new Random();
List<int> list1 = new List<int>(); //source list
List<int> list2 = new List<int>();
list2 = random.SequenceWhile((i) =>
{
if (list2.Contains(i))
{
return false;
}
list2.Add(i);
return true;
},
() => list2.Count == list1.Count,
list1.Count).ToList();
}
}
public static class RandomExtensions
{
public static IEnumerable<int> SequenceWhile(
this Random random,
Func<int, bool> shouldSkip,
Func<bool> continuationCondition,
int maxValue)
{
int current = random.Next(maxValue);
while (continuationCondition())
{
if (!shouldSkip(current))
{
yield return current;
}
current = random.Next(maxValue);
}
}
}
}
It is pretty much impossible to do it without allocating extra memory. If you're worried about the amount of extra memory allocated, you could always pick a random subset and shuffle between those. You'll get repeats before every song is played, but with a sufficiently large subset I'll warrant few people will notice.
const int MaxItemsToShuffle = 20;
public static IEnumerable<int> RandomIndexes(int count)
{
Random random = new Random();
int indexCount = Math.Min(count, MaxItemsToShuffle);
int[] indexes = new int[indexCount];
if (count > MaxItemsToShuffle)
{
int cur = 0, subsetCount = MaxItemsToShuffle;
for (int i = 0; i < count; i += 1)
{
if (random.NextDouble() <= ((float)subsetCount / (float)(count - i + 1)))
{
indexes[cur] = i;
cur += 1;
subsetCount -= 1;
}
}
}
else
{
for (int i = 0; i < count; i += 1)
{
indexes[i] = i;
}
}
for (int i = indexCount; i > 0; i -= 1)
{
int curIndex = random.Next(0, i);
yield return indexes[curIndex];
indexes[curIndex] = indexes[i - 1];
}
}

Categories