I spent my entire Friday night mucking around in C++ because I think I saw people in an anime playing Concentration and I wondered, how long would it take you if you went full YOLO and picked cards at completely random? Also I can’t do the math to figure it out.
I think the anime was One Week Friends, but it’s 2 in the morning right now.
The conditions were as follows:
- A grid with an even number of total spaces is laid out; maximum 8×8 on this toaster of a Thinkpad
- The number of copies of a symbol is set to 2, and the number of symbols is half of the total spaces
- The cards are distributed randomly across the grid
- Two cards are drawn at random; if they match, those cards are discarded, counters are incremented, and the player picks again; if not, the cards stay on the grid
- A single card deal is played 1000 times with no knowledge of prior games (read: RNG gods)
- It is assumed it would take approximately 4 seconds per turn for a human to play in this way
I played grids in even increments from 4×4 to 8×8. I tried to do 1 million trials per game, but Windows just gave up and crashed the program every time. The results are shown below:
Looks exponential to me. Mission accomplished.
I really think this would’ve been loads easier in Python, but I don’t have that set up on this Windows machine right now. My next steps are to try various algorithms to see how much faster they are. Or really, whatever I figure out how to code past midnight.
UPDATE: My friend Mike, who actually knows statistics, showed me that the expected number of moves to match n symbols at random is n^2. This lines up perfectly with the data for large numbers of moves. Thanks Mike!