Improve solver performance #60

Open
opened 2020-08-04 11:49:28 +02:00 by FWDekker · 9 comments
Owner

The solver is currently rather slow, which is a real issue when using the "ensure solvability" feature. Given a field with w=30,h=16,m=99,s=b, clicking in the top-right corner causes a 2+ second delay before a map is shown. Instead, this should take no more than, say, 250ms. Meanwhile, increasing the number of mines to 170 freezes the field entirely; I've never seen it find a map before I gave up. Increasing the field to 99x99 with 2021 mines is unthinkable right now, but shouldn't be. (For comparison, Tatham's game generates a solvable 99x99;2021 game well within a second.)

Related to #58, since that issue should be able to increase the performance of the field drastically. Also, I should consider looking at Tatham's solver to see how it works.

Additionally, solving #14 may improve the performance of finding a solvable map by increasing the amount of maps that can be solved.

The solver is currently rather slow, which is a real issue when using the "ensure solvability" feature. Given a field with w=30,h=16,m=99,s=b, clicking in the top-right corner causes a 2+ second delay before a map is shown. Instead, this should take no more than, say, 250ms. Meanwhile, increasing the number of mines to 170 freezes the field entirely; I've never seen it find a map before I gave up. Increasing the field to 99x99 with 2021 mines is unthinkable right now, but shouldn't be. (For comparison, Tatham's game generates a solvable 99x99;2021 game well within a second.) Related to #58, since that issue should be able to increase the performance of the field drastically. Also, I should consider looking at Tatham's solver to see how it works. Additionally, solving #14 may improve the performance of finding a solvable map by increasing the amount of maps that can be solved.
Author
Owner

It is important to note that the most expensive operation in the finding of a good field is the solveBinary function of the solver. In particular, the solve function takes approximately 75% of the time, and of that the rref function takes 85% of the time. Improving that performance (or avoiding invocation of that function) is the most important; making all those other things such as event structures work better is really not as important.

It is important to note that the most expensive operation in the finding of a good field is the `solveBinary` function of the solver. In particular, the `solve` function takes approximately 75% of the time, and of that the `rref` function takes 85% of the time. Improving that performance (or avoiding invocation of that function) is the most important; making all those other things such as event structures work better is really not as important.
FWDekker added this to the v1.0.0 milestone 2020-08-13 13:19:50 +02:00
Author
Owner

This issue will be closed once a 30x16 170-mine field can be opened consistently within 1000ms.

This issue will be closed once a 30x16 170-mine field can be opened consistently within 1000ms.
FWDekker self-assigned this 2020-08-13 13:23:44 +02:00
Author
Owner

This paper is interesting in its discussion of different approachers for solvers. However, I'm not sure if improving solver performance is necessary as much as reducing the number of attempts necessary; this could be done by prematurely filtering out unusable fields before trying to solve them, without requiring matrix solving.

Alternatively/additionally, I could look into using webassembly to speed up the solver.

[This paper](https://dash.harvard.edu/bitstream/handle/1/14398552/BECERRA-SENIORTHESIS-2015.pdf) is interesting in its discussion of different approachers for solvers. However, I'm not sure if improving solver performance is necessary as much as reducing the number of attempts necessary; this could be done by prematurely filtering out unusable fields before trying to solve them, without requiring matrix solving. Alternatively/additionally, I could look into using webassembly to speed up the solver.
Author
Owner

See also this paper.

See also [this paper](http://www.cs.toronto.edu/~cvs/minesweeper/minesweeper.pdf).
Author
Owner

Since it's the matrix operations that are slow, maybe I could find a way of rewriting the matrix solver so that it doesn't actually change the matrix, but each time calculates it dynamically by keeping track of what changes have previously occurred.

Since it's the matrix operations that are slow, maybe I could find a way of rewriting the matrix solver so that it doesn't actually change the matrix, but each time calculates it dynamically by keeping track of what changes have previously occurred.
Author
Owner

As a comment on the above, I should see if my matrices are sparse (probably yes).

I could test what happens if I use an rref algorithm specialised for sparse matrices, and/or use a library that implements sparse matrices. I could use something such as vectorious or vectorious-plus, and rewrite my rref code to work with that. (Or keep the old code to compare against later in tests.)

If that doesn't work, I should see if there are special algorithms for sparse matrices. This lecture seems interesting.

As a comment on the above, I should see if my matrices are sparse (probably yes). I could test what happens if I use an rref algorithm specialised for sparse matrices, and/or use a library that implements sparse matrices. I could use something such as [vectorious](https://github.com/mateogianolio/vectorious) or [vectorious-plus](https://www.npmjs.com/package/vectorious-plus), and rewrite my rref code to work with that. (Or keep the old code to compare against later in tests.) If that doesn't work, I should see if there are special algorithms for sparse matrices. [This lecture seems interesting.](https://www-users.cse.umn.edu/~saad/csci8314/FILES/LecN6.pdf)
Author
Owner

I tried out Vectorious. Unfortunately, it only slows down the code; now it can't even solve an expert size field. I should seriously consider contacting Tatham to ask him how he did it.

I tried out Vectorious. Unfortunately, it only slows down the code; now it can't even solve an expert size field. I should seriously consider contacting Tatham to ask him how he did it.
Author
Owner

I should look into filtering out unsolvable fields, as I noted in a comment above, and as suggested by Jorrit. This means the solver doesn't get faster, but the canSolve function can say "no" quicker.

I should look into filtering out unsolvable fields, as I noted in a comment above, and as suggested by Jorrit. This means the solver doesn't get faster, but the `canSolve` function can say "no" quicker.
Author
Owner

Filtering out is definitely a good idea, but I'm not sure I'll be able to properly define and identify these cases efficiently. It's quite a tough job.

That said, an alternative method may be to, when an unsolvable case arises, move one or more mines into the currently unplayed part, and see if it then becomes solvable.

Filtering out is definitely a good idea, but I'm not sure I'll be able to properly define and identify these cases efficiently. It's quite a tough job. That said, an alternative method may be to, when an unsolvable case arises, move one or more mines into the currently unplayed part, and see if it then becomes solvable.
Sign in to join this conversation.
No Label
No Milestone
No Assignees
1 Participants
Notifications
Due Date
The due date is invalid or out of range. Please use the format 'yyyy-mm-dd'.

No due date set.

Dependencies

No dependencies set.

Reference: tools/minesweeper#60
No description provided.