I really just wanted to write this down somewhere:
1000 instances of n=64 takes up 7.9M.
So 1 million would take up about 7.71G.
And a million is just a tiny ass drop in the bucket.
Notes (or, how do I generate difficult instances of 3-sat?):
According to The State of SAT by Henry Kautz and Bart Selman, a 700 variable problem took a couple months of compute time back in...early 2000s? Who doesn't date a paper? This must be a draft.
Two existing (and boring) algorithms for SAT are:
http://en.wikipedia.org/wiki/WalkSAT
http://en.wikipedia.org/wiki/DPLL_algorithm
Problems arising from creating proof trees mentioned in The State of SAT give me hope--modifying my algorithm to produce the proof trees as it runs would be simple if I cared to do it, although the proof trees would not be minimal.
Ok...most of the rest of the paper is boring. Looks like everyone thinks P != NP.
Oooh, finally, something about hard instance generation:
http://www.is.titech.ac.jp/~watanabe/gensat/
Shit. Dead links. Well, at least I know "something about a factorization problem."
Here is Hard Instances for 3-SAT by idk-who in an atrocias ps file intended for slides. Still more people thinking hard is simply m=n*f
Perhaps what I should really do is measure running time affected by the constant factor X, where m=Xn. Pretty graphs! Yay!
Also checked out "Algorithsm for Random 3-SAT" by Abraham D. Flaxman, Microsoft Research. It pointed me to Where the Really Hard Problems Are.
Subscribe to:
Post Comments (Atom)

No comments:
Post a Comment