logo

SCIENTIA SINICA Informationis, Volume 49, Issue 6: 685(2019) https://doi.org/10.1360/N112017-00248

Partial maximum satisfiability problem method combined with structure characteristics for diagnostic problems

More info
  • ReceivedJul 23, 2018
  • AcceptedNov 1, 2018
  • PublishedJun 6, 2019

Abstract

Partial MaxSAT (PMS) is a generalized version of the maximum satisfiability problem (MaxSAT), which has important applications in many fields. At present, the PMS algorithm requires improvements in solving the industrial problems. Based on deep research on the PMS algorithm using random search, this paper presents a structure characteristics partial MaxSAT (SCPMS) method that combines structural feature exploration. First, according to the unit propagation rules and structural features of the problem, we gradually divide the PMS clause into two parts to construct a subproblem that can satisfy the problem due to the absence of the hard unit clause. In this paper, a random search guidance strategy is proposed. The new subproblem is reused to identify the hard-blocking variable in the hard unit clause of the original problem using unit propagation. Then, the corresponding soft-blocking variable is reversed with the feature of the clause to improve the efficiency of the random search. The experimental results show that in comparison with the two latest algorithms, DeciDist and DistUp, the proposed SCPMS greatly reduces the number of unsatisfied soft clauses in solving model-based diagnosis instances.


Funded by

国家自然科学基金项目(61872159,61672261,61502199)


References

[1] Li C M, Manya F, Planes J. New Inference Rules for Max-SAT. jair, 2007, 30: 321-359 CrossRef Google Scholar

[2] Lin H, Su K, Li C M. Within-problem learning for efficient lower bound computation in Max-SAT solving. In: Proceedings of the 23rd AAAI Conference of Artificial Intelligence, Chicago, 2008. 351--356. Google Scholar

[3] Heras F, Larrosa J, Oliveras A. MiniMaxSAT: An Efficient Weighted Max-SAT solver. jair, 2008, 31: 1-32 CrossRef Google Scholar

[4] Davies J, Cho J, Bacchus F. Using learnt clauses in MAXSAT. In: Proceedings of the 16th International Conference of Principles and Practice of Constraint Programming - CP 2010, St. Andrews, 2010. 176--190. Google Scholar

[5] Li C M, Manyà F, Mohamedou N, et al. Exploiting cycle structures in Max-SAT. In: Proceedings of the 12th International Conference on Theory and Applications of Satisfiability Testing, Swansea, 2009. 467--480. Google Scholar

[6] Ansótegui C, Bonet M L, Levy J. SAT-based MaxSAT algorithms. Artificial Intelligence, 2013, 196: 77-105 CrossRef Google Scholar

[7] Martins R, Manquinho V, Lynce I. Community-based partitioning for MaxSAT solving. In: Proceeding of the 16th International Conference of Theory and Applications of Satisfiability Testing (SAT 2013), Helsinki, 2013. 182--191. Google Scholar

[8] Narodytska N, Bacchus F. Maximum satisfiability using core-guided MAXSAT resolution. In: Proceeding of the 28th AAAI Conference on Artificial Intelligence, Québec City, 2014. 2717--2723. Google Scholar

[9] Fu Z, Malik S. On solving the partial MAX-SAT problem. In: Proceeding of the 9th International Conference of Theory and Applications of Satisfiability Testing, Seattle, 2006. 252--265. Google Scholar

[10] Davies J, Bacchus F. Solving MAXSAT by solving a sequence of simpler SAT instances. In: Proceeding of the 17th International Conference of Principles and Practice of Constraint Programming (CP 2011), Perugia, 2011. 225--239. Google Scholar

[11] Koshimura M, Zhang T, Fujita H, et al. QMaxSAT: A partial Max-SAT solver. J Satisf Boolean Model, 2012, 8: 95--100. Google Scholar

[12] Cai S W, Luo C, Zhang H. From decimation to local search and back: A new approach to MaxSAT. In: Proceeding of the 26th International Joint Conference on Atificial Intelligence, Melbourne, 2017. 571--577. Google Scholar

[13] Morgado A, Heras F, Liffiton M. Iterative and core-guided MaxSAT solving: A survey and assessment. Constraints, 2013, 18: 478-534 CrossRef Google Scholar

[14] Luo C, Cai S W, Wu W. CCLS: An Efficient Local Search Algorithm for Weighted Maximum Satisfiability. IEEE Trans Comput, 2015, 64: 1830-1843 CrossRef Google Scholar

[15] Cai S W, Luo C, Lin J K. New local search methods for partial MaxSAT. Artificial Intelligence, 2016, 240: 1-18 CrossRef Google Scholar

[16] Robinson J A, Davis M, Logemann G, et al. A machine program for theorem-proving. J Symb Log, 1967, 32: 118. Google Scholar

[17] Biere A, Heule M, Maaren H V, et al. Handbook of Satisfiability. Netherlands: IOS Press, 2009. 99--130. Google Scholar

[18] Reiter R. A theory of diagnosis from first principles. Artificial Intelligence, 1987, 32: 57-95 CrossRef Google Scholar

[19] Chieu H L, Lee W S. Relaxed Survey Propagation for The Weighted Maximum Satisfiability Problem. jair, 2009, 36: 229-266 CrossRef Google Scholar

[20] Wang Y, Cai S, Yin M. Two efficient local search algorithms for maximum weight clique problem. In: Proceeding of Proceedings of the 30th AAAI Conference on Atificial Intelligence, Phoenix, 2016. 805--811. Google Scholar

  •   

    Algorithm 1 Decimation($F$)

    Require:A CNF formula $F$;

    Output:an assignment of variables in $F$;

    while $\exists$ unassigned variables do

    if $\exists$ unit clause then

    if $\exists$ unit clause then

    Hard_unit_clause_Propagation();

    else

    if $\exists$ soft unit clause then

    Soft_unit_clause_Propagation();

    end if

    end if

    else

    $v$ $\leftarrow$ a randomly unassigned vars;

    Assign($v$);

    end if

    end while

  • Table 1   Solution of SCPMS algorithm
    Benchmark $\#$ Inst DistUP #Win. DeciDist #Win. SCPMS #Win. WPM3 #Win.
    b1426 001812
    b156 0016
    b1726 008 26
    b20640018 58
    b211200021 115
    b2255008 55
  •   

    Algorithm 2 DeciDist: a local search algorithm

    Require:PMS instance $F$, cutoff, wp, $t$, and sp;

    Output:A feasible assignment $\alpha$ of $F$, or “no feasible assignment found";

    $\alpha\ast$ $\leftarrow~$ Decimation($F$);

    cost$\ast$ $\leftarrow~$ $+\propto$;

    while elapsed time $<~$ cutoff do

    if $\nexists$ falsified hard clauses $\&~$ cost($\alpha$) $<~$ cost$\ast$ then

    $\alpha~\ast~\leftarrow~\alpha$;

    cost$\ast$ $\leftarrow~$ cost($\alpha$);

    end if

    if $~H$ $\leftarrow~$ $\{$ $x$ $\|$ hscore($x$) $>~$ 0 $\}$ $\ne~$ $\emptyset$ then

    $\alpha\ast$ $\leftarrow~$ $\alpha$;

    cost$\ast$ $\leftarrow~$ cost($\alpha$);

    else

    if $S$ $\leftarrow~$ $\{$ $x$ $\|$ hscore($x$) = 0 $\&~$ sscore($x$)$>~$ 0 $\}$ $\ne~$ $\emptyset$ then

    $V$ $\leftarrow~$ a variable in $S$ with the greatest sscore, breaking ties randomly;

    end if

    else

    Update weights of hard clauses according to HPAWS;

    end if

    if $\exists$ falsified hard clauses then

    $c$ $\leftarrow$ a random falsified hard clause;

    else

    $c$ $\leftarrow~$ a random falsified soft clause;

    end if

    if with probability wp then

    $v$ $\leftarrow~$ a random variable in $c$;

    else

    $v$ $\leftarrow~$ a variable in c with the greatest sscore;

    $\alpha$ $\leftarrow~$ $\alpha$ with $v$ flipped;

    end if

    if cost$\ast$ $\le$ SUMWeight(soft clause) then

    return (cost$\ast$, $\alpha\ast$);

    else

    return “no feasible assignment found";

    end if

    end while

  • Table 2   Solution of soft blocking vars
    Benchmark $\#$ Inst. Maximum Minimum Mean
    b1426 332025
    b156 372830
    b1726 4021 38
    b20643922 32
    b2112039 21 27
    b22554022 33
  •   

    Algorithm 3 CSV($F$)

    Require:a CNF formula $F$;

    Output:A choosedSVQueue;

    while DPLL($F$) has a conflict do

    New CNF formula $F$ $\leftarrow~$ chooseHClausedel(CNF, DelHClause);

    end while

    Vars $\leftarrow~$ DPLL($F$);

    choosedHVQueue $\leftarrow~$ Compare(DelHClause, Vars);

    while choosedHVQueue is not NULL do

    $V$ $\leftarrow~$ choosedHVQueue.pop();

    sv $\leftarrow~$ ChooseSClauseVar($v$);

    choosedSVQueue(sv);

    end while

    Return a choosedSVQueue;

  •   

    Algorithm 4 SCPMS($F$)

    Require:a CNF formula $F$;

    Output:An assignment of variables;

    $Q$ $\leftarrow~$ CSV($F$);

    while $Q$ is not NULL do

    var $\leftarrow~$ $Q$.pop();

    Filp(var);

    end while

    while unpropagated Hvar do

    UnitPropagate(Hvar);

    end while

    while unpropagated Svar do

    UnitPropagate(Svar);

    end while

    while $\exists$ a var unassigned do

    Assign var with a random number from 0,1;

    end while

    LocalSearch();

Copyright 2019 Science China Press Co., Ltd. 《中国科学》杂志社有限责任公司 版权所有

京ICP备18024590号-1