.. _linear-example: Usage example: Automatic Configuration of the Linear MaxSAT algorithm --------------------------------------------------------------------- Let's see how to adapt an example function to be able to automatically configure it. In particular, let's see an OptiLog implementation of the Linear algorithm :cite:p:`le2010sat4j`, a SAT-based model-guided algorithm for Weighted MaxSAT formulas. In this algorithm, the MaxSAT optimization problem is reformulated into a sequence of SAT decision problems. Each SAT instance of the sequence encodes whether there exists an assignment with a cost :math:`\leq k`, encoded as a Pseudo-Boolean or Cardinality constraint depending on the weights of the soft constraints. SAT instances with a :math:`k` less than the optimal cost are unsatisfiable, the others being satisfiable. The objective is to iteratively refine (decrease) the upper bound and guide the search with satisfying assignments (models) obtained from satisfiable SAT instances. .. code:: python :number-lines: from optilog.solvers.sat import Glucose41 from optilog.encoders.pb import IncrementalEncoder def linear( instance, seed ): s = Glucose41() s.set('seed', seed) f = s.load_wcnf(instance) B, W, max_var = [], [], f.max_var() for w, c in f.soft_clauses: max_var += 1 s.add_clause(c + [max_var]) B += [max_var] W += [w] res, ub = True, f.top_weight() encoder, max_var, C = IncrementalEncoder.init(B, ub, W, max_var) s.add_clauses(C) while res is True and ub > 0: max_var, C = encoder.extend(ub - 1) s.add_clauses(C) res = s.solve() if res is True: ub = f.cost(s.model()) print("o", ub) return ub This code can be easily adapted with OptiLog to be configurable through an automatic configurator. .. code:: python :number-lines: from optilog.encoders.pb import IncrementalEncoder from optilog.tuning import ac, Categorical, CfgCls from optilog.solvers.sat import Glucose41 @ac def linear( instance, seed, init_solver_fn: CfgCls(Glucose41), encoding: Categorical('best', 'bdd', 'card') = 'best' ): s = init_solver_fn(seed=seed) f = s.load_wcnf(instance) B, W, max_var = [], [], f.max_var() for w, c in f.soft_clauses: max_var += 1 s.add_clause(c + [max_var]) B += [max_var] W += [w] res, ub = True, f.top_weight() encoder, max_var, C = IncrementalEncoder.init(B, ub, W, max_var, encoding) s.add_clauses(C) while res is True and ub > 0: max_var, C = encoder.extend(ub - 1) s.add_clauses(C) res = s.solve() if res is True: ub = f.cost(s.model()) print("o", ub) return ub There are, two main configurable aspects: the SAT solver parameters and the Pseudo-Boolean encoder to be used. To configure the SAT solver we use the ``CfgCls`` class, that can automatically instantiate and configure a Configurable object. Some PyiSAT solvers are configurable, like ``Glucose41`` (line 9). For the Pseudo-Boolean encoder, we add a configurable categorical parameter called ``encoding`` (line 10), which is passed to the ``init`` method of ``IncrementalEncoder`` in line 24. Line 33 prints the current suboptimum, which will be the criteria that the AC tool will use to improve the performance of the function. Then, to generate the scenario for a particular AC tool (like GGA), we can use a GGAScenario object. .. code:: python :number-lines: from optilog.tuning.configurators import GGAScenario from optilog.blackbox import ExecutionConstraints, RunSolver if __name__ == '__main__': configurator = GGAScenario( linear, input_data=["inst1.wcnf", "inst2.wcnf", ..., "instN.wcnf"], data_kwarg="instance", seed_kwarg="seed", constraints=ExecutionConstraints( s_wall_time=300, s_real_memory="6G", enforcer=RunSolver() ), tuner_rt_limit=43200, run_obj="quality", quality_regex="^o (\d+)$", seed=1, cost_min=0, cost_max=(2 << 64) - 1, # Max sum of weights for a WCNF instance ) configurator.generate_scenario("./scenario") The ``linear`` function is passed as :ref:`entrypoint `. It is also passed inside *global_cfgcalls*, as it is a :ref:`global configurable function `. However, the ``Glucose41`` class is not passed inside *global_cfgcalls*, as in the context of the ``linear`` function it is used as a :ref:`local configurable function `. We provide the set of *wcnf* instances that will be used to configure the entrypoint (*input_data*, line 5), the name for the *data* and *seed* parameters inside the ``linear`` function (*data_kwarg* and *seed_kwarg*, line 6) and the regex that must be used to parse the quality of the execution (*quality_regex*, line 9). Notice that in this case, multiple qualities may be printed, but only the last one will be taken into account (which will be the best suboptimum that the algorithm has found). Finally, we set the *cost_min* and *cost_max* (line 10) to the maximum sum of the weights in a WCNF formula to ensure that a proper penalty is applied in case of crash or timeout. Notice that *eval_time_limit*, *tuner_rt_limit* and *seed*, which indicate the maximum evaluation time per instance, the overall time for optimization and the seeds to evaluate. For more information, check GGA's documentation. After the initialization, we can call the *generate_scenario* method to create all the files required by GGA to configure the ``linear`` function.