Usage example: Automatic Configuration of the Linear MaxSAT algorithm

Let’s see how to adapt an example function to be able to automatically configure it.

In particular, let’s see an OptiLog implementation of the Linear algorithm [LBP10], a SAT-based model-guided algorithm for Weighted MaxSAT formulas. In this algorithm, the MaxSAT optimization problem is reformulated into a sequence of SAT decision problems. Each SAT instance of the sequence encodes whether there exists an assignment with a cost $$\leq k$$, encoded as a Pseudo-Boolean or Cardinality constraint depending on the weights of the soft constraints. SAT instances with a $$k$$ less than the optimal cost are unsatisfiable, the others being satisfiable. The objective is to iteratively refine (decrease) the upper bound and guide the search with satisfying assignments (models) obtained from satisfiable SAT instances.

 1from optilog.solvers.sat import Glucose41
2from optilog.encoders.pb import IncrementalEncoder
4
5def linear(
6    instance, seed
7):
8    s = Glucose41()
9    s.set('seed', seed)
11    B, W, max_var = [], [], f.max_var()
12
13    for w, c in f.soft_clauses:
14        max_var += 1
16        B += [max_var]
17        W += [w]
18
19    res, ub = True, f.top_weight()
20    encoder, max_var, C = IncrementalEncoder.init(B, ub, W, max_var)
22
23    while res is True and ub > 0:
24        max_var, C = encoder.extend(ub - 1)
26        res = s.solve()
27        if res is True:
28            ub = f.cost(s.model())
29            print("o", ub)
30
31    return ub


This code can be easily adapted with OptiLog to be configurable through an automatic configurator.

 1from optilog.encoders.pb import IncrementalEncoder
2
3from optilog.tuning import ac, Categorical, CfgCls
4from optilog.solvers.sat import Glucose41
5
6@ac
7def linear(
8    instance, seed,
9    init_solver_fn: CfgCls(Glucose41),
10    encoding: Categorical('best', 'adder', 'seqcounter') = 'best'
11):
12    s = init_solver_fn(seed=seed)
13
15    B, W, max_var = [], [], f.max_var()
16
17    for w, c in f.soft_clauses:
18        max_var += 1
20        B += [max_var]
21        W += [w]
22
23    res, ub = True, f.top_weight()
24    encoder, max_var, C = IncrementalEncoder.init(B, ub, W, max_var, encoding)
26
27    while res is True and ub > 0:
28        max_var, C = encoder.extend(ub - 1)
30        res = s.solve()
31        if res is True:
32            ub = f.cost(s.model())
33            print("o", ub)
34
35    return ub


There are, two main configurable aspects: the SAT solver parameters and the Pseudo-Boolean encoder to be used.

To configure the SAT solver we use the CfgCls class, that can automatically instantiate and configure a Configurable object. Some PyiSAT solvers are configurable, like Glucose41 (line 9).

For the Pseudo-Boolean encoder, we add a configurable categorical parameter called encoding (line 10), which is passed to the init method of IncrementalEncoder in line 24.

Line 33 prints the current suboptimum, which will be the criteria that the AC tool will use to improve the performance of the function.

Then, to generate the scenario for a particular AC tool (like GGA), we can use a GGAConfigurator object.

 1from optilog.tuning.configurators import GGAConfigurator
2from optilog.blackbox import ExecutionConstraints
3
4configurator = GGAConfigurator(
5    linear, global_cfgcalls=[linear], runsolver_path="./runsolver",
6    input_data=["inst1.wcnf", "inst2.wcnf", ..., "instN.wcnf"],
7    data_kwarg="instance", seed_kwarg="seed",
8    constraints=ExecutionConstraints(
9        wall_time=300,
10        memory='6G'
11    ),
12    tuner_rt_limit=43200, run_obj="quality",
13    quality_regex="^o (\d+)\$",
14    seed=1, cost_min=0, cost_max=(2 << 64) - 1, # Max sum of weights for a WCNF instance
15)
16configurator.generate_scenario("./scenario")


The linear function is passed as entrypoint. It is also passed inside global_cfgcalls, as it is a global configurable function. However, the Glucose41 class is not passed inside global_cfgcalls, as in the context of the linear function it is used as a local configurable function.

We provide the set of wcnf instances that will be used to configure the entrypoint (input_data, line 5), the name for the data and seed parameters inside the linear function (data_kwarg and seed_kwarg, line 6) and the regex that must be used to parse the quality of the execution (quality_regex, line 9). Notice that in this case, multiple qualities may be printed, but only the last one will be taken into account (which will be the best suboptimum that the algorithm has found).

Finally, we set the cost_min and cost_max (line 10) to the maximum sum of the weights in a WCNF formula to ensure that a proper penalty is applied in case of crash or timeout.

Notice that eval_time_limit, tuner_rt_limit and seed, which indicate the maximum evaluation time per instance, the overall time for optimization and the seeds to evaluate. For more information, check GGA’s documentation.

After the initialization, we can call the generate_scenario method to create all the files required by GGA to configure the linear function.