Common use of Relevance Filtering Clause in Contracts

Relevance Filtering. The relevance filter heuristics we have considered do not work out of the box - their parameters need to be carefully adjusted. The major design decision concerned how to carry out the process of fine tuning. We started with an ad-hoc benchmark containing models of several problem domains and aimed for maximizing the number of automatically discharged proof obligations among this benchmark while minimizing the amount of time spent for proving. We experimented with different filter configurations, i.e., combinations of heuristics, heuristic parameters, provers (PP, newPP, or ML) and prover timeouts. Finally, the parameters and timeouts were chosen such that • the number of automatically discharged proof obligations is almost maximal among all considered filter configurations, and • decreasing the timeouts would significantly decrease the number of automatically discharged proof obligations. To rebut criticism of overfitting, we tested the final filter configuration on a validation benchmark (based on deployment partners' models), which was chosen independently from the benchmark used for fine-tuning. We observed that the final filter configuration significantly increases the number of automatically discharged proof obligations among the validation benchmark in comparison to not using relevance filtering.

Appears in 2 contracts

Sources: Grant Agreement, Grant Agreement