Making Hill-Climbing Great Again through Online Relaxation Refinement and Novelty Pruning
DOI:
https://doi.org/10.1609/socs.v9i1.18460Abstract
Delete relaxation is one of the most successful approaches to classical planning as heuristic search. The precision of these heuristics can be improved by taking some delete information into account, in particular through atomic conjunctions in the hCFF heuristic. It has recently been shown that this heuristic is especially effective when these conjunctions are learned online in a hill-climbing search algorithm. In this work, we devise a natural extension to this approach using novelty pruning, a recently-developed technique that prunes states based on whether they contain facts not seen before in the search. We evaluate our extension on the IPC benchmarks, where it beats LAMA, Mercury, and Dual-BFWS on many domains.
Downloads
Published
2021-09-01
Issue
Section
Short Papers