On the Compilability and Expressive Power of Propositional Planning Formalisms

The recent approaches of extending the GRAPHPLAN algorithm to handle more expressive planning formalisms raise the question of what the formal meaning of"expressive power"is. We formalize the intuition that expressive power is a measure of how concisely planning domains and plans can be expressed in a particular formalism by introducing the notion of"compilation schemes"between planning formalisms. Using this notion, we analyze the expressiveness of a large family of propositional planning formalisms, ranging from basic STRIPS to a formalism with conditional effects, partial state specifications, and propositional formulae in the preconditions. One of the results is that conditional effects cannot be compiled away if plan size should grow only linearly but can be compiled away if we allow for polynomial growth of the resulting plans. This result confirms that the recently proposed extensions to the GRAPHPLAN algorithm concerning conditional effects are optimal with respect to the"compilability"framework. Another result is that general propositional formulae cannot be compiled into conditional effects if the plan size should be preserved linearly. This implies that allowing general propositional formulae in preconditions and effect conditions adds another level of difficulty in generating a plan.

There appears to be a consensus on how much expressive power is added by a particular language feature. For example, everybody seems to agree that adding negative preconditions does not add very much to the expressive power of basic STRIPS, whereas conditional effects are considered as a significant increase in expressive power (Anderson et al., 1998;Gazen & Knoblock, 1997;Kambhampati et al., 1997;Koehler et al., 1997). However, it is unclear how to measure the expressive power in a more formal way. Related to this problem is the question of whether "compilation" approaches to extend the expressiveness of a planning formalism are optimal. For example, Gazen and Knoblock (1997) propose a particular method of compiling operators with conditional effects into basic STRIPS operators. This method, however, results in exponentially larger operator sets. While most people (Anderson et al., 1998;Kambhampati et al., 1997;Koehler et al., 1997) agree that we cannot do better than that, nobody has proven yet that a more space-efficient method is impossible. c 2000 AI Access Foundation and Morgan Kaufmann Publishers. All rights reserved.

NEBEL
In order to address the problem of measuring the relative expressive power of planning formalisms, we start with the intuition that a formalism ¡ is at least as expressive as another formalism ¢ if planning domains and the corresponding plans in formalism ¢ can be concisely expressed in the formalism ¡ . This, at least, seems to be the underlying intuition when expressive power is discussed in the planning literature. Bäckström (1995) proposed to measure the expressiveness of planning formalisms using his ESP-reductions. These reductions are, roughly speaking, polynomial many-one reductions£ on planning instances that do not change the plan length. Using this notion, he showed that all of the propositional variants of basic STRIPS not containing conditional effects or arbitrary logical formulae can be considered as expressively equivalent. However, taking our point of view, ESPreductions are too restrictive for two reasons. Firstly, plans must have identical size, while we might want to allow a moderate growth. Secondly, requiring that the transformation can be computed in polynomial time is overly restrictive. If we ask for how concisely something can be expressed, this does not necessarily imply that there exists a polynomial-time transformation. In fact, one formalism might be as expressive as another one, but the mapping between the formalisms might not be computable at all. This, at least, seems to be the usual assumption made when the term expressive power is discussed (Baader, 1990;Cadoli, Donini, Liberatore, & Schaerf, 1996;Erol, Hendler, & Nau, 1996;Gogic, Kautz, Papadimitriou, & Selman, 1995).
Inspired by recent approaches to measure the expressiveness of knowledge representation formalisms (Cadoli et al., 1996;Gogic et al., 1995), we propose to address the questions of how expressive a planning formalism is by using the notion of compiling one planning formalism into another one. A compilation scheme from one planning formalism to another differs from a polynomial many-one reduction in that it is not required that the compilation is carried out in polynomial time. However, the result should be expressible in polynomial space. Furthermore, it is required that the operators of the planning instance can be translated without considering the initial state and the goal. While this restriction might sound unnecessarily restrictive, it turns out that existing practical approaches to compilation (Gazen & Knoblock, 1997) as well as theoretical approaches (Bäckström, 1995) consider only structured transformations where the operators can be transformed independently from the initial state and the goal description. From a technical point of view this restriction guarantees that compilations are non-trivial. If the entire instance could be transformed, a compilation scheme could decide the existence of a plan for the source instance and then generate a small solution-preserving instance in the target formalism, which would lead to the unintuitive conclusion that all planning formalisms have the same expressive power.
As mentioned in the beginning, not only the space taken up by the domain structure is important, but also the space used by the plans. For this reason, we distinguish between compilation schemes in whether they preserve plan size exactly, linearly, or polynomially.
Using the notion of compilability, we analyze a wide range of propositional planning formalisms, ranging from basic STRIPS to a planning formalism containing conditional effects, arbitrary boolean formulae, and partial state specifications. As one of the results, we identify two equivalence classes of planning formalisms with respect to polynomial-time compilability preserving plan size exactly. This means that adding a language feature to a formalism without leaving the class does not increase the expressive power and should not affect the principal efficiency of the planning method. However, we also provide results that separate planning formalisms using results from computational complexity theory on circuit complexity and non-uniform complexity classes. Such separation results indicate that adding a particular language feature adds to the expressive power and to the difficulty of integrating the feature into an existing planning algorithm. For example, we prove that conditional effects cannot be compiled away and that boolean formulae cannot be compiled into conditional effects-provided the plans in the target formalism are allowed to grow only linearly.
This answers the question posed in the beginning. The compilation approach proposed by Gazen and Knoblock (1997) cannot be more space efficient, even if we allow for linear growth of the plans in the target formalism.¤ Allowing for polynomial growth of the plans, however, the compilation scheme can be more space efficient. Interestingly, it seems to be the case that a compilation scheme that allows for polynomially larger plans is similar to the implementation of conditional effects in the IPP system (Koehler et al., 1997), Kambhampati and colleagues' (1997) planning system, and Anderson and colleagues' (1998) planning system.
The rest of the paper is structured as follows. In Section 2, we introduce the range of propositional planning formalisms analyzed in this paper together with general terminology and definitions. Based on that, we introduce the notion of compilability between planning formalisms in Section 3. In Section 4 we present polynomial-time compilation schemes between different formalisms that preserve the plan size exactly, demonstrating that these formalisms are of identical expressiveness. For all of the remaining cases, we prove in Section 5 that there cannot be any compilation scheme preserving plan size linearly, even if there are no bounds on the computational resources of the compilation process. In Section 6 we reconsider the question of identical expressiveness by using compilation schemes that allow for polynomial growth of the plans. Finally, in Section 7 we summarize and discuss the results.

Propositional Planning Formalisms
First, we will define a very general propositional planning formalism, which appears almost as expressive as the propositional variant of ADL (Pednault, 1989). This formalism allows for arbitrary boolean formulae as preconditions, conditional effects and partial state specifications. Subsequently, we will specialize this formalism by imposing different syntactic restrictions. or R E S 2. Note that Gazen and Knoblock's (1997) translation scheme also generates planning operators that depend on the initial state and the goal description. However, these operators simply code the initial state and the goal description and do nothing else. For this reason, we can ignore them here. NEBEL " U T

A General Propositional Planning Formalism
. Further, we define V " to be the element-wise negation of " , i.e., V " g is a truth-assignment for the atoms in ¦ . In the following, we also identify a state with the set of atoms that are true in this state. A state specification h is a subset of v w A y x p re post . We use the notation pre( Example 1 In order to illustrate the various notions, we will use as a running example planning problems connected with the production of camera-ready manuscripts from L A T E X source filessomewhat simplified, of course. As the set of atoms ¦ , we choose the following set: In words, if the precondition of the operator is satisfied in state g and the active effects are consistent, then state g is mapped to the state g X which differs from g in that the truth values of active effects are forced to become true for positive effects and forced to become false for negative effects. If the precondition is not satisfied or the set of active effects is inconsistent, the result of the function is undefined. In the planning formalism itself, we do not work on states but on state specifications. In general, this can lead to semantic problems. By restricting ourselves to state specifications that are sets of literals, however, the syntactic manipulations of the state specifications can be defined in a way such that they are sound in Lifschitz' (1986) Using this notion, one can easily prove-using induction over the plan length-that any plan for an instance · is sound in Lifschitz' (1986) sense, i.e., corresponds to the application of state transition functions to the initial states.

Proposition 2 Let
be a planning instance and is undefined.

A Family of Propositional Planning Formalisms
The propositional variant of standard STRIPS (Fikes & Nilsson, 1971), which we will also call ç in what follows, is a planning formalism that requires complete state specifications, unconditional effects, and propositional atoms as formulae in the precondition lists. Less restrictive planning formalisms can have the following additional features: Incomplete state specifications (è ): The state specifications may not be complete.

Literals as formulae ( ):
The formulae in preconditions and effect conditions can be literals.

Boolean formulae (ê ):
The formulae in preconditions and effect conditions can be arbitrary boolean formulae.
These extensions can also be combined. We will use combinations of letters to refer to such multiple extensions. For instance,  Figure 1: Planning formalisms partially ordered by syntactic restrictions Figure 1 displays the partial order on propositional planning formalisms defined in this way. In the sequel we say that ¡ is a specialization of ¢ , written in the diagram depicting the partial order. Comparing this set of planning formalisms with the one Bäckström (1995) analyzed,ô one notices that despite small differences in the presentation of the planning formalisms: » ç is the same as common propositional strips (CPS), » ç is the same as propositional strips with negative goals (PSN), and » ç ë is the same as ground Tweak (GT).

ç -Family
While one would expect that planning in ç is much easier than planning in ç 7 í ' ë s ì , it turns out that this is not the case, provided one takes a computational complexity perspective.
In analyzing the computational complexity of planning in different formalisms, we consider, as usual, the problem of deciding whether there exists a plan for a given instance-the plan existence problem (PLANEX). We will use a prefix referring to the planning formalism if we consider the existence problem in a particular planning formalism.
5. We do not consider planning formalisms identical to the SAS ae formalism (Bäckström & Nebel, 1995), since we do not allow for multi-valued state variables.

NEBEL
Proof. PSPACE-hardness of ç -PLANEX follows from a result by Bylander (1994, Corollary 3.2). Membership of ç 7 í ë s ì -PLANEX in PSPACE follows because we could, step by step, guess a sequence of operators, verifying at each step that the operator application leads to a legal follow up state specification and that the last operator application leads to a state specification that entails the goal specification. For each step, this verification can be carried out in polynomial space. The reason for this is that all the conditions in the definition of¨are verified by polynomially many calls to an NP-oracle. Therefore, ç 7 í ë s ì can be decided on a non-deterministic machine in polynomial space, hence it is a member of PSPACE.
From that it follows that the plan existence problem for all formalisms that are in expressiveness between ç and ç 7 í ' ë s ì -including both formalisms-is PSPACE-complete.

Expressiveness and Compilability between Planning Formalisms
Although there is no difference in the computational complexity between the formalisms in the ç í ë f ì -family, there might nevertheless be a difference in how concisely planning domains and plans can be expressed. In order to investigate this question, we introduce the notion of compiling planning formalisms.

Compiling Planning Formalisms
As mentioned in the Introduction, we will consider a planning formalism ¡ as expressive as another formalism ¢ if planning domains and plans formulated in formalism ¢ are concisely expressible in ¡ . We formalize this intuition by making use of what we call compilation schemes, which are solution preserving mappings with polynomially sized results from ¢ domain structures to ¡ domain structures. While we restrict the size of the result of a compilation scheme, we do not require any bounds on the computational resources for the compilation. In fact, for measuring the expressibility, it is irrelevant whether the mapping is polynomial-time computable, exponential-time computable, or even non-recursive. At least, this seems to be the idea when the notion of expressive power is discussed in similar contexts (Baader, 1990;Erol et al., 1996;Gogic et al., 1995;Cadoli et al., 1996). If we want to use such compilation schemes in practice, they should be reasonably efficient, of course. However, if we want to prove that one formalism is strictly more expressive than another one, we have to prove that there is no compilation scheme regardless of how many computational resources such a compilation scheme might use.
So far, compilation schemes restrict only the size of domain structures. However, when measuring expressive power, the size of the generated plans should also play a role. In Bäckström's ESP-reductions (1995), the plan size must be identical. Similarly, the translation from ç ì to ç proposed by Gazen and Knoblock (1997) seems to have as an implicit prerequisite that the plan length in the target formalism should be almost the same. When comparing the expressiveness of different planning formalisms, we might, however, be prepared to accept some growth of the plans in the target formalism. For instance, we may accept an additional constant number of operators, or we may even be satisfied if the plan in the target formalism is linearly or polynomially larger. This leads to the schematic picture of compilation schemes as displayed in Figure 2. Although Figure 2 gives a good picture of the compilation framework, it is not completely accurate. First of all, a compilation scheme may introduce some auxiliary propositional atoms that are used to control the execution of newly introduced operators. These atoms should most likely have an initial value and may appear in the goal specification of planning instances in the target formalism. We will assume that the compilation scheme takes care of this and adds some literals to the initial state and goal specifications.
Additionally, some translations of the initial state and goal specifications may be necessary. If we want to compile a formalism that permits for literals in preconditions and goals to one that requires atoms, some trivial translations are necessary. Similarly, if we want to compile a formalism that permits us to use partial state specification to a formalism that requires complete state specifications, a translation of the initial state specification is necessary. However, such state translation functions should be very limited. They should depend only on the set of symbols in the source formalism, should be "context-independent," i.e., the translation of a literal in a state specification should not depend on the whole specification, and they should be efficiently computable.
While the compilation framework is a theoretical tool to measure expressiveness, it has, of course, practical relevance. Let us assume that we have a reasonably fast planning system for a planning formalism ¡ and we want to add a new feature to ¡ resulting in formalism ¢ . If we can come up with an efficient compilation scheme from ¢ to ¡ , this means we can easily integrate the new feature-either by using the compilation scheme or by modifying the planning algorithm minimally. If no compilation scheme exists, we probably would have problems integrating this feature. Finally, if only computationally expensive compilation schemes exist, we have an interesting situation. In this case, the off-line compilation costs may be high. However, since the compiled domain structure can be used for different initial and goal state specifications, the high off-line costs may be compensated by the efficiency gain resulting from using the ¡ planning algorithm.ú As it turns, however, this situation does not arise in analyzing compilability between the ç 7 í ë s ì formalisms. Either we can identify a polynomial-time compilation scheme or we are able to prove that no compilation scheme exists.
6. This means that compilation schemes between planning formalisms are similar to knowledge compilations (Cadoli & Donini, 1997), where the fixed part of a computational problem is the domain structure and the variable part consists of the initial state and goal specifications. The main difference to the knowledge compilation framework is that we also take the (size of the) result into account. In other words we compile function problems instead of decision problems.

Compilation Schemes
Assume a tuple of functions as follows: and they are polynomial-time computable; 3. and the size of the results of ý b þ e ý â , and ý b ÿ is polynomial in the size of the arguments.
Condition (1) states that the function ¢ induced by the compilation scheme û is solutionpreserving. Condition (2) states requirements on the on-line state-translation functions. The result of these functions should be computable element-wise, provided the state specification is consistent. Considering the fact that these functions depend only on the original set of symbols and the state specification, this requirement does not seem to be very restrictive. Since the state-translation functions are on-line functions, we also require that the result should be efficiently computable.
Finally, condition (3) formalizes the idea that û is a compilation. For a compilation it is much more important that the result can be concisely represented, i.e., in polynomial space, than that the compilation process is fast. Nevertheless, we are also interested in efficient compilation schemes. We say that In addition to the resource requirements on the compilation process, we will distinguish between different compilation schemes according to the effects on the size of the plans solving the instance in the target formalism. If a compilation scheme , then û is a compilation scheme preserving plan size polynomially. More generally, we say that a planning formalism ¡ is compilable to formalism ¢ (in polynomial time, preserving plan size exactly, linearly, or polynomially), if there exists a compilation scheme with the appropriate properties. We write , or E depending on whether the scheme preserves plan size exactly, linearly plan, or polynomially, respectively.
As is easy to see, all the notions of compilability introduced above are reflexive and transitive.
7. Although it is hard to imagine a modular state-translation function that is not polynomial time computable, some pathological function could, e.g., output translations that have exponential size in the encoding of the symbols.
Furthermore, it is obvious that when moving upwards in the diagram displayed in Figure 1, there is always a polynomial-time compilation scheme preserving plan size exactly. If % v â denotes the projection to the Ü -th argument and µ the function that returns always the empty set, the generic compilation scheme for moving upwards in the partial order is

Compilability Preserving Plan Size Exactly
Proposition 5 leads to the question of whether there exist other compilation schemes than those implied by the specialization relation. Because of Proposition 5 and Proposition 4, we do not have to find compilation schemes for every pair of formalisms. It suffices to prove that ¡ is compilable to ¢ , in order to arrive at the conclusion that all formalisms that are below ¡ are compilable to ¢ and formalisms above ¢ . A preview of the results of this section is given in Figure 3. We will establish two equivalence classes such that all members of each class are compilable to each other preserving plan size exactly. These two equivalence classes will be called , naming them after their respective largest elements.

Planning Formalisms without Conditional Effects and Boolean Formulae
First, we will show that the formalisms analyzed by Bäckström (1995), namely, ç ë , ç , and ç are polynomial-time compilable into each other preserving plan size exactly. In fact, a fourth class can be added to this set, namely, ç ë , which lies between ç ë and ç . In other words, using the notion of compilability, we get the same equivalence class as with Bäckström's ESP-reductions. Having a closer look at the proofs in Bäckström's (1995) paper reveals that this is not surprising at all because the ESP-reductions he used could be reformulated as compilation schemes. Since he used a quite different notation, we will nevertheless prove this claim from first principles.
The key idea in compiling planning formalisms with literals to formalisms that allow for atoms only is to consider E and R E as different atoms in the new formalism. For this purpose, we introduce as the new set of atoms, one can translate state specifications and preconditions easily. In the postconditions we have to make sure that the intended semantics is taken care of, i.e., whenever is added, 3 E must be deleted and vice versa. Finally, we have to deal with the problem of partial state specifications. However, this not a problem when all effects are unconditional and the preconditions contain only atoms. In this case, we can safely assume that all atoms with unknown truth-value are false without changing the outcome of the application of an operator. Let Using this function, we can transform a partial state specification into a complete specification without changing the outcome, i.e., we get the same plans.
The set of all such operators is denoted by 3 © . Now we can define the compilation scheme The scheme û obviously satisfies conditions (2) and (3), all the functions can be computed in polynomial time, and denote a sequence of operators corresponding to a sequence of operators i.e., condition (1) on compilation schemes is also satisfied. This means, û is in fact a compilation scheme. Further, since the plan size does not change, the compilation scheme preserves plan size exactly. Finally, because all functions in û can be computed in time polynomial in their arguments, û is a polynomial-time compilation scheme.
One view on this result is that it does not matter whether, from an expressivity point of view, we allow for atoms only or for literals and it does not matter whether we have complete or partial state specification-provided propositional formulae and conditional effects are not allowed.

Planning Formalisms with Conditional Effects but without Boolean Formulae
Interestingly, the view spelled out above generalizes to the case where conditional effects are allowed. Also in this case it does not matter whether only atoms or also literals are allowed and whether we have partial or complete state specifications. In proving that, however, there are two additional complications. Firstly, one must compile conditional effects over partial state specifications to conditional effects over complete state specifications. This is a problem because the condition ( 6 h p ost( . In the general case, however, things are less straightforward because effect literals can be produced by more than one conditional rule and an effect condition can consist of more than one literal. Assuming without loss of generality (using a polynomial transformation) that the effects are all singleton sets, we have to check the following condition. Either one of the conditional effects with the same effect literal is activated-i.e., the effect condition is entailed by the partial stateor all of the conditional effects with the same effect literal are blocked, i.e., each effect condition contains a literal that is inconsistent with the state specification. If this is true, the original operator satisfies , otherwise the resulting state specification is inconsistent. For example, consider the following and the result is the illegal state. In order to test for this condition in a formalism with complete states we introduce four new sets of atoms: First, we introduce a number of new sets of symbols that are pairwise disjoint and disjoint from For a given set of literals denotes the set of primed literals, i.e., Now we can specify a compilation scheme û from ç ë s ì to ç ì as follows: The scheme û obviously satisfies conditions (2), i.e., that the state-translation functions are modular, and (3), i.e., that the compilation functions have polynomially sized results. Further, all the functions can be computed in polynomial time, and In the latter case, the application of any operator tö leads to an inconsistent state because of the conditional effects in test£ , which is part of all postconditions of operators applicable in this state. Additionally, the same is true for the relation between is consistent, then there exists probably only a compilation scheme that preserves plan size linearly. If we use a semantics where the resulting state specification is legal when the application of all state-transformation functions leads to a theory that can be represented as a set of literals, it seems unlikely that there exists a scheme that preserves plan size polynomially. The reason for this pessimistic conjecture is that under this semantics it appears to be coNP-hard to determine whether the state specification resulting from applying a ç ë f ì -operator is legal. As a second step in showing that partial state specifications and literals can be compiled away, we show that we can compile ç ì to ç ì . The key idea in the proof is the same as in the proof of Theorem 6. We replace each negative literal R E by a new atom 3 E . In order to detect inconsistencies introduced by conditional effects, we add to each postcondition conditional effects of the form . Further, to check that the last operator in a plan does not introduce any inconsistencies, we force the application of a "checking" operator that contains the same conditional effects.  pre( Then we can specify a compilation scheme û from ç ì to ç ì as follows: The scheme û obviously satisfies conditions (2) and (3)

The Limits of Compilation when Preserving Plan Size Linearly
The interesting question is, of course, whether there are other compilation schemes preserving plan size exactly than those we have identified so far. As it turns out, this is not the case. We will prove that for all pairs of formalisms for which we have not identified a compilation scheme preserving plan size exactly, such a compilation scheme is impossible even if we allow for a linear increase of the plan size. For some pairs of formalisms we are even able to prove that a polynomial increase of the plan size would not help in establishing a compilation scheme. These results are, however, conditional based on an assumption that is slightly stronger than the e p A g f h e assumption. A preview of the results of this section is given in Table 1. The symbol ó means that there exists a compilation scheme because the first formalism is a specialization of the second one. In all the other cases, we specify the separation and give the theorem number for this result.

Conditional Effects Cannot be Compiled Away
First of all, we will prove that conditional effects cannot be compiled away. The deeper reason for this is that with conditional effects, one can independently do a number of things in parallel, which is impossible in formalisms without conditional effects.  It is obviously possible to come up with a set of exponentially many operators that can do the same thing in one step. However, it is unclear how to do that with less than exponentially many operators. In fact, we will show that this is impossible. In order to illustrate this point, let us generalize the above example. We start with a set of ß  With locality as an additional condition on state-translation functions we could easily prove that conditional effects cannot be compiled away. Instead of doing so we will show, however, that it is possible to derive a weaker condition from the definition of compilation schemes that will be enough to prove the impossibility result. This weaker condition is quasi-locality of state-translation functions relative to a given set of symbols ¦ , which in turn is based on the notion of universal literals. A literal p is called a universal literal for given state-translation functions on ¦ iff one of the following conditions is satisfied: be a function that has as the result the union of all results for all possible translations of a literal returned by the state-translation functions, i.e., Note that such an infinite subset w must exist. The reason is that some literal p y r must occur for infinitely many atoms in over w because we could not find an infinite subset satisfying condition (1). Because for a single atom there are only six possible ways to generate p , there must exist an infinite subset such that this literal occurs in all of either ) and in this subset p is a universal literal.
If we can pick a subset satisfying the first condition, we can choose from it a finite subset ¦ with any desired cardinality such that the state-translation functions are quasi-local with respect to ¦ and r .
Otherwise we repeat the selection process with w and r until condition (1) is satisfied. This selection process can only be repeated finitely often because otherwise there are some atoms E such that t ( l E 1 has an infinite result, which is impossible because the state-translation functions are polynomial-time computable and can therefore have only finite results.
This demonstrates that there always exists a set of propositional atoms such that the statetranslation functions are quasi-local. However, we might not be able to effectively determine this set.
Using this result, we are finally able to prove the non-existence of compilation schemes for compiling conditional effects away when preserving plan size linearly.
, it follows that (again because ÿ is modular), À achieves also º ¤ . Since does not have any plan, there should not be any plan for . The fact that À is a plan for this instance implies that û cannot be a compilation scheme, which is the desired contradiction.
Using Propositions 4 and 5 as well as Theorem 9, this result can be generalized as follows (see also This answers the question of whether more space efficient compilation schemes from ç ì to ç than the one proposed by Gazen and Knoblock (1997) are possible. Even assuming unbounded computational resources for the compilation process, a more space efficient compilation scheme is impossible-provided that the compilation should preserve plan size linearly. If we allow polynomially larger plans, then efficient compilation schemes are possible (see Section 6).

Non-Uniform Complexity Classes
In the next section we make use of so-called non-uniform complexity classes, which are defined using advice-taking machines, in order to prove the impossibility of a compilation scheme. An advice-taking Turing machine is a Turing machine with an advice oracle, which is a (not necessarily recursive) function from positive integers to bit strings. On input , the machine loads the bit string and then continues as usual. Note that the oracle derives its bit string only from the length of the input and not from the contents of the input. An advice is said to be polynomial if the oracle string is polynomially bounded by the instance size. Further, if is a complexity class defined in terms of resource-bounded machines, e.g., P or NP, then t p f n % m (also called non-uniform X) is the class of problems that can be decided on machines with the same resource bounds and polynomial advice.
Because of the advice oracle, the class P/poly appears to be much more powerful than P. However, it seems unlikely that P/poly contains all of NP. In fact, one can prove that  . As with other classes, it is unknown whether the inclusions between the classes are proper. However, it is strongly believed that this is the case, i.e., that the hierarchy is truly infinite.
Based on the firm belief that the polynomial hierarchy is proper, the above mentioned question of whether implies that the polynomial hierarchy collapses at the third level (Yap, 1983), i.e., , which again is considered to be very unlikely. We will use these result for proving that for some pairs of formalisms it is very unlikely that one formalism can be compiled into the other one. 9. The super-script is only used to distinguish these sets from the analogous sets in the Kleene hierarchy.

On the Expressive Power of Partial State Specifications and Boolean Formulae
In all the cases considered so far, operators over partial state specifications could be compiled to operators over complete state specifications, i.e., partial state specifications did not add any expressiveness. This is no longer true, however, if we also allow for arbitrary boolean formulae in preconditions and effect conditions. In this case, we can decide the coNP-complete problem of whether a formula is a tautology by deciding whether a one-step plan exists. Asking, for example, if the has a plan is equivalent to asking whether is a tautology. Let the one-step plan existence problem (1-PLANEX) be the PLANEX problem restricted to plans of size one. From the above it is evident that ç 7 í ë f ì -1-PLANEX and ç 7 í ë -1-PLANEX are coNP-hard. Let be some fixed polynomial, then the polynomial step plan-existence problem (E -PLANEX) is the PLANEX problem restricted to plans that have length bounded by ß is the size of the planning instance. As is easy to see, this problem is in NP for all formalisms except ç 7 í ë f ì and ç 7 í ë . The reason is that after guessing a sequence of operators and state specifications of polynomial size, one can verify for each step in polynomial time that the precondition is satisfied by the current state specification and produces the next state specification. Since there are only polynomially many steps, the overall verification takes only polynomial time. ). However, even if we allow for unbounded computational resources of the compilation process, a proof technique first used by Kautz and Selman (1992) can be used to show that such a compilation scheme cannot exist (provided Now, the initial state for any particular formula of size ß is computed as follows: From the construction, it follows that there exists a one-step plan for Let can be computed in polynomial time. Finally, we decide the E -PLANEX problem on the resulting . From Proposition 13 we know that this can be done in polynomial time on a nondeterministic Turing machine.
Because deciding E -PLANEX for , which is in turn equivalent to deciding unsatisfiability of , it follows that we can decide a coNP-complete problem on a nondeterministic, polynomial advice-taking Turing machine in polynomial time. From that it follows that . Using Yap's (1983) result, the claim follows.
Using Proposition 4 and Proposition 5, the above result generalizes as follows (see also If we restrict the form of the formulae, however, we may be able to devise compilation schemes from ç 7 í ë to, e.g., ç 7 í . Reconsidering the proof of the last theorem, it turns out that it is essential to use the negation of a CNF formula as a precondition. If we restrict ourselves to CNF formulae in preconditions, it seems possible to move from partial to complete state descriptions using ideas similar to the ones used in the proof of Lemma 7.
However, no such compilation scheme will work for ç 7 í ' ë s ì . The reason is the condition in the definition of the function¨. If this condition is not satisfied, the result of the operator is inconsistent. This condition could be easily employed to reduce unsatisfiability of CNF formulae to 1-step plan existence, which enables us to use the same technique as in the proof of the above theorem.

Circuit Complexity
For the next impossibility result we need the notions of boolean circuits and families of circuits. A boolean circuit is a directed, acyclic graph « á A ä ( have in-degree one, and the gates with have in-degree two. All gates except one have at least one outgoing edge. The gate with no outgoing edge is called the output gate. The gates with no incoming edges are called the input gates. The depth of a circuit is the length of the longest path from an input gate to the output gate. The size of a circuit is the number of gates in the circuit.
is now interpreted as a value assignment to the ß input variables § £ g e g e g e X ¡ § R Å of a circuit. The word is accepted iff the output gate has value 1 for this word. In order to deal with words of different length, we need one circuit for each possible length. A family of circuits is an infinite sequence © A y ( 9 « s @ « £ g e g e g e w 1 , where « Å has ß input variables. The language accepted by such a family of circuits is the set of words ² such that Usually, one considers so-called uniform families of circuits, i.e., circuits that can be generated on a Turing machine with aμ k ¶ @ ß -space bound. Sometimes, however, also non-uniform families are interesting. For example, the class of languages accepted by non-uniform families of polynomiallysized circuits is just the class P/poly introduced in Section 5.2.
Using restrictions on the size and depth of the circuits, we can now define new complexity classes, which in their uniform variants are all subsets of P. One class that is important in the following is the class of languages accepted by uniform families of circuits with polynomial size and logarithmic depth, named NC£ . Another class which proves to be important for us is defined in terms of non-standard circuits, namely circuits with gates that have unbounded fan-in. Instead of restricting the in-degree of each gate to be two at maximum, we now allow an unbounded in-degree. The class of languages accepted by families of polynomially sized circuits with unbounded fan-in and constant depth is called ACs .
From the definition, it follows almost immediately that ACs NC£ . Moreover, it has been shown that there are some languages in NC£ that are not in the non-uniform variant of ACs , which implies that ACs A NC£ (Furst, Saxe, & Sipser, 1984).

Boolean Formulae Cannot be Compiled to Conditional Effects
As we have seen in Section 5.3, Boolean formulae are quite expressive if they are used in combination with partial state specifications. However, what if all state specifications are complete? In this case, it seems to be possible to simulate the evaluation of CNF formulae by using conditional effects. In fact, it is possible to compile in polynomial-time, for example, ç 7 í to ç ì preserving plan size linearly, provided all formulae are in conjunctive normal form. Each operator would have to be split into two operators, one that evaluates the clauses of all the formulae in the original operator and one that combines these evaluations and takes the appropriate actions, e.g., asserting © if the precondition is not satisfied. Sequencing of these pairs of operators can be achieved by introducing some extra literals.
What can we say about the general case, however? When trying to simulate the evaluation of an arbitrary logical formula using conditional effects, it seems to be the case that we need as many operators as the nesting depth of the formula, which means that we would need plans that cannot be bounded to be only linearly longer than the original plans.
We will use the results sketched in Section 5.4 to separate ç 7 í and ç ì . In order to do so, let us view domain structures with fixed size plans as "machines" that accept languages. For all words ² consisting of ß bits, let¸V Assume that the atoms in ¦ Å are numbered from 1 to ß . Then a word ² consisting of ß bits could be encoded by the set of literals . We now say that the ß -bit word ² is accepted with a one-step or Á -step plan byVÅ iff there exists a one-step or Á -step plan, respectively, for the instance Similarly to families of circuits, we also define families of domain structures, · à A £ ( s ȩ £ g e g e g e É 1 . The language accepted by such a family with a one-step (or Á -step) plan is the set of words accepted using the domain structureVÅ for words of length ß . Borrowing the notion of uniformity as well, we say that a family of domain structures is uniform if it can be generated by aμ k ¶ ß -space Turing machine.
Papadimitriou has pointed out that the languages accepted by uniform polynomially-sized boolean expressions is identical to NC£ (Papadimitriou, 1994, p. 386). As is easy to see, a family of ç 7 í domain structures is nothing more than a family of boolean expressions, provided we use one-step plans for acceptance.

Proposition 16
The class of languages accepted by uniform families of ç 7 í domain structures using one-step plan acceptance is identical to NC£ .

NEBEL
If we now have a closer look at what the power of Á -step plan acceptance for families of ç ì domain structures is, it turns out that it is less powerful than NC£ . In order to show that, we will first prove the following lemma that relates Á -step ç ì plans to circuits with gates of unbounded fan-in.
Further, using the -gate, it is checked that no inconsistency was generated when executing the plan.
For each plan step V , it must be computed whether the precondition is satisfied and what the result of the conditional effects are. Figure 6 (a) displays the precondition test for the precondition . If the conjunction of the precondition literals is not true, © becomes true, which is connected to the -gate in Figure 5. Without loss of generality (using a polynomial transformation), we assume that all conditional effects have the form " q p . Whether the effect p is activated on level V is computed by a circuit as displayed in Figure 6 (b), which shows the circuit for  Figure 6 (b) and (c) dominate the depth of the circuit necessary to represent one plan step leading to the conclusion that a plan step can be represented using a circuit of depth 7. Adding the depth of the goal testing circuit, the claim follows.
The lemma implies that ç ì ¬ Á -step plan acceptance is indeed less powerful than ç 7 í 1-step plan acceptance, which means that a compilation scheme from ç 7 í to ç ì preserving plan size linearly is impossible. implies that we can accept all language in NC£ by (possibly non-uniform) ACs circuits, which is impossible by the result of Furst and colleagues (1984).
Using the Propositions 4 and 5 again, we can generalize the above theorem as follows. preserving plan size linearly.

Compilability Preserving Plan Size Polynomially
As has been shown in the previous section, only the compilation schemes induced by Propositions 4 and 5 and the ones identified in Section 4 allow for compilation schemes preserving plan size exactly. For all other pairs of formalisms we were able to rule out such compilation schemes-even if we allow linear growth of the resulting plans. Nevertheless, there might still be a chance for compilation schemes preserving plan size polynomially. Having shown that ç 7 í ë s ì and ç 7 í ë cannot be compiled to the other formalisms even if the plan can grow polynomially, we may still be able to find compilation schemes preserving plan size polynomially for the ç 7 í ë s ì /ç 7 í ' ë pair and for the remaining formalisms.
A preview of the results of this section is given in Figure 7. As it can be seen, we are able Figure 7: Equivalence classes of planning formalisms created by polynomial-time compilation schemes preserving plan size polynomially. Compilation schemes constructed in this section are indicated by dashed lines to establish compilation schemes preserving plan size polynomially for all pairs of formalisms for which we have not proved the impossibility of such compilation schemes.

Compiling Conditional Effects Away for Partial State Specifications
The first compilation scheme we will develop is one from ç í ë s ì to ç í ë In order to simulate the parallel behavior of conditional effects, we have to break them up into individual operators that are executed sequentially. This means that for each conditional effect of an operator we introduce two new operators. One simulates the successful application of the rule, the other one simulates the "blocking" situation of the rule. At least one of these operators must be executed for each conditional effect in the original operator. This is something we can force by additional literals that are added to control the execution of operators. All in all this leads to a sequence of operators that has length bounded by the number of conditional effects in the original operator.
If we want to simulate the parallel behavior by a sequence of unconditional operators, the effects of the unconditional operators should not directly influence the state description, but the effect should be deferred until all operators corresponding to the set of conditional effects have been executed. For this reason, we will use a sequence of "copying operators" which copy the activated effects to the state description after all "conditional operators" have been executed. These "copying operators" can also be used to check that the set of activated effects is consistent.
, the compilation scheme introduces a number of new operators. The first operator we introduce is one which checks whether the conditional effects of the previous operators have all been executed, no copying is in progress and the precondition is satisfied. If this is the case, the execution of the conditional effects for this operator is started: This operator enables all the "conditional effect operators." For the activated effects, we introduce the following operators: In words, if the effect condition is entailed, then the activated positive or negative effect as well as the fact that the rule has been tried is recorded.
Since there is at most one effect literal for each conditional effect, a conditional effect is "blocked" if the negation of the effect condition is entailed by the state specification. For all "blocked conditional effects" we introduce the following operators: In order to check that all conditional effects have been tried (activating the corresponding effect or not activating it because the conditional effect is blocked), the following operator is used: This operator enables copying of the activated effects to the state specification, which is achieved with the following set of operators for each atom Finally, we need an operator that checks that all possible effects have been copied. This operator also starts the "execution cycle" again by enabling the execution of another "precondition operator: Using these definitions, we can now specify the set of compiled operators: The scheme û obviously satisfies conditions (2) and (3) for compilation schemes and all the functions can be computed in polynomial time. Further, is consistent, then it appears to be very unlikely that we are able to identify a compilation scheme that preserves plan size polynomially.

Compiling Conditional Effects Away for Complete State Specifications
The next compilation scheme compiles ç 7 í ì to ç 7 í and ç ì to ç . Since we deal with complete state specification, we do not have to take care of the condition w ( 6 h V p ost( , which is always true for complete states. This makes the compilation scheme somewhat simpler. Since ç does not allow for general boolean formulae, the scheme becomes a little bit more difficult. In general, however, the compilation scheme we will specify is very similar to the one given in the proof of Theorem 20. structure. This means that we do not assume the effects to be unique for each conditional effect.
In addition, we assume the same set symbols for the compiled domain structure as in the proof of Theorem 20: , and v i as in the proof of Theorem 20. In addition, the following operators are needed:

Parallel Execution Models and the Feasibility of Compilation Schemes Preserving Plan Size Polynomially
While compilation schemes that preserve plan size exactly or linearly seem to be of immediate use, a polynomial growth of the plan appears to be of little practical interest. Considering the practical experience that planning algorithms can roughly be characterized by their property of how many steps they can plan without getting caught by the combinatorial explosion and the fact that this number is significantly smaller than 100, polynomial growth does not seem to make much sense. If we take GRAPHPLAN (Blum & Furst, 1997) into consideration again-the planning system that motivated our investigation in the first place-it turns out that this system allows for the parallel execution of actions. Although parallel execution might seem to add to the power of the planning system considerably, it does not affect our results at all. If a sequential plan can solve a planning instance with ß steps, a parallel plan will also need at least ß actions. Nevertheless, although the size of a plan (measured in the number of operations) might be the same, the number of time steps may be considerably smaller-which might allow for a more efficient generation of the plan. Having a look at the compilation scheme that compiles conditional effects away, it seems to be the case that a large number of generated actions could be executed in parallel-in particular those actions that simulate the conditional effects.
However, the semantics of parallel execution in GRAPHPLAN is quite restrictive. If one action adds or deletes an atom that a second action adds or deletes or if one action deletes an atom that a second action has in its precondition, then these two actions cannot be executed in parallel in GRAPHPLAN. With this restriction, it seems to be impossible to compile conditional effects away preserving the number of time steps in a plan. However, a compilation scheme that preserves the number of time steps linearly seems to be possible. Instead of such a compilation scheme, the approaches so far either used an exponential translation (Gazen & Knoblock, 1997) or modified the GRAPHPLAN-algorithm in order to handle conditional effects (Anderson et al., 1998;Koehler et al., 1997;Kambhampati et al., 1997). These modifications involve changes in the semantics of parallel execution as well as changes in the search procedure. While all these implementations are compared with the straightforward translation Gazen and Knoblock (1997) used, it would also be interesting to compare them with a compilation scheme based on the ideas spelled out in Theorem 22 as the base line.

Summary and Discussion
Motivated by the recent approaches to extend the GRAPHPLAN algorithm (Blum & Furst, 1997) to deal with more expressive planning formalisms (Anderson et al., 1998;Gazen & Knoblock, 1997;Kambhampati et al., 1997;Koehler et al., 1997), we asked what the term expressive power could mean in this context. One reasonable intuition seems to be that the term expressive power refers to how concisely domain structures and the corresponding plans can be expressed. Based on this intuition and inspired by recent approaches in the area of knowledge compilation (Gogic et al., 1995;Cadoli et al., 1996;Cadoli & Donini, 1997), we introduced the notion of compilability in order to measure the relative expressiveness of planning formalisms. The basic idea is that a compilation scheme can only transform the domain structure, i.e., the symbol set and the operators, while the initial state and the goal specification are not transformed-modulo some small changes necessary for technical reasons. Further, we distinguish compilation schemes according to whether the plan in the target formalism has the same size (up to an additive constant), a size bounded linearly by the size of the plan in the source formalism, or a size bounded polynomially by the original planning instance and the original plan.
Although the compilability framework appears to be a straightforward and intuitive tool for measuring the expressiveness of planning formalisms, it is possible to come up with alternative measures. Bäckström (1995), for instance, proposed to use ESP-reductions, which are polynomial many-one reductions on planning problems that preserve the plan size exactly. However, requiring that the transformation should be polynomial-time computable seems to be overly restrictive. In particular, if we want to prove that one formalism is not as expressive as another one, we had better proven that there exists no compilation scheme regardless of how much computational resources the compilation process may need. Furthermore, there appear to be severe technical problems to using Bäckström's (1995) framework for proving negative results. On the other hand, all of the positive results reported by Bäckström are achievable in the compilation framework because the transformations he used are in fact compilation schemes. Taking all this together, it appears to be the case that the compilation framework is superior from an intuitive and technical point of view.
Another approach to judging the expressiveness of planning formalisms has been proposed by colleagues (1994, 1996). They measure the expressiveness of planning formalisms according to the set of plans a planning instance can have. While this approach contrasts hierarchical task network planning nicely with STRIPS-planning, it does not help us in making distinctions between the formalisms in the ç -family. The compilability framework is mainly a theoretical tool to measure how concisely domain structures and plans can be expressed. However, it also appears to be a good measure of how difficult planning becomes when a new language feature is added. Polynomial-time compilation schemes that preserve the plan size linearly indicate that it is easy to integrate the feature that is compiled away. One can either use the compilation scheme as is or mimic the compilation scheme by extending the planning algorithm. If only a polynomial-time compilation scheme leading to a polynomial growth of the plan is possible, then this is an indication that adding the new feature requires most probably a significant extension of the planning algorithm. If even a compilation scheme preserving plan size polynomially can be ruled out, then there is most probably a serious problem integrating the new feature.
Using this framework, we analyzed a large family of planning formalisms ranging from basic STRIPS to formalisms with conditional effects, boolean formulae, and incomplete state specifications. The most surprising result of this analysis is that we are able to come up with a complete classification. For each pair of formalisms, we were either able to construct a polynomial-time compilation scheme with the required size bound on the resulting plans or we could prove that compilation schemes are impossible-even if the computational resources for the compilation process are unbounded. In particular, we showed for the formalisms considered in this paper: » incomplete state specifications and literals in preconditions can be compiled to basic STRIPS preserving plan size exactly, » incomplete state specifications and literals in preconditions and effect conditions can be compiled away preserving plan size exactly, if we have already conditional effects, » and there are no other compilation schemes preserving plan size linearly except those implied by the specialization relationship and those described above.
If we allow for polynomial growth of the plans in the target formalism, then all formalisms not containing incomplete state specifications and boolean formulae are compilable to each other. Incomplete state specifications together with boolean formulae, however, seem to add significantly to the expressiveness of a planning formalism, since these cannot be compiled away even when allowing for polynomial growth of the plan and unbounded resources in the compilation process.
It should be noted, however, that some of these results hold only if we use the semantics for conditional effects over partial state specifications as spelled out in Section 2.1. For other semantics, we may get slightly different results concerning the compilability of conditional effects over partial states.
One question one may ask is what happens if we consider formalisms with boolean formulae that are syntactically restricted. As indicated at various places in the paper, restricted formulae, such as CNF or DNF formulae, can sometimes be easily compiled away. However, there are also cases when this is impossible. For example, it can be shown that CNF formulae cannot be compiled to basic STRIPS preserving plan size linearly (Nebel, 1999), which confirms Bäckström's (1995) conjecture that CNF-formulae in preconditions add to the expressive power of basic STRIPS.
Another question is how reasonable our restrictions on a compilation scheme are. In particular, one may want to know whether non-modular state-translation functions could lead to more powerful compilation schemes. First of all, requiring that the state-translation functions are modular seems to be quite weak considering the fact that a compilation scheme should only be concerned with the domain structure and that the initial state and goal specification should not be transformed at all. Secondly, considering the fact that the state-translation functions do not depend on the operator set, more complicated functions seem to be useless. From a more technical point of view, we need modularity in order to prove that conditional effects and boolean formulae cannot be compiled away preserving plan size linearly. For the conditional effects, modularity or a similar condition seems to be crucial. For the case of boolean formulae, we could weaken the condition to the point that we require only that state-translation functions are computable by circuits of constant depth-or something similar. In any case, the additional freedom one gets from non-modular state-translation functions does not seem to be of any help because these functions do not take the operators into account. Nevertheless, it seems to be an interesting theoretical problem to prove that more powerful state-translation functions do not add to the power of compilation schemes.
Although the paper is mainly theoretical, it was inspired by the recent approaches to extend the GRAPHPLAN algorithm to handle more powerful planning formalisms containing conditional effects. So, what are the answers we can give to open problems in the field of planning algorithm design? First of all, Gazen and Knoblock's (1997) approach to compiling conditional effects away is optimal if we do not want to allow plan growth more than by a constant factor. Secondly, all of the other approaches (Anderson et al., 1998;Kambhampati et al., 1997;Koehler et al., 1997) that modify the GRAPHPLAN algorithm are using a strategy similar to a polynomial-time compilation scheme preserving plan size polynomially. For this reason, these approaches should be compared to a "pure compilation approach" using the ideas from the compilation scheme developed in the proof of Theorem 22 as the base line. Thirdly, allowing for unrestricted boolean formulae adds again a level of expressivity because they cannot be compiled away with linear growth of the plan size. In fact, approaches such as the one by Anderson and colleagues (1998) simply expand the formulae to DNF accepting an exponential blow-up. Again, we cannot do better than that if plan size should be preserved linearly. Fourthly, if we want to add partial state specifications on top of general boolean formulae, this would amount to an increase of expressivity that is much larger than adding conditional effects or general formulae to basic STRIPS, because in this case there is no way to compile this away even if we allow for polynomial plan growth.
Finally, one may wonder how our results apply to planning approaches that are based on translating (bounded) planning problems to propositional logic such as SATPLAN  or BLACKBOX (Kautz & Selman, 1998). Since the entire analysis of the relative expressiveness of planning formalisms uses the assumption that we compile from one planning formalism to another planning formalism, the results do not tell us anything about the size of representations if we switch to another formalism. In particular, it seems possible to find an encoding of (bounded) planning problems with conditional operators in propositional logic which is as concise as an encoding of unconditional operators. The only advice our results give is that such a concise encoding will not be found by first translating conditional actions to unconditional actions and then using the "standard" encoding for unconditional actions (Kautz, McAllester, & Selman, 1996) to generate boolean formulae. However, addressing the problem of determining the conciseness of representation in this context appears to be an interesting and relevant topic for future research.