1 I n t r o d u c t i o n Reducibility is one of the most basic notions in complexity theory. I t provides a fundamental tool for comparing the computat ional complexity of different problems. The key idea is to use a program for a device tha t solves one problem as a subroutine within the computat ion of another problem /7. If this is possible, /7 is said to be reducible to Z . Reductions provide the possiblity to conclude upper bound results on the computat ional complexity of p r o b l e m / 7 and lower bounds for Z , if one insists tha t the program f o r / 7 designed around the subroutine for Z respects certain resource complexity bounds of interest. In the past , a great variety of different reducibility notions has been invest igated in order to get a bet ter understanding of the different computat ional paradigms and /o r resource bounds. Here we only mention polynomialt ime Turing reducibility, log-space reducibility, polynomial projection reducibility, and NCl-reducibi l i ty (see, e.g., [Lee90], [BDG88]). This great variety of different reducibility notions is a consequence of the fact tha t the computat ional power available to the reduction must not be stronger than the computat ional power of the complexity class under consideration. Otherwise, the possibility of hiding * We are grateful to DAAD ACCIONES INTEGRADAS, grant Nr. 322-ai-e-dr Reischuk, Morvan (Eds.): STACS'97 Proceedings, LNCS 1200 (~) Springer-Verlag Berlin Heidelberg 1997 214 C. Meinel and A. Slobodov~ some essential computations within the reduction is a threat to the relevance of the obtained results. For example, polynomial-time reducibility does not give any insight into the computational complexity of logarithmic-space bounded computations. The computational power implementable in a reducibility notion for complexity classes defined in terms of very restricted computational models (e.g., eraser Turing machines [KMW88], real-time branching programs [KW87], or ordered binary decision diagrams [Bry92]) becomes extremely limited, since in general, almost all of the resources are consumed already by the programs which are used as subroutines in the reductions. Hence, the traditional approach results in reducibility concepts which enable to relate merely highly similar problems (e.g., [BW96]). Since complexity classes defined by such restricted models are interesting for the theory they occur in connection with our limited abilities in proving lower bounds [e.g., KMW88, KW87, Mei89] and of practical importance ordered binary decision diagrams are the state of the art data structure for computer aided circuit design [Bry92, BCMD, Bry95] it is highly desirable to develop more powerful reducibility concepts. Here, we consider the case of complexity classes defined by ordered binary decision diagrams (OBDDs), (i.e. read-once binary decision diagrams with a fixed variable ordering). We attempt to overcome the difficulties mentioned above by introducing a new reducibility concept that is based on the following idea: A problem H is reducible to a problem Z if an OBDD for H can be constructed from a given OBDD for ~ by applying a sequence of elementary operations (here 'elementary' means 'performable in constant time'). In contrast to previous reducibility notions the suggested one is able to reflect the real needs of a reducibility concept in the context of OBDD-based complexity classes: Firstly, it allows to reduce those problems to each other which are computable with the same amount of OBDD-resources, and, secondly, it allows to carry over lower and upper bounds. Although well-motivated, a reducibility based on sequences of elementary operations is difficult to describe and to handle since it has to deal with permanently changing OBDDs. For this reason, we develop a formalism which allows a more 'static' description in terms of the so-called OBDD-transformer. We prove that the size of an OBDD which is obtained by the application of a sequence of elementary operations can be estimated in terms of the sizes of the original OBDD and of the corresponding OBDD-transformer. Hence, the formalism gives a solid basis for complexity theoretic investigations. 2 N o t a t i o n s and P r e l i m i n a r i e s Let X~ denote the set {xx, x2,..., xn ) of Boolean variables. A variable ordering on Xn is a total order on X~ and is described by a permutation of the index set I~ = {1,...,n), i.e. xi < xj iff ~r-l(i) < ~r-l(j). Throughout the paper, we will work with the extension of an ordering to constants false and true which are defined to be maximal (false and true become incomparable). Identity defines the so-called natural ordering. Two orderings (possibly defined on different variable sets) are
[1]
Randal E. Bryant,et al.
Efficient implementation of a BDD package
,
1991,
DAC '90.
[2]
Leslie G. Valiant,et al.
A complexity theory based on Boolean algebra
,
1981,
22nd Annual Symposium on Foundations of Computer Science (sfcs 1981).
[3]
Christoph Meinel,et al.
Separating the eraser Turing machine classes I+, NIL,, co-N!+_ and P,
,
1991
.
[4]
Ingo Wegener,et al.
Reduction of OBDDs in Linear Time
,
1993,
Inf. Process. Lett..
[5]
Beate Bollig,et al.
Read-once Projections and Formal Circuit Verification with Binary Decision Diagrams
,
1996,
STACS.
[6]
Carl A. Gunter,et al.
In handbook of theoretical computer science
,
1990
.
[7]
Stephan Waack,et al.
Exponential Lower Bounds for Real-Time Branching Programs
,
1987,
FCT.
[8]
Randal E. Bryant,et al.
Graph-Based Algorithms for Boolean Function Manipulation
,
1986,
IEEE Transactions on Computers.