AbstractIn this paper, monotone Boolean functions are studies using harmonic analysis on the cube. The main result is that

anymonotone Boolean function has most of its power spectrum on its Fourier coefficients of ``degree'' at mostO(sqrt(n)) underanyproduct distribution. This is similar to a result of Linial et al. [1993], which showed that AC^0 functions have almost all of their power spectrum on the coefficients of degree, at most (logn)^{O(1)}, under the uniform distribution. As a consequence of the main result, the following two corollaries are obtained:The learning algorithms runs in time subexponential as long as the required error is Omega(1/(sqrt(

- For any
e> 0, monotone Boolean functions are PAC learnable with erroreunder product distributions in time 2^{\tilde{O}((1/e)sqrt(n))}.- Any monotone Boolean function can be approximated within error
eunder product distributions by a non-monotone Boolean circuit of size 2^{\tilde{O}(1/esqrt(n))} and depth \tilde{O}(1/esqrt(n)).n) logn)). It is shown that this is tight in the sense that for any subexponential time algorithm there is a monotone Boolean function for which this algorithm cannot approximate with error better than \tilde{O}(1/sqrt(n)).The main result is also applied to other programs in learning and complexity theory. In learning theory, several polynomial-time algorithms for learning some classes of monotone Boolean functions, such as Boolean functions with

O(log^2n/log logn) relevant variables, are presented. In complexity theory, some questions regarding monotone NP-complete problems are addressed.The abstract is also available as a LaTeX file, a DVI file, or a PostScript file.

Categories and Subject Descriptors: F.1.2 [Computation by Abstract Devices]: Modes of Computation --probabilistic computation; F.2.1 [Analysis of Algorithms and Problem Complexity]: Numerical Algorithms and Problems --computation of transforms,computation on polynomials; G.3 [Probability and Statistics] --probabilistic algorithms; I.2.6 [Artificial Intelligence]: Learning --concept learning

General Terms: Algorithms, Theory

Additional Key Words and Phrases: Approximation, circuits, complexity, Fourier transform, functions, harmonic analysis learning, monotone Boolean, monotone circuits

Selected references

- William Aiello and Milena Mihail. Learning the Fourier spectrum of probabilistic lists and trees. In Proceedings of the Second Annual ACM-SIAM Symposium on Discrete Algorithms, pages 291-299, San Francisco, California, 28-30 January 1991.

- Avrim Blum, Merrick Furst, Jeffrey Jackson, Michael Kearns, Yishay Mansour, and Steven Rudich. Weakly learning DNF and characterizing statistical query learning using Fourier analysis. In Proceedings of the Twenty-Sixth Annual ACM Symposium on the Theory of Computing, pages 253-262, Montréal, Québec, Canada, 23-25 May 1994.

- Jehoshua Bruck and Roman Smolensky. Polynomial threshold functions, AC^0 functions, and spectral norms. SIAM Journal on Computing, 21(1):33-42, February 1992.

- Jeffrey Jackson. An efficient membership-query algorithm for learning DNF with respect to the uniform distribution. In 35th Annual Symposium on Foundations of Computer Science, pages 42-53, Santa Fe, New Mexico, 20-22 November 1994. IEEE.

- Jeff Kahn, Gil Kalai, and Nathan Linial. The influence of variables on Boolean functions (extended abstract). In 29th Annual Symposium on Foundations of Computer Science, pages 68-80, White Plains, New York, 24-26 October 1988. IEEE.

- Michael Kearns. Efficient noise-tolerant learning from statistical queries. In Proceedings of the Twenty-Fifth Annual ACM Symposium on the Theory of Computing, pages 392-401, San Diego, California, 16-18 May 1993.

- Michael Kearns and Leslie G. Valiant. Cryptographic limitations on learning Boolean formulae and finite automata. In Proceedings of the Twenty First Annual ACM Symposium on Theory of Computing, pages 433-444, Seattle, Washington, 15-17 May 1989.

- Eyal Kushilevitz and Yishay Mansour. Learning decision trees using the Fourier sprectrum (extended abstract). In Proceedings of the Twenty Third Annual ACM Symposium on Theory of Computing, pages 455-464, New Orleans, Louisiana, 6-8 May 1991.

- Eyal Kushilevitz and Yishay Mansour. Learning decision trees using the Fourier spectrum. SIAM Journal on Computing, 22(6):1331-1348, December 1993.

- Nathan Linial, Yishay Mansour, and Noam Nisan. Constant depth circuits, Fourier transform and learnability. Journal of the ACM, 40(3):607-620, July 1993.

- Nathan Linial, Yishay Mansour, and Noam Nisan. Constant depth circuits, Fourier transform, and learnability. In 30th Annual Symposium on Foundations of Computer Science, pages 574-579, Research Triangle Park, North Carolina, 30 October-1 November 1989. IEEE.

- Yishay Mansour. Randomized interpolation and approximation of sparse polynomials. SIAM Journal on Computing, 24(2):357-368, April 1995.

- Leslie G. Valiant. A theory of the learnable. Communications of the ACM, 27(11):1134-1142, November 1984.