Is there a general structure for grammars?

Summary form only given. Linguists have proposed dozens of formalisms for grammars and now vision is weighing in with its versions based on its needs. Ulf Grenander has proposed general pattern theory, and has used grammar-like graphical parses of "thoughts" in the style of AI. One wants a natural, simple formalism treating all these cases. I want to pose this as a central problem in modeling intelligence. Pattern theory started in the 70's with the ideas of Ulf Grenander and his school at Brown. The aim is to analyze from a statistical point of view the patterns in all "signals" generated by the world, whether they be images, sounds, written text, DNA or protein strings, spike trains in neurons, time series of prices or weather, etc. Pattern theory proposes that the types of patterns-and the hidden variables needed to describe these patterns - found in one class of signals will often be found in the others and that their characteristic variability will be similar. The underlying idea is to find classes of stochastic models which can capture all the patterns that we see in nature, so that random samples from these models have the same "look and feel" as the samples from the world itself. Then the detection of patterns in noisy and ambiguous samples can be achieved by the use of Bayes' rule, a method that can be described as "analysis by synthesis".