Skip to main content Accessibility help
  • Print publication year: 2016
  • Online publication date: March 2016

7 - Designed versus Intrinsic Computation

from Part Two - The Computation of Processes, and Not Computing the Brain


Top-down versus bottom-up design

The design of modern products commonly follows a well-defined path, such as for example the human-centered approach used by the famous IDEO design consultancy, Kelley (2001). Design has turned into a process that can be studied, formalized, learned, and improved. As Tom Kelley (2001) explains, IDEO uses a methodology that has the following five basic steps: understand, observe, visualize, evaluate and refine, implement. Two things are important to notice in this process. First, there is an implicit “loop” for the evaluation and refinement process. No product is perfect from the beginning and most of them will need to be refined and re-evaluated over and over again. Second, the process is organized in a very top-down manner, i.e., it is a step-wise process that starts with a high-level goal, target, or vision and then gradually follows down in the hierarchy by specifying, evaluating, and implementing the lower-level (sub)systems. That is the way engineers design cars, airplanes, computers, and whatnot. The most important reason why top-down design has been so successful in the engineering world is divide et impera, or in more contemporary words, divide and conquer. Top-down design allows us to handle complex problems and to treat subsystems as abstract black boxes. While this generally works well in the engineering world, an entirely different design process can be observed in nature. Nature proceeds in a bottom-up way by simple trial and error. The process is called evolution and has led to amazing, albeit by no means perfect, ‘designs. ’ As the name says, bottom-up design proceeds from the bottom up by piecing subsystems into something bigger. Often, the functionality of the resulting system is greater than the sum of its parts. In that context, the term emergence (Holland, 1999) comes up frequently. Loosely speaking, emergence means that the interactions of the subsystem components do not necessarily allow us to predict the behavior at the next level. There is an element of surprise (Ronald et al., 1999) involved, which describes that non-predictibility nicely.

Today, the computing disciplines face difficult challenges from extending CMOS technology to and beyond dimensional scaling. One solution path is to use an innovative combination of novel manufacturing techniques, devices, compute paradigms, and architectures to create new information processing technology that may be radically different from what we currently know.

L., Bϋsing, B., Schrauwen, and R., Legstein. Connectivity, dynamics, and memory in reservoir computing with binary and analog neurons.Neural Computation, 22:1272–1311, 2010.
J.P., Crutchfield, W.L., Ditto, and S., Sinha. Introduction to focus issue: intrinsic and designed computation: information processing in dynamical systems – beyond the digital hegemony. Chaos, 20:037101, 2010.
R.A., Freitas Jr. and R.C., Merkle. Kinematic Self-Replicating Machines.Landes Bioscience, 2004.
A., Goudarzi, C., Teuscher, N., Gulbahce, and T., Rohlf. Emergent criticality through adaptive information processing in Boolean networks. arXiv:1104.4141, 2011. In revision.
J.H., Holland. Emergence: From Chaos to Order. Perseus Books, 1999.
ITRS (International Technology Roadmap for Semiconductors), update. Semiconductor Industry Association,, 2009.
S.A., Kauffman. Metabolic stability and epigenesis in randomly connected genetic nets. Journal of Theoretical Biology, 22:437–467, 1968.
S.A., Kauffman. The Origins of Order: Self–Organization and Selection in Evolution. Oxford University Press, 1993.
T., Kelley. The Art of Innovation. Currency Books, 2001.
L.B., Kish. End of Moore's law: Thermal (noise) death of integration in micro and nano electronics. Physics Letters A, 305:144–149, 2002.
E., Lazowska, M., Pollack, D., Reed, and J., Wing. Boldly exploring the endless frontier. Computing Research News, 21(1):1, 6, 2009.
W., Maass, T., Natschläger, and H., Markram. Real-time computing without stable states: a new framework for neural computation based on perturbations. Neural Computation, 14(11):2531–2560, 2002.
T., Rohlf, N., Gulbahce, and C., Teuscher. Damage spreading and criticality in finite random dynamical networks. Physical Review Letters, 99(24):248701, 2007.
E.M.A., Ronald, M., Sipper, and M.S., Capcarrere. Design, observation, surprise! A test of emergence. Artificial Life, 5(3):225–239, 1999.
P.T., Saunders, editor. Collected Works of A.M. Turing: Morphogenesis. North-Holland, 1992.
A., Sinha, M.S., Kulkarni, and C., Teuscher. Evolving nanoscale associative memories with memristors. In Proceedings of the 11th International Conference on Nanotechnology (IEEE Nano 2011), pp. 860–864, IEEE, 2011.
D.B., Strukov, G.S., Snider, D.R., Stewart, and R.S., Williams. The missing memristor found. Nature, 453(7191):80–83, 2008.
C., Teuscher. Turing's Connectionism. An Investigation of Neural Network Architectures. Springer-Verlag, 2002.
C., Teuscher. Turing's connectionism. In Alan Turing: Life and Legacy of a Great Thinker,C., Teuscher (ed.), pp. 499–530. Springer-Verlag, 2004.
C., Teuscher, I., Nemenman, and F.J., Alexander (eds.). Novel computing paradigms: Quo vadis?Physica D, 237(9):1157–1316, 2008.
A.M., Turing. Intelligent machinery. In Machine Intelligence,B., Meltzer and D., Michie (eds.). Vol. 5, pp. 3–23. Edinburgh University Press, 1969.
W., Wolf. Modern VLSI Design: Systems on Silicon. Prentice Hall, 2nd edition, 1998.
V.V., Zhirnov and D.J.C., Herr. New frontiers: Self-assembly in nanoelectronics. IEEE Computer, 34:34–43, 2001.