Mike Kestemont bio photo

Mike Kestemont

Academic researcher

Email Twitter Facebook Google+ LinkedIn Github

Quotes

Overview

  • “I always thought of myself as a humanities person as a kid, but I liked electronics,” [Steve Jobs] said. “Then I read something that one of my heroes, Edwin Land of Polaroid, said about the importance of people who could stand at the intersection of humanities and sciences, and I decided that’s what I wanted to do.” It was as if he were suggesting themes for his biography (and in this instance, at least, the theme turned out to be valid). The creativity that can occur when a feel for both the humanities and the sciences combine in one strong personality was the topic that most interested me in my biographies of Franklin and Einstein, and I believe that it will be a key to creating innovative economies in the twenty-first century. (Steve Jobs, W. Isaacson, 2012).
  • What we see is that scientists must deal with unexpected findings virtually all the time. One of the most important places that they anticipate the unexpected is in designing experiments. They build many conditions and controls into their experiments. Thus, rather than being the victims of the unexpected, they create opportunities for unexpected events to occur, and once these events do occur, they have specific reasoning strategies for determining which of these events will be a clue to a new discovery. They focus on the method, using local analogies, and only after repeated demonstration of the unexpected event will they switch to the use of new theoretical explanations using more distant analogies and generalizations. Scientists are not passive recipients of the unexpected; rather, they actively create the conditions for discovering the unexpected and have a robust mental toolkit that makes discovery possible (Dunbar, K., & Fugelsang, J. (2005). Causal thinking in science: How scientists and students interpret the unexpected. In M. E. Gorman, R. D. Tweney, D. Gooding & A. Kincannon (Eds.), Scientific and Technical Thinking (pp. 57-79). Mahwah, NJ: Lawrence Erlbaum Associates).
  • And I designed a system I thought fit the problem. I broke everything down into the smallest parts and tried to think of each person as a number in a gigantic equation. […] But it wasn’t working, because people aren’t like numbers. They’re more like letters and those letters want to become stories and dad said that stories need to be shared. I had anticipated a six minute visit with each person named Black, but there were never just 6 minutes. Everyone took more time than I had planned for (Extremely Loud and Incredibly Close [screenplay adaptation], 2011).
  • Science has welcomed big data and scaled its methods accordingly. With a huge amount of digital-textual data, we must do the same. Close reading is not only impractical as a means of evidence gathering in the digital library, but big data render it totally inappropriate a as method of studying literary history (Jockers, M., Macroanalysis. Digital Methods and Literary history. University of Illinois Press, 2013).
  • “What I cannot create, I do not understand.” (R. Feynman, on his blackboard at the time of death in February 1988)
  • “Any sufficiently advanced technology is indistinguishable from magic.” (Arthur C. Clarke).
  • “The best minds of my generation are thinking about how to make people click ads. That sucks.” (J. Hammerbacher).
  • [The Argument from Consciousness] is very, well expressed in Professor Jefferson’s Lister Oration for 1949, from which I quote. “Not until a machine can write a sonnet or compose a concerto because of thoughts and emotions felt, and not by the chance fall of symbols, could we agree that machine equals brain-that is, not only write it but know that it had written it. No mechanism could feel (and not merely artificially signal, an easy contrivance) pleasure at its successes, grief when its valves fuse, be warmed by flattery, be made miserable by its mistakes, be charmed by sex, be angry or depressed when it cannot get what it wants.” This argument appears to be a denial of the validity of our test [i.e. the Turing test]. According to the most extreme form of this view the only way by which one could be sure that machine thinks is to be the machine and to feel oneself thinking (A. Turing, Computing Machinery and Intelligence, 1950).