apple tree logo
apple tree logo
Namesakes
(a subtopic of History)



 

 
How new words come to be - They travel from abroad and migrate from the lab. Sometimes, old words get new meanings; other new ones are just made up!

- Sharon J. Huntington. The Christian Science Monitor (July 16, 2002). Learn about robot ("Borrowing words from literature"), bug ("Old words put on new meanings"), and much more in this fascinating article.

a nametag

Ada Programming Language Bayes Theorem Boolean algebra Boolean Logic Dijkstra's algorithm Eliza Markov Model Markov Chains Bug Occam's Razor Pandemonium Pareto's Principle The 80-20 Rule Pascal's Wager The PascaleneTuring Test Turing Machine Zipf's Law


Augusta Ada Byron, Lady Lovelace (1815-1852) -> Ada Programming Language

  • Ada Byron, Lady Lovelace, An Analyst and Metaphysician (abstract). Betty Alexandra Toole. IEEE Annals of the History of Computing (Fall 1996) Vol. 18, No. 3; pp. 4-12. "The computer revolution also began with a woman, Augusta Ada Byron, Lady Lovelace, who wrote an article in 1843 that not only gave us descriptive, analytical, contextual, and metaphysical information about the Analytical Engine but also the first 'program.'"
  • Ada Byron, Lady Lovelace. A biography by Dr. Betty Toole. From the Biographies of Women Mathematicians Web Site at Agnes Scott College.
  • Ada Home: The Web Site for Ada . "Since March 1994 this server provides a home to users and potential users of Ada, a modern programming language designed to support sound software engineering principles and practices."
  • "Charles Babbage & Ada Byron (Lady Lovelace) worked on programmable mechanical calculating machines." - Brief History of AI.
  • Web watch - Virtual Ada. By Sean Dodson. The Guardian (October 10, 2002). "Ada was the daughter of the poet Lord Byron, and became Countess of Lovelace. She is often credited with being the first computer programmer, and worked with the engineer Charles Babbage, who developed the idea of the Analytical Engine in 1832-34."
  • August 24, 2003: The curious afterlife of Ada Lovelace. By Victoria James. The Japan Times. "Recent years, a century and a half after her death in November 1852 at the age of 36, have witnessed a fierce (and often mudslinging) battle over Ada Lovelace's reputation. ... Now, just as the fuss is dying down in the United States and Britain, the movie that set it all off has come to Japan. 'Conceiving Ada,' directed by Lynne Hershmann Leeson, a professor at the University of California, Davis...."
    • Author's postscript: "Those intrigued by Ada should read 'The Difference Engine' (1990) by William Gibson and Bruce Sterling. This 'what if' account of Victorian England explores what would have happened if Babbage's engines had met with acceptance in his lifetime, ushering in the computer age a century early. Ada, allowed to cheat her early death, becomes the first prophet of artificial intelligence."


Thomas Bayes ( 1702 - 1761) -> Bayes Theorem

  • 18th-century theory is new force in computing. By Michael Kanellos. CNET News.com (February 18, 2003). "Thomas Bayes, one of the leading mathematical lights in computing today, differs from most of his colleagues: He has argued that the existence of God can be derived from equations. His most important paper was published by someone else. And he's been dead for 241 years. Yet the 18th-century clergyman's theories on probability have become a major part of the mathematical foundations of application development. Search giant Google and Autonomy, a company that sells information retrieval tools, both employ Bayesian principles to provide likely (but technically never exact) results to data searches. Researchers are also using Bayesian models to determine correlations between specific symptoms and diseases, create personal robots, and develop artificially intelligent devices that 'think' by doing what data and experience tell them to do."
  • A biography by J. J. O'Connor and E. F. Robertson of the School of Mathematics and Statistics, University of St Andrews, Scotland.
  • Bayesian logic, a whatis definition from TechTarget. "Bayes first proposed his theorem in his 1763 work (published two years after his death in 1761), An Essay Towards Solving a Problem in the Doctrine of Chances. Bayes' theorem provided, for the first time, a mathematical method that could be used to calculate, given occurrences in prior trials, the likelihood of a target occurrence in future trials. According to Bayesian logic, the only way to quantify a situation with an uncertain outcome is through determining its probability."
  • Bayesian Inference - computer applications. From Wikipedia, the free encyclopedia.
  • What is Bayesian Learning? From Part 3 of the Neural Network FAQ, maintained by Warren S. Sarle
  • See our Uncertainty / Probability and Machine Learning pages


George Boole (1815 - 1864) -> Boolean algebra, Boolean Logic

  • George Boole. By J. J. O'Connor and E. F. Robertson, School of Mathematics and Statistics University of St. Andrews, Scotland. "Boole approached logic in a new way reducing it to a simple algebra, incorporating logic into mathematics. He pointed out the analogy between algebraic symbols and those that represent logical forms. It began the algebra of logic called Boolean algebra which now finds application in computer construction, switching circuits etc."
  • The Isaac Newton of logic - It was 150 years ago that George Boole published his classic The Laws of Thought, in which he outlined concepts that form the underpinnings of the modern high-speed computer. By Siobhan Roberts. The Globe and Mail (March 27, 2004; page F9).
  • So, who was George Boole and why is he famous? Background information for the National Institute for Literacy's lesson guide for Venn Diagrams/Sorting Circles & George Boole! "Boolean logic uses words called 'operators' . There are three main 'operators': the words AND, OR, and NOT."
  • The Calculus of Logic. By George Boole. Cambridge and Dublin Mathematical Journal Vol. III (1848), pp. 183-98. (Transcribed by D.R. Wilkins, School of Mathematics Trinity College, Dublin.)

Edsger Wybe Dijkstra (1930 - 2002) -> Dijkstra's algorithm

  • Dr. Dijkstra is best known for his shortest-path algorithm, a method for finding the most direct route on a graph or map, and for his work as the co-designer of the first version of Algol 60, a programming language that represented one of the first compiler programs that translates human instructions. The shortest-path algorithm, which is now widely used in global positioning systems and travel planning, came to him one morning in 1956 as he sat sipping coffee on the terrace of an Amsterdam cafe. It took him three years to publish the method, which is now known simply as Dijkstra's algorithm. At the time, he said, algorithms were hardly considered a scientific topic. " From his obituary in The New York Times, by John Markoff August 10, 2002 (no fee reg. req'd).
  • Route Planning. From the Computational Intelligence Research Laboratory (CIRL) of the University of Oregon. "A common problem studied particularly in robotics and artificial intelligence involves calculating navigational routes from a starting point to a destination point for an entity to pursue through a given space. The process typically involves discretizing the navigational space into a graph of intermediate waypoints linked together through single-step transitions. In typical implementations of systems to solve such problems, waypoints are modeled as nodes of a graph, while transition paths between waypoints become arcs connecting the nodes. Representation of waypoints (states) and transitions (arcs) is typically highly domain dependent. Algorithms such as A*, IDA*, Recursive Best First Search (RBFS), and Dijkstra's Algorithm may be used to search for a solution that fits desired criteria."
  • How Routing Algorithms Work. By Roozbeh Razavi. Howstuffworks. "In [Dijkstra shortest path] algorithm, a router, based on information that has been collected from other routers, builds a graph of the network. This graph shows the location of routers in the network and their links to each other. Every link is labeled with a number called the weight or cost. This number is a function of delay time, average traffic, and sometimes simply the number of hops between nodes. ..."
  • Also see: In Pursuit of Simplicity - the manuscripts of Edsger W. Dijkstra

Eliza Doolittle -> Eliza, the chatterbot

  • Talk to her - Artificial intelligence vs. human stupidity. By Victoria James. The Japan Times (November 23, 2003). "The earliest chatterbot programs ever written say more about the human condition than they do about the nature of computer intelligence. The first, ELIZA -- or Dr. Eliza, as 'she' was known -- had the persona of a Rogerian psychotherapist. Her successor, perhaps the inspiration for Marvin, the 'paranoid android' of Douglas Adams' anarchic 'The Hitchhiker's Guide to the Galaxy' novels, was named PARRY and was programmed to display the behavioral hallmarks of a paranoid schizophrenic. ... [Joseph] Weizenbaum recognized that Alan Turing's 'Imitation Game' test of computer intelligence required merely that the computer simulate intelligence, so he used some simple semantic tricks to create the desired effect. (It's no coincidence that his program shares the name of Eliza Doolittle, the erstwhile heroine of George Bernard Shaw's 'Pygmalion,' a flower girl trained up to act like a lady in a perfect example of an 'imitation game.') ... In 1994, the term 'chatterbot' was established in the AI lexicon by Michael Mauldin of Carnegie Mellon University, in his account of entering the Loebner contest."
  • Visit our collection of chatterbots . . . and find out more about Alan Turing.

Andrei Andreyevich Markov (1856 - 1922) -> Markov Model, Markov Chains . . .

  • "Markov is particularly remembered for his study of Markov chains, sequences of random variables in which the future variable is determined by the present variable but is independent of the way in which the present state arose from its predecessors." From the biography by J. J. O'Connor and E. F. Robertson of the School of Mathematics and Statistics, University of St Andrews, Scotland.
  • "His name is best known for the concept of the Markov chain, a series of events in which the probability of a given event occurring depends only on the immediately previous event." From his Biography.com bio.
  • "A related technique, called Hidden Markov models, allows probability to anticipate sequences. A speech recognition application, for example, knows that the sound most likely to follow "q" is "u." Along those lines, the software can also calculate the possible utterance of the word Qagga, an extinct zebra." - from 18th-century theory is new force in computing. By Michael Kanellos. CNET News.com (February 18, 2003).
  • Hidden Markov Models Tutorial from the School of Computing, University of Leeds. "[T]he type of system we will consider in this tutorial. * First we will introduce systems which generate probabalistic patterns in time, such as the weather fluctuating between sunny and rainy. * We then look at systems where what we wish to predict is not what we observe - the underlying system is hidden. In the above example, the observed sequence would be the seaweed and the hidden system would be the actual weather. * We then look at some problems that can be solved once the system has been modeled."
  • Also see our pages such as: Natural Language Processing, Speech, Planning & Scheduling, Software (Hidden Markov Models), Uncertainty/Probability


Moth (1947) -> Bug

  • "One day a computer failure had [Grace Murray] Hopper and her team baffled. Finally they opened the machine - a moth had gotten inside! Hopper taped the offending creature into her log book and noted beside it, 'first actual bug found.' She is credited with the terms 'bug' and 'debug' for computer errors and how to fix them." From PBS' A Science Odyssey: People and Discoveries - Grace Murray Hopper (1906 - 1992).
  • First instance of actual computer bug being found - September 9, 1945. "This day in History" from the Computer History Museum. " At 3:45 p.m., Grace Murray Hopper records the first computer bug in her log book...."
  • See a photo of the first computer bug. From the Smithsonian National Museum of American History's Computer History Collection.
  • Also see Sherri Danis' biography of Grace Murray Hopper, from the "collection of materials relating to the history of computing ... provided courtesy of the Department of Computer Science at Virginia Tech, and ... sponsored in part by a grant from the National Science Foundation."
  • Grace Hopper links from Grace Hopper Celebrations
  • ... and a computer scientist sent us a note telling us about these two resources:
    • word IQ Encyclopedia's definition of Computer bug. Excerpt: "Usage of the term 'bug' to describe inexplicable defects has been a part of engineering jargon for many decades; it may have originally been used in hardware engineering to describe mechanical malfunctions. Problems with radar electronics during World War II were referred to as bugs (or glitches), and there is evidence that the usage dates back much earlier. This mention can be found in a letter from Edison to an associate in 1878:

      'It has been just so in all of my inventions. The first step is an intuition, and comes with a burst, then difficulties arise -- this thing gives out and [it is] then that 'Bugs' -- as such little faults and difficulties are called -- show themselves and months of intense watching, study and labor are requisite before commercial success or failure is certainly reached.'"

    • James S. Huggins' Refrigerator Door: First Computer Bug. Excerpt: "But, lets go way, way back to Shakespeare. In Henry VI, part III, Act V, Scene II, King Edward says 'So, lie thout there. Die though; and die our fear; For Warwick was a bug that fear'd us all.'"


William of Ockham (1285 - 1347/49) ->  Occam's Razor

  • "His best-known philosophical contributions are ... and his deployment in theology of the rule of ontological economy, 'entities are not to be multiplied beyond necessity', so frequently and to such effect that it came to be known as Ockham's razor." From his Biography Online bio.
  • Occam's Razor. From Wikipedia, the free encyclopedia. "In its simplest form, Occam's Razor states that one should make no more assumptions than needed. When multiple explanations are available for a phenomenon, the simplest version is preferred."
  • Occam's Razor (Specialized to Decision Trees) in Howard J. Hamilton's Overview of Decision Trees.
  • see our Razor 'toon

Pandemonium, the capital of Hell in John Milton's Paradise Lost  ->  Oliver Selfridge's classic paper, Pandaemonium (a/k/a Pandemonium)

  • "The word Pandemonium can be either upper or lower case. The uncapitalized term names 'a tumult or wild uproar,' while the capitalized version refers to 'the infernal regions, or to the capital of Hell in John Milton's Paradise Lost.' When Milton coined Pandemonium for his epic poem, he combined the Greek pan meaning 'all, or every,' with the Latin daemonium, or 'evil spirit.'" - from Merriam-Webster's Word for the Wise, broadcast of September 5, 2001: "We recently heard from a fellow interested in the story behind the word pandemonium...." You can also listen to it!
  • "Walt Bunch believes the term [demon / daemon] comes from the demons in Oliver Selfridge's paper 'Pandemonium', MIT 1958, which was named after the capital of Hell in Milton's 'Paradise Lost'. Selfridge likened neural cells firing in response to input patterns to the chaos of millions of demons shrieking in Pandemonium." - from the definition of "demon" in FOLDOC.
    • "Demons (parts of programs) are particularly common in AI programs. For example, a knowledge-manipulation program might implement inference rules as demons." - from the definition of "demon" in FOLDOC.
  • "Pandemonium - Model of feature detection developed by Selfridge (1959): originally to recognise Morse code patterns, but later developed by Lindsay and Norman (1972) into a bottom-up theory of letter recognition. Quite apart from its usual meaning, the word 'pandemonium' refers to the dwelling-place of all the demons. Selfridge made use of both meanings of the word in his model, in which neuronsor neural clusters 'shriek' to indicate the presence of particular features of the perceived stimulus." - from the Psybox Online Dictionary's definition of "Pandemonium"
  • Agents: from Pandemonium to ... whither? - Oliver Selfridge. "His pandemonium paper of 1958 is recognized as the beginning of breakthroughs in several fields." - Fifth Annual New Paradigms for Using Computers Workshop, IBM Almaden Research Center.
  • Oliver Selfridge - entry in Wikipedia, the free encyclopedia.
  • " Pandemonium consists of four separate layers: each layer is composed of 'demons' specialized for specific tasks. The bottom layer consists of data or image demons that store and pass on the data. The third layer is composed of computational demons that perform complicated computations on the data and then pass the results up to the next level. The second layer is composed of cognitive demons who weight the evidence from the computational demons and 'shrie' the amount of evidence up to the top layer of the network. The more evidence that is accumulated, the louder the shriek. At the top layer of the network is the decision demon, who simply listens for the loudest 'shriek' from the cognitive demons, and then decides what was presented to the network. - from A Brief History of Connectionism, by David A. Medler.
  • Also see: Agents (including Agent Architecture), Cognitive Science, Vision, Neural Networks and MAS

Vilfredo Pareto -> Pareto's Principle -> The 80-20 Rule

  • Vilfredo Pareto, 1848-1923, a biographical sketch with links to major works and resources. From Gonçalo L. Fonseca's The History of Economic Thought Website
  • The 80/20 Rule of Time Management - This technique teaches you to focus on what's really important in your life and your life's work. By Pamela J. Vaccaro, MA. Family Practice Management (September 2000). "Vilfredo Pareto, an Italian economist, 'discovered' this principle in 1897 when he observed that 80 percent of the land in England (and every country he subsequently studied) was owned by 20 percent of the population. Pareto's theory of predictable imbalance has since been applied to almost every aspect of modern life. Given a chance, it can make a difference in yours. Simply put, the 80/20 rule states that the relationship between input and output is rarely, if ever, balanced. When applied to work, it means that approximately 20 percent of your efforts produce 80 percent of the results."
  • "Rosenfeld: Rules don't work so well in IA either; in fact, the 'right' answer to any tricky IA question is 'It depends.' The only IA 'rule"' that comes to mind is the Pareto Principle, a.k.a. the 80/20 rule, which really isn't a rule at all. The way Pareto works in IA is basically that some large number of users will benefit from a small selection of all the possible architectural approaches. So pick the few best ways that give you and your users the most bang for your buck. There are many other variations on Pareto; for example, the few most common searches constitute the vast majority of all searches (and can be addressed by manually developed 'best bet' results). If you don't believe me, just check your search logs." - from Information Architecture Meets Usability, Bruce Stewart interviews Lou Rosenfeld and Steve Krug. O'Reilly Network (May 13, 2003).
  • also see Zipf's Law

Blaise Pascal (1623 - 1662) -> Pascal's Wager & The Pascalene

  • "'Pascal's Wager' is the name given to an argument due to Blaise Pascal for believing, or for at least taking steps to believe, in God. ... We find in it the extraordinary confluence of several strands in intellectual thought: the justification of theism; probability theory and decision theory, used here for almost the first time in history; pragmatism; voluntarism (the thesis that belief is a matter of the will); and the use of the concept of infinity." - from the Stanford Encyclopedia of Philosophy
  • "1623-62, French scientist and religious philosopher." -from Bartleby.com, with links to other sources.
  • The Pascalene: In 1642 he "created an adding machine with automatic carries from one position to the next. The son of a tax collector, Pascal devised a machine that contained several dials that could be turned with the aid of a stylus." See a photo of thismachine when you visit this page from the IEEE Computer Society's timeline of events in computing history.

Alan Mathison Turing (1912 - 1954) -> Turing Test, Turing Machine

  • 'Father of the computer' honoured. BBC News (June 7, 2004). "The father of the modern computer is being honoured, 50 years after he died in tragic circumstances."
  • Man who cracked computer engima. Opinion by Andrew Hodges. Edinburg Evening News / available from Scotsman.com News (June 8, 2004). "In 1944, following the invasion of Normandy that Allied control of the Atlantic allowed, Alan Turing was almost uniquely in possession of three key ideas - his own 1936 concept of the universal machine, the potential speed and reliability of electronic technology and the inefficiency in designing different machines for different logical processes. Combined, these ideas provided the principle, the practical means and the motivation for the modern computer. ... From October 1947, the National Physical Laboratory allowed, or perhaps preferred, that he should spend the academic year at Cambridge. Out of this came a pioneering paper on what would now be called neural nets. ... Though marginalised in practice, he published his theoretical ideas on artificial intelligence in 1950 in a paper which is now one of the most quoted in science. His 'Turing Test' for intelligent machinery now has a long and entertaining history."
  • "While addressing a problem in the arcane field of mathematical logic, he imagined a machine that could mimic human reasoning. Sound familiar?" Read Alan Turing's entry in TIME's 100 Scientists & Thinkers.
  • Alan Turing - Thinking Up Computers. The Cambridge University mathematician laid the foundation for the invention of software. By Andy Reinhardt. BusinessWeek Online (May 10, 2004). ["As part of its anniversary celebration, BusinessWeek is presenting a series of weekly profiles for the greatest innovators of the past 75 years."] "The rarefied world of early 20th-century mathematics seems light years away from today's PCs and virtual-reality video games. Yet it was a 1936 paper by Cambridge University mathematician Alan M. Turing that laid the foundation for the electronic wonders now crowding into every corner of modern life. In a short and eventful life, Turing also played a vital role in World War II by helping crack Germany's secret codes -- only to be persecuted later for his homosexuality. ... Turing invoked the notion of a 'universal machine' that could be given instructions to perform a variety of tasks. Turing spoke of a "machine" only abstractly, as a sequence of steps to be executed. But his realization that the data fed into a system also could function as its directions opened the door to the invention of software. ... Turing didn't live to see the revolution he unleashed. But he left an enormous legacy. In 1950 he proposed a bold measure for machine intelligence: If a person could hold a typed conversation with 'somebody' else, not realizing that a computer was on the other end of the wire, then the machine could be deemed intelligent. Since 1990 an annual contest has sought a computer that can pass this 'Turing Test.'"
  • "In 1924, he published a paper proving that mathematics would always contain statements that could neither be proven nor refuted. As part of his argument, he envisioned a machine that could compute any number. This machine, which included a control unit and a memory, could perform several basic actions: reading, writing or erasing symbols on a tape, and advancing or rewinding the tape. This simple 'Turing machine' served as the model for all later digital computers." From Biography's entry for Alan Turing.
  • "A Turing machine is an abstract representation of a computing device. It consists of a read/write head that scans a (possibly infinite) one-dimensional (bi-directional) tape divided into squares, each of which is inscribed with a 0 or 1. Computation begins with the machine, in a given 'state', scanning a square. It erases what it finds there, prints a 0 or 1, moves to an adjacent square, and goes into a new state." From the "Turing Machine" entry in the Stanford Encyclopedia of Philosophy.
  • Turing Machine. "A theoretical computing machine invented by Alan Turing (1937) to serve as an idealized model for mathematical calculation." - By Eric W. Weisstein, in MathWorld--A Wolfram Web Resource.
  • Alan Turing: a very comprehensive web site maintained by Andrew Hodges. Here's a page from the site about Turing Machines.
  • The First Hacker and his Imaginary Machine. Chapter 3 of the 1985 edition of Howard Rheingold's Tools for Thought (The MIT Press). "The Turing Machine was a hypothetical device Turing invented on the way to settling a critical question about the foundations of mathematics as a formalized means of thinking."
  • For information about the Turing Test, see our page: Turing Test.

George Kingsley Zipf (1902-1950) -> Zipf's Law

  • "George Kingsley Zipf was a Harvard linguist who in the 1930s noticed that the distribution of words adhered to a regular statistical pattern. The most common word in English -- 'the' -- appears roughly twice as often in ordinary usage as the second most common word, three times as often as the third most common, ten times as often as the tenth most common, and so on. As an afterthought, Zipf also observed that cities' sizes followed the same sort of pattern, which became known as a Zipf distribution. Oversimplifying a bit, if you rank cities by population, you find that City No. 10 will have roughly a tenth as many residents as City No. 1, City No. 100 a hundredth as many, and so forth. (Actually the relationship isn't quite that clean, but mathematically it is strong nonetheless.) Subsequent observers later noticed that this same Zipfian relationship between size and rank applies to many things: for instance, corporations and firms in a modern economy are Zipf-distributed." -from Seeing Around Corners, by Jonathan Rauch. The Atlantic (April 2002).
  • Tunes create context like language - Maths shows why tonal music is easy listening. By Philip Ball. NATURE Science Update (June 19, 2004). "In both written text and speech, the frequency with which different words are used follows a striking pattern. In the 1930s, American social scientist George Kingsley Zipf discovered that if he ranked words in literary texts according to the number of times they appeared, a word's rank was roughly proportional to the inverse of its frequency. In other words, a graph of one plotted against the other appeared as a straight line. The economist and sociologist Herbert Simon later offered an explanation for this mathematical relationship. He argued that as a text progresses, it creates a meaningful context within which words that have been used already are more likely to appear than other, random words. For example, it is more likely that the rest of this article will contain the word 'music' than the word 'sausage'."
  • References on Zipf's law from Wentian Li of Rockefeller University.
  • His biography from Virtual Laboratories in Probability and Statistics.
  • Zipf's Law, by Dr. Richard S. Wallace of the A.L.I.C.E. AI Foundation. "The Zipf curve is a characteristic of human languages, and many other natural and human phenomena as well. ... The Zipf curve was even known in the 19th century. The economist Pareto also noticed the log-rank property in studies of corporate wealth."

More Namesakes:

  • Deep Blue: see our Chess page.
  • Heisenbugs: "For at least three decades now, programmers have joked of 'heisenbugs' -- software errors that surface at seemingly random intervals and whose root causes consistently evade detection. The name is a takeoff on Werner Heisenberg, the German physicist whose famous uncertainty principle posited that no amount of observation or experimentation could pinpoint both the position and momentum of an electron." - excerpt from: Computer, heal thyself - Why should humans have to do all the work? It's high time machines learned how to take care of themselves. By Sam Williams. Salon.com (July 12, 2004; no fee reg. req'd.).
  • I, Robot / iRobot : "After enduring plenty of lean years chasing that elusive vision as a co-founder and chairman of iRobot Corp., [Helen] Greiner can now boast a product that whirs and chirps much like the character she to this day calls her 'personal hero.' The Roomba vacuum cleaner.... Greiner stresses the PackBot's defensive role, but technologies that IRobot and other defense contractors are developing are expected to lead to front-line robots — including unarmed reconnaissance rovers that lead soldiers into buildings and help direct gunfire, and armed and autonomous robots that do the shooting themselves. ... Such prospects have raised ethical concerns, and run counter to a principle -- that robots should not harm humans -- outlined by classic science fiction author Isaac Asimov in his 1950 anthology, 'I, Robot' -- the namesake of Greiner's company." - from Robots Tackle Living Room and Battlefield. By Mark Jewell. Associated Press (May 30, 2005) / available from The Los Angeles Times / also available from HoustonChronicle.com (iRobot co-founder's perseverance pays off).

Related Pages: