Happy 65th birthday, your computer

by Matthew Cobb

The machine you are reading this post on can trace its conceptual ancestry back to the city where I live and work – Manchester. 65 years ago, on 21 June 1948, the world’s first computer that could store a programme in its electronic memory – Random Access Memory or RAM – was turned on in a red brick building. Developed by ‘Freddie’ Williams, Tom Kilburn and Geoff Tootill, the machine was able to store 2048 digits on a cathode ray tube. This development marked the beginning of programming.

The machine’s official name was ‘the Manchester Small Scale Experimental Machine’; its nickname was ‘The Baby’. Despite its name, it filled a room:

The Small Scale Electronic Machine, University of Manchester

To commemorate this 65th anniversary, Google has produced this brief video explaining what was involved and why it was such a great leap forward in the history of computing. It includes archive footage and interviews with many of the men who were involved.

Interestingly (but perhaps not surprisingly), this 1961 US Army phylogenetic tree of computers, completely ignores The Baby. This (very large) 1975 poster of the history of computers has a broader view, and covers all the different sources and routes to the development of the modern computer, including The Baby.

At the time, people were well aware of the significance of what was going on in Manchester. In 1946, Norbert Wiener, the mercurial genius who was in the process of developing the concepts of cybernetics, visited the UK. After chatting to the geneticist J B S Haldane in London, Wiener made the smoky journey up to Manchester. ‘I found that Manchester was well at the front of the new technique of high-speed automatic computing machines’ he wrote in his autobiography.

On that visit to England, Wiener also met Alan Turing, who had already theorised the idea of a programmable computer as the ‘universal machine’. At the beginning of 1948, Turing joined the University of Manchester and began to use The Baby and its successors, ultimately using the machines to try and understand patterns of organismal development, through what he termed ‘morphogens’. Some of Turing’s programmes are still intact and were displayed in a recent Turing centenary exhibition about his work on morphogens, at the Manchester Museum.

In an intriguing ‘what if’, the pioneer of virus genetics, Max Delbrück, came within a gnat’s whisker of joining the University of Manchester at the same time. However, after accepting the offer, he  changed his mind and went to Caltech instead. In a parallel universe, the histories of molecular genetics, developmental biology and computing all took a very different turn as Turing and Delbrück interacted in unimaginable ways.

The Baby was soon cannibalised and within 18 months was out of date; nothing remains of the original machine. But to mark the 50th anniversary, a working replica was built, which can be seen at the Museum of Science and Industry (MOSI) in Manchester, which is housed in the world’s first passenger railway station.

If you visit Manchester, be sure to come and see the working copy, and also the blue plaque on the side of the Bridgeford Street building where The Baby was built and eventually turned on:

 

http://farm9.staticflickr.com/8430/7510182174_441f5df0dd_z.jpg

27 Comments

  1. Derek
    Posted June 22, 2013 at 12:18 pm | Permalink

    Yes, the US Army poster is significantly deficient in dealing with the English computers, such as the Elliott series – one of which I trained on at Victoria University in New Zealand in the late 1960s.

    • Jim Jones
      Posted June 22, 2013 at 1:17 pm | Permalink

      Aaah, the Elliot 503 and Algol 68, replaced by the Boroughs B6500 with card reader and chain printer – and COBOL.

      I was offered the Elliot for free if I’d haul it out. I declined. I already had my TRS-80 Model I.

  2. Gordon Hill
    Posted June 22, 2013 at 12:54 pm | Permalink

    One of the more interesting historical lapses in computer history is the minuscule mention of analog computers and their merging with digital computers as hybrids, without which the space program–which forced the development of higher speed, lower power, lighter weight digital components–would have taken much longer. While low in accuracy, the Milgo 4100 had a component accuracy of only 0.01%, their computing components included real time and compressed time integration capability which allowed flight simulation models to be developed and refined in days, not months.

    Not that this trumps the work done in Manchester, but I was there in the early sixties and, as Jimmy Durante said, “Everybody wants to get into the act.” It was a computer fest. Thanks for the reminder.

    • Torbjörn Larsson, OM
      Posted June 22, 2013 at 3:16 pm | Permalink

      More generic hybrids are reentering the market with D-Waves quantum annealing machine. We’ll see if they stick around for long. (Seeing the history you describe, I doubt it.)

    • Stephen P
      Posted June 23, 2013 at 2:55 am | Permalink

      I can understand that the omission is galling for people who actually worked with them. But let’s face it: the usage of analogue computers was small, specialised and brief. The Milgo you mention came out in the mid-1960s. And indeed when I first started reading up on computers in the late sixties and early seventies, practically every article mentioned the existence of both digital and analogue computers. But I entered computing in 1975 and, although I had the fortune to do a huge variety of computing work, I never came across an analogue computer, met anyone who was using an analogue computer or even (AFAICR) met anyone who ever had used an analogue computer. So the omission isn’t that surprising.

  3. Jim Jones
    Posted June 22, 2013 at 1:13 pm | Permalink

    And:

    http://en.wikipedia.org/wiki/Konrad_Zuse

    • Torbjörn Larsson, OM
      Posted June 22, 2013 at 3:29 pm | Permalink

      Indeed, A Turing complete machine should be equivalent to a stored-program machine. Zuse is always forgotten outside of Europa, I think.

      The next meta-computing achievement was programmable hardware, which has its place too. (You are starting to feel ROI constraints.)

  4. BigBob
    Posted June 22, 2013 at 3:27 pm | Permalink

    Quote
    This development marked the beginning of programming.
    Unquote

    But that was the Jacquard loom surely. And even that had its precursors.

    “The ability to change the pattern of the [Jacquard] loom’s weave by simply changing cards was an important conceptual precursor to the development of computer programming. Specifically, Charles Babbage planned to use cards to store programs in his Analytical engine”.

    From Wikipedia natch’.
    Bob(Big)

    • Michael Fisher
      Posted June 22, 2013 at 4:43 pm | Permalink

      from the Wiki [sorry MC] on the Stored-program computer

      Many early computers, such as the Atanasoff–Berry Computer, were not reprogrammable. They executed a single hardwired program. As there were no program instructions, no program storage was necessary. Other computers, though programmable, stored their programs on punched tape which was physically fed into the machine as needed.

      In 1936 Konrad Zuse also anticipated in two patent applications that machine instructions could be stored in the same storage used for data.

      The University of Manchester’s Small-Scale Experimental Machine [SSEM] is generally recognized as world’s first electronic computer that ran a stored program — an event that occurred on June 21, 1948. However the SSEM was not regarded as full-fledged computer, more a proof of concept that was built on to produce the Manchester Mark 1 computer. On May 6, 1949 the EDSAC in Cambridge ran its first program, and due to this event, it is considered “the first complete and fully operational regular electronic digital stored-program computer”

    • madscientist
      Posted June 22, 2013 at 4:59 pm | Permalink

      Yes – although the Jacquard loom was a programmable device and not at all a computing device. Charles Babbage is a strange footnote in the history of computing – he had many grand ideas but he didn’t get far.

  5. madscientist
    Posted June 22, 2013 at 3:34 pm | Permalink

    For me it’s pretty obvious why the 1961 US Army chart is missing ‘Baby’ and many other computers: the chart shows computers commissioned for the DoD. What’s curious for me is that ‘MANIAC’ is missing from the tree, but maybe that’s because it still employed huge arrays of mechanical relays.

    “The first with storage” isn’t really a big deal – many independent groups at the time were working on electronic storage for computers and there were many solutions implemented in working machines. How many computers actually used the storage technique employed at Manchester?

  6. Grania Spingies
    Posted June 22, 2013 at 4:12 pm | Permalink

    I did not know this.

    Well, cheers to both of them. They made the world a much better place.

  7. Michael Fisher
    Posted June 22, 2013 at 4:49 pm | Permalink

    The British LEO was the first computer used for commercial business applications in the world [according to above link]. Overseen by Oliver Standingford and Raymond Thompson of J. Lyons and Co., and modelled closely on the Cambridge EDSAC, LEO I ran its first business application in 1951.

  8. Posted June 22, 2013 at 5:27 pm | Permalink

    Reblogged this on Sarvodaya and commented:
    Computers have come a very long way. It’s so easy to take these vital and ubiquitous devices for granted. Imagine how much more they will change within another 65 years.

  9. Posted June 22, 2013 at 6:53 pm | Permalink

    The first use of computers for biology may be one of Turing’s simulations of distribution of morphogens, but a more certainly datable candidate is Maurice Wilkes and his EDSAC staff solving a differential equation for the shape of a cline of gene frequencies at the request of R.A. Fisher. Fisher’s paper was published in 1950 in Biometrics and cryptically thanked Wilkes and his people for computing the curve.

    Wilkes wrote a paper in 1975 entitled “How Babbage’s dream came true” that mentioned this computation, which must have occurred within a few months of EDSAC starting operation. Interestingly, the differential equations in Fisher’s and Turing’s problems are fairly similar.

  10. phil
    Posted June 22, 2013 at 7:52 pm | Permalink

    I’m surprised Colossus hasn’t been mentioned.

    “Colossus was the first combining digital, (partially) programmable, and electronic.”

    http://en.wikipedia.org/wiki/Colossus_computer

    • Posted June 23, 2013 at 5:34 am | Permalink

      That’s exactly what I was thinking. I’m confused. I thought Colossus was the first computer and was invented at Bletchley during WWII. Or am I missing something?

      • Posted June 23, 2013 at 5:52 am | Permalink

        It all depends on your definition of ‘computer’. Note the “(partially) programmable” business in the Wiki quote you give. Vanavar Bush was developing similar calculating machines at the same time. The Baby could store a programme in its electronic memory, rather than only doing something it was told to by cards or tape or being hardwired. That’s why it was a step forward.

  11. Posted June 23, 2013 at 8:01 am | Permalink

  12. peterr
    Posted June 23, 2013 at 2:36 pm | Permalink

    Good old Manchester! My grad studies were there, not so long after that, and even if they hadn’t been, I’d like to think that I’d feel as strongly as I do that this achievement deserves a lot of credit.

    However, perhaps ‘provincially’ as a mathematician, ‘the computer was invented’, if such a statement must be made, basically by Alan Turing in the mid ’30’s in Britain, and also very nearly by Alonzo Church in U.S. when he invented the lambda-calculus, completely independently of each other, at almost the same time. They independently solved in the negative a Hilbert problem about finding an automatic method for discovering all mathematics. But the Turing machine is much more of a direct model for a material computer than is Church’s invention. Others invented notions which captured the ability ‘to compute everything which is computable’, but no one had that idea prior to those two.

    As far as actual concrete machines are concerned, there are all sorts of squabbles about priority for that as a form of ‘first to invent the computer’, and always will be. However, it seems clear to me that the construction of “Baby”, or any other realistic candidate for the first actual hardware, simply would not have happened without using the real mathematical (even philosophical, pace Ben and others here!) ideas. And someone else would quite soon afterwards have produced similar hardware if, say, the Manchester pioneers hadn’t, of course based on the mathematical notion of computability.

    To hammer it home: without the Turing machine or equivalent, no actual computer, period; with it, the hardware would have come into existence quite soon, one way or another. As a joke, for those who insist on empirical evidence for the last statement, you merely need to examine the culture other somewhat advanced intelligent forms of life in the universe.

    • Dominic
      Posted June 24, 2013 at 2:55 am | Permalink

      These things – inventions, scientific developments – never come out of a vacuum – there is always ‘something in the air’ that means lots of people are working on a similar problem in slightly different ways.

  13. Dominic
    Posted June 24, 2013 at 2:58 am | Permalink

    Prepare yourselves for the Hollywoodisation of Turing – with added sex-appeal –

    http://www.telegraph.co.uk/culture/film/film-news/10101013/Keira-Knightley-may-join-to-Benedict-Cumberbatch-in-Turing-biopic.html

    Pathetic.

    • Posted June 24, 2013 at 9:31 am | Permalink

      I’m not sure this is *necessarily* added. Turing did try to “play straight” for a bit (and he did have female friends), but it didn’t work, needless to say. So I’d wait to see. Also, having anything mainstream about Turing *is* in a way welcome.

  14. Allautin@gmail.com
    Posted June 24, 2013 at 7:06 am | Permalink

    A beief review, in part, of the contribution of John von Neumann, and John Mauchly to early computer development can be found at Eugene Garfield’s Comments Page:

    http://www.garfield.library.upenn.edu/essays/v14p032y1991.pdf

    The review gestures also strongly to the tragic ending of these two geniuses. Particularly the feeling on Mauchly’s part re: the eclipse of his contribution as he lay dying from his final illness.

  15. Launcher
    Posted June 24, 2013 at 3:12 pm | Permalink

    My (American) computer celebrated its 65th birthday two years ago*. But, it loves to party, so here’s to “The Baby”.

    * http://en.wikipedia.org/wiki/Eniac

  16. Posted June 24, 2013 at 4:08 pm | Permalink

    Reblogged this on Science Technology Engineering Ireland and commented:
    ..Wow Geek Heaven !! haha…. this machine had an effective CPU speed of about 1000 instructions per second and was the worlds first stored program computer … by comparison a modern Core I7 processor in a desktop PC is capable of about 177,730,000,000 instructions per second !!


Follow

Get every new post delivered to your Inbox.

Join 27,585 other followers

%d bloggers like this: