Saturday, August 31, 2019
Othello: Literary Essay on Iago Essay
Humans are born with a natural capacity for good and evil. As an individual develops, he or she is taught to distinguish between the two in order to strengthen a sense of right and wrong. Through Shakespeareââ¬â¢s play, ââ¬ËOthelloââ¬â¢ (Heinemann Advanced Shakespeare, 2000) we are introduced to a meticulously devious character, Iago. Ironically affiliated with the military, a substandard moral compass, channels his unreasonable code of conduct. Well educated on human behaviour, his lack of ethics enables his character to exploit this skill set to deceive those around him, ultimately disguising his true character. Between self-perception, opposing character perceptions and audience perception, Iago portrays a brilliantly, manipulative villain whose two dimensional nature is concealed beneath sheets of false sympathy, integrity and empathy. Beyond all else, Iago considers himself a man of true wit and distinction. In Act I, he furiously declares, ââ¬Å"I know my priceâ⬠(p. 3). By this statement, he admits to Rodrigo, that upon oversight for the position of first lieutenant, his dedicated service and efforts have been in vain. Having served at the front lines of the war, Iago adamantly deems that his echelon of experience proves him a more deserving candidate, than young Cassio. More so, an over confidence in his abilities, lead to the belief that he is of a superior authority, ââ¬Å"I am worth no worse a placeâ⬠(ibid). Continuing his argument of non promotion, Iago sees it unbearable, even shameful that he has been regarded with such insignificance. A man of his status and self imposed importance should not be degraded to such a pitiable standard. In having lost out on the appointment, Iagoââ¬â¢s pride and dignity have been considerably impaired. Once a devoted soldier under Othelloââ¬â¢s command, Iago believed it was his rightful place to gain promotion to lieutenant. When these expectations were not met, it brought out the worst in Iago and the decision was interpreted as a personal offence to his proficiency. Although having already, risen to adequate ranks, Iago could not fathom the injustice. His superior and extensive knowledge had been undermined. In comparison to the present era, this form of ââ¬Ëinjusticeââ¬â¢ is experienced frequently in professions where favoured parties lose out on coveted positions to seemingly less experienced oppositions. Victims feel at loss and that their competence has been jeopardised. Due to varying circumstances, these conceited opinions cannot be voiced as it translates into bad character and conduct. Consequently, Iago set out on a conquest for supremacy. Uncannily, abiding by modern day author, Robert Greeneââ¬â¢s, ââ¬Å"[Law] 3: Conceal your intentionsâ⬠, of ââ¬Å"The 48 Laws of Powerââ¬â¢ (1998), detailing observational aspects profitable in a rise to command. Within the community of Cyprus, Iago gave no obvious reason for opposing characters to suspect his cunning. In this respect, he was constantly referred to as ââ¬Å"Honest Iagoâ⬠(p. 41). Specifically, Othello, affectionately used the term, ââ¬Å"Honest Iago, My Desdemona must I leave to thee. I prithee let thy wife attend to herâ⬠(ibid). Not only does Othello impart trust with ease, but he mistakes Iago as a dear friend. One who would never think to bring harm to Desdemona or he. This misplacement of faith encourages Iago in his surreptitious endeavours. On several more occasions we witness other characters mistakenly confer with and about Iagoââ¬â¢s sincere nature. Following a grave night for Cassio, he bids a farewell of, ââ¬Å"Good night, honest Iagoâ⬠(p. 99) unaware of Iagoââ¬â¢s key role in his ruin. Desdemona continues the chain of mistrust when she speaks, ââ¬Å"O, thatââ¬â¢s an honest fellowâ⬠(p. 113) of Iagoââ¬â¢s phony grievances of Ca ssioââ¬â¢s unfortunate predicament. An individualââ¬â¢s nature can be concealed in various forms within the company he keeps, ââ¬Å"Law 12: Use selective honesty and generosity to disarm your victimsâ⬠(Greene, 1998). Iago, around such company manipulates and deceives his companions that he is faithful. Having grasped an understanding of human behavior, this wisdom is disposed through cruelty and brutality. In persuading the Cyprus population of his sincerity, unknown to the other characters, Iago fuels his perception of his superiority. Such is his wit that for a much of the play, his deceit goes undetected. When it appears that a character may unknowingly expose Iagoââ¬â¢s ulterior motives, he buries the potential realisation beneath artificial concern. Among the audience, the consensus is that Iago is purely composed of arrogance, jealousy and hostility. Triumphant in his succession to provide false evidence of Cassioââ¬â¢s incompetence, Iago announces, ââ¬Å"And whatââ¬â¢s he then that says I play the villain, When this advice is free I give and honestâ⬠(Shakespeare, p. 99). Arrogance radiates from his gloating manner, of having provided Cassio with ingenious and true advice. Jealousy held as incentive to which Iago brought about the demotion and discredit of Cassio. Moreover, this envy fuelled his ploy in planting a seed of convincing doubt to which Othello was wrongly mislead that his wife, Desdemona, was unfaithful. Not only in determining the detriment of the Moorââ¬â¢s marriage, Iago displayed hostility through his own relationship with Emilia, ââ¬Å"To have a foolish wifeâ⬠(p. 133), consistent with yet another law, ââ¬Å"Law 20: Do not commit to anyoneâ⬠, (Greene, 1998). Only to the audience is Iagoââ¬â¢s evil nature revealed for a significant portion of the play. We are exposed to his ulterior motives, lies and ultimate betrayal, ââ¬Å"I follow him to serve my turn upon himâ⬠(Shakespeare, p. 5). Unlike any other character in the play, Iago is not fazed by the level of destruction he causes, neither to whom. In subsequent scenes, viewers witness the steady yet gradual downfall of the Moor. Unsuspecting, characters, indirectly aid Iagoââ¬â¢s cause, particularly his ignorant, praise seeking wife, Emilia, which furthermore increases his arrogance of his unmatched brilliance. To the audience each move Iago makes is calculated methodically and only adds to the current chaos of the present scene. Iago possesses an astute two dimensional disposition. His objective throughout the play is to ensure the Moorââ¬â¢s eventual ruin. He becomes the bane of Othelloââ¬â¢s existence, through conniving, unforgivable means, only recognized by the audience if not himself. Born of a natural capacity for good and evil, it is evident that Iago acquires no good, a concept hidden from other characters within the play. In relation to the present, Iago demonstrates the extremity to which an individual will go in order to sabotage the happiness and prosperity of an enemy. Persecutors hold a high opinion of themselves, as Iago displayed. Yet, by all means this strategy to pursue such an attack is achieved through clandestine, surreptitious modes. Iago can distinguish between right and wrong but still, he opts for the latter. As Nobel Prize laureate, William Golding, once said, ââ¬Å"We need more humanity, more care, more love. There are those who expect a political system to produce that; and others who expect the love to produce the system.â⬠(Nobel Lecture, Dec 7, 1983). Humanity is defenceless against the erroneous nature of individuals like Iago. Repeatedly throughout ââ¬ËOthelloââ¬â¢ (Shakespeare, 2000), his master scheme endures unnoticed. As children, we are prompted to ââ¬Ërecognise virtue and viceââ¬â¢ (Zak. J. P. The Moral Molecule, 2011) to instil a sense of ethicality, nonetheless, alongside such vindictive figures like Iago, the ââ¬Ësystemââ¬â¢ (Nobel Lecture, 1983), political or military, regresses on any potential advancement, stunting ââ¬Ëhumanity, more care, more loveââ¬â¢ (ibid). Humanity is susceptible to evil, regardless encouragement otherwise. As long as rouges exist, an eternal battle will ensue between good and bad, unmistakably demonstrated through I agoââ¬â¢s performance. Bibliography Novel References Gray, C. (Series 2000). Othello. Heinemann Advanced Shakespeare. London: Briddles Ltd. Website References Golding, G. W. (Dec 7, 1983). Nobel Lecture. Retrieved Aug 27, 2013, from http://www.nobelprize.org/nobel_prizes/literature/laureates/1983/golding-lecture.html Keltner, D. (2007-08). The Power Paradox. Retrieved Aug 28, 2013 from http://greatergood.berkeley.edu/article/item/power_paradox Unknown Author. (No date). The 48 Laws of Power. Retrieved Aug 28, 2013 from http://en.wikipedia.org/wiki/The_48_Laws_of_Power Zak, J. P. (Feb 10, 2011). The Moral Molecule: Are Humans Good or Evil?. Retrieved Aug 28, 2013 from http://www.psychologytoday.com/blog/the-moral-molecule/201102/are-humans-good-or-evil
Friday, August 30, 2019
History of Digital Computer
The History of Digital Computers B. RANDELL Computing Laboratory, University of Newcastle upon Tyne This account describes the history of the development of digital computers, from the work of Charles Babbage to the earliest electronic stored program computers, It has been prepared for Volume 3 of ââ¬Å"lââ¬â¢Histoire Generale des Techniques,â⬠and is in the main based on the introductory text written by the author for the book ââ¬Å"The Origins of Digital Computers: Selected Papersâ⬠(Springer Verlag, 1973). . Charles Babbage THE first electronic digital computers were completed in the late 1940ââ¬â¢s. In most cases their developers were unaware that nearly all the important functional characteristics of these computers had been invented over a hundred years earlier by Charles Babbage. It was in 1821 that the English mathematician Charles Babbage became interested in the possibility of mechanising the computation and printing of mathematical tables.He successfully constructed a small machine, which he called a ââ¬Å"difference engine,â⬠capable of automatically generating successive values of simple algebraic functions by means of the method of finite differences. This encouraged him to plan a full-scale machine, and to seek financial backing from the British government. During the next 12 years both Babbage and the government poured considerable sums of money into the attempt at building his Difference Engine.However the project, which called for the construction of six interlinked adding mechanisms, each capable of adding two multiple-digit decimal numbers, together with an automatic printing mechanism, was considerably beyond the technological capabilities of the era ââ¬â indeed it has been claimed that the efforts expended on the Difference Engine were more than justified simply by the improvements they generated in mechanical engineering equipment and practice.Although Babbageââ¬â¢s plans for a Difference Engine were somewha t premature, the basic scheme was vindicated when in 1843, inspired by their knowledge of his work, George and Edvard Scheutz successfully demonstrated a working prototype difference engine. A final version of this model was completed 10 years later, with financial assistance from the Swedish government. Several other difference engines ere constructed in the decades that followed, but such machines never achieved the importance of more conventional calculating machines, and when multi-register accounting machines became available in the 1920ââ¬â¢s it was found that these could be used essentially as difference engines. However Babbageââ¬â¢s ideas soon progressed far beyond that of a special-purpose calculating machine ââ¬â in fact almost as soon as he started work on his Difference Engine he became dissatisfied with its limitations.In particular he wished to avoid the need to have the highest order of difference constant, in order to be able to use the machine directly fo r transcendental as well as algebraic functions. In 1834 Babbage started active work on these matters, and on problems such as division and the need to speed up the part of the addition mechanism which dealt with the assimilation of carry digits. He developed several very ingenious methods of carry assimilation, but the time savings so obtainable would have been at the cost of a considerable amount of complex machinery.This led Babbage to realise the advantages of having a single centralised arithmetic mechanism, the ââ¬Å"mill,â⬠separate from the ââ¬Å"figure axes,â⬠i. e. , columns of discs which acted merely as storage locations rather than accumulators. Babbageââ¬â¢s first idea for controlling the sequencing of the various component mechanisms of the engine was to use ââ¬Å"barrels,â⬠i. e. , rotating pegged cylinders of the sort used in musical automata. He first planned to use a set of subsidiary barrels, with over-all control of the machine being specifi ed by a large central barrel with exchangeable pegs.However in June 1836 he took the major step of adopting a punched card mechanism, of the kind found in Jacquard looms, in place of the rather limited and cumbersome central barrel. He did so in the realisation that the ââ¬Å"formulaeâ⬠which specified the computation that the machine was to perform could therefore be of almost unbounded extent, and that it would be a simple matter to change from the use of one formula to another.Normally formula cards, each specifying an arithmetic operation to be performed, were to be read by the Jacquard mechanism in sequence, but Babbage also envisaged means whereby this sequence could be broken and then recommenced at an earlier or later card in the sequence. Moreover he allowed the choice of the next card which was to be used to be influenced by the partial results that the machine had obtained.These provisions allowed him to claim that computations of indefinite complexity could be perf ormed under the control of comparatively small sets of formula cards. Babbage talked at one time of having a store consisting of no less than 1000 figure axes, each capable of holding a signed 40-digit decimal number, and planned to provide for reading numbers from cards into the store, and for punching or printing the values of numbers held in the store.The movement of numbers between the mill and the store was to be controlled by a sequence of ââ¬Å"variable cards,â⬠each specifying which particular figure axis was involved. Therefore an arithmetic operation whose operands were to be obtained from the store and whose result was to be returned to the store would be specified by an operation card and several variable cards. He apparently intended these different kinds of control cards to be in separate sequences, read by separate Jacquard mechanisms.Thus in the space of perhaps 3 years Babbage had arrived at the concept of a general purpose digital computer consisting of a sto re, arithmetic unit, punched card input and output, and a card-controlled sequencing mechanism that provided iteration and conditional branching. Moreover although he continued to regard the machine, which he later came to call the Analytical Engine, as being principally for the construction of mathematical tables, he had a very clear grasp of the conceptual advances he had made.Basing his claim on the unbounded number of operation and variable cards that could be used to control the machine, the ease with which complicated conditional branches could be built from a sequence of simple ones, and the fact that automatic input and output, and multiple precision arithmetic, were provided, he stated that ââ¬Å". . . it appears that the whole of the conditions which enable a finite machine to make calculations of unlimited extent are fulfilled in the Analytical Engine . . . . I have converted the infinity of space, which was required by the conditions of the problem, into the infinity of time. Because separate, but associated, sequences of cards were needed to control the Analytical Engine the concept of a program as we know it now does not appear very c1early in contemporary descriptions of the machine. However there is evidence that Babbage had realised the fact that the information punched on the cards which controlled the engine could itself have been manipulated by an automatic machine-for example he suggested the possibility of the Analytical Engine itself being used to assist in the preparation of lengthy sequences of control cards.Indeed in the description of the use of the Analytical Engine written by Lady Lovelace, in collaboration with Babbage, there are passages which would appear to indicate that it had been realised that an Analytical Engine was fully capable of manipulating symbolic as well as arithmetical quantities. Probably Babbage himself realised that the complete Analytical Engine was impractical to build, but he spent much of the rest of his l ife designing and redesigning mechanisms for the machine.The realisation of his dream had to await the development of a totally new technology, and an era when the considerable finances and facilities required for an automatic computer would be made available, the need at last being widely enough appreciated. He was a century ahead of his time, for as one of the pioneers of the modern electronic digital computer has written: ââ¬Å"Babbage was moving in a world of logical design and system architecture, and was familiar with and had solutions for problems that were not to be discussed in the literature for another 100 years. â⬠He died in 1871, leaving an immense collection of engineering drawings and documents, but merely a small portion of the Analytical Engine, consisting of an addition and a printing mechanism, whose assembly was completed by his son, Henry Babbage. This machine and Babbageââ¬â¢s engineering drawings are now in the Science Museum, London. 2. Babbageââ¬â ¢s direct successors Some yearsââ¬â¢ after Babbageââ¬â¢s death his son Henry Babbage recommenced work on the construction of a mechanical calculating machine, basing his efforts on the designs his father had made for the Mill of the Analytical Engine.This work was started in 1888 and carried on very intermittently. It was completed only in about 1910 when the Mill, which incorporated a printing mechanism, was demonstrated at a meeting of the Royal Astronomical Society. By this date however the work of a little-known successor to Charles Babbage, an Irish accountant named Percy Ludgate, was already well advanced. Ludgate started work in 1903 at the age of 20 on an entirely novel scheme for performing arithmetic on decimal numbers.Decimal digits were to be represented by the lateral position of a sliding metal rod, rather than the angular position of a geared disc. The basic operation provided was multiplication, which used a complicated mechanism for calculating the two-digit products resulting from multiplying pairs of decimal digits. together. The scheme involved first transforming the digits into a form of logarithm, adding the logarithms together, and then converting the result back into a two-digit sum.This scheme is quite unlike any known to have been used in earlier mechanical calculators, or for that matter since, although there had been several calculating machines constructed that used built-in multiplication tables to obtain two-digit products ââ¬â the earliest known of these was that invented by Bollee in 1887. It is in fact difficult to see any advantages to Ludgateââ¬â¢s logarithmic scheme, although his form of number representation is reminiscent of that used in various mechanical calculating devices in the following decades.So striking are the differences between Ludgateââ¬â¢s and Babbageââ¬â¢s ideas for mechanical arithmetic that there is no reason to dispute Ludgateââ¬â¢s statement that he did not learn of Babbageââ¬â ¢s prior work until the later stages of his own. It seems likely that Babbage was the eventual inspiration for Ludgate to investigate the provision of a sequence control mechanism. Here he made an advance over the rather awkward system that Babbage had planned, involving separate sets of operation and variable cards.Instead his machine was to have been controlled by a single perforated paper tape, each row of which represented an instruction consisting of an operation code and four address fields. Control transfers simply involved moving the tape the appropriate number of rows forwards or backwards. Moreover he also envisaged the provision of what we would now call subroutines, represented by sequences of perforations around the circumference of special cylinders-one such cylinder was to be provided The Institute of Mathematics and its Applications 2 for division.The machine was also to be controllable from a keyboard, a byproduct of whose operation would be a perforated tape which could then be used to enable the sequence of manually controlled operations to be repeated automatically. Ludgate estimated that his Analytical Machine would be capable of multiplying two twenty-digit numbers in about 10 seconds, and that, in considerable contrast to Babbageââ¬â¢s Analytical Engine, it would be portable. However there is no evidence that he ever tried to construct the machine, which he apparently worked on alone, in his spare time.He died in 1922, and even if at this time his plans for the Analytical Machine still existed there is now no trace of them, and our knowledge of the machine depends almost entirely on the one description of it that he published. The next person who is known to have followed in the footsteps of Babbage and to have worked on the problems of designing an analytical engine was Leonardo Torres y Quevedo. Torres was born in the province of Santander in Spain in 1852.Although qualified as a civil engineer he devoted his career to scientific re search, and in particular to the design and construction of an astonishing variety of calculating devices and automata. He gained great renown, particularly in France and in Spain, where he became President of the Academy of Sciences of Madrid, and where following his death in 1936 an institute for scientific research was named after him. Torres first worked on analog calculating devices, including equation solvers and integrators.In the early 1900ââ¬â¢s he built various radio-controlled devices, including a torpedo and a boat which, according to the number of pulses it received, could select between various rudder positions and speeds, and cause a flag to be run up and down a mast. In 1911 he made and successfully demonstrated the first of two chess-playing automata for the end game of king and rook against king. The machine was fully automatic, with electrical sensing of the positions of the pieces on the board and a mechanical arm to move its own pieces. The second machine was built in 1922, and used magnets underneath the board to move the pieces. ) In all this work, he was deliberately exploiting the new facilities that electromechanical techniques offered, and challenging accepted ideas as to the limitations of machines. He picked on Babbageââ¬â¢s Analytical Engine as an important and interesting technical challenge, and in 1914 published a paper incorporating detailed schematic designs for a suitable set of electro-mechanical components.These included devices for storing, comparing and multiplying numbers, and were accompanied by a discussion of what is now called floating point number representation. He demonstrated the use of the devices in a design for a special-purpose program-controlled calculator. The program was to be represented by areas of conductive material placed on the surface of a rotating drum, and incorporated a means for specifying conditional branching. Torres clearly never intended to construct a machine to his design, but 6 yea rs later he built, and successfully demonstrated, a typewriter-controlled calculating machine primarily to demonstrate that an electromechanical analytical engine was completely feasible. He in fact never did build an analytical engine, although he designed, and in many cases built, various other digital devices including two more calculating machines, an automatic weighing machine, and a machine for playing a game somewhat like the game of Nim. However there seems little reason to doubt that, should the need have been sufficiently pressing, Torres would indeed have built a complete analytical engine.In the event, it was not until the 1939-1945 war that the desirability of largescale fully automatic calculating machines became so clear that the necessary environment was created for Babbageââ¬â¢s concept to become a reality. Before this occurred there is known to have been at least one further effort at designing an analytical engine. This was by a Frenchman, Louis Couffignal, who was motivated mainly by a desire to reduce the incidence of errors in numerical computations.He was familiar with the work of Babbage and Torres y Quevedo but, in contrast to their designs, proposed to use binary number representation. The binary digits of stored numbers were to be represented by the lateral position of a set of parallel bars controlled by electro-magnets. The various arithmetic operations were to be performed by relay networks, the whole machine being controlled by perforated tapes. Couffignal apparently had every intention of building this machine, in association with the Logabax Company, but presumably because of the war never did so.However after the war he was in charge of an electronic computer project for the Institut Blaise Pascal, the design study and construction of the machine being in the hands of the Logabax Company. With Couffignalââ¬â¢s pre-war plans, the line of direct succession to Babbageââ¬â¢s Analytical Engine seems to have come to an end. Most of the wartime computer projects were apparently carried out in ignorance of the extent to which many of the problems that had to be dealt with had been tackled by Babbage over a century earlier. However in some cases there is clear evidence that nowledge of Babbageââ¬â¢s work was an influence on the wartime pioneers, in particular Howard Aiken, originator of the Automatic Sequence Controlled Calculator, and William Phillips, an early proponent of binary calculation, and various other influential people, including Vannevar Bush and L. J. Comrie, were also well aware of his dream. 3. The contribution of the punched card industry An initially quite separate thread of activity leading to the development of the modern computer originated with the invention of the punched card tabulating system.The capabilities of Herman Hollerithââ¬â¢s equipment, first used on a large scale for the 1890 US National Census, were soon extended considerably. The original equipment allowed cards to hold binary information representing the answers to a Census questionnaire. These cards could be tabulated, one by one, using a machine which sensed the presence of holes in the card electrically and could be wired to count the number of cards processed in which particular holes or combinations of holes had been punched. A device could be attached to such a tabulator which assisted the manual sorting of cards into a number of separate sequences.Within 10 years automatic card handling mechanisms, which greatly increased the speed of machine operation, and addition units, which enabled card tabulators to sum decimal numbers punched on cards, had been provided. The system soon came into widespread use in the accounting departments of various commercial organisations, as well as being used for statistical tabulations in many countries of the world. After the 1900 US Census relations between Hollerith and the Census Bureau deteriorated, and the Bureau began to manufacture its own equ ipment for use in the 1910 Census.The person in charge of this work was James Powers who circumvented Hollerithââ¬â¢s patents by producing a mechanical card reading apparatus. He retained the patent rights to his inventions and formed his own company which eventually merged with Remington Rand in 1927. In 1911 Hollerith sold his own company, the Tabulating Machine Company, which he had formed in 1896, and it was shortly afterwards merged with two other companies to form the Computing-TabulatingRecording Company. This company which was under the direction of Thomas J.Watson from 1914 became the International Business Machines Corporation in 1924. During the 1920ââ¬â¢s and 1930ââ¬â¢s punched card systems developed steadily, aided no doubt by the stimulus of competition, not only in the USA but also in Britain, where the Hollerith and Powers-based systems continued to be marketed under the names of their original inventors, while in France a third manufacturer, Compagnie Machi nes Bull, was also active. Unfortunately the people involved in this work did not in general publish technical papers and their work has received little public recognition.Thus full appreciation of the contribution of IBM development engineers, such as J. W. Bryce, one of the most prolific inventors of his era, will probably have to await an analysis of the patent literature. One inventor whose work has, however, been documented is Gustav Tauschek, a self-taught Viennese engineer, with more than 200 patents in the computing field to his credit. While working for Rheinische Metallund Maschinenfabrik he designed and built a punched card electromechanical accounting machine.His other patents, many of which were filed whilst he was under contract to IBM during the 1930ââ¬â¢s, also included a ââ¬Å"reading-writing-calculating machineâ⬠which used photocells to compare printed input characters with templates held on photographic film, a number storage device using magnetised stee l plates, and an electromechanical accounting machine designed for use in small banks capable of storing the records of up to 10 000 accounts. By the 1930ââ¬â¢s printing tabulators were available which worked at approximately 100 cards per minute, and there were sorters which worked at 400 cards per minute.The machines were controlled by fairly intricate plugboards, but arithmetic and logical computations involving sequences of operations of any great complexity were carried out by repeated processing of sets of cards, under the direction of operators. Various attempts were made to supplement the functional capabilities of punched card systems by linking together otherwise independent machines. One such system, the Synchro-Madas machine, incorporated a typewriter/accounting machine, an automatic calculating machine and an automatic card punch.These were linked together so that a single action by the operator sitting at the typewriter/accounting machine would control several opera tions on the different machines. One other system involving a set of inter-linked card machines, although very different in concept and scale from the Synchro-Madas machine, is worth mentioning. This is the Remote-control Accounting system which was experimented with in a Pittsburgh department store, also in the mid-1930ââ¬â¢s. The system involved 250 terminals connected by telephone lines to 20 Powers card punch/tabulators and 15 on-line typewriters.The terminals transmitted data from punched merchandise tags which were used to produce punched sales record cards, later used for customer billing. The typewriter terminals were used for credit authorisation purposes. The intended peak transaction rate was 9000 per hour. Even during the 1920ââ¬â¢s punched card systems were used not only for accounting and the compilation of statistics, but also for complex statistical calculations. However the first important scientific application of punched card systems was made by L.J. Comrie in 1929. Comrie was Superintendent of HM Nautical Almanac Office until 1936, and then founded the Scientific Computing Service. He made a speciality of putting commercial computing machinery to scientific use, and introduced Hollerith equipment to the Nautical Almanac Office. His calculations of the future positions of the Moon, which involved the punching of half a million cards, stimulated many other scientists to exploit the possibilities of punched card systems. One such scientist was Wallace J.Eckert, an astronomer at Columbia University, which already had been donated machines for a Statistical Laboratory by IBM in 1929, including the ââ¬Å"Statistical Calculator,â⬠a specially developed tabulator which was the forerunner of the IBM Type 600 series of multiplying punches, and of the mechanisms used in the Harvard Mark I machine. With assistance from IBM in 1934 Eckert set up a scientific computing laboratory in the Columbia Astronomy Department, a laboratory which was la ter to become the Thomas J.Watson Astronomical Computing Bureau. In order to facilitate the use of his punched card equipment Eckert developed a centralised control mechanism, linked to a numerical tabulator, a summary punch and a multiplying punch, so that a short cycle of different operations could be performed at high speed. The control mechanism which was based on a stepping switch enabled many calculations, even some solutions 4 The Institute of Mathematics and its Applications of differential equations, to be performed completely automatically.The potential of a system of inter-connected punched card machines, controlled by a fully general-purpose sequencing mechanism, and the essential similarity of such a system to Babbageââ¬â¢s plans for an Analytical Engine, were discussed in an article published by Vannevar Bush in 1936. Bush was at this time already renowned for his work on the first differential analyser, and during the war held the influential position of Director o f the US Office of Scientific Research and Development.In fact an attempt was made to build such a system of inter-connected punched card machines at the Institut fur Praktische Mathematik of the Technische Hochschule, Darmstadt, in Germany during the war. The plans called for the inter-connection of a standard Hollerith multiplier and tabulators, and specially constructed divider and function generators, using a punched tape sequence control mechanism. Work was abandoned on the project following a destructive air raid in September 1944. However, by this stage, in the United States much more ambitious efforts were being made to apply the expertise of punched card equipment designers.The efforts originated in 1937 with a proposal by Howard Aiken of Harvard University that a large-scale scientific calculator be constructed by inter-connecting a set of punched card machines via a master control panel. This would be plugged so as to govern the transmission of numerical operands and the sequencing of arithmetic operations. Through Dr. Shapley, director of the Harvard College Observatory, Aiken became acquainted with Wallace Eckertââ¬â¢s punched card installation at Columbia University.These contacts helped Aiken to persuade IBM to undertake the task of developing and building a machine to his basic design. For IBM, J. W. Bryce assigned C. D. Lake, F. E. Hamilton and B. M. Durfee to the task. Aiken later acknowledged these three engineers as co-inventors of the Automatic Sequence Controlled Calculator, or Harvard Mark I as it became known. The machine was built at the IBM development laboratories at Endicott and was demonstrated there in January 1943 before being shipped to Harvard, where it became operational in May 1944.In August of that year IBM, in the person of Thomas J. Watson, donated the machine to Harvard where it was used initially for classified work for the US Navy. The design of the Harvard Mark I followed the original proposals by Aiken fairly close ly, but it was built using a large number of the major components used in the various types of punched card machines then manufactured, rather than from a set of complete machines themselves. It incorporated 72 ââ¬Å"storage countersâ⬠each of which served as both a storage location, and as a complete adding and subtracting machine.Each counter consisted of 24 electromechanical counter wheels and could store a signed 23digit decimal number. A special multiply/divide unit, and units for obtaining the value of previously computed functions held on perforated tape, and for performing interpolation, were provided together with input/output equipment such as card readers and punches, and typewriters. The various mechanisms and counter wheels were all driven and synchronised by a single gearconnected mechanical system extending along nearly the entire length of the calculator.A main sequence control mechanism incorporating a punched tape reader governed the operation of the machine. Each horizontal row on the tape had space for three groups of eight holes, known as the A, B and C groups. Together these specified a single instruction of the form ââ¬Å"Take the number out of unit A, deliver it to unit B, and start operation C. â⬠Somewhat surprisingly, in view of Aikenââ¬â¢s knowledge of Babbageââ¬â¢s work and writings, no provision was made originally for conditional branching.As it was, such provision was only made later when a subsidiary sequence control mechanism was built at Harvard and incorporated into the machine. The Harvard Mark I was a massive machine over 50 feet long, built on a lavish scale. Being largely mechanical its speed was somewhat limited ââ¬â for example multiplication took 6 seconds ââ¬â but it continued in active use at Harvard until 1959. It has an important place in the history of computers although the long-held belief that it was the worldââ¬â¢s first operational programcontrolled computer was proved to be fals e, once the details of Zuseââ¬â¢s wartime work in Germany became known.It marked a major step by IBM towards full involvement in the design of general-purpose computers and, with ENIAC and the Bell Telephone Laboratories Series, represents the starting point of American computer developments. After completion of the Mark I, Aiken and IBM pursued independent paths. Aiken, still distrustful of the reliability of electronic components, moved to electromagnetic relays for the construction of the Harvard Mark II, another paper-tape-sequenced calculator.This machine had an internal store which could hold about 100 dccimal floating point numbers. One of the most interesting aspects of the machine was that it could be operated either as a single computer or as two separate ones. The complete system incorporated four of each type of input/output device, namely sequence tape readers, data tape readers and punches, numerical function tape readers and output printers. It also had multiple ar ithmetic facilities, including two adders and four multipliers (taking 0. 7 second) which could all be used simultaneously.Detailed design of the machine, which was intended for the US Naval Proving Ground, Dahlgren, Virginia, began at Harvard early in 1945, and the machine was completed in 1947. Afterwards Aiken and his colleagues went on to design the Mark III, an electronic computer with magnetic drum storage, completed in 1950, and the Mark IV, which incorporated 200 magnetic core shift registers, completed in 1952. The designers of IBMââ¬â¢s next machine, the Pluggable Sequence Relay Calculator, included two of the Harvard Mark Iââ¬â¢s design team, namely C. D. Lake and B. M.Durfee, but the machine in fact had more in common with IBMââ¬â¢s earlier calculating punches than with the Mark I; like the punches it was controlled using plugboard-specified sequencing, rather than by a sequence control tape of essentially unlimited length. Its relay construction resulted in its basic operation speed being considerably faster than the Mark I, although it lacked the Mark Iââ¬â¢s ease and flexibility of programming, demanding instead the kind of detailed design of parallel subsequencing that one sees nowadays at the microprogramming level of some computers.Great stress was raid by the designers on the efficient use of punched card input/output, and it was claimed that in many cases, where other machinesââ¬â¢ internal storage capacity proved inadequate, the IBM relay calculators could outperform even the contemporary electronic computers. Several machines were built, the first of which was delivered in December 1944 to the Aberdeen Proving Ground, and two were installed at the Watson Scientific Computing Laboratory that IBM had set up at Columbia University under the directorship of Wallace Eckert.The Relay Calculator was followed by the giant IBM Selective Sequence Electronic Calculator, a machine which was very much in the tradition of the Mark I. Wal lace Eckert was responsible for the logical organisation of the machine, with Frank Hamilton being the chief engineer on the project. The design was a compromise between Eckertââ¬â¢s wish, for performance reasons, to use electronic components to the full, and Hamiltonââ¬â¢s preference for electro-mechanical relays, on grounds of reliability. As a result vacuum tubes were used for the arithmetic unit, the control circuitry, and the 8 word high-speed store, relays being used elsewhere.In addition to the 8 word store there was a 150 word random access electro-magnetic store and storage for 20000 numbers in the form of punched tapes. Numbers would be read from the electro-magnetic store, or in sequence from the punched tape store, at the speed of the multiplier, i. e. , every 20 milliseconds. The design was started in 1945, and the machine was built in great secrecy at Endicott, before being moved to New York City, where it was publicly unveiled at an elaborate dedication ceremony in January 1948. The most important aspect of the SSEC, credited to R. R.Seeber, was that it could perform arithmetic on, and then execute, stored instructions ââ¬â it was almost certainly the first operational machine with these capabilities. This led to IBM obtaining some very important patents, but the machine as a whole was soon regarded as somewhat anachronistic and was dismantled in 1952. It had however provided IBM with some valuable experience ââ¬â for example, Hamilton and some of his engineering colleagues went on to design the highly successful IBM 650, and many of the SSEC programmers later became members of the IBM 701 programming group.Finally, mention should be made of one other machine manufactured by IBM which can be classed as a precursor to the modern electronic digital computer. This was the Card Programmed Calculator, a machine which along with its predecessors now tends to be overshadowed by the SSEC. Like the Pluggable Sequence Relay Calculator, the C PC can trace its origins to the IBM 600 series of multiplying punches. In 1946 IBM announced the Type 603, the first production electronic calculator. The IBM 603, which incorporated 300 valves, was developed from an experimental multiplier designed at Endicott under the direction of R.L. Palmer in 1942. One hundred machines were sold, and then IBM replaced it with the Type 604, a plugboardcontrolled electronic calculator, which provided conditional branching but, lacking backward jumps, no means of constructing program loops. Deliveries of the 604, which incorporated over 1400 valves, started in 1948 and within the next 10 years over 5000 were installed. In 1948 a 604 was coupled to a type 402 accounting machine by Northrop Aircraft Company, in order to provide the 604 with increased capacity and with printing facilities. This idea was taken up by IBM, and formed the basis of the CPC.Nearly 700 CPCââ¬â¢s were built, and this machine played a vital role in providing computing pow er to many installations in the USA until stored program electronic computers became commercially available on a reasonable scale. In the years that followed the introduction of the CPC, IBM continued to develop its range of electronic calculators and, starting in 1952 with the IBM 701, an electronic computer in the tradition of von Neumannââ¬â¢s IAS machine, took its first steps towards achieving its present dominant position amongst electronic computer manufacturers. . Konrad Zuse Konrad Zuse started to work on the development of mechanical aids to calculation as early as 1934, at the age of 24. He was studying civil engineering at the Technische Hochschule, Berlin-Charlottenburg, and sought some means of relief from the tedious calculations that had to be performed. His first idea had been to design special forms to facilitate ordinary manual calculation, but then he decided to try to mechanise the operation.Continuing to use the special layouts that he had designed for his fo rms, he investigated representing numerical data by means of perforations, and the use of a hand-held sensing device which could communicate the data over an electrical cable to an automatic calculating machine. The idea then arose of using a mechanical register rather than perforated cards, and, realising that the layout was irrelevant, Zuse started to develop a general purpose mechanical store, whose locations were addressed numerically.By 1936 he had the basic design of a floating point binary computer, controlled by a program tape consisting of a sequence of instructions, each of which specified an operation code, two operand addresses and a result address. Thus, apparently quite independently of earlier work by Babbage and his successors on analytical engines, Zuse had very quickly reached the point of having a design for a general-purpose program-controlled computer, although the idea of conditional branching was lacking.More importantly, even though the various basic The Inst itute of Mathematics and its Applications 6 ideas that his design incorporated had, it now turns out, been thought of earlier (i. e. , binary mechanical arithmetic (Leibniz), program control (Babbage), instruction formats with numerical storage addresses (Ludgate) and floating point number representations (Torres y Quevedo)), Zuseââ¬â¢s great achievement was to turn these ideas into reality. Zuse had considerable trouble finding sponsors willing to finance the building of his machine.Despite his financial difficulties his first machine, the Z1, which was of entirely mechanical construction was completed in 1938, but it proved unreliable in operation. He then started to construct a second, fixed-point binary, machine which incorporated the 16 word mechanical binary store of the Z1, but was otherwise built from second-hand telephone relays. Although the Z2 computer was completed it was inadequate for any practical use. However by this time a colleague, Helmut Schreyer, was already working with Zuse on the problem of producing an electronic version of the Z1.This led to the construction of a small 10 place binary arithmetic unit, with approximately 100 valves, but proposals that Schreyer and Zuse made to the German government for a 1500 valve electronic computer were rejected and the work was discontinued in 1942. Earlier, in 1939, Zuse was called up for military service, but managed to get released after about a year, and for the first time received significant government backing for his plans. This enabled him to build the Z3 computer, a binary machine with a 64 word store, all built out of telephone relays.This computer, since it was operational in 1941, is believed to have been the worldââ¬â¢s first general-purpose program-controlled computer. It incorporated units for addition, subtraction, multiplication, division and square root, using a floating point number representation with a sign bit, a 7-bit exponent and a 14-bit mantissa. Input was via a manu al keyboard and output via a set of lights, in each case with automatic binary/decimal conversion, and the machine was controlled by a perforated tape carrying single address instructions, i. . , instructions specifying one operand, and an operation. In addition to his series of general-purpose computers, Zuse built two special-purpose computers, both used for calculations concerning aircraft wing profiles. The first of these was in use for 2 years at the Henschel Aircraft Works, before being destroyed through war damage. Both computers had fixed programs, wired on to rotary switches, and performed calculations involving addition, subtraction and multiplication by constant factors.Soon after completion of the Z3, the design of an improved version, the Z4, was started. This was mainly electro-mechanical but incorporated a purely mechanical binary store similar to that which had been used for the Zl and Z2 machines. The partially completed Z4 was the only one of Zuseââ¬â¢s machines to survive the war ââ¬â indeed it eventually was completed and gave years of successful service at the Technische Hochschule, Zurich. The Z4 was inspected shortly after the war by R. C. Lyndon, whose report on the machine for the US Office f Naval Research was published in 1947. At this stage the Z4 had only manual input and output, and no means of conditional branching, although it was planned to add four tape readers and two tape punches, and facilities for repeating programs and for choosing between alternate subprograms. The machine was housed in the cellar of a farmhouse in the little village of Hopferau in Bavaria, and was not fully operational, but the mechanical store and various arithmetic operations and their automatic sequencing were successfully demonstrated to Lyndon.His report, although it gives a fairly full description of the Z4 (with the exception of the mechanical store, which he was not allowed to examine in detail), made virtually no mention of Zuseââ¬â¢s earlier work. Indeed it was many years before any other English language accounts of Zuseââ¬â¢s work were published, and Zuseââ¬â¢s rightful place in the chronology of computer development became at all widely appreciated. 5. Bell Telephone Laboratories The potentialities of telephone equipment for the construction of digital calculation devices were not realised for many years.The first automatic telephone exchange, which used the step-by-step or Strowger switch, was installed in 1892. As early as 1906 Molina devised a system for translating the pulses representing the dialled decimal digits into a more convenient number system. Exchanges based mainly on the use of electromechanical relays started to come into use at the turn of the century, the earliest successful centralised automatic exchanges dating from about 1914. However, from the late 1920ââ¬â¢s various different calculating devices were developed using telephone equipment.Perhaps the most spectacular of these was the automatic totalisator. Totalisator, or ââ¬Å"pari-mutuel,â⬠betting became legal on British race courses in July 1929. Development of fully automatic totalisators consisting of ticket-issuing machines situated in various parts of the race course, a central calculating apparatus, and display boards which indicated the number and total value of bets made on each horse, and on the race as a whole, was already well under way.There were several rival systems. The Hamilton Totalisator and the totalisator produced by the British Automatic Totalisator Company were fully electrical, both as regards the calculations performed and the operation of the display boards, whereas the Lightning Totalisator used electrical impulses from remote ticket machines only to release steel balls which fell through tubes and actuated a mechanical adding apparatus.In January 1930 the Racecourse Betting Control Board demonstrated at Thirsk Racecourse a new standard electric totalisator supplied by Bri tish Thompson Houston, built from Strowger switches. This machine which was transportable from racecourse to racecourse could accumulate bets on up to six horses at a maximum rate of 12 000 per minute. The machine had in fact been designed in Baltimore, Maryland, in 1928 but the first complete machine to be used in the USA was installed by the American Totalisator Company at Arlington Park nly in 1933. In succeeding years much more sophisticated totalisators, involving hundreds of remote ticket-issuing machines, were used at racecourses all over USA, and it was not until many years after the advent of the electronic computer that one was used as a replacement for the central calculating apparatus of the totalisator. One early little-known design for a calculating machine to be built from telephone relays was that of Bernard Weiner in Czechoslovakia in 1923.Weiner, in association with the Vitkovice Iron Works, went on during the 1930ââ¬â¢s to design a more powerful automatic calcu lator. He did not survive the war, and nothing is known about the results of this work. Other early work was done by Nicoladze who in 1928 designed a multiplier based on the principle of Genailleââ¬â¢s rods. (These were a non-mechanical aid to multiplication which enabled a person to read off the product of a multidigit number by a single digit number. Four years later Hamann described not only various different styles of relay-based multiplier, but also a device for solving sets of simultaneous linear equations, and shortly afterwards Weygandt demonstrated a prototype determinant evaluator, capable of dealing with 3 x 3 determinants. Undoubtedly in the years that followed many other digital calculating devices were developed based on telephone relay equipment, particularly during the war for such military applications as ballistics calculations and cryptanalysis ââ¬â indeed, as mentioned earlier, some of Zuseââ¬â¢s machines made extensive use of telephone relays.It is per haps a little surprising that it was not until 1937 that Bell Telephone Laboratories investigated the design of calculating devices, although from about 1925 the possibility of using relay circuit techniques for such purposes was well accepted there. However, in 1937 George Stibitz started to experiment with relays, and drew up circuit designs for addition, multiplication and division. At first he concentrated on binary arithmetic, together with automatic decimal-binary and binarydecimal conversion, but later turned his attention to a binary-coded decimal number representation.The project became an official one when, prompted by T. C. Fry, Stibitz started to design a calculator capable of multiplying and dividing complex numbers, which was intended to fill a very practical need, namely to facilitate the solution of problems in the design of filter networks, and so started the very important Bell Telephone Laboratories Series of Relay Computers. In November 1938, S. B. Williams took over responsibility for the machineââ¬â¢s development and together with Stibitz refined the design of the calculator, whose construction was started in April and completed in October of 1939.The calculator, which became known as the ââ¬Å"Complex Number Computerâ⬠(often shortened to ââ¬Å"Complex Computer,â⬠and as other calculators were built, the ââ¬Å"Model Iâ⬠), began routine operation in January 1940. Within a short time it was modified so as to provide facilities for the addition and subtraction of complex numbers, and was provided with a second, and then a third, teletype control, situated in remote locations. It remained in daily use at Bell Laboratories until 1949.The Complex Computer was publicly demonstrated for the first time in September 1940 by being operated in its New York City location from a teletypewriter installed in Hanover, New Hampshire, on the occasion of a meeting of the American Mathematical Society, a demonstration that both John Mauc hly and Norbert Wiener attended. During 1939 and 1940 Stibitz started work on the idea of automatic sequencing and on the use of error-detecting codes. These ideas were not pursued actively until, a year or so later, the onset of the war rovided a strong stimulus and the necessary financial climate. They then formed the basis of the second of the Bell Laboratories relay calculators, the ââ¬Å"Relay Interpolator. â⬠This was a special-purpose tape-controlled device, with selfchecking arithmetic, designed to solve fire control problems, and was built for the National Defense Research Council, to which Stibitz had been lent by Bell Laboratories. Although mainly used for interpolation it was also used for a few problems in harmonic analysis, calculation of roots of polynomials and solution of differential equations.It became operational in September 1943, and after the war it was handed over to the US Naval Research Laboratory, where it was in use until 1961. The Model III relay c alculator, the ââ¬Å"Ballistic Computer,â⬠work on which started in 1942, was a much more complete realisation of Stibitzââ¬â¢s early plans for an automatic computer, and although once again intended for fire control problems was much more versatile than the Model II. It was tape-controlled, and had a tenregister store, a built-in multiplier (designed by E. L.Vibbard), and devices for performing automatic look-up of tables held on perforated paper tape. Perhaps most impressive was the fact that the machine was 100 per cent. self-checked. The machine was completed in June 1944, and remained in use until 1958. The Model IV relay calculator was little different from the Model III, and the series culminated in the Model V, a truly general-purpose program-controlled computer, complete with convenient conditional branching facilities. (The final member of the series, Model VI, was essentially just a simplified version of the Model V. Two copies of the Model V were built, the firs t being delivered in 1946 to the National Advisory Committee on Aeronautics at Langley Field, Virginia, and the second in 1947 to the Ballistics Research Laboratory at Aberdeen, Maryland. With its multiple computing units, the Model V, which used floating point arithmetic, was what we would now call a multiprocessing system, and its ââ¬Å"problem tapesâ⬠were the forerunners of the early simple batch-processing operating systems. Each of the two computing units comprising a complete system contained 15 storage registers.A single register could hold a floating point number consisting of a sign, a seven-decimal digit mantissa and a two-digit exponent. Decimal digits were stored in a bi-quinary form, using seven relays, and each register used a total of 62 relays. Each unit had independent provision for the addition, subtraction, multiplication and division and for 8 The Institute of Mathematics and its Applications taking the square root of floating point numbers, and for printi ng or punching its results.In addition a large set of tape readers, intended for tapes of input data, tabulated functions and programs, and for the problem tapes which controlled the running of series of separate programs, were shared by the two computer units. These units normally functioned as independent computers, but for large problems would be arranged to work cooperatively. Although somewhat slow in execution, the Model V set new standards for reliability, versatility and ease of switching from one task to another, and in so doing must surely have had an important influence on the designers of the earliest round of general-purpose electronic computers.In later years, quite a number of relay calculators were constructed, in both the USA and Europe, even after the first stored program electronic computers became operational, but the importance of their role in the history of computers hardly matches that of the Bell Laboratories Model V and its contemporaries. 6. The advent of electronic computers The earliest known electronic digital circuit, a ââ¬Å"trigger relay,â⬠which involved a pair of valves in a circuit with two stable states and was an early form of flip-flop, was described by Eccles and Jordan in 1919.The next development that we know of was the use by WynnWilliams at the Cavendish Laboratory, Cambridge, of thyratrons in counting circuits including, in 1932, a ââ¬Å"scale-of-twoâ⬠(binary) counter. By the end of the decade quite a few papers had been published on electronic counters intended for counting impulses from GeigerMuller tubes used in nuclear physics experiments. WynnWilliamsââ¬â¢ work had a direct influence on the ideas of William Phillips, who apparently in 1935 attempted to patent a binary electronic computing machine.He built a mechanical model, which still exists, of the intended electronic multiplication unit but no other details are presently known of his planned machine. The first known attempt to build an elect ronic digital calculating machine was begun by John V. Atanasoff in the mid-1930ââ¬â¢s at Iowa State College where there had been an active interest in statistical applications using punched card equipment since the early 1920ââ¬â¢s. As an applied mathematician Atanasoff had many problems requiring generalisations of existing methods of approximating solutions of linear operational equations.He first explored the use of analog techniques and with Lynn Hannum, one of his graduate students, developed the ââ¬Å"Laplaciometer,â⬠a device for solving Laplaceââ¬â¢s equation in two dimensions with various boundary conditions. By 1935 the realisation of the sharp limitations of analog computing forced Atanasoff to digital methods. The disadvantages of mechanical techniques and his knowledge of electronics and of the work of Eccles and Jordan then led him to consider an electronic approach.He soon found that in these circumstances a base two number system would have great adva ntages. In 19361937 Atanasoff abandoned the Eccles-Jordan approach and conceived a system employing memory and logic circuits, whose details were worked out in 1938. He received a grant from Iowa State in 1939, and was joined by Clifford E. Berry. With Berryââ¬â¢s assistance a prototype computing element was built and operating by the autumn of that year. They then undertook the design and construction of a large machine intended for the solution of up to 30 simultaneous linear equations.At the heart of the machine there was a pair of rotating cylinders around the surface of which a set of small electrical condensers was placed. Each condenser could, by the direction of its charge, represent a binary digit; although the charge would leak away slowly, it was arranged that as the cylinders rotated the charge on each condenser was detected and reinforced at 1 second time intervals so that information could be stored for as long as required.The condensers were arranged so as to provi de two sets of 30 binary words, each consisting of 50 bits, the condensers corresponding to a single word being arranged in a plane perpendicular to the axis of the cylinders. The results of intermediate steps of a computation were to be punched in binary form on cards, for later re-input to the machine. In order that card punching and reading should be fast enough to keep pace with the computation, special devices were designed that made and detected holes in cards by means of electrical sparks.Ordinary input and output was to be via conventional punched cards, with the machine providing automatic binary/decimal conversions. The machine, with binary addition, subtraction and shifting as its basic arithmetic facilities, was designed to solve sets of simultaneous linear equations by the method of successive elimination of unknowns. The electronic part of the computer was operational but the binary card reader was still unreliable when in 1942 Atanasoff and Bcrry left Iowa State for w artime jobs, so that the machine was abandoned, never having seen actual use.In the late 1930ââ¬â¢s and early 1940ââ¬â¢s several groups started to investigate the use of digital electronic circuits as replacements for mechanical or electro-mechanical calculating devices, including several of the American business machine manufacturers such as IBM, whose work was described briefly above. The earliest known efforts at applying electronics to a general-purpose program-controlled computer were those undertaken by Schreyer and Zuse, also mentioned earlier.The next development which should be mentioned is the still classified series of electronic cryptanalytic machines that were designed and built in Britain during the war. The machines that are of particular interest, with respect to the development of electronic computers are the Colossi, the first of which was operational in late 1943, while by the end of the war ten had been installed. Each Colossus incorporated approximately 20 00 valves, and processed a punched data tape that was read at a speed of 5000 characters per second.Preset patterns that were to be compared against the input data were generated from stored component patterns. These components were stored in ring registers made of thyratrons and could be manually set by plug-in pins. The Colossi were developed by a team led by M. H. A. Newman. Alan Turing, who had been one of the main people involved in the design of an electro-mechanical predecessor to the Colossi, was apparently not directly associated with the new design, but with others provided the requirements that the machines were to satisfy.The comparative lack of technical details about the design of these machines makes it unreasonable to attempt more than a preliminary, and somewhat hesitant, assessment of the Colossi with respect to the modern digital computer. It would appear that the arithmetical, as opposed to logical, capabilities were minimal, involving only counting rather than g eneral addition or other operations. They did, however, have a certain amount of electronic storage. Although fully automatic, even to the extent of producing printed output, they were very much special-purpose machines, but ithin their field of specialisation the facilities provided by plug-boards and banks of switches afforded a considerable degree of flexibility; in fact several of the people involved in the project have since characterised the machines as being ââ¬Å"program-controlled. â⬠Their importance as cryptanalytic machines, which must have been immense, can only be inferred from the number of machines that were made and the honours bestowed on various members of the team after the end of the war; however, their importance with respect to the development of computers was twofold.They demonstrated the practicality of largescale electronic digital equipment, just as ENIAC did, on an even grander scale, approximately 2 years later. Furthermore, they were also a major source of the designers of some of the first post-war British computers, namely the Manchester machine, the MOSAIC, and the ACE at the National Physical Laboratory. Fascinating though they are, none of the efforts described so far comes near to matching the importance of the work at the Moore School of Electrical Engineering, University of Pennsylvania, which led to the design of first the ENIAC and then the EDVAC computers.By 1942 the Moore School had, because of pressures of war, become closely associated with the Ballistic Research Laboratory of the US Army Ordnance Department, and the Moore Schoolââ¬â¢s differential analyser was being used to supplement the work of the one at the Ballistic Research Laboratory on the production of ballistic tables. (The two analysers were identical and had been patterned on the original differential analyser invented by Vannevar Bush in 1930. ) One of the people who had worked with the analyser was John Mauchly, then an assistant professor at the Moore School.Mauchly was by this time well aware of what could be done with desk calculating machines and punched card equipment, although he was apparently unaware of the work Aiken was then doing on what became the Harvard Mark I, or of Babbageââ¬â¢s efforts 100 years earlier. He did however know of the work of Stibitz and had visited Iowa State in June 1941 in order to see Atanasoffââ¬â¢s special-purpose computer. Another person who worked on the Moore School differential analyser, and in fact made important improvements to it by replacing its mechanical amplifiers by partially electronic devices, was J. Presper Eckert, a research associate at the School.Eckert had met Mauchly in 1941, and it was their discussions about the possibility of surmounting the reliability problems of complex electronic devices that laid the groundwork for a memorandum that Mauchly wrote in August 1942. This proposed that an electronic digital computer be constructed for the purpose of solving numerical difference equations of the sort encountered in ballistics problems. Also at the Moore School, acting as a liaison officer for Colonel Paul N. Gillon of the office of the Chief of Ordnance, was Herman H. Goldstine, who before the war had been assistant professor of mathematics at the University of Michigan.In early 1943 Goldstine and Gillon became interested in the possibility of using an electronic calculating machine for the preparation of firing and bombing tables. By this time Mauchlyââ¬â¢s 1942 memorandum had been mislaid, and it had to be recreated from his secretaryââ¬â¢s notes. The second version of the memorandum, together with more detailed plans drawn up by Mauchly and Eckert, was included in a report dated April 1943 which formed the basis for a contract between the University of Pennsylvania and the US Government to develop an electronic computer.A large team was assembled at the Moore School in order to design and build the computer under the supervisi on of J. G. Brainerd, with Eckert as chief engineer and Mauchly as principal consultant. As the project progressed its aims broadened, so that the ENIAC, as it became known, turned out to be much more a general-purpose device than had been originally contemplated, and although programs were represented by plugged interconnecting wires, it provided full conditional branching facilities.It was an incredibly ambitious machine incorporating over 19 000 valves and consuming approximately 200 kilowatts of electric power! (The number of valves largely resulted from the use of them for high speed storage, and the choice of number representation, which can best be described as ââ¬Å"unary-coded decimal. ââ¬Å") The ENIAC incorporated 20 10-digit accumulators, which could be used for addition and subtraction, and for the temporary storage of numbers, a multiplier and a combination divider and square rooter.Addition took 200 microseconds, and multiplication of two 10-digit numbers approximat ely 3 milliseconds. Storage was provided for approximately 300 numerical constants in function tables, which could be set up by manual switches prior to commencing a computation. Input and output was via punched cards, using standard IBM devices. Early in its career the method of programming the machine was modified so that the program was represented by settings of the function tables without the need for changing the interconnecting cables.
Thursday, August 29, 2019
New Developments in Technology Management
The teaching of technology management has a long history in business schools. However, the nature and focus of such curricula have changed in recent years, due to several trends. The rise of a knowledge-based economy has brought greater attention to the management and commercialization of intellectual property (Markman, Siegel, & Wright, 2008).Questions regarding the appropriate business models to foster successful commercialization have been further complicated by the rise of ââ¬Å"open-sourceâ⬠innovation (e. g. , Linux, a software company that has captured substantial market share). And new institutions (e. g. , incubators and science parks; Phan, Siegel, & Wright, 2005) and new organizational forms (e. g. , research joint ventures [RJVs], and technology alliances) have emerged that may also have profound effects on technology management education.Nonprofit institutions, most notably universities and federal laboratories, have become much more aggressive in protecting and ex ploiting their intellectual property (Siegel & Wright, 2007). Such institutions, es324 Copyright of the Academy of Management, all rights reserved. Contents may not be copied, emailed, posted to a listserv, or otherwise transmitted without the copyright holderââ¬â¢s express written permission. Users may print, download or email articles for individual use only. pecially universities, are also working much more closely with industry and government.These trends and growing involvement of government and nongovernmental institutions in innovation and commercialization have led to growing international recognition of the narrowness of technology management education as it is practiced today. Some business and engineering schools have responded to these developments by designing new courses and curricula related to technological entrepreneurship. Some countries with centralized educational systems (e. g. , Japan, Singapore, and Ireland) are graduating ââ¬Å"bilingual engineersâ⬠with capabilities in technology and business.Yet, this trend of marrying technology with management education is still far from being in the mainstream. Another important development in stimulating and changing the nature of the demand for technology management education is the rise of knowledge and intellectual property management as a professional field. In many countries, national governments have supported these initiatives by en- 2009 Phan, Siegel, and Wright 325 acting legislation to facilitate publicââ¬âprivate research partnerships, technology transfer (through patenting and licensing) from universities to firms (e. g. , the Bayhââ¬âDole Act of 1980), and collaborative research.For example, the EU, China, and Singapore have established technology-based venture funds to stimulate the development of technologybased start-up companies. In the United States, the national ââ¬Å"public sector venture capitalâ⬠for technology-based new ventures, the Small Business Inn ovation Research (SBIR) program and numerous state-level programs with similar goals (e. g. , Ben Franklin Technology Partners, Pennsylvania, and the Massachusetts Technology Development Corporation) have propelled technology transfer issues to the forefront of university technology management curricula.Government is also providing subsidies for research joint ventures involving universities and firms (e. g. , the Commerce Departmentââ¬â¢s Advanced Technology Program/Technology Innovation Program), shared use of expertise and laboratory facilities (e. g. , the National Science Foundationââ¬â¢s Engineering Research Centers and Industryââ¬â University Cooperative Research Centers), and programs to promote management and entrepreneurship education among scientists and engineers (e. g. the Science Enterprise Challenge in the U. K. ). These and other trends discussed here have led to experimentation and innovation in technology management pedagogy and content, which is the focu s of this special issue. For example, it is obvious that the rise in collaborative research and commercialization has important educational implications, since it implies that team-work has become more important in science and engineering, especially when both innovation and commercialization are involved.This has resulted in the increasingly popular use of real-life team projects as the primary method of delivering discovery-based learning. Our purpose in this special issue is to assess the implications of these trends for technology management curricula in business schools. In spring 2008, we issued an open Call for Papers on the Academy of Management website, the Social Science Research Network, and other venues. We received 38 manuscripts, which were reviewed according to AMLE standards for the Research & Reviews section.Papers were also solicited for the Essays, Dialogues, & Interviews and Exemplary Contribution sections, which were subject to the usual peer-review process. Bas ed on the results, we selected several manuscripts for inclusion which are summarized in Table 1. The remainder of this essay is organized as fol- lows: First, we describe recent public policy changes, which have promoted universityââ¬â industry partnerships, collaborative research, and technology transfer from universities and federal labs to the private sector.Then, we discuss the educational implications of these trends, drawing on some of the lessons learned from the papers in special issue. Finally, we outline an agenda for additional research on technology management education. PUBLIC POLICY INITIATIVES INFLUENCING TECHNOLOGY MANAGEMENT In recent decades, we have witnessed rapid growth in the incidence of a variety of research partnerships and technology commercialization involving corporations, universities, nonprofit organizations, and government agencies.This growth can be attributed to three policy initiatives: â⬠¢ Policies promoting the transfer of technology from universities and federal labs to firms â⬠¢ A large increase in the incidence of publicââ¬â private partnerships â⬠¢ Relaxation of antitrust enforcement related to collaborative research Examples of such technology partnerships are research joint ventures, strategic alliances and networks involving high-technology organizations, industry consortia (e. g. SEMATECH), cooperative research and development agreements (CRADAs) involving federal labs and firms, engineering research centers (ERCs), and industryââ¬â university cooperative research centers (IUCRCs) sponsored by the U. S. National Science Foundation, federally funded research and development centers, science parks and high-technology incubators (many of which are located at universities), and licensing and sponsored research agreements involving universities, government laboratories, firms, and university-based start-ups. Table 2 summarizes the key U. S. egislation promoting governmentââ¬â universityââ¬â industry partnerships, collaborative research, and technology transfer/commercialization. The most important legislation in this regard is the Bayhââ¬âDole Act of 1980, which dramatically changed the rules of the game with respect to the ownership of intellectual property rights of technologies emerging from federal research grants. Bayhââ¬âDole conferred the right to universities to patent and claim the scientific discoveries arising from U. S. government-funded research, instituted a uniform patent policy across federal agencies, and lifted numerous restrictions on technology licensing.As a result of this legis- 326 Academy of Management Learning & Education September TABLE 1 Summary of Papers Authors Barr, Baker, Markham, & Kingon Key Research Question Discovering how to teach technological entrepreneurship skills that will help bridge the ââ¬Å"valley of deathâ⬠in COT between creation of technology and emergence of a commercial venture. Theory/Framework Van Burg et al. (2008) science-based design framework of five factors critical to enhance science-based start-ups; cognitive theory; theory of planned action.Data/Methods Analysis of development of a COT program for MBA, PhD, and masterââ¬â¢s students at North Carolina State over 14year period. Findings/Conclusions Enactive mastery experiences have to be perceived as authentic and real to have desired effect; importance of loosely structured handson engagement; program needs to be real, intensive, interdisciplinary and iterative; need to create temporal checkpoints, decenter business plans, to structure large blocks of time, to emphasize and balance team diversity, generate technology flow, beware of idiosyncratic heuristics.Significant positive effects of the program on student perceptions of the multidisciplinary capabilities needed to operate in a technological business environment. Thursby, Thursby, & Fuller What are the benefits and challenges of integrated approaches to graduate edu cation in technological entrepreneurship? Theory of the Firmââ¬âEconomic Approach to Evaluation. Austin, Nolan, & Oââ¬â¢Donnell How to design a student experience in technology management that addresses the learning cycle more completely, while maintaining very high levels of student engagement. Experiential learning theory.Ordered logit analysis of program assessment data including pre- and postsurveys and a control group relating to a NSF-sponsored integrated program at Georgia Tech and Emory University involving PhD, MBA, and JD students. Programs at universities in two countries, MNC executives, and open enrollment course at a business school; combination of case and traditional lecture-based approaches; narrative approach based on monomyth; student course feedback and follow-up 1 year later. Verzat, Byrne, & Fayolle Boni, Weingart, & Evenson What teaching methods can be used to create entrepreneurial engineers that have a keen sense of teamwork?Are games an appropriate p edagogical device to meet the specific learning needs of engineering students? Can games help engineering students learn about teamwork? How to teach skills of creating disruptive innovations and develop new business opportunities through blending entrepreneurial thought and action, design thinking, and team building. Education science and team process; Kirkpatrickââ¬â¢s 4level hierarchy of evaluation. Use of team games in a traditional elitist French teaching context that emphasizes individual learning; evaluation data collected from 111 groups on initial reaction to the game and interviews 3 months later.Approach works at multiple student levels with same materials but emphasis differs across groups; able to use with introductory and capstone courses; approach acts as a leveler in class as all can engage with the ââ¬Ëstoryââ¬â¢; issues concerning integration of supplementary materials, lack of ââ¬Ëclosureââ¬â¢ in each class, use of fictionalized cases. Games rated a positive reaction from students despite being an informal departure from normal formal approach; real learning outcome in exposing students to importance of team working.Disruptive innovation, entrepreneurial leadership, design thinking, and team building. Capstone course for MBA Entrepreneurship in Organizations & Design masterââ¬â¢s students at Carnegie Mellon involving team teaching; Multidisciplinary teams of designers, technologists, and business student entrepreneurs. It is important to blend three perspectives for effective commercialization of innovation: (1) entrepreneurial thought and action, (2) design thinking, and (3) teambuilding.A key feature of this project-based course is the collaboration between MBA students and School of Design students, which leads to the development of new business opportunities. (table continues) 2009 Phan, Siegel, and Wright 327 TABLE 1 Continued Authors Clarysse, Mosey, & Lambrecht Key Research Question What are implications for developm ents in technology management education of contemporary challenges such as globalization, open innovation, and the need for corporate renewal (and venturing)? Theory/Framework Technology management skills provision.Data/Methods Qualitative analysis based on interviews with 10 technology management education demand- and supply-side actors in universities, consultancies, and corporations across Europe. Findings/Conclusions Technology Management Educations is a dynamic field moving from traditional MBA focused programs towards more entrepreneurial ââ¬Ëbootcampsââ¬â¢, from a case study oriented teaching style towards a mentoring approach and from an emphasis upon general business towards working across disciplines yet being sensitive to underlying technologies; a shift from general to specific skills; Linkages between business schools and technology chools is an important element of this change. Courses in IP management, management of industrial R&D, systems architecture and engin eering could only be offered by transfer to School of Engineering; traditional professional degrees can be enhanced by integrating management of technology programs into core engineering curriculum; advantages of offering part-time courses for those in employment.Need to find a subtle balance between traditional didactic courses, presentations of leading edge research, workshops and meetings with practitioners, field studies and involvement in real projects through internships (including outside France); need for faculty to have close links with industry both domestically and abroad; important use of concurrent teaching modes. Hang, Ang, Wong, & Subramanian How can management of technology programs & curricula be designed to meet the needs of a small newly developed Asian country?Action learning as a foundation for curriculum design in technology intensive technology management programs. Qualitative analysis of transfer of MSc in Management of Technology from business school to a sc hool of engineering in Singapore Mustar How to develop a highly selective technology management course for students in a leading French engineering school, in an institutional and country environment traditionally resistant to the notion of entrepreneurship, that develops their entrepreneurial skills but which goes beyond an introductory course on how to start a business.How to combine the acquisition of knowledge and the development of skills. How to develop their entrepreneurial skills and their ability to take responsibilities. How to encourage imagination, creativity, involvement, and risk taking. Qualitative analysis of the case of innovation and entrepreneurship in Mines Paris-Tech, a leading French engineering school. lation, U. S. research universities established technology transfer offices to manage and protect their intellectual property.The Stevensonââ¬âWydler Act, enacted in the same year as Bayhââ¬âDole and then extended in 1986, required federal labs to adopt technology transfer as part of their mission and also authorized cooperative research and development agreements (CRADAs) between the labs and private organizations. The National Cooperative Research Act (NCRA) of 1984 and the National Cooperative Research and Production Act (NCRPA) of 1993, promoted collabo- 328 Academy of Management Learning & Education September TABLE 2 Key U. S.Legislation Promoting Governmentââ¬âUniversityââ¬âFederal Labââ¬âIndustry Partnerships, Collaborative Research, Technology Transfer/Commercialization Legislation Bayhââ¬âDole Act of 1980 Key Aspects of Legislation Transferred ownership of intellectual property from federal agencies (which sponsor most basic research) to universities; Spurred the growth of university technology transfer offices, which manage university patenting and licensing. Required federal labs to adopt technology transfer as a part of their mission; Authorized cooperative research and development agreements (CRADAs) be tween federal labs and private organizations.Created the Small Business Innovation Research (SBIR) and the Small Business Technology Transfer (STTR) programs, which require each federal agency to allocate a percentage (now 2. 5%) of their research budget to small business research with commercial potential. NCRA and NCRPA actively encouraged the formation of research joint ventures and joint production ventures among U. S. firms. Institutions Affected by Legislation Universities; teaching hospitals; firms Stevensonââ¬âWydler Technology Innovation Act of 1980; Federal Technology Transfer Act of 1986 Federal labs; firms Small Business Innovation Development Act of 1982Universities; small firms; venture capital firms National Cooperative Research Act (NCRA) of 1984; National Cooperative Research and Production Act (NCRPA) of 1993 Omnibus Trade and Competitiveness Act of 1988; America COMPETES Act (2007) Firms; universities The 1988 act established the Advanced Technology Program (A TP), a publicââ¬âprivate research program. In 2007, the America COMPETES Act created the successor to ATP, the Technology Innovation Program (TIP). Firms; universities rative research by eliminating antitrust concerns associated with joint research even when these projects involved firms in the same industry.The NCRA created a registration process, later expanded by the National Cooperative Research and Production Act (NCRPA) of 1993, under which research joint ventures (RJVs) can disclose their research intentions to the Department of Justice. The most notable research joint venture established via the NCRA registration process was SEMATECH (SEmiconductor MAnufacturing TECHnology), a not-for-profit research consortium, which provided a pilot manufacturing facility, where member companies could improve their semiconductor manufacturing process technologies.Other legislation created two key publicly funded technology programs: (1) the Small Business Innovation Research (SBIR) and the Small Business Technology Transfer (STTR) programs, which require each federal agency to allocate a percentage (now 2. 5%) of their research budgets to small businesses with commercial promise, and (2) the Advanced Technology Program (ATP), a publicââ¬â private research program, which funds collaborative research on generic technologies. In 2007, the America COMPETEs Act created the successor to ATP, the Technology Innovation Program (TIP).Universities are actively involved in both programs, working closely with large firms on ATP/ TIP research projects, as well as with small companies on SBIR/STTR, sometimes founding these firms. As a result, many technology management curricula in the United States are now infused with a public policy dimension that was previously missing. Table 3 presents global evidence on key policy changes relating to the legislative and support environment for technology commercialization in five nations: France, Germany, Italy, Singapore, and the Un ited Kingdom.For example, according to Meyer (2008), Austria, Denmark, Finland, Germany, Italy, and Japan have adopted ââ¬Å"Bayhââ¬âDole likeâ⬠legislation, emphasizing a ââ¬Å"patent-centeredâ⬠model of university and national laboratory technology transfer. The United Kingdom and Israel have always had a system of university-owned 2009 TABLE 3 Legislative and Support Environment for Technology Commercialization in France, Germany, Italy, Singapore, and the U. K. Germany 1999 Public researchers receive the right to be the owner of their IP.This is the opposite of the Bayhââ¬âDole Act, but oftentimes the university makes a formal contract on an individual basis to give the IP rights to the university. 2002 Employer Invention Law: Invention belongs to the employer not to the professor. 2000ââ¬â2006 Restructuring of various laws to make it easier to commercialize technology from universities, get part of the royalties as an academic, take equity in start-ups, etc. Italy Singapore U. K. No formal Bayhââ¬âDole Act. In the case of UK public research organizations the IP is owned by the institution and the royalties associated with the IP are distributed between the relevant parties.The distribution of royalties is organized on an institutional basis. Milestone France I. University Ownership of Intellectual Property Arising From Federal (National) Research Grants (e. g. , Bayhââ¬âDole Act in U. S. ) Not relevant as all IP belongs to universities/public research institutes following the ââ¬Å"code intellectuelle de la propriete. â⬠II. Other Key Changes 1999 Innovation Act gives the possibility to academics who are civil servants to participate as a partner or a manager in a new company and to take equity (previously illegal for civil servants).This Act encourages the creation of new start-up firms by students. 2002 Decree that regulates and increases the personal income an academic can receive from IP (50%). Phan, Siegel, and W right III. Financial Support 1999 11 (pre-) seed capital funds created to invest in innovative start-ups and take equity (investment in 150 spinoffs in 8 yrs). Creation of the annual National Competition for the creation of technologically innovative startups (grant from 45,000 to 450,000 Euros); 12,927 projects have been presented between 1999 and 2007: 1,879 have been funded.Creation of 29 incubators between 1999 and 2007; they hosted 1993 projects giving birth to 1,239 new firms. Between 1999 and 2007, these 3 schemes have benefited 1,760 new firms (taking into account that a company can benefit from different schemes). Around 50% are academic spin-offs. 2000 EXIST: public program that assists spin-offs through preseed capital and management support. 2002 EEF-Fund: Researchers can receive a scholarship to start a spin-off. 2002 22 TTOs established which take care of IP management. 999 National Research Commission created, which annually funds about 5-10 proposals for spin-offs, a mounting to 30,000 Euro, on average. 2005 Quantica Fund. First interuniversity seed capital fund (a form of publicââ¬âprivate partnership) is created. 2005 Italian University technology transfer offices have to join together in groups of four and bid for money (100,000 Euro/university) to sponsor their day-today operations. 1963 Forms tripartite macroeconomic structure of industry, labor, and government as basis for funding innovation and economic development. 001ââ¬â2008 National initiative to focus on microelectronics, biotechnology, nanotechnology, materials science, healthcare and life sciences as part of national innovation initiative. The right to commercialize IP are assigned to the faculty. 2001 Economic Development Board charged with the implementation of the 5-Year Science and Technology Plan which includes initiatives to target key technology sectors, attract foreign investment and human capital, and accelerate technological entrepreneurship and technology commerc ialization.Agency for Science, Technology and Research or A*STAR) created to fund and create infrastructure of industryââ¬â university joint research efforts in strategic technology sectors. 2005 The governmentââ¬â¢s funding plan is to increase R&D expenditure to 3% of GDP by 2010, from the 2004 R&D expenditure of $2. 5 billion US (about 2. 25% of GDP). 2007 Public sector R&D budgets more than doubled to $13. 55 US billion from 2005, comprised of $5 billion US for the National Research Foundation (NRF), $5. 4 billion US for the Public Research Institutes housed in the Agency for Science, Technology and Research (A*STAR). 1. 05 billion US for academic (universitybased) research. $2. 1 billion US for the Economic Development Board (EDB) to promote private sector R&D. 1970 onward Various schemes to promote collaborative projects between universities and industry, including Knowledge Transfer Networks. 1998ââ¬â2004 Higher education reaches out to business and the community to provide funding to establish corporate liaison offices and collaborative projects. 1998 University Challenge Funds (UCFs): Universities were granted funds to support spin-off and limited incubation support. 001 onward HEIF (Higher Education Innovation Fund) provides permanent flow of funding to support & develop universitiesââ¬â¢ capacity to act as drivers of growth in the knowledge economy (various rounds up to 2008). (table continues) 329 330 TABLE 3 Continued Germany Italy Singapore UK Milestone France In 2005, six ââ¬Å"Maisons de lââ¬â¢entrepreneuriatâ⬠in different universities have been created. They aim at facilitating the promotion of the entrepreneurial spirit and mind-set and ââ¬Å"sensitizationâ⬠to the new business start-up or new activities.Academy of Management Learning & Education Science Enterprise Challenge funding (1991/2001), to encourage culture open to entrepreneurship required for successful knowledge transfer from science base. Teaching ent repreneurship to support the commercialization of science and technology to produce graduates and postgraduates better able to engage in enterprise. Establish a network of UK universities specializing in teaching and practice of commercialization and entrepreneurialism in science and technology. 005 Medici Fellowship Scheme, pilot providing 50 fellowships over 2 years focusing on commercialization of biomedical research; fellows required to have significant prior research; local training in host institution in finance, marketing, IP, & business strategy; fellows encouraged to develop links with practitioners; postpilot further funding obtained to extend remit to include engineering researchers from 2007ââ¬â2009; analogous schemes subsequently created by Research Councils and Regional Development Agencies and from 2007ââ¬â2009 mainly focused in life sciences.Regional Development Agencies providing broad spectrum of assistance to develop more productive links between universit ies and industry. 2007ââ¬â2011 Technology Strategy Board strategic plan envisages investing ? 1 billion of public funds plus matched funds from industry over 2008-2011, in doubling number of innovation platforms, a strategic review of Knowledge Transfer Networks, doubling number of Knowledge Transfer Partnerships, developing strategy to rapidly commercialize new and emerging technologies, piloting a new Small Business Research Initiative.September Information sources: Clarysse et al. (2007); Mustar & Wright (2009); and Koh & Phan (In Press). 2009 Phan, Siegel, and Wright 331 intellectual property. An increase in funding for technological entrepreneurship in many countries (see Table 3) has also stimulated greater interaction among firms, universities, and national labs, as well as the rise of intellectual property management curricula and courses at these institutions (for detailed comparison of France and the U. K. , see Mustar & Wright, 2009).EDUCATIONAL IMPLICATIONS OF THESE TRENDS The end result of these global trends is an increased emphasis on collaborative research, commercialization of intellectual property, entrepreneurship, venture capital, and research centers dedicated to emerging technologies, such as Organic LEDs, nanotechnology, biotechnology, materials science, MEMS, and so on. Such trends have brought new issues and perspectives, propelling the role of education to the forefront of discourse (e. g. , the recent AMLE special issue on entrepreneurship education).Conventional technology management and management of innovation curricula have focused largely on understanding the technology and innovation strategies of multinational firms (Nambisan & Willemon, 2003). There has been, until recently, less emphasis on start-up and entrepreneurial technology-based firms. The differences can be significant. For example, in the traditional curriculum, the role of teamwork, especially linking interdisciplinary teams of agents (scientists, technology ma nagers, and entrepreneurs) and institutions (firms, universities, government agencies) have not been stressed.That is, the individual and institutional levels of analyses have been ignored, such that technology management education curricula have been confined to how organizations respond to technological challenges. The developments in technology management education considered in this special issue can be seen as a response to the challenges leveled at business schools to be relevant to the practice of management (Pfeffer & Fong, 2002, 2004; Starkey, Hatchuel, & Tempest, 2004).At the same time, such programs that reside in business schools, when detached from the engineering and science faculties of their universities, run the risk of treating the technology component as a special case of general management. Our review of the literature and the lessons learned from this special issue suggest that a fully matured technology management program should treat technology with a capital ââ¬Å"Tâ⬠rather than the small one it has been to date. To accomplish this design goal, business schools eed to appoint program directors with strong boundary-spanning skills that can link up with technology-based units on and off campus by colocating or partnering with such institutions. We note that the challenge of integration is not easily solved. Over the years, business schools in the United States and United Kingdom have chosen to remain independent from the rest of their universities. This was partially enabled by the largesse of endowments in the 1980s and 1990s pouring in from private foundations and industrialists seeking to establish their names in perpetuity.Clarysse, Mosey, and Lambrecht (this issue) hypothesize that this is not a wise strategy for business schools administering technology management curricula. The authors conclude that business schools should expand their educational mission to include the education of engineering and science professors and res earchers, and the training of postgraduate science and engineering students, since these individuals are more likely to choose an industry or technology-specific masterââ¬â¢s degree, instead of a traditional MBA.More generally, business schools need to have a stronger connection to schools of engineering and the sciences, and other technology-orientated organizations in the areas of medicine, public health, and pharmacy, as well as science-based business incubators and science parks. While acknowledging Clarysse et al. ââ¬â¢s points, we are concerned that each of these institutions has different paradigms, norms, standards, and values, as well as diverse languages and codes. Thus, it may be necessary to develop a shared syntax of boundary objects that include repositories, standardized forms, objects and models (Carlile, 2002).These communication devices enable individuals in business schools and technologybased schools to learn about their differences and dependencies, as wel l as jointly to evolve their knowledge bases about how things work ââ¬Å"on the other side. â⬠Hence, the recruitment and development of boundary spanners (such as program managers, center directors, or interdisciplinary faculty members) who can communicate across schools are important to facilitate such integration (see e. . , the Medici Scheme, Table 3). Another concern regarding the optimal design of technology management curricula arises in relation to the overall configuration of business schools. Ambos, Makela, Birkinshaw, and Dââ¬â¢Este (2008) have argued that for universities to be effective at technology commercialization there is a need for ambidexterity in the organizational structures of these traditional research and teaching institutions.Similarly, with respect to technology 332 Academy of Management Learning & Education September management education, business schools must make their organizations more porous, for example, through the hiring and promotion of faculty with science and engineering degrees. Such ambidexterity configurations will enable business schools to more tightly bind the traditional business disciplines to science and engineering disciplines. The papers in this pecial issue challenge the proposition of Suddaby and Greenwood (2001), who asserted that business schools can sustain demand for new managerial knowledge through the education and accreditation of a continuing stream of management students. While it is true that there has been substantial growth in demand for courses in entrepreneurship and innovation in MBA and undergraduate programs, the ability of business schools to deliver these programs beyond an introductory level is open to debate, especially when faculty in such schools traditionally lack exposure to the hard sciences and technology disciplines.A third concern in the design of technology management curricula raised herein is the notion of avoiding polar extremes in content coverage, which are emphasiz ing theoretically rigorous, but highly abstract research or stressing practical content based on ââ¬Å"war storiesâ⬠and conventional wisdom. Placing too much emphasis on practical experience may have negative consequences since the mental models that such pedagogies create can quickly become obsolete, particularly in light of the fast evolving technologies the curricula are supposed to address (Locke & Schone, 2004).In ? other words, practice-oriented technology management curricula may inspire students to become more entrepreneurially oriented, but without the concomitant development of critical thinking skills, such as the ability to assess risks and recognize the inevitable downsides of entrepreneurial activity. Technology management curricula that are light on practice, however, can produce students who may find the challenge of boundary spanning, a key skill for successful technology managers, too great to scale.Van Burg, Romme, Gilsing, and Reymenk (2008) have outlined a design science-based model for the development of academic spin-offs that is grounded in both theory and practice. As noted by Barr, Baker, Markham, and Kingon (this issue), new developments in technology management education stress the importance of active involvement (experiential learning) models that are authentic and real. Many technology management curricula mimic those of entrepreneurship, in that they include a ealthy dose of business plan writing, ostensibly as products of courses on commercialization and opportunity search. There is considerable debate over the usefulness of business plans in practice, even though venture capitalists and banks demand them. Indeed, Barr, Baker, Markham, and Kingon (this issue) challenge the effectiveness of teaching the preparation of a business plan. They suggest that it is preferable to deemphasize the writing of a plan because it tends to restrict creativity and the search for more appropriate solutions.Yet, as a pedagogical tool, we t hink that business plans, when used appropriately, can be a useful way to garner a studentââ¬â¢s attention on a comprehensive set of issues that should be considered when commercializing an invention. A shift is taking place from traditional technology management curricula toward more entrepreneurially based courses that require interdisciplinary skills. As part of this development, there is a need for interdisciplinary team-learning activities to be a central part of curriculum development in technology management education.Team composition needs to be addressed carefully to enable participants to gain full benefits. Thursby, Thursby, and Fuller (this issue) present an interesting example of teams of law, business, science, and engineering students converging to commercialize innovations developed at Emory University and the Georgia Institute of Technology. Developments in technology management education also pose major faculty recruitment challenges. Many business school facult y members do teaching, research, and service (including consulting) that is focused on large corporations.Traditional business school academics typically lack the appropriate context-specific business creation skills that are increasingly demanded as central to technology management education (Wright, Piva, Mosey, & Lockett, 2008). As noted in Barr, Baker, Markham, and Kingon (this issue), the recruitment of adjunct faculty members should be focused on those who can serve as mentors to students. There is also a need to consider recruitment and training of faculty who can act as boundary spanners.The time-consuming nature of developing interdisciplinary curricula raises a concern about possible conflicts with the promotion-and-tenure process, which also needs to be addressed in recruitment and retention. AGENDY FOR FURTHER RESEARCH ON TECHNOLOGY EDUCATION To build on the findings of this special issue, we identify a number of areas for further research. 2009 Phan, Siegel, and Wright 333 These are summarized in Table 4, where we identify a series of research questions relating to institutional issues, the interaction between education and practice, the advancement of business schools, and evaluation.Universities typically have well-established conventions and practices concerning the management of their activities. The traditional academic culture of the university (the classic ââ¬Å"ivory towerâ⬠) embodies a system of values that opposes the commercialization of research through company creation. When university administration is decentralized, with no mechanism for integration, links between business schools and technologyoriented units of universities may be weak or in- formal.This suggests a need for the development and implementation of clear and well-defined strategies, processes, and policies regarding new venture formation and approaches to technology management education that incorporate entrepreneurial activities. Institutional frictions and thei r impact upon intraorganization knowledge transfer are wellknown (Szulanski, 1996). These frictions in the interactions between different elements of the university may frustrate the development of interdisciplinary technology management curricula.Transferring personnel across organizational boundaries has been identified as an important mechanism to effect knowledge transfer (Inkpen & Tsang, TABLE 4 Research Agenda Institutional Issues How do incentive systems for faculty encourage the time-intensive development of effective technology management courses? What institutional challenges constrain the cross-disciplinary development of technology management education? What are resource implications for universities attempting to develop interdisciplinary technology management education?Interaction Between Education and Practice How can technology management education processes be transferred to promote the creation and development of spin-offs? How can universities develop integration processes among technology management education and technology transfer offices, incubators, and science parks? How can business schools enhance (effective) engagement with leading-edge technological entrepreneurs? Advancement of Business Schools How can the necessary specific skills now required for technology management education be developed within business schools?Do business schools have the requisite career structures for faculty involved in technology management education? (e. g. , adjunct, nontenure track faculty). What is the role of business school faculty in contributing to the development of technology management education? Evaluation Issues How effective are different developments in technology management education? Is it possible to have a valid control group in evaluation of technology management education? From a corporate perspective (since many students are sponsored by companies), how effective are technology management programs?What are the most appropriate metho ds for evaluating the effectiveness of technology management education? What decision making processes are most effective in promoting interdisciplinary teaching and research, and integration in technology management education (top-down vs. bottom-up)? Does development of technology management education represent a need to reevaluate the whole position of business schools within universities, or is there a need for ambidexterity? What are the roles of different competitors within the segments of the broad technology management space?What challenges arise in addressing ââ¬Å"language barriersâ⬠between business school and technology/ engineering faculty and how can they be overcome? What is the best way to train technology managers who must engage in boundary spanning among industry, the entrepreneurial community, academia, and government? What challenges arise in integrating research with new developments in technology management education? Is it possible to build evaluation i nto the design of technology management education programs, so we can identify ââ¬Å"best practicesâ⬠and benchmark comparable programs? 34 Academy of Management Learning & Education September 2005). Universities may need to consider the facilitation of exchanges of staff between schools or the development of faculty with boundary-spanning skills. Academics may identify more closely with their discipline than with the business school or university and may seek to marginalize ââ¬Å"tribesâ⬠from ââ¬Å"outside disciplinesâ⬠(Becher & Trowler, 2001). This concern is especially salient if the objective is to integrate research with new developments in technology management education.Differences in language and goals between business schools and science- and technology-based departments exacerbate these problems. Business schools may also lack credibility with conventional, ââ¬Å"pureâ⬠scientists, who perceive them as professional schools with little research tra dition. This may be a major issue in universities with strong science departments and weak business schools (Wright et al. , 2008). However, even this effect is likely to vary between disciplines, as some departments, for example, engineering and medicine, may be closer in the sense of being professional schools than the pure science departments.It may also be important to focus on the role of technology managers within the university. Siegel, Waldman, and Link (2003) found that the key impediment to effective university technology transfer tended to be organizational in nature. In a subsequent field study (Siegel, Waldman, Atwater, & Link, 2004), the authors found there are deficiencies in the technology transfer office and other areas of the university involved in technology commercialization with respect to marketing skills and entrepreneurial experience.This finding has been confirmed with more systematic data by Markman, Phan, Balkin, and Gianodis (2004), who explained this res ult by reporting that universities were not actively recruiting individuals with such skills and experience. Instead, representative institutions appear to be focusing on expertise in patent law and licensing or technical expertise. To develop effective curricula, the expertise that business school faculty need to interact with science and technology departments may be discipline specific.Yet the background of business school faculty typically makes it difficult for them to convey sufficiently context-specific material for different groups of technologists. To this end, Siegel and Phan (2005) suggest the creation of formal training programs for university personnel on the issue of technology management. Thursby, Thursby, and Fuller (this issue) report that an integrated graduate program on technological entrepreneurship has a positive impact on student perceptions of the multidisciplinary capabil- ties needed to operate in a technologically oriented business environment. Taking a pa ge from Souitaris, Zerbinati, and Al-Laham (2007), who drew on the theory of planned behavior to demonstrate that entrepreneurship programs raised risktaking attitudes and inspired entrepreneurial intention among students, we suggest that technology management curricula can similarly inspire students to think creatively about how they can convert science to commercial ventures by immersing them in the experience of technology and opportunity evaluation early on in the program.Authors of evaluation studies need to find ways of incorporating the measurement of postprogram outcomes, such as new venturing and career trajectories, through more longitudinal studies. More specifically, it would be extremely useful to build evaluation into the design of such programs, so that we can identify ââ¬Å"best practicesâ⬠and benchmark comparable programs as we do for other types of programs. A critical methodological issue in evaluation concerns whether it is possible to have a viable contro l group for such a study. The papers in this special issue represent a number of different institutional contexts worldwide.A final question one can ask, after reading these papers, is whether there are developments that suggest a convergence in program design towards a universal model, or are we likely to experience a wide variation due to adaptations to the local contexts? Locke and Schone (2004) highlight ? important differences in the interaction between business schools and industry in Europe compared to those in the United States. They suggest that the relations between business school faculty and other scientists have traditionally been stronger in the United States than in the United Kingdom and France.Further, subjects taught in business schools in France, the United Kingdom, and the United States tend to be close to praxis, and professors have usually had practical experience. To contrast, in Germany management education has always been strongly oriented toward science, wi th academics having little business experience/ contact with industry; this pattern appears to have persisted despite pressure for convergence to an Anglo-Saxon business school model (Muller-Camen & Salzgeber, 2005).Mustar (this issue) and Verzat, Byrne, and Fayolle (this issue) illustrate the challenges of introducing entrepreneurial elements to the traditional approach to technology and engineering training in France. Hang, Ang, Wong, and Subramanian (this issue) argue that there was a need to design a program to meet the needs of a small newly developed Asian country. In sum, while the elements of technology man- 2009 Phan, Siegel, and Wright 335 agement curricula appear to be very similar, in part driven by the institutional hegemony of U. S. ased models, there is some indication of local adaptation in pedagogy, delivery mechanisms, and sequencing of content, based on government initiatives, types of corporations that employ the local graduates of such programs, and the capabili ties of the universities delivering them. REFERENCES Ambos, T. , Makela, K. , Birkinshaw, J. , & Dââ¬â¢Este, P. 2008. When does university research get commercialized? Creating ambidexterity in research institutions. Journal of Management Studies, 45: 1424 ââ¬â1447. Becher, T. , & Trowler, P. R. 2001. Academic tribes and territories.Buckingham: The Society for Research into Higher Education and Open University Press. Carlile, R. P. 2002. A pragmatic view of knowledge and boundaries: Boundary objects in new product development. Organization Science, 13: 442ââ¬â 455. Inkpen, A. , & Tsang, E. 2005. Social capital, networks and knowledge transfer. Academy of Management Review, 30(1): 146 ââ¬â 165. Koh, W. , & Phan, P. In Press. The National Innovation System in Singapore. In V. K. , Narayanan, & G. Oââ¬â¢Connor, (Eds. ), Encyclopedia for Technology, Innovation and Management, Blackwell Press: U. K. Locke, R. , & Schone, K. 2004.The entrepreneurial shift: Ameri? canizat ion in European high-technology management education. Cambridge: Cambridge University Press. Markman, G. , Phan, P. , Balkin, D. , & Gianiodis, P. 2004. Entrepreneurship from the ivory tower: Do incentive systems matter? Journal of Technology Transfer, 29(3ââ¬â 4): 353ââ¬â364. Markman, G. , Siegel, D. , & Wright, M. 2008. Research and technology commercialization. Journal of Management Studies, 45: 1401ââ¬â1423. Meyer, M. 2008. University patenting and IP management approaches in Europe. Brighton: SPRU, University of Sussex. Muller-Camen, M. , & Salzgeber, S. 2005.Changes in academic work and the chair regime: The case of German business administration academics. Organization Studies, 26(2): 271ââ¬â 290. Mustar, P. , & Wright, M. 2009. Convergence or path dependency in policies to foster the creation of university spin-off firms? A comparison of France and the United Kingdom. Journal of Technology Transfer, forthcoming. Nambisan, S. , & Willemon, D. 2003. A global st udy of graduate management of technology programmes. Technovation, 23: 949 ââ¬â962. Pfeffer, J. , & Fong, C. T. 2002. The end of business schools? Less success than meets the eye. Academy of Management Learning and Education, 1(1): 78 ââ¬â95.Pfeffer, J. , & Fong, C. T. 2004. The business school ââ¬Å"businessâ⬠: Some lessons from the U. S. experience. Journal of Management Studies, 41(8): 1501ââ¬â1520. Phan, P. , Siegel, D. S. , & Wright, M. 2005. Science parks and incubators: Observations, synthesis and future research. Journal of Business Venturing, 20(2): 165ââ¬â182. Siegel, D. S. , & Phan, P. 2005. Analyzing the effectiveness of university technology transfer: Implications for entrepreneurship education. In G. D. Libecap, (Ed. ), Advances in the study of entrepreneurship, innovation, and economic growth, volume 16: University entrepreneurship and technology transfer: 1ââ¬â38.JAI Press: Oxford, UK. Siegel, D. S. , Waldman, D. , & Link, A. N. 2003. Assess ing the impact of organizational practices on the productivity of university technology transfer offices: An exploratory study. Research Policy, 32(1): 27ââ¬â 48. Siegel, D. S. , Waldman, D. , Atwater, L. , & Link, A. N. 2004. Toward a model of the effective transfer of scientific knowledge from academicians to practitioners: Qualitative evidence from the commercialization of university technologies. Journal of Engineering and Technology Management, 21(1ââ¬â2): 115ââ¬â142. Siegel, D. S. , & Wright M. 2007. Intellectual property: The assessment.Oxford Review of Economic Policy, 23(4): 529 ââ¬â540. Souitaris V. , Zerbinati, S. , & Al-Laham, A. 2007. Do entrepreneurship programmes raise entrepreneurial intentions of science and engineering students? The effects of learning, inspiration and resources. Journal of Business Venturing, 22(4): 566 ââ¬â591. Starkey, K. , Hatchuel, A. , & Tempest, S. 2004. Rethinking the business school. Journal of Management Studies, 41(8) : 1521ââ¬â1532. Suddaby, R. , & Greenwood, R. 2001. Colonizing knowledge: Commodification as a dynamic of jurisdictional expansion in professional service firms. Human Relations, 54: 933ââ¬â953.Szulanski, G. 1996. Exploring internal stickiness: Impediments to the transfer of best practice within the firm. Strategic Management Journal, 17: 27ââ¬â 43. Van Burg, E. , Romme, G. L. , Gilsing, V. A, & Reymenk, I. M. M. J. 2008. Creating university spin-offs: A science-based design perspective. Journal of Product Innovation Management, 25: 114 ââ¬â128. Wright, M. , Piva, E. , Mosey, S. , & Lockett, A. 2009. Academic entrepreneurship and the role of business schools. Journal of Technology Transfer. Phillip Phan is professor and vice dean for Faculty and Research at The Johns Hopkins University Carey Business School.Between 2000 and 2007, he was the Warren H. Bruggeman ââ¬â¢46 and Pauline Urban Bruggeman Distinguished Professor of Management at Rensselaer Polytechnic Insti tute. Phil is associate editor for the Journal of Business Venturing, the Journal of Financial Stability, and the Journal of Technology Transfer. His most recent books are Theoretical Advances in Family Enterprise Research (InfoAge Press); Entrepreneurship and Economic Development in Emerging Regions (Edward Elgar); and Taking Back the Boardroom: Thriving as a Director in the 21st Century (Imperial College Press).Donald Siegel is dean of the School of Business and professor of management at the University at Albany, SUNY. Don is editor of the Journal of Technology Transfer, associate editor of 336 Academy of Management Learning & Education Journal of Business Venturing, Journal of Productivity Analysis, and Academy of Management Learning & Education. His most recent books are Innovation, Entrepreneurship, and Technological Change (Oxford University Press); and the Handbook of Corporate Social Responsibility (Oxford University Press).He has received grants or fellowships from the Slo an Foundation, National Science Foundation, NBER, American Statistical Association, W. E. Upjohn Institute for Employment Research, and the U. S. Department of Labor. Professor Siegel is a member of the Advisory Committee to the Secretary of Commerce on ââ¬Å"Measuring Innovation in the 21st Century Economy. â⬠Mike Wright has been professor of financial studies at Nottingham University Business School since 1989 and director of the Centre for Management Buy-out Research since 1986.He has written over 25 books and more than 250 papers in academic and professional journals on management buy-outs, venture capital, habitual entrepreneurs, corporate governance, and related topics. He served two terms as an editor of Entrepreneurship Theory and Practice (1994 ââ¬â1999) and is currently a consulting editor of Journal of Management Studies and an associate editor of Strategic Entrepreneurship Journal. Mike is also program chair of the Academy of Management Entrepreneurship Divisi on. His latest books include Academic Entrepreneurship in Europe and Private Equity and Management Buyouts. September
Subscribe to:
Comments (Atom)