Feeds:
Posts
Comments

Posts Tagged ‘Computers’

On January 22, 1984, the famous “1984” television commercial introducing the Macintosh personal computer ran during the third quarter of the Super Bowl.  Many people think that this is the only time it ever ran.  But, it was also run by the Chiat/Day, the ad agency that created it, on December 31, 1983 right before the 12:00 midnight sign-off on KMVT in Twin Falls, Idaho, so that it could qualify for the 1983 advertising awards. The ad was so successful, that it never really needed to be run again as the media coverage it got generated a lot of free airtime.  And, people are still talking about it 30 years later.

The ad is based on the book, “1984” by George Orwell which introduced the concept of “Big Brother”.  The ad refers to IBM as “Big Brother” and the Apple Macintosh computer as the individual challenging a society of people who don’t behave as individuals.  Interestingly, the estate of George Orwell and the television rights holder to the novel Nineteen Eighty-Four considered the commercial to be a copyright infringement and sent a cease-and-desist letter to Apple and Chiat/Day after the ad ran which generated even more publicity.

 

 

Here’s Director Ridley Scott discussing the making of the famous 1984 Macintosh commercial.  [This is excerpted from an Apple promotional video.]

 

 

The “1984” ad was shown at the 20th anniversary celebration of the Macintosh in 2004  There was also an updated version of it created for the iPod launch.  Was it one of the best ads ever?  That’s up for debate.  But, as a marketer, I’d give it an award for one of the top 10 product launches ever.

— Carole Gunst

Read Full Post »

 

NIST-Logo_5The National Institute of Standards and Technology (NIST) is one of the nation’s oldest physical science laboratories in existence.  The United States Congress established the agency in 1901 as the National Bureau of Standards (NBS) because, at the time, the U.S. had a second-rate measurement infrastructure that lagged behind the capabilities of other countries.  For some reason, the word “national” was dropped from the name in 1903 and added back in 1934. In 1988, the agency name became the National Institute of Standards and Technology, or NIST.

NIST and High Tech History

According to the NIST website, “Before air conditioning, airplanes, and plastics were invented, and before science was changed forever by Albert Einstein’s special theory of relativity, the National Institute of Standards and Technology (NIST) began laying the technical foundation for the world’s most prosperous nation.  At that time, the United States had few, if any, authoritative national standards for any quantities or products.  It was difficult for Americans to conduct fair transactions or get parts to fit together properly. Construction materials were of uneven quality, and household products were unreliable. Few Americans worked as scientists, because most scientific work was based overseas.”

NIST Centenial photosWhen World War II began, science and technology rose in importance and so did NIST who was drawn into the new field of electronics.  NIST weapons research led to a contractor’s development of printed circuits, which substituted printed wiring, resistors, and coils for the conventional discrete components in electronic devices. This technology contributed to a new field of electronic miniaturization for which the Institute provided useful engineering data and components.

An automated electronic computing project was established at NIST in 1946, about the time that the Electronic Numerical Integrator and Automatic Computer (ENIAC), the first all-purpose electronic computer, began operating at the University of Pennsylvania. In 1948, the Air Force financed NIST to design and construct the Standards Eastern Automatic Computer (SEAC.)  The computer went into operation in May 1950 using a combination of vacuum tubes and solid-state diode logic.

About the same time, the Standards Western Automatic Computer, was built at the Los Angeles office of NIST and was used for research there.  In 1954, a mobile version, DYSEC,  (it was actually housed in a truck and might just be the first portable computer) went into operation.  NIST staff members also developed a mathematical algorithm, used to solve very large systems of linear equations, that nearly 50 years later would be named one of the top 10 algorithms of the century by a computing trade journal.

NIST Today

Today, NIST is part of the U.S. Department of Commerce. Its official mission is “to promote U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology in ways that enhance economic security and improve our quality of life.”

NIST is involved with the following areas of technology:

Interested in learning more?  NIST provides many educational activities and is open for tours if you’re in Gaithersburg, MD or Boulder, CO.

— Carole Gunst

Read Full Post »

An original three-page contract establishing Apple Computer, dated April 1, 1976, will be auctioned at Sotheby’s New York on December 13. The documents, which are signed by Apple co-founders Steve Jobs, Steve Wozniak and Ronald Wayne, also include Wayne’s addendum to the original contract, where he dissolved his own 10% interest in the company. Sotheby’s estimates the documents could fetch between $100,000 and $150,000.

At the time Wayne left the company, he received $800 and $1500 at a later date. Walter Isaacson, who penned the authorized biography of Steve Jobs, suggested that the 10% stake Wayne sold in 1976 would be worth in the neighborhood of $2.6 billion today.

In an article in Crain’s New York Business, Richard Austin, head of books and manuscripts for Sotheby’s New York stated that the consigner bought the documents in the mid-1990s from a manuscript dealer who had acquired them from Mr. Wayne. The consigner apparently believed that the untimely passing of Mr. Jobs in addition to the appearance of Mr. Isaacson’s biography suggested that this was an appropriate time to sell. (Illustration credit: engadget.com)

-Chris Hartman

Read Full Post »

Dennis M. Ritchie (standing) and Ken L. Thompson (seated), inventors of UNIX, at Bell Labs in front of a DEC PDP-11 computer, ca 1970. Courtesy, Computer History Museum.

Dennis M. Ritchie, who made two monumental and lasting contributions to computing: “C” programming language and the Unix operating system, died last week, aged seventy. More than any other successes realized in his digital research, these two innovations have had a remarkable and lasting impact on computer science and related disciplines – permanently establishing Ritchie’s prominence in the field. The Computer History Museum, in bestowing upon Ritchie its “Fellow Award” in 1997, asserted that “both … are foundations of our modern digital world”. As a result of their work, Ritchie and research partner Ken Thompson were the recipients of the ACM (Association for Computing Machinery) Turing Award and the United States National Medal of Technology, among many other honors. And this past May, Ritchie and Thompson were named Laureates in the category of Information and Communication in receiving the Japan Prize, an annual award given by the Japan Foundation:  “… to individuals whose original and outstanding achievements are not only scientifically impressive, but have also served to promote peace and prosperity for all mankind”.

The C programming language, a shorthand of words, numbers and punctuation, is still widely used today, and its successor languages, like C++ and Java, employ the same syntax. Ritchie was the leader of C development and co-authored (with fellow researcher Brian Kernighan) the definitive early book on the language entitled The C Programming Language (1978). This book, and the version of C it documents, continues to be simply called “K&R” after its authors and is a classic in the history of computer science selling millions of copies and having been translated into twenty-five languages.

And the development of Unix left an equally enduring legacy. Its free, “open-source” variant, Linux, is arguably the leading server operating system, and it runs the ten fastest supercomputers in the world. Linux’ open-source software revolutionized the computer industry because of the very nature of open source software code, which may be used, freely modified, and redistributed in both commercial and non-commercial settings. While at Bell Labs’ Computing Sciences Research Center, Ritchie and Thompson wrote and published the Unix Programmer’s Manual in November of 1971, which can be found on Ritchie’s own website.

Dennis M. Richie, 2011 recipient of the Japan Prize in the category of Information and Communication. Courtesy, Japan Foundation.

In the course of his Unix research at Bell Labs, Ritchie made extensive use of the Gordon Bell-designed Digital Equipment Corp. PDP-11 computer. In one instance, he demonstrated that Unix could run on more than one type of computer by exporting it from the DEC. According to the Computer History Museum:

“The success of the Unix operating system is in large part due to its ability to run on a great variety of different types of computers with minimal changes. This was made possible when Unix was re-implemented in C early in its development. Prior to that, Unix, like most other operating systems, was written in the assembly language unique to each type of computer, requiring great effort. Ritchie demonstrated the flexibility that came with implementing Unix in C by porting it from the DEC PDP-11 minicomputer, on which Unix was running at the time, to an Interdata 8/32 computer”.

Later on in his career at Bell Labs, Ritchie managed a group that created a Unix-like operating system called Plan 9 (after the Ed Wood “horror” film Plan Nine from Outer Space). Ritchie retired from Bell Labs in 2007.

-Chris Hartman

Read Full Post »

Courtesy, thehackernews.com

There was a time when “hackers” were seen as indispensable, if not plodding and exacting foot soldiers in the arcane world of computer programming. Certainly, many in their own ranks saw themselves that way. Their almost tunnel-visioned fascination with code, debugging and programming generally bordered on the obsessive. A previous post I wrote here on Nathan Ensmenger’s book The Computer Boys Take Over, included the opinion of one management consultant, Herbert Grosch (himself a former programmer) who referred to them as the “Cosa Nostra” of the computer industry for their ungovernable yet highly intellectual and analytical natures.

Grace Hopper, who I’ve also written about here on High Tech History, was an early programmer (many of earliest of the profession were women) who was devoted to honorable goals. In her case, it was helping to win World War II at Harvard’s computer lab under the leadership of Howard Aiken – which proved invaluable to the U.S. naval effort in the field of ballistics. The idea of hacking for illicit or otherwise mischievous objectives would have been unthinkable at the time.

Now, fast-forward fifty years and you have the curious case of Kevin Mitnick, a brilliant yet devious programmer who almost single-handedly reversed the connotation of “hacker” from relatively unknown, yet positive  – to malicious, dangerous and, at its worst, criminal. He’s now attempting to set the record straight in a new book, Ghost in the Wires, which he co-wrote with technology writer William L. Simon. Mitnick and Simon had collaborated on a previous book, The Art of Deception: Controlling the Human Element of Security (2003), which also has significant bearing on his current book. Mitnick offers several examples of where he was able to breach the security of a company through the unwitting assistance of its own personnel. Mitnick euphemistically refers to this as “social engineering.” As Mitnick himself claimed, “People, as I had learned at a very young age, are just too trusting.”

But what sets Mitnick apart from more diabolical “hackers” is that he never used the information he acquired for financial or other gain. He repeatedly asserts he simply did what he did because he could. In other words, it was the challenge rather than the information he ultimately gained access to. This is a point of intersection between himself and Apple, Inc. co-founder Steve Wozniak, who in his youth likewise hacked the local phone company out of an intense curiosity in its switches and circuits. Called “phone phreaking,” this procedure involved the manipulation of telephones and related infrastructure, as well as telephone company employees themselves. Wozniak, who is friendly with Mitnick and has written introductions for both of Mitnick’s books, credits him with finally getting the previously reclusive Wozniak out on the lecture circuit.

Kevin Mitnick's "Wanted" poster issued by U.S. Marshals, 1992. Flickr.com

Such relatively innocuous stunts led eventually to Mitnick’s pilfering of proprietary code to hack into companies like Sun Microsystems and Novell – as well as eavesdropping on the National Security Agency’s telephone calls. As authorities closed in on him, he went on the run until he was caught in February, 1995 and subsequently imprisoned (he was released in 2000 and has since formed his own company, Mitnick Security Consulting, LLC., which advises businesses on computer security strategies).

Mitnick also uses much of his book to debunk some of the more incredible rumors manufactured by authorities about the nefarious extent of his activities – such as his ability to “whistle into a telephone and launch a nuclear missile from NORAD.” He also asserts that he ignored the credit card numbers and other financial information he routinely encountered in his pursuit of code – the hacker’s manna.

Kevin Mitnick. Courtesy, pocketberry.com

But all told, Mitnick, an equally brilliant and cheeky sort, relished invading the intricacy of technology and bending both it and its human element to his will. As one savvy reviewer humorously noted in his appraisal of The Art of Deception: “After Mitnick’s first dozen examples [of security breaches], anyone responsible for organizational security is going to lose the will to live.” But Mitnick’s chief defense, as he claimed he told the former Wall Street insider-trader Ivan Boesky when they were both in prison together, was that “I didn’t do it for the money; I did it for the entertainment.” And the record appears to confirm this. For this and other reasons, Ghost in the Wires is a valuable book that computer enthusiasts and historians alike can enjoy – combining both humor and insight as it delves into a comparatively innocent period of computer science – one that existed before hacking did truly turn malicious and financially motivated.

-Chris Hartman

Read Full Post »

Jean-Claude Halgand, "Surf III," courtesy, Boston Globe

This year marks the fiftieth anniversary of the founding of what came to be known as the New Tendencies movement of computer art. As has been previously noted here at High Tech History, the earliest iterations of computers adopted a monolithic, emotionless, almost Bauhaus-ian severity that emphasized simplicity over complexity, function over form, and utility over creativity. But it would be short-sighted to believe that computers were not capable of great feats of artistry and even humanity.

With regard to the latter of those anthropomorphic attributes, and the powerful human responses they can engender, author and MIT professor Sherry Turkle noted in her recent book, Alone Together:

“My first brush with a computer program that offered companionship was in the mid-1970s. I was among MIT students using Joseph Weizenbaum’s ELIZA, a program that engaged in dialogue in the style of a psychotherapist … Weizenbaum’s students knew that the program did not know or understand; nevertheless, they wanted to chat with it. More than this, they wanted to be alone with it. They wanted to tell it their secrets.”

Computers were also capable of creating inventive and absorbing games, such as “Spacewar” that MIT students devised with Digital Equipment Corporation’s PDP-1 mainframe. And in what was the first instance of interactive gaming, the PDP-1 was engaged to play a game of “Kalah” – where Harlan Anderson, the co-founder of Digital, operated a terminal in California, and through a primitive “modem,” played with his colleague, Alan Kotok, seated at an identical computer in Maynard, Massachusetts, where Digital was based.

As in these cases, art was also an area of considerable interest for creatively-inclined computer engineers. The so-called “New Tendencies” movement was a short but intense artistic experiment that took place in Yugoslavia fifty years ago but has been influential far beyond that time and place in the intersection of computers in art. With an exhibition mounted by Matko Mestrovic at the Museum of Contemporary Art of Zagreb, Yugoslavia in 1961, the New Tendencies movement advocated strongly that the “thinking machine” was adopted as an artistic tool and medium. Pursuing the idea of “art as visual research,” the New Tendencies movement embraced the medium of computer-generated graphics, film, and sculpture.

MIT Press' new book on the New Tendencies movement in computer art. Courtesy, MIT Press.

This pioneering work has now been strikingly displayed and chronicled in a new tome published by MIT Press: A Little-Known Story about a Movement, a Magazine, and the Computer’s Arrival in Art: New Tendencies and Bit International, 1961-1973, edited by Margit Rosen. The book includes new essays by Jerko Denegri, Darko Fritz, Margit Rosen, and Peter Weibel; many texts that were first published in New Tendencies exhibition catalogs and Bit International magazine; and historic documents. Including more than 650 black-and-white and color illustrations, this book offers testimony to both the exhibited artworks and the movement’s protagonists. Many of the historic photographs, translations, and documents are published here for the first time. Bit International magazine, the chief chronicler of this phenomenon, was a beneficiary of the participation of computer enthusiasts from the farthest reaches of the western and eastern hemispheres. And after only a few years, images from New Tendencies started to find their way into landmark exhibitions at museums such as the Louvre and the Museum of Modern Art in New York City.

Dushko Petrovich. Courtesy, GregCookLand.com

Though nowadays it is commonplace, at the time this movement began in 1961, computers were typically in university, corporate, and military domains; so for such an innovative and seemingly incongruous use for computer technology to arise was a monumental achievement, by any stretch of the imagination. And the power of these machines to evoke emotional and other very human responses through artistic expression is compelling, wondrous and dramatic. And writing in the Boston Globe, Dushko Petrovich, a painter and critic who teaches at Boston University, notes: “Peering into the age before computers is already tricky enough, but the New Tendencies art shows us something more disorienting: a time when the computer offered total respite from the political, the commercial, the social, and the everyday.” And MIT Press concludes about their publication on New Tendencies, “Taken together, the images and texts offer the long overdue history of the New Tendencies experiment and its impact on the art of the twentieth century.”

-Chris Hartman

Read Full Post »