Archive for the ‘Gaming’ Category

I just read this fascinating article by Nick Paumgarten  in the December 20/27 double issue of The New Yorker magazine about Shigeru Miyamoto, widely considered to be the world’s foremost gaming system developer and producer. He has been with Nintendo since 1977 and has worked for no-one else. In addition to creating Zelda (named for novelist F. Scott Fitzgerald’s wife) and its various offshoots, he also designed Donkey Kong and the Super Mario Brothers series, and was a team leader in the creation of the Wii gaming system.

Shigeru Miyamoto

Miyamoto is an unassuming 58 year-old man who has a feel for what gamers of all ages seem to want. In fact, a recent poll taken by game developers seeking the most highly-regarded amongst them had Miyamoto winning in a landslide. Will Wright, the creator of the Sims and Spore, and the developer who came in third in that same poll, praised Miyamoto effusively:

He approaches the games playfully, which seems kind of obvious, but most people don’t. And he approaches things from the players’ point of view, which is part of his magic.


Read Full Post »

Having secured my own iPad device the Saturday before Easter, I thought it would be beneficial to attend a gathering at Boston’s Vilna Shul synagogue on April 7th to discuss high tech platforms and applications, and how they are important not only for how information is delivered to the end-user, but also how they have impacted our culture generally. The iPad and its potential for projecting the importance of platforms and applications may be in its infancy, but already has software engineers and designers excited about this new vehicle for displaying their work. I recall Harlan Anderson mentioning in his memoir, Learn, Earn & Return that the PDP-1 computer Digital Equipment Corp. created required the active participation of software designers to maximize the computer’s enormous potential, but that there was much friction between hardware and software engineers. Nowadays, these two camps need each other more than ever, which has necessitated close interaction.

As the introduction on Vilna Shul’s website described the talk, “There was a time when the most complicated platform decision faced by a developer was PC or Mac, and sometimes choosing a flavor of Unix. Now in the age of mobile devices and set-top boxes, a platform is not just a development tool but can serve as a channel for delivering product to the market … The arrival of a new platform, such as the much-anticipated iPad, creates new opportunities, but what factors determine the rate of adoption and the return on investment in building for that platform?”

Serving on the panel to discuss these and other topics were: Ravi Mehta, Vice President, Products, Viximo; Brad Rosen, C.E.O., Drync, and Michael Troiano, President, Holland-Mark Digital. Christopher Herot, Chief Product Officer, VSee Lab moderated and software designer Doug Levin, per usual, ably introduced and coordinated the event.

The first part of the discussion centered on choosing a platform and what makes one better than another. Mike Troiano, who received his MBA from Harvard Business School and who writes a high tech blog called Miketrap, indicated that the platform is the very foundation on which 3rd parties can participate; i.e. consumers. Ravi Mehta, who received his business degree from MIT’s Sloan School, added that the idea of applications is for them to interact with the hardware. The platform traditionally served as more of a “gateway,” but now is a key delivery system for applications. Brad Rosen, also an attendee of Sloan School, whose Drync is an application for wine enthusiasts designed for the iPhone, added that he’s found that the iPhone platform is able to effectively deliver a targeted message or product to a specific audience.

Ravi then mentioned that Facebook has taken the initiative in designing a platform that gives designers free reign in creating applications that seek out target audiences. Mike added that third party applications on Facebook have become a business strategy all their own, and that many in high tech felt this area would be a growth industry – which it has proven to be – and has created boundless opportunities for software developers. Brad added that games in many ways have driven this growth. Additionally, Mike said, there has been mutual benefit (platform/app) in collaboration, as opposed to going it separately (i.e. division of value).

Brad, whose previous experience has been with the iPhone – and who will now be designing for the Android phone, said that with software development for both devices, the “monetizing” model is still fluid, but that the goals of the platform and application designers are closely aligned – even if the model isn’t yet aligned to make the developers money. However, Apple, it was mentioned, seems to be more conscientious than others in that 30% of revenue from apps goes to designers. An interesting fact Brad added was that Apple now allegedly has 700 million credit card numbers on file, which makes the importance of the “micro-transaction” – no matter how small – a lucrative one.

Next question from Chris Herot to the panel involved “When is a platform NOT one.” Mike observed that this is the case when participation in the platform is controlled by a central body – and is not organic in nature. As mentioned previously, Facebook’s success is due in large measure to its spontaneity, which is not controlled by or filtered through corporate or other entities. Ravi suggested that Twitter, though a successful platform encouraging social interaction, does not present as many opportunities for software applications. In relation to the iPhone, Ravi added that people bought it for its “killer” applications; though the whole process through which applications became influential was completely spontaneous. Apple didn’t originally push various applications for the device; but the more they came to dominate the whole mindset of the iPhone, more opportunities presented themselves for the company, and for software engineers. Brad then commented that the quality of an application drives interest, and consequently, monetizing. 

Ravi and the rest of the panel then went into how Steve Jobs and his visionary approach to platforms has resulted in an inexorable shift in consumers’ behavior and their very interaction with computers. In relation to this, Chris Herot asked who had recently acquired an iPad, and also asked hypothetically which platform would each of the panelists design for. Brad said he enjoyed developing for the iPhone and Android, though he added that many designers (particularly in the gaming sphere) are leaving the iPhone for the iPad because of the sheer screen size.

In summary, I learned that applications tend to propagate exponentially when their platform is judged to be a successful vehicle for their display and growth. Apple Computer, in designing the iPhone may not have conceived it as the cultural phenomenon it has become in serving as the platform for hundreds of thousands of creative applications, but through the organic interaction between consumers and computer engineers, has become iconic. Where the iPad goes from here is based on that same template, and with a larger screen featuring high definition technology, it’s not inconceivable that this device may be headed toward even greater heights.

Read Full Post »

The New York Times reported yesterday (Dec. 7th) that there was a reunion last month of colleagues who pioneered the Stanford Artificial Intelligence Laboratory. They met over two days at the William Gates Computer Center on the Stanford campus.

According to the article’s author, John Markoff, there were other pioneering labs at Stanford, but the A.I. lab received less recognition than its peers:

“One laboratory, Douglas Engelbart’s Augmentation Research Center, became known for the mouse; a second, Xerox’s Palo Alto Research Center, developed the Alto, the first modern personal computer. But the third, the Stanford Artificial Intelligence Laboratory, or SAIL, run by the computer scientist John McCarthy, gained less recognition.”

SAIL was begun by Dr. John McCarthy (who coined the term “artificial intelligence”) in 1963. Les Earnest was its deputy director. During that time, McCarthy’s initial proposal, to the Advanced Research Projects Agency of the Pentagon, envisioned that building a thinking machine would take about a decade. In 1966, the laboratory took up residence in the foothills of the Santa Cruz Mountains behind Stanford in an unfinished corporate research facility that had been intended for a telecommunications firm.

Markoff continues, “SAIL researchers embarked on an extraordinarily rich set of technical and scientific challenges that are still on the frontiers of computer science, including machine vision and robotic manipulation, as well as language and navigation.”

This group of alumni distinguished themselves in other innovative and distinctive ways – with artificial intelligence at the heart of their experimentation. As Markoff notes, “… Raj Reddy and Hans Moravec  went on to pioneer speech recognition and robotics at Carnegie Mellon University. Alan Kay brought his Dynabook portable computer concept first to Xerox PARC and later to Apple. Larry Tesler  developed the philosophy of simplicity in computer interfaces that would come to define the look and functioning of the screens of modern Apple computers — what is called the graphical user interface, or G.U.I.”

John Chowning, a musicologist, referred to SAIL as a ‘Socratean abode.’ He was invited to use the mainframe computer at the laboratory late at night when the demand was light, and his group went on to pioneer FM synthesis, a technique for creating sounds that transforms the quality, or timbre, of a simple waveform into a more complex sound. (The technique was discovered by Dr. Chowning at Stanford in 1973 and later licensed to Yamaha.)”

As has been noted previously in “High Tech History,” Spacewar was, in essence the first video game which was programmed with a Digital Equipment Corp. PDP-1 computer. At Stanford, Joel Pitts, a protege of SAIL’s Don Knuth (who wrote definitive texts on computer programming),  “… took a version of the Spacewar computer game and turned it into the first coin-operated video game — which was installed in the university’s student coffee house — months before Nolan Bushnell did the same with Atari.”

In 1980, the lab merged with Stanford’s computer science department, reopened in 2004, and is now enjoying something of a rebirth. Markoff concludes,

“The reunion also gave a hint of what is to come. During an afternoon symposium at the reunion, several of the current SAIL researchers showed a startling video called “Chaos” taken from the Stanford Autonomous Helicopter project. An exercise in machine learning, the video shows a model helicopter making a remarkable series of maneuvers that would not be possible by a human pilot. The demonstration is particular striking because the pilot system first learned from a human pilot and then was able to extend those skills.

But an artificial intelligence? It is still an open question. In 1978, Dr. McCarthy wrote, “human-level A.I. might require 1.7 Einsteins, 2 Maxwells, 5 Faradays and .3 Manhattan Projects.”

Reunion of the S.A.I.L. Laboratory at Stanford University last month

Read Full Post »

I was speaking with a friend last night about an early computer game I played at Hartwick College’s computer center about 1981.  I had always known it as “Adventure;” but knew very little about its origins or background.

Well, after “trolling” the internet for an hour or so this morning, I found a few references to the programmer who created it (William Crowther, who was employed by Bolt, Beranek and Newman in Boston), and the mainframe computer it was created on (the PDP-10, created by Digital Equipment Corporation). It was a revelation. I played it for hours – which probably could have been more productively spent at study.

As I was subsequently to learn, Crowther was a “caving” expert who wrote the program (one and two-word commands) to navigate through a cave – all the while avoiding being killed by dragons and shaken-down by trolls lurking under bridges. Crowther developed the game in 1975, and first released it under the name “Colossal Cave Adventure” on the ARPAnet, predecessor to the Internet, which Bolt, Beranek and Newman had developed.

It had proved immensely popular, and eventually it spread to the college microcomputing world.

Scott Adams, another programmer, was a huge fan of the game, and one of the few who achieved “Grand Master” status for scoring perfectly (350 pts.) on Crowther’s version. In 1978 he started a company, “Adventure International” that sought to introduce the game to a wider audience, including the new personal computer market. It closed down in 1985, having lost market share in the gaming industry – presumably because of its graphics, which lagged behind those of other companies at the time.

Here’s a brief video of what “Adventure” looked like to the average “cave dweller.”

Read Full Post »

As it turns out, Digital Equipment Corporations’s “Space War”, which I’ve previously discussed, was not the only technologically ground-breaking game made interactive on DEC’s PDP-1 computer. Fifty years ago this year, an ancient Near-Eastern game, “Kalah,” was also programmed by MIT students for this computer. In its traditional form, it required the movement of stones from one’s “pits” as illustrated above.

During the course of the game, “stones” are removed from each player’s pits, and are cast into the Kalahs (or goals), but are never removed from them. The game ends when one player’s pits are entirely empty. The other player then casts the stones remaining in his pits into his kalah. At this point, the player with more stones in his Kalah would win. The way the MIT students designed it was very much like computer chess: you would play against the computer itself.

Not long after the game was developed in 1959, it was played remotely on DEC’s PDP system, where one party, on a computer terminal in California, was able to play the game with another sitting at a similar terminal at DEC’s headquarters in Maynard. This was the first time such an interactive computer game had been played remotely. This proved a very popular game for many years following. Copies of Digital’s original notes on the game are included in the archive of the Computer History Museum.

Illustration credit: The Computer History Museum, Mountain View, California.

— Christopher Hartman

Read Full Post »


Last night, while I was assembling my wife’s and my new Wii, I couldn’t help but think for a few moments about some of the games I played in both high school and college — Asteroids, Star Wars, Galaga, Tron, etc. Certainly the advances in computer technology in the interim have been extraordinary; but as I inserted the Wii’s input wires and prepared to start up the game program, I couldn’t help but feel the same sense of exhilaration I experienced when I was haunting my college town’s video arcade, armed with a pocket full of quarters.

This prompted me to prowl my bookshelves and unearth my well-worn copy of Michael Rubin’s wonderful book Droidmaker: George Lucas and the Digital Revolution, which details the origins and development of Lucasfilm’s computer graphics division – and its forays into film, games, medical graphics, etc. In the late 1970s and into the 1980s, Lucasfilm had a very pro-active gaming division; in Droidmaker’s chapter “A Hole in the Desert”, Rubin describes in scrupulous detail the free-form gaming environment there, and at other companies. For instance, there’s a lengthy examination of Atari and all the idiosyncracies both of its founder, Nolan Bushnell, and the company’s management style. I’ll examine some of those in subsequent posts.

But in the course of re-reading the chapter, one passage in particular caught my attention:

“By 1981, the arcade videogame world was reaching a feverpitch — it was impossible for anyone to imagine it continuing, and yet it grew month after month.

Atari released Tempest, the original vector display game, with color — a new feature. But all else was eclipsed by a game from Japan, Pac-Man. Distributed in the U.S. by Midway, the pinball giant, Pac-Man became a cultural phenomenon. The yellow character, originally called “Puck-Man” from the Japanese paku-paku meaning “to eat,” not only dominated in arcades, it was soon found on lunchboxes, board games, and hundreds of other products.”

But cultural phenomenon or not, what impressed me above all else was how the success of video game companies occurred almost in spite of their “let it be” management styles. For example, Rubin quotes Manny Gerard, a Warner Brothers executive (they had acquired Atari by this time):

“From $75 million, to $432 million, to a billion-one … it’s very hard to manage growth that fast … especially when the culture you’re starting out with — and I’m only slightly exaggerating — is a bunch of guys smoking dope.”

Michael Rubin’s book is filled with many more highly insightful interview excerpts like this one. I intend to review his book here in the near future. If you’re interested in the development of computer graphics, and the brilliant men and women who were at its forefront, Rubin’s book is an extraordinary window into that world – and an indispensable resource.

— Christopher Hartman

Read Full Post »