Quantcast
Silicon City: Before Steve Jobs, New York was the center of computing – Metro US

Silicon City: Before Steve Jobs, New York was the center of computing

As a custodian of the past, the New-York Historical Society’s Stephen Edidin would like to make one thing very clear about modern technology: Computers didn’t start with Steve Jobs.

Nov. 13-April 17New-York Historical Society170 Central Park West$20

The dominance of Apple makes it difficult to get out of its shadow but that’s the goal of Silicon City, the Historical Society’s new exhibit about computing’s New York roots opening this Friday, sponsored by AT&T. From the “Victorian Internet” of phonographsto IBM’s blockbuster 1964 World’s Fair exhibit that humanized machines, the digital age was set in motion and nurtured on the East Coast long before it retreated into the northern California valley.

RELATED: MOFAD Lab tastes success with first exhibit, Flavor

“You learn history first and then you innovate, and we hope the show will help in that way,” Edidin says.

As for the future, IBM — which produced the first portable computer in 1975 — is back in the spotlight, having moved its corporate headquarters to 51 Astor Place last year and looking to get back at the forefront of the field with its Watson artificial intelligence project. Edidin also points to NYC-based Etsy and the 3-D printing company Shapeways as the new phase of the city’s tech future.

Here’s a preview of the history of computing, BSJ — Before Steve Jobs.

Computing gets personal:The exhibit covers a timeline from the late 1800s to the 1980s, but doesn’t begin at the beginning. As essential as technology has become — running our lives, connecting us with friends, even being a friend in lines and on long commutes — computers were business and military machines in the popular imagination until the 1964 New York World’s Fair. IBM, which was founded in Edincott, New York, and was by then already a calculating machine monopoly on a scale no other company may ever be, built an egg-shaped theater as the jewel of its pavilion at the fair. Inside, visitors got a 12-minute lesson how computers and the human brain solve problems in similar ways, stoking popular interest in technology.

The eureka moment:The innovation that would allow the computer to eventually emerge was, of all things, the light bulb. Thomas Edison, known as the Wizard of Menlo Park (which was in New Jersey, but even Andy Murray is English when he’s winning), developed the switching current that would lead to vacuum tubes in 1904, which were integral in everything from radio to TVs, telephones and digital computers. They were eventually abandoned for the more stable transistor, which also allowed machines to get smaller — think Walkmen and, eventually, the smartphone.

Before machines could remember:Tech was not always a boys’ club, and women’s contributions get special attention in Silicon City. Ask to see a computer before 1960, and you were likely to be introduced to a person — chiefly, a woman. Before computers could do more than process data, a legion of women not only turned the levers but served as a sort of living RAM. In 1935, IBM’s first class of system service professionals was rows and rows of women. In 1954, Grace Hopper invented a programming language that remains in use today.