Best Demos Ever

Just this week a chip that I’ve been working on for the last couple of years finally came alive.   My colleagues put it on a board, loaded up the software, and it ran!   This was what the scene looked like:

[From “Young Frankenstein”.  It was so sad to lose Gene Wilder this week.]

Well, something like that.  It did flash some LEDs and print out some status messages as we huddled around it in a cubicle.  That meant that tens of millions of transistors had to work correctly, as did tens of thousands of lines of code.  It’s amazing that this stuff works at all, much less in the first days after power-up, but I have to admit that it wasn’t that spectacular.

So that got me thinking about demos that really did impress.  What were the some demos that really did blow people’s socks off?  Here are a few from oldest to newest:

First Display of the Telephone

Bell model of 1876

Bell model of 1876

Alexander Graham Bell achieved the first transmission of speech in March 1876, but couldn’t interest anyone in it.  He offered it to Western Union, who thought it was a toy. But his fiancé, Mabel Hubbard, knew what to do.  In June 1876 she secretly bought him a train ticket from Boston to the Centennial Exhibition in Philadelphia.   This was a massive world’s fair, showcasing the latest and greatest in technology.   The typewriter!  The mechanical calculator!  A slice of cable from the Brooklyn Bridge and the arm of the Statue of Liberty!

Mabel packed his bag and took him to the train station without telling him where he was going.  When she handed him the ticket, he protested, but she had already turned away.  Since she was completely deaf (they had met when he was teaching at a Boston school for the deaf), his complaints were of no avail.

He went to Philadelphia and stretched a wire from one end of the Education Hall to the other.  He again got no interest, until the last day of judging.   The emperor of Brazil, Don Pedro, was on the awards committee and recognized Bell from a tour of schools for the deaf.  He asked Bell what he doing at this machinery exhibition. Bell handed him a receiver and then went to the other end of the hall.  He recited Hamlet into the transmitter.  “My God, it speaks!” exclaimed the emperor.   That’s the reaction you want from demo visitors!

The telephone became the hit of the Exhibition, and Bell was on his way.  Thirteen years later he retired to his own supervillain lair, Beinn Bhreagh in Cape Breton, Nova Scotia to continue his astounding (but sadly not fiendish) experiments.

First Serious Light Bulb

Recreation of Edison's Lab in Menlo Park

Recreation of Edison’s Lab in Menlo Park

On New Year’s Eve 1879, Thomas Edison invited the public and press to Menlo Park, New Jersey to see his newly developed light bulb.   He and his crew had gotten it working three months earlier, but had only shown it to his investors and select journalists.   This was the big reveal.  He had 25 bulbs strung around the lab buildings and the street outside.  A huge crowd came out in the cold and snow.  Inside the lab itself, he showed a bulb burning while dunked in a glass jar of water.  You can’t do that with a candle, or a kerosene lantern, or a gas mantle, or a carbon arc lamp, or any other form of lighting in the previous history of the world.

The story was, well, electrifying.  It was on front pages around the world.  It was such an iconic moment, such a dividing line in history, that it has provoked a prolonged backlash against Edison.  People say that he wasn’t actually first, or that he stole other people’s ideas, or that he succeeded by brute-force trial and error.  Yet his bulbs were obviously better than others: they lasted longer because of better vacuums, were cheaper to install because they ran at a high voltage and so needed less copper, and were safer because he also invented fuses, switches, and circuit breakers.   He didn’t invent just a glowing wire; he created the entire system that made incandescent lighting practical.

His pride in it ultimately led to his downfall.  He got so wrapped up in defending his patents, particularly against George Westinghouse, that he got kicked out of his own company.  J. P. Morgan had invested in him to create the Edison General Electric company, but was disgusted by his distraction during the patent wars.  He replaced Edison as CEO with one Charles Coffin, who immediately renamed it General Electric, as it’s known to this day.

That story is nicely told in a new roman à clef, “Last Days of Night” by Graham Moore:

Click for website

Click for website

It follows Westinghouse’s young lawyer, Paul Cravath, in his attempt to beat off the wily and sinister Edison in the late 1880s.  It’s a ripping yarn, but kind of hard on the guy who genuinely was the Wizard of Menlo Park.

The Mother of All Demos

In 1968 Douglas Englebart presented a demo at an ACM meeting that prefigured most of computing for the next fifty years.  At that time computing was done with punch cards fed into big central machines.   Englebart showed:

  • A graphical user interface
  • Controlled by a hand pointer that he called a mouse
  • That connected one computer to another
  • With links that you could click on
  • Displaying text that you could edit
  • With files that recorded changes for undoing mistakes

This video is long, but shows more innovation per minute than anything I’ve ever seen.   I’m typing on the consequence of this work right now.   The ultimate value of this is literally in the trillions of dollars.

Steve Jobs Sees Smalltalk on the Xerox Alto

Xerox Alto workstation, 1973

In 1979 Apple was already working on a big upgrade from the Apple II – the Lisa.  This would have a mouse, windows, and a graphics display.  Jobs knew that Xerox was way ahead of them in this style of interface.  They had built a machine like this, the Alto, years earlier.   He wanted to get Xerox’s tech, and get them to invest.   The execs at Xerox PARC knew that they could never build a machine as cheaply as the small, fast PC companies could, and hoped to collaborate.  They ordered the engineers to show the Apple crew everything.

Adele Goldberg, one of the authors of the graphics package Smalltalk, was furious about this.   Xerox was giving away the store before it even knew what it had.   She had to be directly ordered to give the demo.  She and a co-author, Daniel Ingalls, showed off the huge range of things Smalltalk could do: overlapping windows, education applications, an animation editor.  Michael Hiltzik describes what happened next in his book “Dealers of Lightning: Xerox PARC and the Dawn of the Computer Age” (1999):

At one point Jobs, watching some text scroll up the screen line by line in its normal fashion, remarked, “It would be nice if it moved smoothly, pixel by pixel, like paper.”

With Ingalls at the keyboard, that was like asking a New Orleans Jazz band to play “Limehouse Blues”. He clicked the mouse on a window displaying several lines of Smalltalk code, made a minor edit, and returned to the text.  Presto!  The scrolling was now continuous.

The Apple engineers’ eyes bulged in astonishment.  In any other system the programmer would have had to rewrite the code and recompile a huge block of the program, maybe even all of it.  The process might take hours.  Thanks to its object-oriented modularity, in Smalltalk such a modest changed never required the recompiling of more than ten or twenty lines, which could be done in second or two.

Jobs returned to Apple in a fury.   He had finally seen a system that would let his visions become instant reality.  The plans for Lisa were torn up, and the Macintosh was born.  Xerox got nothing out of it, except an eternal reputation as suckers.  Oddly, Jobs missed an even more important aspect of the Alto – that hundreds of them were wired together with a novel system called Ethernet.  Maybe that was because he cared mainly for the look of things, and not how they worked under the hood.

But it’s that under the hood stuff that my current project is demonstrating.  It won’t make emperors exclaim, or be in headlines around the world, or set the course of IT, or be the founding innovation of the world’s most valuable company, but we’re pretty proud of it.

Posted in Uncategorized | Leave a comment

Airborne LIDAR: Cool or Creepy?

The Guardian recently published a fascinating story, Revealed: Cambodia’s vast medieval cities hidden beneath the jungle, about the discovery of huge city complexes near Angkor Wat.   The temples of Angkor are built of stone, but these structures are roadways or wooden walls and so are far more perishable.  The Khmer Empire collapsed around 1300 CE, so they’ve been lost under vegetation for some 700 years.   It’s impossible to see them while traipsing through the jungle, but a remarkable new technology, Airborne Laser Detection And Ranging, can see them from the air:


Archaeologically-significant topography around the central brick towers of Sambor Prei Kuk. Courtesy McElhanney. Click for site

The paper describing the work is here: “Airborne laser scanning as a method for exploring long-term socio-ecological dynamics in Cambodia”.  It says this is the biggest lidar survey ever for archaeology.

Airborne lidar was developed in the late 90s.  It works by mounting a laser and a highly sensitive infrared photodetector on a helicopter.   The laser shoots out 120,000 pulses per second, and the sensor times how long they take to echo from off of the ground.  The helicopter flies at about 800 m, so the pulses take about 5 microseconds to travel down and back, which is what limits the pulse rate.  The laser is scanned across the landscape by a mirror cycling at 200 Hz, and you end up getting at least 16 pulses per m².  The timing is accurate enough to distinguish delays down to about 15 cm of height resolution.  The laser’s position is tracked by GPS supplemented with inertial sensors. The result is a 3D map of a landscape.

Most of the pulses will bounce off the tops of trees, which aren’t that interesting.   Some fraction, though, will get through the leaves and bounce off the ground.  This lets one see the actual ground level through all the shrouding jungle.  They did the scan in March 2015, when the leaf cover was at a minimum. Here’s a cross-section of one strip of their data:

Pre-Angkorian towers of Sambor Prei Kuk among the trees. Courtesy McElhanney

Pre-Angkorian towers of Sambor Prei Kuk among the trees. Courtesy McElhanney

By filtering the pulses by height, one can find the ground elevation over square kilometers at a time.   This actually takes a lot of processing, since huge amounts of data get collected, so the systems come with GPUs.  It’s carefully compared to ground surveys, and visible light images taken at the same time as the scan.   Angkor Wat itself is the most important archaeological site in Southeast Asia, and so has been heavily surveyed, so it makes a good reference.

The project is called the Cambodian Archaeological Lidar Initiative, and is a joint project between institutes in Cambodia and France.  It’s led by Prof. Damien Evans of the University of Sydney.   They did an earlier, smaller run in 2012. The survey itself was done by a Canadian company, McElhanney, which has done this kind of geographic information work all over the world since 1910, largely for forestry and mining interests.  The lidar is a Leica ALS70 (now replaced by the ALS80, which has a faster peak pulse rate).   It’s generally used for city surveys.

Well, finding lost cities is pretty cool.  How can this be creepy?  Hmmm, what else is hidden under forest canopies?   Drug operations.  Whether it’s marijuana farms in the Pacific Northwest or submarine construction in Columbia:

30 m seized by the Columbian army in 2011

30 m sub seized by the Columbian army in 2011

It won’t be possible to hide things on the land any more.  Not only can lidar see through trees, it can detect a camouflaged tank from its shape.  Operations like the above will have to be done indoors or underground.

And airborne lidar is getting easier to do all the time.  A Lincoln Labs spinoff in Massachusetts, 3DEO, has developed a much more sophisticated sensor than this Leica one using DARPA money.   It can detect a single photon of infrared light using a circuit called an avalanche photo-diode, and measure the return time down to a nanosecond.  It can do this for 4K pixels at once, and so can scan at a much higher rate than the Leica.

The size and power of the systems are also constantly improving.  The Leica weighs 90 kg and takes 900W, and so needs to be on a real plane or helicopter.   Cut these by a factor of 20 and they can go on a drone.    The size reduction will be driven by use for automotive sensing, where lidar is a good complement to radar.

With cheap, fast scanning there are scads of other interesting things one could do.   You could scan all the farms in a state every week to learn how crops are growing.  The depth of the snow pack could be constantly measured over the entire Rockies, and the height of every lake.  You could scan all the streets of a city every morning to find out where obstructions were in order to tell self-driving cars to avoid them.   Count all the cars on a highway or in parking lots!  Count all the sheep on a moor, or elephants on the veldt, or Taliban in Afghanistan!

Everything everywhere can now be tracked.   Sound creepy?  Sure, but it can be cool too.

Posted in Uncategorized | Tagged | Leave a comment

How Space Science Might Have Gone

But for an accident of history, this is how space science would have been done:

NASA Super-Pressure Balloon launched 5/17/16 from New Zealand, with COSI telescope. Click for blog

NASA Super-Pressure Balloon launched 5/17/16 from New Zealand, with COSI telescope. Click for blog

This is the launch a few days ago of the Compton Spectrometer and Imager, a soft gamma-ray (0.2 to 10 MeV) telescope designed to look at what’s happening around the galactic black hole and in supernova remnants.  You can follow its track and get views from its camera:

SPB COSI horizon cam, click for update

SPB COSI horizon cam, click for update

For a couple of million bucks these super-pressure balloons let you fly close to the edge of space – 33 km (110,000 feet) up, above 99.5% of the atmosphere.    The balloon is meant to stay up for at least 100 days, and will circle the earth several times at the latitude of New Zealand. Super-pressurizing means that there is extra helium in the balloon, so that even when it cools off at night it still has the same volume, and therefore the same lift.  The pressure difference between day and night is very small, only 0.03 psi, but that adds up over the 22 acres of surface.

They’re tricky to make because the extra pressure tends to make the envelope leak and then tear.  They’re made of 20 um thick polyethylene, the same as Saran wrap, with lines that run from top to bottom to keep the force off the film.   More design info here.   This pumpkin-style super-pressure balloon was invented by Julian Nott, who flew one from Perth on the west coast of Australia to Broken Hill on the east in 1984.  Google is using them for Project Loon, their attempt to build a world-wide wireless network by having thousands of these balloons floating around 20 km up with links to the ground and to each other.  This one was built by NASA’s Columbia Scientific Balloon Facility in Palestine TX, which is managed by the rocket company Orbital ATK.   They’ve done over a thousand launches, including missions that do great science like the BLAST far infrared telescope and the BOOMERanG cosmic microwave background instrument.  The program costs about $34 million to do a dozen launches a year from Texas, New Mexico, Antarctica, and Sweden.

So how is it that this elegant and cheap method of flight doesn’t dominate space science?  You could launch twenty missions like this for the cost of a single satellite.  Plus you can retrieve your instrument at the end, if the parachute landing isn’t too hard.  Plus you get to visit New Zealand!

Balloons like this could have been built in the 1960s, and in fact small ones were as part of the GHOST program for world-wide weather monitoring.  One of those balloons stayed up for two years, circling the earth dozens of times, and there was an unconfirmed sighting of one 15 years later.  Yet somehow they’re a neglected corner of the field.

I blame Walter Dornberger, the project manager for the V-2.   In 1942, after showing Adolf Hitler a test firing, he convinced Hitler that the rocket should be the top priority of the German economy.  The whole program cost 50% more than the Manhattan Project.  It killed more people in its construction, 12,000, than it did in its use, 9,000.   It so exhausted Germany that it probably shortened the war.

Rocketry would have taken decades longer to develop without that vast war-time investment.   Without the example of the V-2, the US and USSR would never have made the even vaster investments in the ICBM, and that’s what ultimately made orbital rocketry possible.  One successful demo in 1942 by a hard-charging startup CEO to a VC with particularly deep pockets, and the whole direction of science changed.

This isn’t a good way to make technical decisions.  Rocketry has not turned out to be all that technically important if you compare it to, say, plastics or electronics. Those affect daily life vastly more than anything to do with space. Nuclear fission, the other great distortion that came out of WW II, hasn’t turned out all that well either.

Ballooning really does depend on plastics and electronics, but it got short-changed.   Maybe that was inevitable.  There’s something primal about a big rocket launch.   It tickles that instinctual pyromania that hominids have evolved over two million years of handling fire. Maybe the graceful bobbing of a balloon as it lifts off just can’t compete with the glamorous roar and blaze of a rocket.   But now that super-pressure tech is finally working, maybe balloons will take their rightful place in supporting good, cheap, diverse science.

Posted in Uncategorized | Tagged , | Leave a comment

Rebecca Leaf, Engineer Heroine from MIT

I recently came across a striking set of stories about leftist women who attended MIT.  The school has had female graduates longer than any other major US university (their first was in 1873), and they’ve done remarkable things.  It turns out that Vilma Espin, the wife of Cuban President Raúl Castro, did graduate work there in 1955 in chemical engineering.  She then returned to Cuba to join the opposition to Batista,  met Raúl in Mexico, married him in 1959, ran the Federation of Cuban Women after the revolution, and died in 2007.

Then there was Lori Berenson, who attended MIT as a freshman in 1987, then went off to El Salvador to assist rebels there, and then to Peru to aid the Tupac Amaru, a violent communist group.  She was arrested (perhaps falsely) for planning to attack the Peruvian Congress, and just finished a 20-year jail term there and returned to New York.   She was actually in the same MIT living group that I was at, but it was years after I was there, and I don’t think I’ve met her.

Well, those aren’t terribly positive stories – the wife of a dictator and someone convicted of terrorism.   Yet I’m happy to relay a much more upbeat story, that of Rebecca Leaf, MIT ’82, and the founder of a non-profit that has brought hydroelectric power and clean water to tens of thousands of people in Nicaragua.

Rebecca Leaf, 1996

Rebecca Leaf, 1996

She grew up in Winchester, MA, an upscale suburb of Boston.  She came to MIT late, after having worked for some years as a potter. After getting a degree in mechanical engineering, she worked in the early 80s on developing new ceramics for heat exchangers.  This was at just the time when the Reagan administration was setting up the Contras, a group of mercenary terrorists, to fight the left-wing Sandinista regime in Nicaragua.  She visited there in 1984 under the auspices of TecNica, an international aid organization that sent American technicals to Nicaragua and South Africa to train locals and assist with industrial projects.   She fell in love with the country, and moved there.  By 1985 she was working at INE, the National Energy Institute, which oversaw electrification projects throughout that country.

That’s where she met a remarkable figure, Benjamin Linder.   He was a fellow American, and fellow mech E from the University of Washington, graduating in 1983.  He was a lot younger than her.   They shared a house but were not romantically involved.  Linder was a person of light spirit, inspired by the Sandinistas’ goals of aiding the desperately poor, but still fond of juggling, unicycling, and circus clowning.   Children followed him everywhere.

He found a hydroelectric project  that needed finishing in the northern part of the country near the village of El Cua.  It had been started in the late 70s with Swedish backing, but abandoned in the early 80s with the onset of war.  He moved up to El Cua and got it working by November ’85.  It was a small plant, only 100 kW, but it was the only steady power the town had ever seen.

He and Leaf saw an opportunity to build a larger, sturdier plant nearby at the town of San Jose de Bocay.  They worked up plans for it.  On April 28th, 1987, he and a local crew went up to inspect a weir that was being used to measure the water flow.  They were ambushed by Contras.  Linder and two of his companions were shot, then finished with bullets to the head.  His wallet, watch, and camera were stolen.

Leaf was back in Managua.  She was the first there to hear of his death, and it fell upon her to tell his parents in Oregon.  She couldn’t reach them, and had to leave the message with her own mother.

You don’t get to just kill Americans, no matter how dangerous a situation they may be in.  Even Israel discovered this when they ran over Rachel Corrie with a bulldozer in Gaza.  Linder’s family raised a huge stink, saying that CIA-financed thugs had murdered their son.   That led to Congressional hearings, wherein Connie Mack, a Republican congressman from Florida, told Elisabeth Linder to her face that her son had it coming.  It was an astonishing thing to say to a mother.  He was later elected US Senator for two terms, demonstrating once again why Florida is a state to be avoided.

But the Contra war was not popular in any case, and Congress refused to renew their funding the next year.  That led to the embarrassment of the Iran-Contra affair, where a rogue CIA unit tried to set up an black ops slush fund to kill people Reagan didn’t like.  When George H. W. Bush was elected 18 months after Linder’s death, he cut the Contras off.   The issue then became moot when the Sandinistas were defeated in a national election in 1990, and the Contras officially laid down arms.

Yet Leaf stayed on, even when all the ideological fervor had died down.  She was living in El Cua in dire poverty.   She could have had a decent life back in Massachusetts, but somehow designing industrial heat exchangers wasn’t as satisfying as bringing light to the poor.  She kept working on the Bocay hydro project.   She set up ATDER-BL, the Association of Rural Development Workers – Benjamin Linder:

Main office and mechanical shop. Click for link

ATDER-BL main office and mechanical shop. Click for link

It’s a humble operation, but successful.  The initial money came from Linder’s family, but she has raised it from all across the US.  They have a machine shop that can actually manufacture the water turbine parts, and can put up their own power lines, all with local skilled labor.  That even includes ex-Contras!  It took her seven years to finish the plant, because the war frightened most people away, and then the peace made the outside world lose interest.   But by 1994 she finally got the 185 kW Bocay project done:

Bocay Hydropower dam, 1994

Constructing the Bocay dam

They threw the switch on the 7th anniversary of his death.  There was a big ceremony attended by Linder’s family and all of his friends and colleagues.  Then in 1997 she received the Barus Award from the IEEE Society on Social Implications of Technology.

Since then she has gotten 30 other hydro projects built, ranging from a 2 kW dam for a coffee grower to a 930 kW project at El Bote that powers a good chunk of the province.  She has also gotten involved in watershed conservation.  So much of the jungle was being cut down for agriculture that the dams were all silting up because of erosion.  That led to clean water supply projects, something even more important for health than electricity.  In the last couple of years ATDER-BL has also been putting up photovoltaic panels for extra power, which now serve 1700 people.

Leaf in Bocay in 2015

Leaf in Bocay in 2015

Through all of this she has been remarkably self-effacing.  The two pictures of her in this post are the only ones I could find of her in all the Internet.   This may be her personal style, but it’s also clear that she believes that this kind of aid work only succeeds when the locals themselves take it to heart.  They built all this themselves, and they’ll be the ones to maintain it.   It can’t be a gift bestowed by white overlords.

Now, Nicaragua is in the middle of a renewable energy boom right now.  The country has lots of hydro and wind resources, and plenty of geothermal power from its active volcanoes.  Over half their power currently comes from renewables, and they hope to hit 90% by 2020.  International firms are dropping in to build huge projects that dwarf what Leaf has accomplished over the last 30 years.   Those aren’t supported by the people, though, and will fail when the next recession means they can’t pay foreign technicians.  That’s what has happened all over the developing world.

So here’s hoping that Leaf’s life-long efforts to get Nicaraguans to bootstrap themselves out of poverty actually stick.   She has done extraordinary good with no resources and in the face of murderous violence.   In Kendall Square MIT maintains an Entrepreneur Walk of Fame.   She’s exactly the sort of person they should be celebrating.


Donations to ATDER-BL are managed by an umbrella organization, Green Empowerment, here under the auspices of the Ben Linder Memorial Fund.

IEEE Spectrum did a typically thorough overview of her work in the April 2015 issue here: How Nicaraguan Villagers Built Their Own Electric Grid by Lucas Laursen.

The New Yorker had an important piece on Leaf and Linder in their Sep 23, 1996 issue: In Search of Ben Linder’s Killers by Paul Berman.  He gets big points for tracking down ex-Contras in the dangerous and bandit-filled northern mountains of Nicaragua, but can hardly hide his distaste for Linder himself.  At one point he actually calls him a goody two-shoes.  Rather incredibly, he ends the piece by claiming that Linder resembles the Contra who may have killed him.   The Contra, Santiago Giron Meza, had later settled down and found Jesus.   How a reformed guerilla resembles a unicycling altruist engineer is mysterious.

Linder’s story is told at length in The Death of Ben Linder by Joan Kruckewitt.  She came to Nicaragua early, and gives a far fuller and more balanced view of him than Berman.

Posted in Uncategorized | Tagged , , , | Leave a comment

Homo Faber

Even the children of homo sapiens can’t resist making things.   While walking through the local park, Menotomy Rocks, I came across this:

2016-04-09 17.13.56

It’s been there for years.    Every kid who walks by feels an urge to add a stick.   In mountain passes people feel an urge to add stones to a cairn, but in the woods they add sticks.  It took a couple of kids to move the bigger logs, or a bemused dad.

In a more isolated part of the park, remote enough that teenagers can have a campfire and some beers, I found this:

2016-04-09 16.49.10

“Keep Out” it says, and “Please don’t touch, this took a lot of work to make”

It’s kind of rocky for making out, but the pine needle bedding is soft, and some sleeping bags would help.   In yet another back corner is yet another hut:

2016-04-09 17.16.33 HDR

Leo for scale. Unhappy with standard skateboards, he found some scrap wood and made his own.

Do we have an instinct for making these from our remote ancestors?    Roof them with palm leaves and they would keep off the sun and some rain.   Weave in some horizontal flexible sticks and they would keep out smaller predators like hyenas.  Hominids can make their own dens this way without having to dig them.

But maybe they just tickle our pattern-making sense.   Lean three sticks together and they make an interesting shape.   Pile a lot more around them and they make something that looks vaguely functional.   We’ve been standing upright for millions of years, just so we can grab stuff.  Even as kids, even in a suburban park, we bipeds want to do something with the hands evolution gave us.

Posted in Uncategorized | Tagged | Leave a comment

“The Wright Brothers” and Thinking Straight


Click for site

There are lots of reasons to love the story of the Wright brothers.  They came from nowhere to solve the great problem of flight, one that had defeated so many others.  They showed straight-up physical courage when flying these dangerous machines, and would do it wearing suits and ties.  They built a secret base in the wilderness of Kitty Hawk for their amazing experiments.   They’re like Jules Verne characters come to life, but without the whole conquer-the-world thing.

Yet one thing that David McCullough’s new biography makes clear is that they succeeded when so many others failed because they had a better approach to invention: systematic and incremental development instead of the flash of genius.

The flash story is how invention is almost always described.   An apple falls on Newton’s head and he wonders if the Moon is falling around the Earth.  Edison thinks “What if you put so much current in a wire that it glowed white-hot, but kept it from burning up by putting a glass envelope around it?”  Tim Berners-Lee gets annoyed that all these connected computers have all these incompatible file formats, and adds a few text markers to them so that a single browser program can display them all.

It’s a good way to tell the story to children, because it makes invention seem like much less work.   Come up with the good idea, patent it, and fame and wealth are yours!

It’s nonsense, of course.  To start with, creative people have ideas all the time.    David E. H. Jones, author of the wonderful old Daedalus columns in Nature and New Scientist, describes the process in his book “The Aha! Moment: A Scientist’s Take On Creativity”.   He calls it the Random Idea Generator, or RIG,  a subconscious process that is constantly throwing up combinations of things.  The trick is not to get ideas; it’s to winnow them.  You have to filter the ideas down to get the ones that are useful, feasible, and doable by you.   Useful means something that a few people want a whole lot, or a lot of people want a little.  Feasible means that it can be done in a reasonable time frame, like a year for a nice idea, and a few years for a great one.   Doable by you means that you have some means of actually making it work.  Patenting ideas that you can’t implement is trolling.

So in the case of the Wrights, it was NOT that they had one great idea that made flight possible.  They had a whole set of them, each related to the problem they faced at the time.  They started by writing to the Smithsonian in 1899 to get any literature on the then-current state of aeronautics.   They then carefully studied what Lilienthal and Chanute had discovered.  They realized that control was as important as mere lift, and studied the flight of birds to come up with their famous wing-warping method.  They knew that Lilienthal himself had been killed while flying a glider, so they had to do unmanned trials first.   Then they needed an open place to practice, one with steady winds, so they wrote to the US Weather Bureau to find the best such place, and so came to Kitty Hawk.  It had long stretches of soft sand to crash into.  Better still, it was away from prying eyes – their only neighbors were a Coast Guard station a couple of miles away.  They needed a place to stay while there, so they built their own cabin and workshop.  They practiced with gliders, and found that the existing equations and tables for lift were wrong.  They invented the wind tunnel to measure it themselves.  They then needed a light, powerful engine, so they had a colleague cast and machine one out of an aluminum block.   The plane then needed a push to get it going, so they devised a rope catapult driven by a falling weight.

This then gets them to one of the most iconic pictures in American history, the one McCullough uses for his book cover:

Orville at controls, Wilbur running along beside

First powered flight in history.  Orville at controls, Wilbur running along beside.

There they are, cooperating and focused on something ingenious and wonderful.   This is how Americans like to see themselves!   It’s the animating story of every startup.

Once they flew for the first time in December 1903, they gave up on Kitty Hawk.  It was too remote, the biting flies were hideous, and they had almost been swept away in a hurricane.  They returned home to Dayton Ohio and practiced in a nearby field, constantly improving their machine, until the 1905 Flyer could stay up for up to 40 minutes.

Having solved the main technical problems, they then turned to sales.  They pitched it to the US Army, who wasn’t all that interested.   The US military didn’t get concerned about air power until much later, December 7th, 1941, to be exact.   The Wrights were already getting feelers from European officers, though, so they packed up a machine and went to France in 1908.   It was a sensation.   Thousands turned out to see every flight, including royalty.   The Euros had been trying for flight for the previous few years, and were abashed that these outsiders had gotten it first, but also tremendously excited.

In November 1908 the Aero-Club de France threw a huge banquet in their honor and awarded them gold medals and a $1000 prize.   At the banquet Wilbur gave a particularly gracious and eloquent speech.   Let me copy it here just to show that he was not the stern, taciturn mechanic that he’s usually portrayed as:

For myself and my brother I thank you for the honor you are doing us and for the cordial reception you have tendered us this evening.

If I had been born in your beautiful country and had grown up among you, I could not have expected a warmer welcome than has been given me.  When we did not know each other, we had no confidence in each other; today, when we are acquainted, it is otherwise: we believe each other, and we are friends.  I thank you for this.

In the enthusiasm being shown around me, I see not merely an outburst intended to glorify a person, but a tribute to an idea that has always impassioned mankind.   I sometimes think that the desire to fly after the fashion of birds is an ideal handed down to us by our ancestors who, in their grueling travels across trackless lands in prehistoric times, looked enviously on the birds soaring freely through space, at full speed, above all obstacles, on the infinite highway of the air.

Scarcely ten years ago, all hope of flying had almost been abandoned: even the most convinced had become doubtful, and I confess that, in 1901, I said to my brother Orville that men would not fly for fifty years.  Two years later, we ourselves were making flights.  This demonstration of my inability as a prophet gave me such a shock that I have ever since distrusted myself and have refrained from all prediction – as my friends of the press, especially, well know.   But it is not really necessary to look too far into the future; we see enough already to be certain that it will be magnificent.  Only let us hurry and open the roads.

Once again, I thank you with all my heart, and in thanking you I should like it understood that I am thanking all of France.

That got a standing ovation!   And he had to autograph over two hundred menus.

But Orville didn’t see this triumph.  Two months earlier he had been in a terrible crash while showing the plane to the Army in Fort Myers, Virginia.  A propeller had cracked, which caused so much vibration that a guy wire snapped, wrapped around it, and shattered it, sending the plane straight down into the ground. It killed his passenger, an Army Lieutenant, and broke Orville’s leg and four ribs.   His passenger could have been Theodore Roosevelt!   Their sister Katherine immediately rushed to his side, and nursed him through months of recovery.

They both made it over to Europe by early 1909, with Katherine drawing as much interest as her brothers.   Orville did get back up in the air six months after his accident, and was able to train both French and Italian pilots.   Then they all returned to the US to more acclaim, and a spectacular flight up and down the Hudson and around the Statue of Liberty.  A million New Yorkers came out to see them.

But after that their lives got darker.  Everyone with technical talent got into aviation, and their records were soon surpassed.   Wilbur spent a lot of time defending their patents, and trying to get a proper business set up for building aircraft. That seemed to be where their enormous ingenuity reached its limits – they didn’t have the heart or skills for scaling up their inventions.   It’s a well-known problem among startups to this day, and venture capitalists will often replace the founder as CEO when things get serious.

The strain killed Wilbur.  He was traveling incessantly to raise money and sell product, and it wore him out.   He caught typhoid fever in Boston in 1912, and died six weeks later at age 45.

Orville had neither the interest or energy, given his injuries, to expand the company, and sold it in 1915.  By the time of the US entry into World War I in 1917, the Feds were sick of the patent feuding among the inventors, and forced them to all to enter a common pool for a payoff of $2 million each.  Orville last piloted a plane in 1918, but was involved with NACA, the predecessor of NASA, almost until his death in 1948 at age 77.

Their ultimate tribute came much later.  When Neil Armstrong stepped on the Moon in 1969, he was carrying a piece of muslin from the 1903 Flyer.

But, yes, they stopped contributing to aviation by about 1910.   They were surpassed, as everyone is in the end, and got too involved with growing their business and maintaining their IP to keep up.  But they showed everyone else the method of success – identify each issue as it comes up and knock it down.  It’s not so much a stream of brilliant insights – it’s insights focused on what prevents progress right now.   Don’t Think Different, Think Straight.


Posted in Uncategorized | Tagged | Leave a comment

The Singularity Will Be Built Out of the Same Old Stuff

I work in a field, semiconductors, that is the paradigmatic example of the Singularity.  When people talk about technology zooming up the exponential growth curve, they’re talking about what I do.  Since the field began in the early 1960s, there has been more improvement in integrated circuits than in any other technology except magnetic storage.  Each transistor has not only gotten cheaper by many orders of magnitude, but they’re faster, more reliable, and use less power.  Yet in spite of all this, there’s been a lot less change in the underlying ideas than you would think.

I’ve seen the progress part directly.  The first chip I worked on, the V-11 microprocessor at DEC in the early 80s, had 110,000 transistors with a gate length of 3 µm and ran at 5 Mhz.   My latest has 130,000,000 transistors with a gate length of 28 nm, and runs at 750 Mhz.   That’s not actually a big chip by today’s standards!  Yet it’s a factor of 1200 in density and 250 in raw speed.  Overall it’s about 2000 times faster, and 20 times cheaper to boot.

It’s a story of terrific technological success, which naturally leads to vast hype.  People have compared the invention of ICs to that of printing, writing, and even fire.  Some writers – most notably the SF author and CS professor Vernor Vinge, and the inventor Ray Kurzweil – go farther still and claim that they will bring about the transcendence of humanity, what they call the Singularity.   Things will move so far that no one today will be able to comprehend the people (or their uploaded avatars) of that era.   We’ll either be amoebas crushed by battling AIs, or we’ll all be as gods.

Yeah right.  I’ve actually been in this field for over 30 years, and what strikes me  is NOT how unfathomably different it has become, but how constant the underlying technologies have actually remained.     Let me start at the lowest level and work up:

  1. The base material of chips is still silicon, as it was in the early 60s.   People tried other compounds like gallium-arsenide and silicon-germanium alloys, and they’re only used for niche products, and fewer and fewer of those.
  2. The base device is the MOSFET transistor, invented in the 1960s.  They look a little different today – they’re built out of etched fins on the chip surface instead of lying flat along it, and use hafnium oxide instead of silicon dioxide for the dielectric, but it’s the same basic device that Noyce and Moore worked on. There are no more bipolar transistors, no tunnel diodes, no magnetic bubble memories, and no exotic non-volatile devices like MRAM.    Maybe the last will happen at some point, but people have been trying new schemes for decades.
  3. The base circuits are the CMOS static combinatorial gate, the 6-transistor SRAM cell, the DRAM capacitive storage bit cell, and the floating gate flash memory cell, all from the 70s.   These account for practically all of the 10^21 (billion trillion) transistors made per year.   Dozens of other circuit styles have been tried, and all have failed because of excess power or poor noise resistance.  I’ve worked in some of these styles, such as boot-strapped NMOS, pre-charged dynamic, and cross-coupled cascode, and they’re all gone.   Few people even design at the circuit level any more, except for those doing standard cell libraries (and most of those are auto-generated) or analog circuits.
  4. The base data types are character, integer, fixed point, and floating point, all known to Mauchley and Wilkes in the late 1940s.  Characters, at least, have expanded from the  5-bit Baudot code to the 8 to 32-bit UTF-8 Unicode.    There are no logarithmic number systems (which make divide and square root easy), or redundant binary (which avoid carries), and hardly even any support for 128-bit floating point, which we built in the 80s.
  5. The base processor architecture techniques were all discovered by the 1970s.  Pipelining, caching, branch prediction, vector instructions, and out-of-order execution were all known by then.   They used to be only available in supercomputers, and now every widget that goes for more than $30 can afford an ARM Cortex-A9 processor, which uses all of them.   Even the newer multi-core processors use ideas like write-back cache coherence, multi-threading, and distributed cache directories from the 80s.  Neural networks, massive SIMD, and dataflow never made it.
  6. The base processor language is still C from the 70s.  Lots of code won’t even rely on the enhancements in the 1999 update, C99.   People code today in a huge range of languages, but somewhere underneath them is usually an interpreter written in C, and C is in all the libraries and the OS.  It’s about the only binary compiler that most machines support.  That and FORTRAN, which is even older.  The object-oriented enhancements of C++ and Java are from the 80s. Lisp is still considered exotic, and it’s from the 60s.

Overall, it’s gratifying that the basic concepts of the field have persisted for so long.  It means that one’s initial education has not become irrelevant.  Learn the fundamental concepts and you can work for a long time in a wide set of sub-fields of the chip world.

But I don’t mean to say that there haven’t been any advances.  There have just been less than what the breathless promoters of futurity would have you believe.  And I don’t mean that there won’t be significant consequences of further work.   Twenty years from now we may be living in clouds of smart dust that support terabit links to our augmented-reality sensoria.  But it’s very likely that that smart dust will be made of silicon, use standard gates and arithmetic, run code on standard processors, and that that code eventually will depend on C.   Any smart kid of the last 30 years can understand all of it.



Posted in Uncategorized | Tagged | Leave a comment