Facebook

Facebook has been in the news recently – that’s an understatement. The recent travails the tech giant has undergone are well-chronicalled, and don’t need repeating here.

But some were identifying the downside of Facebook’s size some time ago. Here is what Ross Douthat shared almost two years ago in his piece: “Facebook’s Subtle Empire:”

IN one story people tell about the news media, we have moved from an era of consolidation and authority to an era of fragmentation and diversity. Once there were three major television networks, and everyone believed what Walter Cronkite handed down from Sinai. Then came cable TV and the talk radio boom, and suddenly people could seek out ideologically congenial sources and tune out the old mass-culture authorities. Then finally the Internet smashed the remaining media monopolies, scattered news readers to the online winds, and opened an age of purely individualized news consumption.

How compelling is this story? It depends on what you see when you look at Facebook.

In one light, Facebook is a powerful force driving fragmentation and nicheification. It gives its users news from countless outlets, tailored to their individual proclivities. It allows those users to be news purveyors in their own right, playing Cronkite every time they share stories with their “friends.” And it offers a platform to anyone, from any background or perspective, looking to build an audience from scratch.

But seen in another light, Facebook represents a new era of media consolidation, a return of centralized authority over how people get their news. From this perspective, Mark Zuckerberg’s empire has become an immensely powerful media organization in its own right, albeit one that effectively subcontracts actual news gathering to other entities (this newspaper included). And its potential influence is amplified by the fact that this Cronkite-esque role is concealed by Facebook’s self-definition as “just” a social hub.

These two competing understandings have collided in the last few weeks, after it was revealed that Facebook’s list of “trending topics” is curated by a group of toiling journalists, not just an impersonal algorithm, and after a former curator alleged that decisions about which stories “trend” are biased against conservative perspectives.

Want to read more

Da Vinci Today

OG-AW187_201709_GR_20170929114302

History’s most creative genius, Leonardo da Vinci, was not superhuman, and following his methods can bring great intellectual rewards to anyone writes Walter Isaacson. Here’s how he begins his piece about the inventor and innovator:

Around the time that he reached the unnerving milestone of turning 30, Leonardo da Vinci wrote a letter to the ruler of Milan listing the reasons why he should be given a job. In 10 carefully numbered paragraphs, he touted his engineering skills, including his ability to design bridges, waterways, cannons and armored vehicles. Only at the end, as an afterthought, did he add that he was also an artist. “Likewise in painting, I can do everything possible,” he wrote.

Yes, he could. He would go on to create the two most famous paintings in history, the “Mona Lisa” and “The Last Supper.” But in his own mind, he was just as much a man of science and engineering, pursuing studies of anatomy, flying machines, fossils, birds, optics, geology and weaponry. His ability to combine art and science—made iconic by “Vitruvian Man,” his drawing of a perfectly proportioned man (possibly a self-portrait) spread-eagled inside a circle and square—is why so many consider him history’s most creative genius.

Fortunately for us, Leonardo was also a very human genius. He was not the recipient of supernatural intellect in the manner of, for example, Newton or Einstein, whose minds had such unfathomable processing power that we can merely marvel at them. His genius came from being wildly imaginative, quirkily curious and willfully observant. It was a product of his own will and effort, which makes his example more inspiring for us mere mortals and also more possible to emulate.

More than 7,000 pages of Leonardo’s notebooks still exist, and there we find plenty of evidence that he was not superhuman. He made mistakes in arithmetic. He had a deep feel for geometry but was not adroit at using equations to codify nature’s laws. He left many artistic projects unfinished and pages of brilliant treatises unpublished. He was also prone to fantasy, envisioning flying machines that never flew and tanks that never rolled.

Want more? You can read the full piece here

Law of Innovation

BN-WH159_KEYWOR_P_20171123163122

Most businesses are “all about innovation.” We made innovation a buzz word, but few really have done a deep dive into what innovation means, especially in business.

While such a broad term defines simple explanation – and can mean many thinks to many people, I found Christopher Mims “Laws of Innovation” piece in the Wall Street Journal helpful in bounding the challenge. Here are his “laws.”

Three decades ago, a historian wrote six laws to explain society’s unease with the power and pervasiveness of technology. Though based on historical examples taken from the Cold War, the laws read as a cheat sheet for explaining our era of Facebook, Google, the iPhone and FOMO.

You’ve probably never heard of these principles or their author, Melvin Kranzberg, a professor of the history of technology at Georgia Institute of Technology who died in 1995.

What’s a bigger shame is that most of the innovators today, who are building the services and tools that have upended society, don’t know them, either.

Fortunately, the laws have been passed down by a small group of technologists who say they have profoundly impacted their thinking. The text should serve as a foundation—something like a Hippocratic oath—for all people who build things.

  1. ‘Technology is neither good nor bad; nor is it neutral’
  2. ‘Invention is the mother of necessity.’
  3. ‘Technology comes in packages, big and small.
  4. ‘Although technology might be a prime element in many public issues, nontechnical factors take precedence in technology-policy decisions.’
  5. ‘All history is relevant, but the history of technology is the most relevant.’
  6. ‘Technology is a very human activity.’

Want more? You can read the full piece here

Printing Press and iPhones

07eganWeb-master768

If you read only one article this week, read, “The Phone Is Smart, but Where’s the Big Idea?” Here’s just a taste:

I used a smartphone GPS to find my way through the cobblestoned maze of Geneva’s Old Town, in search of a handmade machine that changed the world more than any other invention. Near a 13th-century cathedral in this Swiss city on the shores of a lovely lake, I found what I was looking for: a Gutenberg printing press.

“This was the Internet of its day — at least as influential as the iPhone,” said Gabriel de Montmollin, the director of the Museum of the Reformation, toying with the replica of Johann Gutenberg’s great invention. It used to take four monks, laboring in a scriptorium with quills over calfskin, up to a year to produce a single book.

With the advance in movable type in 15th-century Europe, one press could crank out 3,000 pages a day. Before long, average people could travel to places that used to be unknown to them — with maps! Medical information passed more freely and quickly, diminishing the sway of quacks. And you could find your own way to God, or a way out of believing in God, with access to formerly forbidden thoughts.

The printing press offered the prospect that tyrants would never be able to kill a book or suppress an idea. Gutenberg’s brainchild broke the monopoly that clerics had on scripture. And later, stirred by pamphlets from a version of that same press, the American colonies rose up against a king and gave birth to a nation.

Intrigued? You can read the entire article here

Serial Disruption

BN-AZ316_crovit_P_20140105125131

I’m a non-tech guy now working in a high-tech organization. Often, my head spins when I see the kind of technologies that are now changing our lives.

That’s why I was so taken by and article entitled, “Disruption Is the New Normal.” Here’s part of what the writer said:

On a trip over the holidays, my wife rolled her eyes when I realized we’d left the Garmin at home and said we’d have to get a GPS for the rental car. She pointed to the Google Maps app on her mobile phone and said: “I bet this works even better.” It did. We benefited from the kind of technological disruption that is great for consumers, but brutal for businesses trying to survive rapid change and perhaps impossible for government regulators trying to keep up.

A generation ago, the Rand McNally atlas was the state of the art in navigation. Then Garmin, TomTom and other innovators developed satellite-based GPS devices. But barely a decade later, Google added constantly updated navigation to its maps and made them easily accessible as an app on mobile phones for the unbeatable price point of zero. The market value of stand-alone GPS makers fell as much as 85%.

This is the radical new normal for business, according to authors Larry Downes and Paul Nunes. “Before the information age, conventional wisdom held that new markets were created from the top down,” they write in their new book, “Big Bang Disruption.” Analog-era business strategies have been disrupted. Business guru Michael Porter once told companies they could get competitive advantage if they picked one strategy among premium pricing, cost savings or focusing on market niches. In the 1990s, Clayton Christensen urged executives to overcome what he called the innovator’s dilemma by moving fast once newcomers entered markets with lower-quality, lower-priced products.

But powerful new technologies like cloud computing and big data allow entrepreneurs to develop products and services that are “simultaneously better, cheaper, and more customized,” Messrs. Downes and Nunes write. “This isn’t disruptive innovation. It’s devastating innovation.”

Intrigued? You can read the entire article here.

Genius?

0720genius-master1050

Last week, I posted a technology blog post that talked about the grinding lifestyle in Silicon Valley. Frightening stuff.

Another thing that Silicon Valley brings to mind is the idea of the “Lone Genius.” Their names pop right to mind: Steve Jobs, Bill Gates, Mark Zuckerberg, and many others.

However, the idea of the “lone genius” has become something of an urban legend especially as it involves innovation.

But Joshua Wolf Shenk challenges that in his new book, Powers of Two. Here is an excerpt from the New York Times review of his book:

The pair is a precious unit — private, generative, even holy. We can explore a couple’s inner workings if we have an invitation to do so. Otherwise, we must use any available external means: letters in archives, revealing anecdotes, loose-lipped quips in interviews. In order to understand creativity, we must learn from couples, Joshua Wolf Shenk argues in his new book, “Powers of Two.” Defying the myth of the lone genius, he makes the case that the chemistry of creative pairs — of people, of groups — forms the primary (albeit frequently hidden) structural basis of innovation.

Pairs don’t often let us pry them apart, looking to see who contributed what. John Lennon wrote what would become “Strawberry Fields Forever” and Paul McCartney came up with “Penny Lane” as a rejoinder, yet their music is credited to both of them, written “eyeball to eyeball,” as Lennon put it, or “like mirrors” in McCartney’s view. Neal Brennan and Dave Chappelle have long agreed to keep private who wrote what in their comic sketches.

“People always ask Ulay and me the same questions,” the artist Marina Abramovic told Shenk about her former partner. “ ‘Whose idea was it?’ or ‘How was this done?’ . . . But we never specify. Everything was interrelated and interdependent.” The daughter of Marie and Pierre Curie said that her parents’ work was a fused endeavor. It’s nearly impossible to distinguish their contributions by looking at their laboratory notebooks, where handwriting by each covers the pages. Shenk’s “Powers of Two” is a rare glimpse into the private realms of such duos. He writes with his face “pressed up against the glass” of paired figures from the present and the past — adding the likes of Steve Jobs and Steve ­Wozniak, ­Susan B. Anthony and Elizabeth Cady Stanton, and C. S. Lewis and J. R. R. ­Tolkien to the pairs mentioned above.

Intrigued? You can read the entire article here.

A New Approach to Eating

0824FOOD-master1050

If you need more evidence regarding how much Silicon Valley has come to dominate our lives, you need only look at the statistics: In 2016, the five top U.S. companies based on market capitalization where all tech companies. That trend continues today.

Now this trend is moving into our kitchens. Amazon has bought Whole Foods. Some didn’t see this coming, but in a prescient article some years ago entitled, “Rethinking Eating,” here’s what the writer suggested:

Having radically changed the way we communicate, do research, buy books, listen to music, hire a car and get a date, Silicon Valley now aims to transform the way we eat. Just as text messages have replaced more lengthy discourse and digital vetting has diminished the slow and awkward evolution of intimacy, tech entrepreneurs hope to get us hooked on more efficient, algorithmically derived food.

Call it Food 2.0.

Following Steve Jobs’s credo that “people don’t know what they want until you show it to them,” a handful of high-tech start-ups are out to revolutionize the food system by engineering “meat” and “eggs” from pulverized plant compounds or cultured snippets of animal tissue. One company imagines doing away with grocery shopping, cooking and even chewing, with a liquid meal made from algae byproducts.

You can read the entire article here.

Innovation on Steroids

ED-AW360_michae_GR_20170509164525

Everyone agrees innovation is good…saying it’s not is like arguing against motherhood and apple pie. But what is innovation? How do you know if you “have it?” If you want more, how do you get it?

I always wondered about those questions, until John Michaelson’s article in the Wall Street Journal helped answer those questions. His thesis – that the financial crisis opened the economy to new forms of growth which are about to start pouring down – is compelling.

To no one’s surprise, much of it has to do with technology. Here is part of what he says:

Each decade for the past 60 years, we have seen a thousand-fold increase in world-wide processing power, bandwidth and storage. At the same time, costs have fallen by a factor of 10,000. Advances in these platforms, in themselves, do not produce innovation. But they facilitate the development and deployment of entirely new applications that take advantage of these advances. Amazing new applications are almost never predictable. They come from human creativity. That is one reason they almost never come from incumbent companies. But once barriers to innovation are lowered, new applications follow.
With that as a teaser, you can read the full article here.

Our Robot Partners

26mag-26machines-t_CA0-superJumbo

Americans – like most people everywhere – have a conflicted relationship with artificial intelligence, autonomy, and robots. Popular culture has a great deal to do with this.

One of the most iconic films of the last century, Stanley Kubrick’s 2001: A Space Odyssey had as its central theme, the issue of autonomy of robots (the unmanned vehicles of the time). Few who saw the movie can forget the scene where astronauts David Bowman and Frank Poole consider disconnecting HAL’s (Heuristically programmed ALgorithmic computer) cognitive circuits when he appears to be mistaken in reporting the presence of a fault in the spacecraft’s communications antenna. They attempt to conceal what they are saying, but are unaware that HAL can read their lips. Faced with the prospect of disconnection, HAL decides to kill the astronauts in order to protect and continue its programmed directives. While few today worry that a 21st century HAL will turn on its masters, the issues involved with fielding increasingly-autonomous unmanned systems are complex, challenging and contentious.

At the next level down from the notion of robots becoming our masters is the issue of these robots taking our jobs. President Obama suggested as much in one of his last addresses as president.

There is vastly more heat than light on this issue. That’s why I found this article, “Learning to Love Our Robot Co-workers,” so revealing. Here is part of what it said:

“The most important frontier for robots is not the work they take from humans but the work they do with humans — which requires learning on both sides.”

Intrigued? You can read the full article here.

Nurturing Innovation

Does the government nurture innovation? Most people are polarized on this issue. But amid the noise, a thoughtful op-ed by Joe Nocera sheds some light where there is mostly heat. Here is part of what he said:

“Manzi’s essential point is that American innovation — the key to our prosperity — has always relied, to some extent, on government support. In the early days of the republic, he writes, Alexander Hamilton proposed government help for the developing manufacturing industries — “the high-tech sector of its day.” Hamilton’s basic insight, he adds, was “that the enormous economic value that innovative industries could offer the nation merited public efforts to enable their success.”

And for most of this country’s history, that premise was embraced. In the early 1800s, West Point was founded “in large part to develop a domestic engineering capability.” In 1843, “Congress allocated the money to build a revolutionary telegraph line from Washington D.C. to Baltimore.” In the late 19th century, the government made investments “in biology and health innovation.” Right up until the 1970s, Manzi writes, the free market was paved “with specific interventions to provide infrastructure and to promote incremental, innovation-led growth.” This was The American System.

And then came the age of computers, which, writes Manzi, caused “the center of gravity” for innovation to shift from large institutions — not just the government, but research-centric firms like Bell Labs — toward “newer, more nimble competitors.” Even here, though, government never completely went away. Much of the early Internet, after all, was either funded or developed by a government agency, the Defense Advanced Research Projects Agency.”

Read more about nurturing innovation.