Frustrated?

08sl-grant-superJumbo (1)

Last week, I posted a blog about “snowplow parents,” and earlier this week posted one about keeping a “failure resume.”

Those thoughts were germinating when I read ANOTHER killer-good piece about how feeling frustrated at work can give rise to success. Here is how it began:

In 2000, Pixar was at the top of its game. “Toy Story” was released five years earlier, and it was the first computer-animated blockbuster on the silver screen. Three years later Pixar debuted “A Bug’s Life” to critical acclaim, and 1999’s “Toy Story 2” was the biggest animated hit of the year.

Concerned about resting on their laurels, the studio’s founders, Steve Jobs and Ed Catmull, hired the company’s first outside director, Brad Bird, to shake things up. Mr. Bird’s most recent film, “Iron Giant,” had flopped financially, and when he pitched his idea for a new movie to Pixar, he was told it would never work: It would take 10 years and cost $500 million to animate.

But Mr. Bird persisted. He recruited a band of disgruntled people inside Pixar — misfits whose ideas had been ignored — to work with him. The resulting movie, “The Incredibles,” won two Oscars and grossed $631 million worldwide, outdoing all of Pixar’s previous successes. (And, for the record, it ended up costing less than $100 million to make.)

We normally avoid frustrated people — we don’t want to get dragged down into a cesspool of complaints and cynicism. We see dissatisfied people as curmudgeons who halt progress, or, worse yet, dementors who suck the joy out of the room. And we have good reason to feel that way: A natural response to frustration is the fight-or-flight response. Disgruntled people often go into “Office Space” mode, choosing to fight by sabotaging the workplace, or flight by doing the bare minimum not to get fired.

But there’s a third reaction to frustration that we’ve overlooked: When we’re dissatisfied, instead of fight or flight, sometimes we invent.

Want more? You can read the full article here

Military Innovation

shutterstock_626258333

Among the buzzwords circulating in the U.S. military, Innovation is likely the most common one we all have encountered over the last decade.

Countless commands have set up “innovation cells” on their staffs and have sought ways to become more innovative, often seeking best practices from industry, especially Silicon Valley.

The Department of Defense has created a Defense Innovation Board comprised of outside experts who are charged to find ways to make DoD more “innovative.”

And just a few years ago, former Secretary of Defense Ashton Carter created Defense Innovation Unit Experimental – DIU(X) – (now DIU) at the old Moffett Field near the heart of Silicon Valley.

All of this is good as far as it goes – but the danger is clear – by establishing innovation cells on major staffs, by having outside experts tell the DoD how to be more innovative, and by establishing a large organization to be the DoD’s innovation “place” we may be sending the wrong signal to the rest of the military and civilian professionals: Don’t worry about being innovative, we’ve assigned that task to someone else.

Former Pacific Fleet Commander, Admiral Scott Swift was unique among senior commanders in that he purposefully and deliberately did not establish an innovation cell on the PACFLEET staff. As he shared in his remarks at the 2018 Pacific Command Science and Technology Conference, “I want every one of my sailors to be an innovator.”

As the old saw goes, the guy (or gal) who invented the wheel was in inventor, the person who took four wheels and put them on a wagon was an innovator.

We are taken by innovations and innovators, they help define our future and then make it possible.

From Archimedes to Zeppelin, the accomplishments of great visionaries over the centuries have filled history books. More currently, from Jeff Bezos of Amazon to Mark Zuckerberg of Facebook and Elon Musk of SpaceX and Tesla Motors, they are the objects of endless media fascination — and increasingly intense public scrutiny.

Although centuries stretch between them, experts who have studied the nature of innovators across all areas of expertise largely agree that they have important attributes in common, from innovative thinking to an ability to build trust among those who follow them to utter confidence and a stubborn devotion to their dream.

Now facing two peer competitors – China and Russia – who want to create a new world order that puts them at the forefront, the U.S. military needs every solider, sailor, airman and marine to be an innovator.

One Trillion

merlin_139066044_39eb8d46-dac6-4f4b-a912-41b3c035374c-jumbo

The stock market – especially tech – has been down a bit, and it’s easy to forget that not that long ago Apple was valued at one trillion dollars.

There has been a great deal of breathless reporting on this milestone, but much less thoughtful analysis. That’s why I was taken by Jack Nicas’ piece. Here’s how he began:

SAN FRANCISCO — In 1997, Apple was on the ropes. The Silicon Valley pioneer was being decimated by Microsoft and its many partners in the personal-computer market. It had just cut a third of its work force, and it was about 90 days from going broke, Apple’s late co-founder, Steve Jobs, later said.

Recently, Apple became the first publicly traded American company to be worth more than $1 trillion when its shares climbed 3 percent to end the day at $207.39. The gains came two days after the company announced the latest in a series of remarkably profitable quarters.

Apple’s ascent from the brink of bankruptcy to the world’s most valuable public company has been a business tour de force, marked by rapid innovation, a series of smash-hit products and the creation of a sophisticated, globe-spanning supply chain that keeps costs down while producing enormous volumes of cutting-edge devices.

That ascent has also been marked by controversy, tragedy and challenges. Apple’s aggressive use of outside manufacturers in China, for example, has led to criticism that it is taking advantage of poorly paid workers in other countries and robbing Americans of good manufacturing jobs. The company faces numerous questions about how it can continue to grow. This is just a snippet. Want more? You can read the full article here.

Minecraft Overdrive

17mag-minecraft-1-articleLarge

Few dispute that games mirror reality and that reality often resembles games. But the game of Minecraft takes this to new levels. And this is driving innovation. Here’s why:

Since its release seven years ago, Minecraft has become a global sensation, captivating a generation of children. There are over 100 million registered players, and it’s now the third-best-­selling video game in history, after Tetris and Wii Sports. In 2014, Microsoft bought Minecraft — and Mojang, the Swedish game studio behind it — for $2.5 billion.

There have been blockbuster games before, of course. But as Jordan’s experience suggests — and as parents peering over their children’s shoulders sense — Minecraft is a different sort of

For one thing, it doesn’t really feel like a game. It’s more like a destination, a technical tool, a cultural scene, or all three put together: a place where kids engineer complex machines, shoot videos of their escapades that they post on YouTube, make art and set up servers, online versions of the game where they can hang out with friends. It’s a world of trial and error and constant discovery, stuffed with byzantine secrets, obscure text commands and hidden recipes. And it runs completely counter to most modern computing trends. Where companies like Apple and Microsoft and Google want our computers to be easy to manipulate — designing point-and-click interfaces under the assumption that it’s best to conceal from the average user how the computer works — Minecraft encourages kids to get under the hood, break things, fix them and turn mooshrooms into random-­number generators. It invites them to tinker.

In this way, Minecraft culture is a throwback to the heady early days of the digital age. In the late ’70s and ’80s, the arrival of personal computers like the Commodore 64 gave rise to the first generation of kids fluent in computation. They learned to program in Basic, to write software that they swapped excitedly with their peers. It was a playful renaissance that eerily parallels the embrace of Minecraft by today’s youth. As Ian Bogost, a game designer and professor of media studies at Georgia Tech, puts it, Minecraft may well be this generation’s personal computer.

Want more? You can read the full article here

Innovative Diplomacy

merlin_139142166_e09b0ede-23f8-432c-a36b-1edc03d13d61-superJumbo

Apple has long been known as a leader in innovation, and most of us think of that in terms of innovative technology. But in today’s globally connected world, that also means finding innovative ways to ensure that two nations who are often at odds don’t unravel what technology can deliver. Innovative diplomacy is now a must.

A recent piece, “In China Trade War, Apple Worries It Will Be Collateral Damage,” explains how Apple is handling this brave new world.

Apple’s chief executive, Timothy D. Cook, may be the leader of the world’s most valuable public company, but lately he has had to act a lot like the tech industry’s top diplomat.

Last month he visited the Oval Office to warn President Trump that tough talk on China could threaten Apple’s position in the country. In March, at a major summit meeting in Beijing, he called for “calmer heads” to prevail between the world’s two most powerful countries.

In a trade and technology showdown between the United States and China, Apple and Mr. Cook have a lot to lose. With 41 stores and hundreds of millions of iPhones sold in the country, there is arguably no American company in China as successful, as high-profile and with as big a target on its back.

Since he took over Apple from its co-founder Steve Jobs, in 2011, questions about whether Mr. Cook, 57, could recreate the magic that led to the iPod and iPhone have persisted. For Mr. Cook, the analogous breakthrough — and potentially his legacy as the heir to Mr. Jobs — has come not from a gadget, but from a geography: China.

Apple fears “the Chinese-bureaucracy machine is going to kick in,” meaning the Chinese government could cause delays in its supply chain and increase scrutiny of its products under the guise of national-security concerns, according to one person close to the company. Apple has faced such retaliation before, another person said, and Reuters reported Ford vehicles are already facing delays at Chinese ports.

Apple executives and lobbyists in Beijing and Washington, led by Mr. Cook, have been trying to work both sides. They have fostered close ties to the administration of the country’s leader, Xi Jinping, an effort called Red Apple by employees at Apple’s manufacturing partner Foxconn, after the official color of the Chinese Communist Party.

Want more? You can read the full article here

Efficiency Isn’t Everything

merlin_138778626_98b1e1e2-607c-4443-ac12-acb92e5fa9a8-jumbo

Silicon Valley has given us a lot – and taken away a lot. We seem to be dividing into those who embrace technology and those who fear it.

That’s why I was taken by Gal Beckerkan’s recent piece, “Kicking the Geeks Where it Hurts.” The subtitle, “Why Silicon Valley Should Embrace Inefficiency,” says a lot. Here’s how it starts:

Hypocrisy thrives at the Waldorf School of the Peninsula in the heart of Silicon Valley. This is where Google executives send their children to learn how to knit, write with chalk on blackboards, practice new words by playing catch with a beanbag and fractions by cutting up quesadillas and apples. There are no screens — not a single piece of interactive, multimedia, educational content. The kids don’t even take standardized tests.

While Silicon Valley’s raison d’être is making platforms, apps and algorithms to create maximum efficiency in life and work (a “friction-free” world, as Bill Gates once put it), when it comes to their own families (and developing their own businesses, too), the new masters of the universe have a different sense of what it takes to learn and innovate — it’s a slow, indirect process, meandering not running, allowing for failure and serendipity, even boredom.

Back in 1911, the English philosopher Alfred North Whitehead said that “civilization advances by extending the number of important operations which we can perform without thinking about them.” By that metric, Uber and Google and Amazon Prime have given us a whole lot of civilization. And there’s no doubt our lives are better for it. (Ordering Chinese takeout in 30 seconds on an app might not be up there with Shakespeare or the incandescent light bulb, but it’s pretty great.) This unrelenting drive for efficiency has, however, blotted out a few things we all know intuitively but seem to be forgetting.

To create a product or service that is truly efficient often involves a lot of inefficiency — more like learning to knit than pressing a button. Likewise, gadgets built with a single-minded focus on efficiency can often backfire, subverting their purpose. Algorithms designed to dish up the news and information we most prefer end up blinkering us to all but a narrow slice of political and social reality. Our smartphones untether us from the office, saving us energy on travel, but also allow our lives to be interrupted nearly 24 hours a day, chewing up any productive idle time.

Want more? You can read the full article here

Change the World

27VISLEDE-superJumbo

As the old saw goes, the guy (or gal) who invented the wheel was in inventor, the person who took four wheels and put them on a wagon was an innovator.

We are taken by innovations and innovators, they help define our future and then make it possible. That’s what drew me to Kerry Hannon’s piece, “The Courage to Change the World.” Here’s how she begins:

Call them what you will: change makers, innovators, thought leaders, visionaries.

In ways large and small, they fight. They disrupt. They take risks. They push boundaries to change the way we see the world, or live in it. Some create new enterprises, while others develop their groundbreaking ideas within an existing one.

From Archimedes to Zeppelin, the accomplishments of great visionaries over the centuries have filled history books. More currently, from Jeff Bezos of Amazon to Mark Zuckerberg of Facebook and Elon Musk of SpaceX and Tesla Motors, they are the objects of endless media fascination — and increasingly intense public scrutiny.

Although centuries stretch between them, experts who have studied the nature of innovators across all areas of expertise largely agree that they have important attributes in common, from innovative thinking to an ability to build trust among those who follow them to utter confidence and a stubborn devotion to their dream.

Want more? You can read the full article here

Facebook

Facebook has been in the news recently – that’s an understatement. The recent travails the tech giant has undergone are well-chronicalled, and don’t need repeating here.

But some were identifying the downside of Facebook’s size some time ago. Here is what Ross Douthat shared almost two years ago in his piece: “Facebook’s Subtle Empire:”

IN one story people tell about the news media, we have moved from an era of consolidation and authority to an era of fragmentation and diversity. Once there were three major television networks, and everyone believed what Walter Cronkite handed down from Sinai. Then came cable TV and the talk radio boom, and suddenly people could seek out ideologically congenial sources and tune out the old mass-culture authorities. Then finally the Internet smashed the remaining media monopolies, scattered news readers to the online winds, and opened an age of purely individualized news consumption.

How compelling is this story? It depends on what you see when you look at Facebook.

In one light, Facebook is a powerful force driving fragmentation and nicheification. It gives its users news from countless outlets, tailored to their individual proclivities. It allows those users to be news purveyors in their own right, playing Cronkite every time they share stories with their “friends.” And it offers a platform to anyone, from any background or perspective, looking to build an audience from scratch.

But seen in another light, Facebook represents a new era of media consolidation, a return of centralized authority over how people get their news. From this perspective, Mark Zuckerberg’s empire has become an immensely powerful media organization in its own right, albeit one that effectively subcontracts actual news gathering to other entities (this newspaper included). And its potential influence is amplified by the fact that this Cronkite-esque role is concealed by Facebook’s self-definition as “just” a social hub.

These two competing understandings have collided in the last few weeks, after it was revealed that Facebook’s list of “trending topics” is curated by a group of toiling journalists, not just an impersonal algorithm, and after a former curator alleged that decisions about which stories “trend” are biased against conservative perspectives.

Want to read more

Da Vinci Today

OG-AW187_201709_GR_20170929114302

History’s most creative genius, Leonardo da Vinci, was not superhuman, and following his methods can bring great intellectual rewards to anyone writes Walter Isaacson. Here’s how he begins his piece about the inventor and innovator:

Around the time that he reached the unnerving milestone of turning 30, Leonardo da Vinci wrote a letter to the ruler of Milan listing the reasons why he should be given a job. In 10 carefully numbered paragraphs, he touted his engineering skills, including his ability to design bridges, waterways, cannons and armored vehicles. Only at the end, as an afterthought, did he add that he was also an artist. “Likewise in painting, I can do everything possible,” he wrote.

Yes, he could. He would go on to create the two most famous paintings in history, the “Mona Lisa” and “The Last Supper.” But in his own mind, he was just as much a man of science and engineering, pursuing studies of anatomy, flying machines, fossils, birds, optics, geology and weaponry. His ability to combine art and science—made iconic by “Vitruvian Man,” his drawing of a perfectly proportioned man (possibly a self-portrait) spread-eagled inside a circle and square—is why so many consider him history’s most creative genius.

Fortunately for us, Leonardo was also a very human genius. He was not the recipient of supernatural intellect in the manner of, for example, Newton or Einstein, whose minds had such unfathomable processing power that we can merely marvel at them. His genius came from being wildly imaginative, quirkily curious and willfully observant. It was a product of his own will and effort, which makes his example more inspiring for us mere mortals and also more possible to emulate.

More than 7,000 pages of Leonardo’s notebooks still exist, and there we find plenty of evidence that he was not superhuman. He made mistakes in arithmetic. He had a deep feel for geometry but was not adroit at using equations to codify nature’s laws. He left many artistic projects unfinished and pages of brilliant treatises unpublished. He was also prone to fantasy, envisioning flying machines that never flew and tanks that never rolled.

Want more? You can read the full piece here

Law of Innovation

BN-WH159_KEYWOR_P_20171123163122

Most businesses are “all about innovation.” We made innovation a buzz word, but few really have done a deep dive into what innovation means, especially in business.

While such a broad term defines simple explanation – and can mean many thinks to many people, I found Christopher Mims “Laws of Innovation” piece in the Wall Street Journal helpful in bounding the challenge. Here are his “laws.”

Three decades ago, a historian wrote six laws to explain society’s unease with the power and pervasiveness of technology. Though based on historical examples taken from the Cold War, the laws read as a cheat sheet for explaining our era of Facebook, Google, the iPhone and FOMO.

You’ve probably never heard of these principles or their author, Melvin Kranzberg, a professor of the history of technology at Georgia Institute of Technology who died in 1995.

What’s a bigger shame is that most of the innovators today, who are building the services and tools that have upended society, don’t know them, either.

Fortunately, the laws have been passed down by a small group of technologists who say they have profoundly impacted their thinking. The text should serve as a foundation—something like a Hippocratic oath—for all people who build things.

  1. ‘Technology is neither good nor bad; nor is it neutral’
  2. ‘Invention is the mother of necessity.’
  3. ‘Technology comes in packages, big and small.
  4. ‘Although technology might be a prime element in many public issues, nontechnical factors take precedence in technology-policy decisions.’
  5. ‘All history is relevant, but the history of technology is the most relevant.’
  6. ‘Technology is a very human activity.’

Want more? You can read the full piece here