Think Different

13Heffernan-articleLarge-v2

While bookstore (or Amazon warehouse) shelves groan under the weight of books about Silicon Valley, they continue to feed our fascination with the tech industry.

That is why I was drawn to the review of a book: WHAT TECH CALLS THINKING
An Inquiry Into the Intellectual Bedrock of Silicon Valley. Here is how it begins:

In 2007, the venture capitalist Marc Andreessen argued in a brassy blog post that markets — not personnel, product or pricing — were the only thing a start-up needed to take flight. Teams, he suggested, were a dime a dozen. Products could be barely functional. He even suggested that the laws of supply and demand, the ones that generate price competition, no longer obtained.

The takeaway was something like If they come, you will build it. To get them to come, a founder needs a magnetic concept. Community, say. Connection. Sharing. Markets coalesced around these hazy notions in 2007 and 2008, with the debuts of Twitter, Airbnb, Waze, Tumblr and Dropbox.

In an erudite new book, “What Tech Calls Thinking,” Adrian Daub, a professor of comparative literature and German studies at Stanford, investigates the concepts in which Silicon Valley is still staked. He argues that the economic upheavals that start there are “made plausible and made to seem inevitable” by these tightly codified marketing strategies he calls “ideals.”

There are so many scintillating aperçus in Daub’s book that I gave up underlining. But I couldn’t let “Disruption is a theodicy of hypercapitalism” pass. Not only does Daub’s point ring true — ennobling destruction and sabotage makes the most brutal forms of capitalism seem like God’s will — but the words themselves sound like one of the verses of a German punk-socialist anthem.

Want more? Here is a link to the NYT article

https://www.nytimes.com/2020/10/13/books/review/what-tech-calls-thinking-adrian-daub.html

 

Think Different

13Heffernan-articleLarge-v2

While bookstore (or Amazon warehouse) shelves groan under the weight of books about Silicon Valley, they continue to feed our fascination with the tech industry.

That is why I was drawn to the review of a new book: WHAT TECH CALLS THINKING
An Inquiry Into the Intellectual Bedrock of Silicon Valley. Here is how it begins:

In 2007, the venture capitalist Marc Andreessen argued in a brassy blog post that markets — not personnel, product or pricing — were the only thing a start-up needed to take flight. Teams, he suggested, were a dime a dozen. Products could be barely functional. He even suggested that the laws of supply and demand, the ones that generate price competition, no longer obtained.

The takeaway was something like If they come, you will build it. To get them to come, a founder needs a magnetic concept. Community, say. Connection. Sharing. Markets coalesced around these hazy notions in 2007 and 2008, with the debuts of Twitter, Airbnb, Waze, Tumblr and Dropbox.

In an erudite new book, “What Tech Calls Thinking,” Adrian Daub, a professor of comparative literature and German studies at Stanford, investigates the concepts in which Silicon Valley is still staked. He argues that the economic upheavals that start there are “made plausible and made to seem inevitable” by these tightly codified marketing strategies he calls “ideals.”

There are so many scintillating aperçus in Daub’s book that I gave up underlining. But I couldn’t let “Disruption is a theodicy of hypercapitalism” pass. Not only does Daub’s point ring true — ennobling destruction and sabotage makes the most brutal forms of capitalism seem like God’s will — but the words themselves sound like one of the verses of a German punk-socialist anthem.

Want more? Here is a link to the NYT article

Dedication to a Cause

01hybrid-superJumbo

Much ink has been spilled about the future of robots and how they will either help – or hurt – humanity. Some still fear HAL from 2001 A Space Odyssey.

That is why I was drawn to a recent piece, “A Case for Cooperation Between Machines and Humans.” The subtitle is revealing: “A computer scientist argues that the quest for fully automated robots is misguided, perhaps even dangerous. His decades of warnings are gaining more attention.” Here is how it begins:

The Tesla chief Elon Musk and other big-name Silicon Valley executives have long promised a car that can do all the driving without human assistance.

But Ben Shneiderman, a University of Maryland computer scientist who has for decades warned against blindly automating tasks with computers, thinks fully automated cars and the tech industry’s vision for a robotic future is misguided. Even dangerous. Robots should collaborate with humans, he believes, rather than replace them.

Late last year, Dr. Shneiderman embarked on a crusade to convince the artificial intelligence world that it is heading in the wrong direction. In February, he confronted organizers of an industry conference on “Assured Autonomy” in Phoenix, telling them that even the title of their conference was wrong. Instead of trying to create autonomous robots, he said, designers should focus on a new mantra, designing computerized machines that are “reliable, safe and trustworthy.”

There should be the equivalent of a flight data recorder for every robot, Dr. Shneiderman argued.

It is a warning that’s likely to gain more urgency when the world’s economies eventually emerge from the devastation of the coronavirus pandemic and millions who have lost their jobs try to return to work. A growing number of them will find they are competing with or working side by side with machines.

Want more? You can read the full article here

The Innovation Bible

25christensen02-popup

Clayton M. Christensen, a Harvard professor whose groundbreaking 1997 book, “The Innovator’s Dilemma,” outlined his theories about the impact of what he called “disruptive innovation” on leading companies and catapulted him to superstar status as a management guru, died last month.

“The Innovator’s Dilemma,” which The Economist called one of the six most important business books ever written, was published during the technology boom of the late 1990s. It trumpeted Professor Christensen’s assertion that the factors that help the best companies succeed — listening responsively to customers, investing aggressively in technology products that satisfied customers’ next-generation needs — are the same reasons some of these companies fail.

These corporate giants were so focused on doing the very things that had been taught for generations at the nation’s top business schools, he wrote, that they were blindsided by small, fast-moving, innovative companies that were able to enter markets nimbly with disruptive products and services and grab large chunks of market share. By laying out a blueprint for how executives could identify and respond to these disruptive forces, Professor Christensen, himself an entrepreneur and former management consultant, struck a chord with high-tech corporate leaders.

Want more? You read the full piece here

Changing the World

Innovation may rank as one of today’s most-used buzzwords. Most would agree that innovation is “good,” and it is something that is inherently good, especially for business.

But that’s where the agreement, as most don’t know “what” they want innovation to “do.” As a result, there is a cottage industry of books, articles, seminars, podcasts etc. about innovation.

Without giving you a point solution, for me, innovation is about change – often the desire to change the world.

That’s why I was drawn to an article, “The Courage to Change the World.” Here is how it began to take on this subject:

Call them what you will: change makers, innovators, thought leaders, visionaries.

In ways large and small, they fight. They disrupt. They take risks. They push boundaries to change the way we see the world, or live in it. Some create new enterprises, while others develop their groundbreaking ideas within an existing one.

From Archimedes to Zeppelin, the accomplishments of great visionaries over the centuries have filled history books. More currently, from Jeff Bezos of Amazon to Mark Zuckerberg of Facebook and Elon Musk of SpaceX and Tesla Motors, they are the objects of endless media fascination — and increasingly intense public scrutiny.

Although centuries stretch between them, experts who have studied the nature of innovators across all areas of expertise largely agree that they have important attributes in common, from innovative thinking to an ability to build trust among those who follow them to utter confidence and a stubborn devotion to their dream.

Want more? You can read the full article here

The Future is Unmanned

300px-MQ-1_Predator,_armed_with_AGM-114_Hellfire_missiles

One of the most rapidly growing areas of innovative technology adoption involves unmanned systems. The U.S. military’s use of these systems—especially armed unmanned systems—is not only changing the face of modern warfare, but is also altering the process of decision-making in combat operations. These systems are evolving rapidly to deliver enhanced capability to the warfighter and seemed poised to deliver the next “revolution in military affairs.” However, there are increasing concerns regarding the degree of autonomy these systems—especially armed unmanned systems—should have.

I addressed this issue in an article in the professional journal, U.S. Naval Institute Proceedings. Here is how I began:

While unmanned systems increasingly impact all aspects of life, it is their use as military assets that has garnered the most attention, and with that attention, growing concern.

The Department of Defense’s (DoD’s) vision for unmanned systems (UxS) is to integrate them into the joint force for a number of reasons, but especially to reduce the risk to human life, to deliver persistent surveillance over areas of interest, and to provide options to warfighters that derive from the technologies’ ability to operate autonomously. The most recent DoD “Unmanned Systems Integrated Roadmap” noted, “DoD envisions unmanned systems seamlessly op­erating with manned systems while gradually reducing the degree of human control and decision making required for the unmanned portion of the force structure.”

I’ve attached the full article here

Frustrated?

08sl-grant-superJumbo (1)

Last week, I posted a blog about “snowplow parents,” and earlier this week posted one about keeping a “failure resume.”

Those thoughts were germinating when I read ANOTHER killer-good piece about how feeling frustrated at work can give rise to success. Here is how it began:

In 2000, Pixar was at the top of its game. “Toy Story” was released five years earlier, and it was the first computer-animated blockbuster on the silver screen. Three years later Pixar debuted “A Bug’s Life” to critical acclaim, and 1999’s “Toy Story 2” was the biggest animated hit of the year.

Concerned about resting on their laurels, the studio’s founders, Steve Jobs and Ed Catmull, hired the company’s first outside director, Brad Bird, to shake things up. Mr. Bird’s most recent film, “Iron Giant,” had flopped financially, and when he pitched his idea for a new movie to Pixar, he was told it would never work: It would take 10 years and cost $500 million to animate.

But Mr. Bird persisted. He recruited a band of disgruntled people inside Pixar — misfits whose ideas had been ignored — to work with him. The resulting movie, “The Incredibles,” won two Oscars and grossed $631 million worldwide, outdoing all of Pixar’s previous successes. (And, for the record, it ended up costing less than $100 million to make.)

We normally avoid frustrated people — we don’t want to get dragged down into a cesspool of complaints and cynicism. We see dissatisfied people as curmudgeons who halt progress, or, worse yet, dementors who suck the joy out of the room. And we have good reason to feel that way: A natural response to frustration is the fight-or-flight response. Disgruntled people often go into “Office Space” mode, choosing to fight by sabotaging the workplace, or flight by doing the bare minimum not to get fired.

But there’s a third reaction to frustration that we’ve overlooked: When we’re dissatisfied, instead of fight or flight, sometimes we invent.

Want more? You can read the full article here

Military Innovation

shutterstock_626258333

Among the buzzwords circulating in the U.S. military, Innovation is likely the most common one we all have encountered over the last decade.

Countless commands have set up “innovation cells” on their staffs and have sought ways to become more innovative, often seeking best practices from industry, especially Silicon Valley.

The Department of Defense has created a Defense Innovation Board comprised of outside experts who are charged to find ways to make DoD more “innovative.”

And just a few years ago, former Secretary of Defense Ashton Carter created Defense Innovation Unit Experimental – DIU(X) – (now DIU) at the old Moffett Field near the heart of Silicon Valley.

All of this is good as far as it goes – but the danger is clear – by establishing innovation cells on major staffs, by having outside experts tell the DoD how to be more innovative, and by establishing a large organization to be the DoD’s innovation “place” we may be sending the wrong signal to the rest of the military and civilian professionals: Don’t worry about being innovative, we’ve assigned that task to someone else.

Former Pacific Fleet Commander, Admiral Scott Swift was unique among senior commanders in that he purposefully and deliberately did not establish an innovation cell on the PACFLEET staff. As he shared in his remarks at the 2018 Pacific Command Science and Technology Conference, “I want every one of my sailors to be an innovator.”

As the old saw goes, the guy (or gal) who invented the wheel was in inventor, the person who took four wheels and put them on a wagon was an innovator.

We are taken by innovations and innovators, they help define our future and then make it possible.

From Archimedes to Zeppelin, the accomplishments of great visionaries over the centuries have filled history books. More currently, from Jeff Bezos of Amazon to Mark Zuckerberg of Facebook and Elon Musk of SpaceX and Tesla Motors, they are the objects of endless media fascination — and increasingly intense public scrutiny.

Although centuries stretch between them, experts who have studied the nature of innovators across all areas of expertise largely agree that they have important attributes in common, from innovative thinking to an ability to build trust among those who follow them to utter confidence and a stubborn devotion to their dream.

Now facing two peer competitors – China and Russia – who want to create a new world order that puts them at the forefront, the U.S. military needs every solider, sailor, airman and marine to be an innovator.

One Trillion

merlin_139066044_39eb8d46-dac6-4f4b-a912-41b3c035374c-jumbo

The stock market – especially tech – has been down a bit, and it’s easy to forget that not that long ago Apple was valued at one trillion dollars.

There has been a great deal of breathless reporting on this milestone, but much less thoughtful analysis. That’s why I was taken by Jack Nicas’ piece. Here’s how he began:

SAN FRANCISCO — In 1997, Apple was on the ropes. The Silicon Valley pioneer was being decimated by Microsoft and its many partners in the personal-computer market. It had just cut a third of its work force, and it was about 90 days from going broke, Apple’s late co-founder, Steve Jobs, later said.

Recently, Apple became the first publicly traded American company to be worth more than $1 trillion when its shares climbed 3 percent to end the day at $207.39. The gains came two days after the company announced the latest in a series of remarkably profitable quarters.

Apple’s ascent from the brink of bankruptcy to the world’s most valuable public company has been a business tour de force, marked by rapid innovation, a series of smash-hit products and the creation of a sophisticated, globe-spanning supply chain that keeps costs down while producing enormous volumes of cutting-edge devices.

That ascent has also been marked by controversy, tragedy and challenges. Apple’s aggressive use of outside manufacturers in China, for example, has led to criticism that it is taking advantage of poorly paid workers in other countries and robbing Americans of good manufacturing jobs. The company faces numerous questions about how it can continue to grow. This is just a snippet. Want more? You can read the full article here.

Minecraft Overdrive

17mag-minecraft-1-articleLarge

Few dispute that games mirror reality and that reality often resembles games. But the game of Minecraft takes this to new levels. And this is driving innovation. Here’s why:

Since its release seven years ago, Minecraft has become a global sensation, captivating a generation of children. There are over 100 million registered players, and it’s now the third-best-­selling video game in history, after Tetris and Wii Sports. In 2014, Microsoft bought Minecraft — and Mojang, the Swedish game studio behind it — for $2.5 billion.

There have been blockbuster games before, of course. But as Jordan’s experience suggests — and as parents peering over their children’s shoulders sense — Minecraft is a different sort of

For one thing, it doesn’t really feel like a game. It’s more like a destination, a technical tool, a cultural scene, or all three put together: a place where kids engineer complex machines, shoot videos of their escapades that they post on YouTube, make art and set up servers, online versions of the game where they can hang out with friends. It’s a world of trial and error and constant discovery, stuffed with byzantine secrets, obscure text commands and hidden recipes. And it runs completely counter to most modern computing trends. Where companies like Apple and Microsoft and Google want our computers to be easy to manipulate — designing point-and-click interfaces under the assumption that it’s best to conceal from the average user how the computer works — Minecraft encourages kids to get under the hood, break things, fix them and turn mooshrooms into random-­number generators. It invites them to tinker.

In this way, Minecraft culture is a throwback to the heady early days of the digital age. In the late ’70s and ’80s, the arrival of personal computers like the Commodore 64 gave rise to the first generation of kids fluent in computation. They learned to program in Basic, to write software that they swapped excitedly with their peers. It was a playful renaissance that eerily parallels the embrace of Minecraft by today’s youth. As Ian Bogost, a game designer and professor of media studies at Georgia Tech, puts it, Minecraft may well be this generation’s personal computer.

Want more? You can read the full article here