Failure = Success

04sl_newsletter-jumbo

Last week, I blogged about “snowplow parents” who keep their children’s futures obstacle-free — even when it means crossing ethical and legal boundaries.

They don’t want their children to fail – and wreck themselves – and their children’s future’s to achieve this end.

They forget that the objective of parenting is supposed to be to prepare the kid for the road, not vice versa.


That’s why I found Tim Herrera’s latest piece, “Do You Keep a Failure Résumé? Here’s Why You Should Start,” so refreshing, especially the article’s subtitle, “Failure isn’t a roadblock. It’s part of the process.” Here’s how he begins:

A little more than three years ago, I had to put together this presentation at work. It was on a topic I wasn’t very familiar with, but I took it on anyway, figuring I could get up to speed and deliver something useful and productive.

Friends, if you hadn’t guessed yet, I bombed it. I wasn’t prepared enough, I missed a few major points, and I didn’t give myself enough time to complete it. Not my greatest work.

But I have such fond memories of that presentation — O.K., maybe not exactly fond — because it was my first significant screw-up at a new job. It’s still something I look to when I’m in a similar position at work; I know what went wrong then, so I can try to fix those issues now before they become problems.

When things go right, we’re generally pretty good at identifying why they went right — that is, if we even take time to analyze the success at all. Preparation, proper scheduling, smart delegation and so on. If it ain’t broke, don’t fix it. But falling on our face gives us the rare opportunity to find and address the things that went wrong (or, even more broadly, the traits or habits that led us to fail), and it’s an opportunity we should welcome. Want more? You can read the full article here

Creating Tomorrow’s World

merlin_151394472_33bd8b9e-c730-466d-af3b-1adcfaf67702-superJumbo

We all want to know what the future will hold. While no one can really know, there is mounting evidence that writers of speculative fiction may have unique insights into the future.

As a writer, I know this because I read a great deal of speculative fiction and it feeds my writing efforts.

That’s why I was drawn to a recent article, “When Sci-Fi Comes True.” Here is a short excerpt. It will make you think:

Maybe because we’re living in a dystopia, it feels as if we’ve become obsessed with prophecy of late.

In “The Dreams Our Stuff Is Made Of: How Science Fiction Conquered the World,” Thomas Disch calls this relay between fiction and reality “creative visualization.” Businesses have started to co-opt it. The designers of the iPhone and the Kindle cite works of science fiction as inspiration. Boeing, Nike, Ford and Intel have hired prototyping, future-casting or world-building ventures for product development. As the author Brian Merchant put it on Medium recently, these companies “do what science fiction has always done — build rich speculative worlds, describe that world’s bounty and perils, and, finally, envision how that future might fall to pieces.” This is “speculative” fiction in the financial sense, too, a new way to gamble on futures.

The irony — or the proof — of this brave new business model is that sci-fi saw it coming. Dystopias have long portrayed artists being drafted into nefarious corporate labor. In “Blade Runner 2049,” for instance, the Wallace Corporation sets a woman the task of crafting memories — not for characters in a novel, but for androids.

It’s a touch self-congratulatory for sci-fi creators to imply that they’re the unacknowledged designers of the world. But they do seem to have a knack for innovation. The genre has predicted satellite communication, army tanks, tablets, submarines, psychotropic pills, bionic limbs, CCTV, electric cars and video calling. You can find dozens more examples of sci-fi-minted gadgetry on the internet, which is itself a prime example of the phenomenon. The word “cyberspace” first appeared in the cyberpunk novel “Neuromancer” (1984), to describe “a consensual hallucination …. A graphic representation of data abstracted from the banks of every computer in the human system.” Its author, William Gibson, is our Nostradamus: His novels have prophesied reality television, viral marketing and nanotechnology. Want more? You can read it here

Military Innovation

shutterstock_1326168245

Among the buzzwords circulating in the U.S. military, Innovation is likely the most common one we all have encountered over the last decade.

Countless commands have set up “innovation cells” on their staffs and have sought ways to become more innovative, often seeking best practices from industry, especially Silicon Valley.

The Department of Defense has created a Defense Innovation Board comprised of outside experts who are charged to find ways to make DoD more “innovative.”

And just a few years ago, former Secretary of Defense Ashton Carter created Defense Innovation Unit Experimental – DIU(X) – (now DIU) at the old Moffett Field near the heart of Silicon Valley.

All of this is good as far as it goes – but the danger is clear – by establishing innovation cells on major staffs, by having outside experts tell the DoD how to be more innovative, and by establishing a large organization to be the DoD’s innovation “place” we may be sending the wrong signal to the rest of the military and civilian professionals: Don’t worry about being innovative, we’ve assigned that task to someone else.

Former Pacific Fleet Commander, Admiral Scott Swift was unique among senior commanders in that he purposefully and deliberately did not establish an innovation cell on the PACFLEET staff. As he shared in his remarks at the 2018 Pacific Command Science and Technology Conference, “I want every one of my sailors to be an innovator.”

As the old saw goes, the guy (or gal) who invented the wheel was in inventor, the person who took four wheels and put them on a wagon was an innovator.

We are taken by innovations and innovators, they help define our future and then make it possible.

From Archimedes to Zeppelin, the accomplishments of great visionaries over the centuries have filled history books. More currently, from Jeff Bezos of Amazon to Mark Zuckerberg of Facebook and Elon Musk of SpaceX and Tesla Motors, they are the objects of endless media fascination — and increasingly intense public scrutiny.

Although centuries stretch between them, experts who have studied the nature of innovators across all areas of expertise largely agree that they have important attributes in common, from innovative thinking to an ability to build trust among those who follow them to utter confidence and a stubborn devotion to their dream.

Now facing two peer competitors – China and Russia – who want to create a new world order that puts them at the forefront, the U.S. military needs every solider, sailor, airman and marine to be an innovator.

What Price Successful Kids?

17snowplow-1-superJumbo-v6

Few issues have dominated the media as much as the recent stories about affluent and famous parents cheating the system to get their kids into elite colleges.

These revelations have sparked outrage and have prompted many to question whether America is still a meritocracy. I wonder myself.

More stories of blatant cheating hit the news every day and many suggest we may just seeing the tip of the iceberg. Others point out that it’s not just the rich and famous who do almost anything to ease the path for their kids.

Easing the path is good as far as it goes: Not allowing kids to play in dangerous neighborhoods, taking advantage of specialists in school if a child is having trouble with reading or other subjects, being thoughtful regarding which friends they pick – those are all okay for most.

But today, the term “helicopter parent” has been replaced by another one, “snowplow parent,” those who keep their children’s futures obstacle-free — even when it means crossing ethical and legal boundaries.

Here’s how a recent article that addressed the issue began:

Nicole Eisenberg’s older son has wanted to be a star of the stage since he was a toddler, she said. He took voice, dance and drama lessons and attended the renowned Stagedoor Manor summer camp for half a dozen years, but she was anxious that might not be enough to get him into the best performing-arts programs.

So Ms. Eisenberg and others in Bloomfield Hills, Mich., the affluent suburb where she lives, helped him start a charity with friends that raised more than $250,000 over four years.

“The moms — the four or five moms that started it together — we started it, we helped, but we did not do it for them,” Ms. Eisenberg, 49, recalled. “Did we ask for sponsors for them? Yes. Did we ask for money for them? Yes. But they had to do the work.”

She even considered a donation to the college of his choice. “There’s no amount of money we could have paid to have got him in,” Ms. Eisenberg said. “Because, trust me, my father-in-law asked.” (Ms. Eisenberg’s son was admitted to two of the best musical theater programs in the country, she said, along with nine more of the 26 schools he applied to.) Want more? You can read the full article here

Amazon and NYC

Recently, New York City mayor Bill de Blasio wrote a powerful op-ed expressing his disappointment (outrage?) that Amazon abandoned its project to build a second headquarters in New York.

But beyond the opinion of one mayor, his piece tells a story of the power technology companies have amassed. Here is how he begins:

The first word I had that Amazon was about to scrap an agreement to bring 25,000 new jobs to New York City came an hour before it broke in the news on Thursday.

The call was brief and there was little explanation for the company’s reversal.

Just days before, I had counseled a senior Amazon executive about how they could win over some of their critics. Meet with organized labor. Start hiring public housing residents. Invest in infrastructure and other community needs. Show you care about fairness and creating opportunity for the working people of Long Island City.

There was a clear path forward. Put simply: If you don’t like a small but vocal group of New Yorkers questioning your company’s intentions or integrity, prove them wrong.

Instead, Amazon proved them right. Just two hours after a meeting with residents and community leaders to move the project forward, the company abruptly canceled it all.

I am a lifelong progressive who sees the problem of growing income and wealth inequality. The agreement we struck with Amazon back in November was a solid foundation. It would have created: at least 25,000 new jobs, including for unionized construction and service workers; partnerships with public colleges; and $27 billion in new tax revenue to fuel priorities from transit to affordable housing — a ninefold return on the taxes the city and state were prepared to forgo to win the headquarters.

The retail giant’s expansion in New York encountered opposition in no small part because of growing frustration with corporate America. For decades, wealth and power have concentrated at the very top. There’s no greater example of this than Amazon’s chief executive, Jeff Bezos — the richest man in the world.

The lesson here is that corporations can’t ignore rising anger over economic inequality anymore. We see that anger roiling Silicon Valley, in the rocks hurled at buses carrying tech workers from San Francisco and Oakland to office parks in the suburbs. We see it in the protests that erupted at Davos last month over the growing monopoly of corporate power. Want more? You can read the full article here.

Creative Juices

guides-creative-slide-T34S-videoSixteenByNineJumbo1600

Most people who are involved in creative endeavors are always looking for ways to be MORE creative. I know I am.

That’s why I was drawn to a series of New York Times “Better Living Guides” focused specifically on ways to become more creative.

The link below takes you to five different ideas regarding how to ignite more creativity. I was particularly taken by one that suggests, “Take a Nap.” Here’s how it begins:

So many wonderful scientific experiments have been done to show us the value of downtime for creativity, memory and learning. One of my favorites involves what happens inside the brains of rats when they are constantly stimulated.

In the experiment, done at the University of California at San Francisco, brain activity was measured in rats when they have a new experience. The researchers put the rat on a table, and could see that the rat developed a brain signal associated with a new experience. (It turns out, a new experience for a rat, like being set down on a table the rat has never been on before, can be fairly exciting.) Then, the researchers added a twist by dividing the subject rats into two groups. One group of rats was immediately subjected to another new experience (look, new table!), while another was given downtime.

In the rats that got downtime, researchers could see the brain activity move into a part of the brain called the hippocampus, which is associated with learning and memory. That didn’t happen with the rats that had a second new experience. When they were stimulated with a second new experience, they, in effect, had less of an ability to process what they’d already experienced. This is one experiment of many that show how crucial it is to let your brain go lax, to give it time and space. Think of your brain as if it were someone you’re in a relationship with; sometimes, it needs to have some alone time. This is just a snippet. Want more? You can read all five articles here

American Dream

05Samualabrams4-superJumbo

An enormous amount of ink has been spilled in books, articles, the blogosphere and elsewhere bemoaning the fact that the “American Dream,” is over.

For many who measure that dream by income, net worth and high-end material possessions, they would agree with this thesis.

But they’ve never asked the majority of American’s who stipulate they ARE living that dream. That’s why I was drawn to a recent op-ed, “We’re Still Living the American Dream.” Here’s how Samuel Abrams begins:

I am pleased to report that the American dream is alive and well for an overwhelming majority of Americans.

This claim might sound far-fetched given the cultural climate in the United States today. Especially since President Trump took office, hardly a day goes by without a fresh tale of economic anxiety, political disunity or social struggle. Opportunities to achieve material success and social mobility through hard, honest work — which many people, including me, have assumed to be the core idea of the American dream — appear to be diminishing.

But Americans, it turns out, have something else in mind when they talk about the American dream. And they believe that they are living it.

Last year the American Enterprise Institute and I joined forces with the research center NORC at the University of Chicago and surveyed a nationally representative sample of 2,411 Americans about their attitudes toward community and society. The center is renowned for offering “deep” samples of Americans, not just random ones, so that researchers can be confident that they are reaching Americans in all walks of life: rural, urban, exurban and so on.

What our survey found about the American dream came as a surprise to me. When Americans were asked what makes the American dream a reality, they did not select as essential factors becoming wealthy, owning a home or having a successful career. Instead, 85 percent indicated that “to have freedom of choice in how to live” was essential to achieving the American dream. In addition, 83 percent indicated that “a good family life” was essential. Want more? You can read the full article here

Who Codes

17mag-coders-pics-slide-ELU0-articleLarge

In his best-selling book, The Innovators, Walter Isaacson, traces to roots of the computer revolution to mathematician Ada Lovelace, Lord Byron’s daughter, who lived from 1815-1852.

His history moves into the twentieth century and highlights another female computer pioneer, Grace Hooper.

And it is well known that women were some of earliest coders and software pioneers over a half-century ago.

But where are female coders today? I’d always wondered, until I read Clive Thompson’s recent piece, “The Secret History of Women in Coding.” Here’s how he begins:

As a teenager in Maryland in the 1950s, Mary Allen Wilkes had no plans to become a software pioneer — she dreamed of being a litigator. One day in junior high in 1950, though, her geography teacher surprised her with a comment: “Mary Allen, when you grow up, you should be a computer programmer!” Wilkes had no idea what a programmer was; she wasn’t even sure what a computer was. Relatively few Americans were. The first digital computers had been built barely a decade earlier at universities and in government labs.

By the time she was graduating from Wellesley College in 1959, she knew her legal ambitions were out of reach. Her mentors all told her the same thing: Don’t even bother applying to law school. “They said: ‘Don’t do it. You may not get in. Or if you get in, you may not get out. And if you get out, you won’t get a job,’ ” she recalls. If she lucked out and got hired, it wouldn’t be to argue cases in front of a judge. More likely, she would be a law librarian, a legal secretary, someone processing trusts and estates.

But Wilkes remembered her junior high school teacher’s suggestion. In college, she heard that computers were supposed to be the key to the future. She knew that the Massachusetts Institute of Technology had a few of them. So on the day of her graduation, she had her parents drive her over to M.I.T. and marched into the school’s employment office. “Do you have any jobs for computer programmers?” she asked. They did, and they hired her.

It might seem strange now that they were happy to take on a random applicant with absolutely no experience in computer programming. But in those days, almost nobody had any experience writing code. The discipline did not yet really exist; there were vanishingly few college courses in it, and no majors. (Stanford, for example, didn’t create a computer-science department until 1965.) So instead, institutions that needed programmers just used aptitude tests to evaluate applicants’ ability to think logically. Wilkes happened to have some intellectual preparation: As a philosophy major, she had studied symbolic logic, which can involve creating arguments and inferences by stringing together and/or statements in a way that resembles coding. Want more? You can read it here

W.E.B!

We lost one of our generation’s great writer’s last month, W.E.B. Griffin. Many of us who love military stories tried to emulate his work. None of us could come close.

W. E. B. Griffin, who depicted the swashbuckling lives of soldiers, spies and cops in almost 60 novels, dozens of which became best sellers, died on Feb. 12 at his home in Daphne, Ala. He was 89.

W.E.B. Griffin estimated that he had published more than 150 books, many of which appeared on the best-seller lists of The New York Times, Publishers Weekly and other publications. His output included fiction for young adults and children’s books. Determining the exact number of books he wrote is not so easily done, however: He was a ghostwriter for many, and many others were published under a variety of pseudonyms, including Webb Beech, Edmund O. Scholefield, Allison Mitchell and Blakely St. James.

Even the name W. E. B. Griffin was a pseudonym; his real name was William E. Butterworth III.

His best-known books are under the Griffin name. The first was “The Lieutenants” (1982), which became the first installment in “The Brotherhood of War,” a nine-novel series that followed soldiers in the United States Army from World War II through the Vietnam War. Among his other series were “Badge of Honor,” about the Philadelphia Police Department, and “Clandestine Operations,” about the birth of the Central Intelligence Agency.

His fast-paced novels, rooted in history and chockablock with technical details, combined action, sex and patriotism and had a devoted readership. A profile in The Washington Post in 1997 described Mr. Griffin as “the grizzled griot of the warrior breed” and “the troubadour of the American serviceman.”

Mr. Griffin saw himself in simpler terms. “Basically I’m a storyteller,” he said. “I like to think I’m a competent craftsman, as writers go, but I am wholly devoid of literary ambitions or illusions.”

Happy?

24Mag-Happiness-Image1-jumbo-v2

Some of you reading this may be part of America’s professional elite – and you may have the wealth to prove it.

But most of you (us) are not, and somewhere deep inside we may wonder if only we were part of that professional elite we might be demonstratively happier.

Thing again. I had vaguely thought the answer was no, but when I read Charle’s Duhigg’s excellent piece, “Wealthy, Successful and Miserable,” I got it – big time. Here’s how he begins:

My first, charmed week as a student at Harvard Business School, late in the summer of 2001, felt like a halcyon time for capitalism. AOL Time Warner, Yahoo and Napster were benevolently connecting the world. Enron and WorldCom were bringing innovation to hidebound industries. President George W. Bush — an H.B.S. graduate himself — had promised to deliver progress and prosperity with businesslike efficiency.

The next few years would prove how little we (and Washington and much of corporate America) really understood about the economy and the world. But at the time, for the 895 first-years preparing ourselves for business moguldom, what really excited us was our good luck. A Harvard M.B.A. seemed like a winning lottery ticket, a gilded highway to world-changing influence, fantastic wealth and — if those self-satisfied portraits that lined the hallways were any indication — a lifetime of deeply meaningful work.

So it came as a bit of a shock, when I attended my 15th reunion last summer, to learn how many of my former classmates weren’t overjoyed by their professional lives — in fact, they were miserable. I heard about one fellow alum who had run a large hedge fund until being sued by investors (who also happened to be the fund manager’s relatives). Another person had risen to a senior role inside one of the nation’s most prestigious companies before being savagely pushed out by corporate politics. Another had learned in the maternity ward that her firm was being stolen by a conniving partner. Want more? You can read the full article here