Military Innovation

shutterstock_626258333

Among the buzzwords circulating in the U.S. military, Innovation is likely the most common one we all have encountered over the last decade.

Countless commands have set up “innovation cells” on their staffs and have sought ways to become more innovative, often seeking best practices from industry, especially Silicon Valley.

The Department of Defense has created a Defense Innovation Board comprised of outside experts who are charged to find ways to make DoD more “innovative.”

And just a few years ago, former Secretary of Defense Ashton Carter created Defense Innovation Unit Experimental – DIU(X) – (now DIU) at the old Moffett Field near the heart of Silicon Valley.

All of this is good as far as it goes – but the danger is clear – by establishing innovation cells on major staffs, by having outside experts tell the DoD how to be more innovative, and by establishing a large organization to be the DoD’s innovation “place” we may be sending the wrong signal to the rest of the military and civilian professionals: Don’t worry about being innovative, we’ve assigned that task to someone else.

Former Pacific Fleet Commander, Admiral Scott Swift was unique among senior commanders in that he purposefully and deliberately did not establish an innovation cell on the PACFLEET staff. As he shared in his remarks at the 2018 Pacific Command Science and Technology Conference, “I want every one of my sailors to be an innovator.”

As the old saw goes, the guy (or gal) who invented the wheel was in inventor, the person who took four wheels and put them on a wagon was an innovator.

We are taken by innovations and innovators, they help define our future and then make it possible.

From Archimedes to Zeppelin, the accomplishments of great visionaries over the centuries have filled history books. More currently, from Jeff Bezos of Amazon to Mark Zuckerberg of Facebook and Elon Musk of SpaceX and Tesla Motors, they are the objects of endless media fascination — and increasingly intense public scrutiny.

Although centuries stretch between them, experts who have studied the nature of innovators across all areas of expertise largely agree that they have important attributes in common, from innovative thinking to an ability to build trust among those who follow them to utter confidence and a stubborn devotion to their dream.

Now facing two peer competitors – China and Russia – who want to create a new world order that puts them at the forefront, the U.S. military needs every solider, sailor, airman and marine to be an innovator.

What Price Successful Kids?

17snowplow-1-superJumbo-v6 (1)

Few issues have dominated the media as much as the recent stories about affluent and famous parents cheating the system to get their kids into elite colleges.

These revelations have sparked outrage and have prompted many to question whether America is still a meritocracy. I wonder myself.

More stories of blatant cheating hit the news every day and many suggest we may just seeing the tip of the iceberg. Others point out that it’s not just the rich and famous who do almost anything to ease the path for their kids.

Easing the path is good as far as it goes: Not allowing kids to play in dangerous neighborhoods, taking advantage of specialists in school if a child is having trouble with reading or other subjects, being thoughtful regarding which friends they pick – those are all okay for most.

But today, the term “helicopter parent” has been replaced by another one, “snowplow parent,” those who keep their children’s futures obstacle-free — even when it means crossing ethical and legal boundaries.

Here’s how a recent article that addressed the issue began:

Nicole Eisenberg’s older son has wanted to be a star of the stage since he was a toddler, she said. He took voice, dance and drama lessons and attended the renowned Stagedoor Manor summer camp for half a dozen years, but she was anxious that might not be enough to get him into the best performing-arts programs.

So Ms. Eisenberg and others in Bloomfield Hills, Mich., the affluent suburb where she lives, helped him start a charity with friends that raised more than $250,000 over four years.

“The moms — the four or five moms that started it together — we started it, we helped, but we did not do it for them,” Ms. Eisenberg, 49, recalled. “Did we ask for sponsors for them? Yes. Did we ask for money for them? Yes. But they had to do the work.”

She even considered a donation to the college of his choice. “There’s no amount of money we could have paid to have got him in,” Ms. Eisenberg said. “Because, trust me, my father-in-law asked.” (Ms. Eisenberg’s son was admitted to two of the best musical theater programs in the country, she said, along with nine more of the 26 schools he applied to.)

Want more? You can read the full article here

Amazon and NYC

Recently, New York City mayor Bill de Blasio wrote a powerful op-ed expressing his disappointment (outrage?) that Amazon abandoned its project to build a second headquarters in New York.

But beyond the opinion of one mayor, his piece tells a story of the power technology companies have amassed. Here is how he begins:

The first word I had that Amazon was about to scrap an agreement to bring 25,000 new jobs to New York City came an hour before it broke in the news on Thursday.

The call was brief and there was little explanation for the company’s reversal.

Just days before, I had counseled a senior Amazon executive about how they could win over some of their critics. Meet with organized labor. Start hiring public housing residents. Invest in infrastructure and other community needs. Show you care about fairness and creating opportunity for the working people of Long Island City.

There was a clear path forward. Put simply: If you don’t like a small but vocal group of New Yorkers questioning your company’s intentions or integrity, prove them wrong.

Instead, Amazon proved them right. Just two hours after a meeting with residents and community leaders to move the project forward, the company abruptly canceled it all.

I am a lifelong progressive who sees the problem of growing income and wealth inequality. The agreement we struck with Amazon back in November was a solid foundation. It would have created: at least 25,000 new jobs, including for unionized construction and service workers; partnerships with public colleges; and $27 billion in new tax revenue to fuel priorities from transit to affordable housing — a ninefold return on the taxes the city and state were prepared to forgo to win the headquarters.

The retail giant’s expansion in New York encountered opposition in no small part because of growing frustration with corporate America. For decades, wealth and power have concentrated at the very top. There’s no greater example of this than Amazon’s chief executive, Jeff Bezos — the richest man in the world.

The lesson here is that corporations can’t ignore rising anger over economic inequality anymore. We see that anger roiling Silicon Valley, in the rocks hurled at buses carrying tech workers from San Francisco and Oakland to office parks in the suburbs. We see it in the protests that erupted at Davos last month over the growing monopoly of corporate power. Want more? You can read the full article here.

Creative Juices

guides-creative-slide-T34S-videoSixteenByNineJumbo1600

Most people who are involved in creative endeavors are always looking for ways to be MORE creative. I know I am.

That’s why I was drawn to a series of New York Times “Better Living Guides” focused specifically on ways to become more creative.

The link below takes you to five different ideas regarding how to ignite more creativity. I was particularly taken by one that suggests, “Take a Nap.” Here’s how it begins:

So many wonderful scientific experiments have been done to show us the value of downtime for creativity, memory and learning. One of my favorites involves what happens inside the brains of rats when they are constantly stimulated.

In the experiment, done at the University of California at San Francisco, brain activity was measured in rats when they have a new experience. The researchers put the rat on a table, and could see that the rat developed a brain signal associated with a new experience. (It turns out, a new experience for a rat, like being set down on a table the rat has never been on before, can be fairly exciting.) Then, the researchers added a twist by dividing the subject rats into two groups. One group of rats was immediately subjected to another new experience (look, new table!), while another was given downtime.

In the rats that got downtime, researchers could see the brain activity move into a part of the brain called the hippocampus, which is associated with learning and memory. That didn’t happen with the rats that had a second new experience. When they were stimulated with a second new experience, they, in effect, had less of an ability to process what they’d already experienced. This is one experiment of many that show how crucial it is to let your brain go lax, to give it time and space. Think of your brain as if it were someone you’re in a relationship with; sometimes, it needs to have some alone time. This is just a snippet. Want more? You can read all five articles here

American Dream

05Samualabrams4-superJumbo

An enormous amount of ink has been spilled in books, articles, the blogosphere and elsewhere bemoaning the fact that the “American Dream,” is over.

For many who measure that dream by income, net worth and high-end material possessions, they would agree with this thesis.

But they’ve never asked the majority of American’s who stipulate they ARE living that dream. That’s why I was drawn to a recent op-ed, “We’re Still Living the American Dream.” Here’s how Samuel Abrams begins:

I am pleased to report that the American dream is alive and well for an overwhelming majority of Americans.

This claim might sound far-fetched given the cultural climate in the United States today. Especially since President Trump took office, hardly a day goes by without a fresh tale of economic anxiety, political disunity or social struggle. Opportunities to achieve material success and social mobility through hard, honest work — which many people, including me, have assumed to be the core idea of the American dream — appear to be diminishing.

But Americans, it turns out, have something else in mind when they talk about the American dream. And they believe that they are living it.

Last year the American Enterprise Institute and I joined forces with the research center NORC at the University of Chicago and surveyed a nationally representative sample of 2,411 Americans about their attitudes toward community and society. The center is renowned for offering “deep” samples of Americans, not just random ones, so that researchers can be confident that they are reaching Americans in all walks of life: rural, urban, exurban and so on.

What our survey found about the American dream came as a surprise to me. When Americans were asked what makes the American dream a reality, they did not select as essential factors becoming wealthy, owning a home or having a successful career. Instead, 85 percent indicated that “to have freedom of choice in how to live” was essential to achieving the American dream. In addition, 83 percent indicated that “a good family life” was essential. Want more? You can read the full article here

Who Codes

17mag-coders-pics-slide-ELU0-articleLarge

In his best-selling book, The Innovators, Walter Isaacson, traces to roots of the computer revolution to mathematician Ada Lovelace, Lord Byron’s daughter, who lived from 1815-1852.

His history moves into the twentieth century and highlights another female computer pioneer, Grace Hooper.

And it is well known that women were some of earliest coders and software pioneers over a half-century ago.

But where are female coders today? I’d always wondered, until I read Clive Thompson’s recent piece, “The Secret History of Women in Coding.” Here’s how he begins:

As a teenager in Maryland in the 1950s, Mary Allen Wilkes had no plans to become a software pioneer — she dreamed of being a litigator. One day in junior high in 1950, though, her geography teacher surprised her with a comment: “Mary Allen, when you grow up, you should be a computer programmer!” Wilkes had no idea what a programmer was; she wasn’t even sure what a computer was. Relatively few Americans were. The first digital computers had been built barely a decade earlier at universities and in government labs.

By the time she was graduating from Wellesley College in 1959, she knew her legal ambitions were out of reach. Her mentors all told her the same thing: Don’t even bother applying to law school. “They said: ‘Don’t do it. You may not get in. Or if you get in, you may not get out. And if you get out, you won’t get a job,’ ” she recalls. If she lucked out and got hired, it wouldn’t be to argue cases in front of a judge. More likely, she would be a law librarian, a legal secretary, someone processing trusts and estates.

But Wilkes remembered her junior high school teacher’s suggestion. In college, she heard that computers were supposed to be the key to the future. She knew that the Massachusetts Institute of Technology had a few of them. So on the day of her graduation, she had her parents drive her over to M.I.T. and marched into the school’s employment office. “Do you have any jobs for computer programmers?” she asked. They did, and they hired her.

It might seem strange now that they were happy to take on a random applicant with absolutely no experience in computer programming. But in those days, almost nobody had any experience writing code. The discipline did not yet really exist; there were vanishingly few college courses in it, and no majors. (Stanford, for example, didn’t create a computer-science department until 1965.) So instead, institutions that needed programmers just used aptitude tests to evaluate applicants’ ability to think logically. Wilkes happened to have some intellectual preparation: As a philosophy major, she had studied symbolic logic, which can involve creating arguments and inferences by stringing together and/or statements in a way that resembles coding. Want more? You can read it here

W.E.B!

We lost one of our generation’s great writer’s last month, W.E.B. Griffin. Many of us who love military stories tried to emulate his work. None of us could come close.

W. E. B. Griffin, who depicted the swashbuckling lives of soldiers, spies and cops in almost 60 novels, dozens of which became best sellers, died on Feb. 12 at his home in Daphne, Ala. He was 89.

W.E.B. Griffin estimated that he had published more than 150 books, many of which appeared on the best-seller lists of The New York Times, Publishers Weekly and other publications. His output included fiction for young adults and children’s books. Determining the exact number of books he wrote is not so easily done, however: He was a ghostwriter for many, and many others were published under a variety of pseudonyms, including Webb Beech, Edmund O. Scholefield, Allison Mitchell and Blakely St. James.

Even the name W. E. B. Griffin was a pseudonym; his real name was William E. Butterworth III.

His best-known books are under the Griffin name. The first was “The Lieutenants” (1982), which became the first installment in “The Brotherhood of War,” a nine-novel series that followed soldiers in the United States Army from World War II through the Vietnam War. Among his other series were “Badge of Honor,” about the Philadelphia Police Department, and “Clandestine Operations,” about the birth of the Central Intelligence Agency.

His fast-paced novels, rooted in history and chockablock with technical details, combined action, sex and patriotism and had a devoted readership. A profile in The Washington Post in 1997 described Mr. Griffin as “the grizzled griot of the warrior breed” and “the troubadour of the American serviceman.”

Mr. Griffin saw himself in simpler terms. “Basically I’m a storyteller,” he said. “I like to think I’m a competent craftsman, as writers go, but I am wholly devoid of literary ambitions or illusions.”

Happy?

24Mag-Happiness-Image1-jumbo-v2

Some of you reading this may be part of America’s professional elite – and you may have the wealth to prove it.

But most of you (us) are not, and somewhere deep inside we may wonder if only we were part of that professional elite we might be demonstratively happier.

Thing again. I had vaguely thought the answer was no, but when I read Charle’s Duhigg’s excellent piece, “Wealthy, Successful and Miserable,” I got it – big time. Here’s how he begins:

My first, charmed week as a student at Harvard Business School, late in the summer of 2001, felt like a halcyon time for capitalism. AOL Time Warner, Yahoo and Napster were benevolently connecting the world. Enron and WorldCom were bringing innovation to hidebound industries. President George W. Bush — an H.B.S. graduate himself — had promised to deliver progress and prosperity with businesslike efficiency.

The next few years would prove how little we (and Washington and much of corporate America) really understood about the economy and the world. But at the time, for the 895 first-years preparing ourselves for business moguldom, what really excited us was our good luck. A Harvard M.B.A. seemed like a winning lottery ticket, a gilded highway to world-changing influence, fantastic wealth and — if those self-satisfied portraits that lined the hallways were any indication — a lifetime of deeply meaningful work.

So it came as a bit of a shock, when I attended my 15th reunion last summer, to learn how many of my former classmates weren’t overjoyed by their professional lives — in fact, they were miserable. I heard about one fellow alum who had run a large hedge fund until being sued by investors (who also happened to be the fund manager’s relatives). Another person had risen to a senior role inside one of the nation’s most prestigious companies before being savagely pushed out by corporate politics. Another had learned in the maternity ward that her firm was being stolen by a conniving partner. Want more? You can read the full article here

Facebook

03Bissell-jumbo

It’s been about 5 years since Jim Cramer and Bob Lang coined the acronym “FANG” for mega-cap high growth stocks Facebook, Amazon, Netflix and Alphabet Google.

And while it just happens to lead a handy acronym, Facebook is quite possibly the most controversial tech company of all time.

For most, this is due to one person, Facebook’s CEO, Mark Zuckerberg. It has been almost a decade since the movie about Facebook and its founder, The Social Network, hit with such force.

We remain fascinated by Facebook and Zuckerberg. We want to learn more, but we want something different. That’s why I was drawn in by a book review for “Zucked.” Here’s how it begins:

The dystopia George Orwell conjured up in “1984” wasn’t a prediction. It was, instead, a reflection. Newspeak, the Ministry of Truth, the Inner Party, the Outer Party — that novel sampled and remixed a reality that Nazi and Soviet totalitarianism had already made apparent. Scary stuff, certainly, but maybe the more frightening dystopia is the one no one warned you about, the one you wake up one morning to realize you’re living inside.

Roger McNamee, an esteemed venture capitalist, would appear to agree. “A dystopian technology future overran our lives before we were ready,” he writes in “Zucked.” Think that sounds like overstatement? Let’s examine the evidence. At its peak the planet’s fourth most valuable company, and arguably its most influential, is controlled almost entirely by a young man with the charisma of a geometry T.A. The totality of this man’s professional life has been running this company, which calls itself “a platform.”

Company, platform — whatever it is, it provides a curious service wherein billions of people fill it with content: baby photos, birthday wishes, concert promotions, psychotic premonitions of Jewish lizard-men. No one is paid by the company for this labor; on the contrary, users are rewarded by being tracked across the web, even when logged out, and consequently strip-mined by a complicated artificial intelligence trained to sort surveilled information into approximately 29,000 predictive data points, which are then made available to advertisers and other third parties, who now know everything that can be known about a person without trepanning her skull. Amazingly, none of this is secret, despite the company’s best efforts to keep it so. Somehow, people still use and love this platform. Want more? You can read the full article here

Unsung Warriors

00kent-alt-superJumbo

Much has been written – most of it reasonable, but some of it shrill – regarding women in the U.S. military.

While integration of women into the U.S. military has progressed by leaps and bounds in the last few decades, one place that has remained a male-only bastion has been Special Operations.

Or has it? Four Americans were killed by a suicide bomber in Syria in mid-January. One was U.S. Navy Cryptologic Technician Chief Petty Officer Shannon Kent. She was operating with the U.S. Navy SEALs. While not “officially” a SEAL or other Special Operator, she was just as vital to the mission as her male counterparts – and just as vulnerable.

Here is how a recent New York Times article describes how Chief Petty Officer Kent served – and died.

Given who she really was, military officials had little choice in how they described Shannon Kent. They said only that she was a “cryptologic technician,” which anyone might assume meant that her most breakneck work was behind a desk.

In reality, she spent much of her professional life wearing body armor and toting an M4 rifle, a Sig Sauer pistol strapped to her thigh, on operations with Navy SEALs and other elite forces — until a suicide bombing took her life last month in northeastern Syria.

She was, in all but name, part of the military’s top-tier Special Operations forces. Officially a chief petty officer in the Navy, she actually worked closely with the nation’s most secretive intelligence outfit, the National Security Agency, to target leaders of the Islamic State.

The last few years have seen a profound shift in attitudes toward women in combat roles. Since 2016, combat jobs have been open to female service members, and they have been permitted to try out for Special Operations units. More than a dozen have completed the Army’s Ranger school, one of the most challenging in the military. Some have graduated from infantry officer courses, and even command combat units. And in November, a woman completed the Army’s grueling Special Forces Assessment and Selection course, the initial step to becoming a Green Beret.

Yet Chief Kent illustrates an unspoken truth: that for many years women have been doing military jobs as dangerous, secretive and specialized as anything men do. This is just a snippet. Want more? You can read the full article here