American Dream

05Samualabrams4-superJumbo

An enormous amount of ink has been spilled in books, articles, the blogosphere and elsewhere bemoaning the fact that the “American Dream,” is over.

For many who measure that dream by income, net worth and high-end material possessions, they would agree with this thesis.

But they’ve never asked the majority of American’s who stipulate they ARE living that dream. That’s why I was drawn to a recent op-ed, “We’re Still Living the American Dream.” Here’s how Samuel Abrams begins:

I am pleased to report that the American dream is alive and well for an overwhelming majority of Americans.

This claim might sound far-fetched given the cultural climate in the United States today. Especially since President Trump took office, hardly a day goes by without a fresh tale of economic anxiety, political disunity or social struggle. Opportunities to achieve material success and social mobility through hard, honest work — which many people, including me, have assumed to be the core idea of the American dream — appear to be diminishing.

But Americans, it turns out, have something else in mind when they talk about the American dream. And they believe that they are living it.

Last year the American Enterprise Institute and I joined forces with the research center NORC at the University of Chicago and surveyed a nationally representative sample of 2,411 Americans about their attitudes toward community and society. The center is renowned for offering “deep” samples of Americans, not just random ones, so that researchers can be confident that they are reaching Americans in all walks of life: rural, urban, exurban and so on.

What our survey found about the American dream came as a surprise to me. When Americans were asked what makes the American dream a reality, they did not select as essential factors becoming wealthy, owning a home or having a successful career. Instead, 85 percent indicated that “to have freedom of choice in how to live” was essential to achieving the American dream. In addition, 83 percent indicated that “a good family life” was essential. Want more? You can read the full article here

Who Codes

17mag-coders-pics-slide-ELU0-articleLarge

In his best-selling book, The Innovators, Walter Isaacson, traces to roots of the computer revolution to mathematician Ada Lovelace, Lord Byron’s daughter, who lived from 1815-1852.

His history moves into the twentieth century and highlights another female computer pioneer, Grace Hooper.

And it is well known that women were some of earliest coders and software pioneers over a half-century ago.

But where are female coders today? I’d always wondered, until I read Clive Thompson’s recent piece, “The Secret History of Women in Coding.” Here’s how he begins:

As a teenager in Maryland in the 1950s, Mary Allen Wilkes had no plans to become a software pioneer — she dreamed of being a litigator. One day in junior high in 1950, though, her geography teacher surprised her with a comment: “Mary Allen, when you grow up, you should be a computer programmer!” Wilkes had no idea what a programmer was; she wasn’t even sure what a computer was. Relatively few Americans were. The first digital computers had been built barely a decade earlier at universities and in government labs.

By the time she was graduating from Wellesley College in 1959, she knew her legal ambitions were out of reach. Her mentors all told her the same thing: Don’t even bother applying to law school. “They said: ‘Don’t do it. You may not get in. Or if you get in, you may not get out. And if you get out, you won’t get a job,’ ” she recalls. If she lucked out and got hired, it wouldn’t be to argue cases in front of a judge. More likely, she would be a law librarian, a legal secretary, someone processing trusts and estates.

But Wilkes remembered her junior high school teacher’s suggestion. In college, she heard that computers were supposed to be the key to the future. She knew that the Massachusetts Institute of Technology had a few of them. So on the day of her graduation, she had her parents drive her over to M.I.T. and marched into the school’s employment office. “Do you have any jobs for computer programmers?” she asked. They did, and they hired her.

It might seem strange now that they were happy to take on a random applicant with absolutely no experience in computer programming. But in those days, almost nobody had any experience writing code. The discipline did not yet really exist; there were vanishingly few college courses in it, and no majors. (Stanford, for example, didn’t create a computer-science department until 1965.) So instead, institutions that needed programmers just used aptitude tests to evaluate applicants’ ability to think logically. Wilkes happened to have some intellectual preparation: As a philosophy major, she had studied symbolic logic, which can involve creating arguments and inferences by stringing together and/or statements in a way that resembles coding. Want more? You can read it here

W.E.B!

We lost one of our generation’s great writer’s last month, W.E.B. Griffin. Many of us who love military stories tried to emulate his work. None of us could come close.

W. E. B. Griffin, who depicted the swashbuckling lives of soldiers, spies and cops in almost 60 novels, dozens of which became best sellers, died on Feb. 12 at his home in Daphne, Ala. He was 89.

W.E.B. Griffin estimated that he had published more than 150 books, many of which appeared on the best-seller lists of The New York Times, Publishers Weekly and other publications. His output included fiction for young adults and children’s books. Determining the exact number of books he wrote is not so easily done, however: He was a ghostwriter for many, and many others were published under a variety of pseudonyms, including Webb Beech, Edmund O. Scholefield, Allison Mitchell and Blakely St. James.

Even the name W. E. B. Griffin was a pseudonym; his real name was William E. Butterworth III.

His best-known books are under the Griffin name. The first was “The Lieutenants” (1982), which became the first installment in “The Brotherhood of War,” a nine-novel series that followed soldiers in the United States Army from World War II through the Vietnam War. Among his other series were “Badge of Honor,” about the Philadelphia Police Department, and “Clandestine Operations,” about the birth of the Central Intelligence Agency.

His fast-paced novels, rooted in history and chockablock with technical details, combined action, sex and patriotism and had a devoted readership. A profile in The Washington Post in 1997 described Mr. Griffin as “the grizzled griot of the warrior breed” and “the troubadour of the American serviceman.”

Mr. Griffin saw himself in simpler terms. “Basically I’m a storyteller,” he said. “I like to think I’m a competent craftsman, as writers go, but I am wholly devoid of literary ambitions or illusions.”

Happy?

24Mag-Happiness-Image1-jumbo-v2

Some of you reading this may be part of America’s professional elite – and you may have the wealth to prove it.

But most of you (us) are not, and somewhere deep inside we may wonder if only we were part of that professional elite we might be demonstratively happier.

Thing again. I had vaguely thought the answer was no, but when I read Charle’s Duhigg’s excellent piece, “Wealthy, Successful and Miserable,” I got it – big time. Here’s how he begins:

My first, charmed week as a student at Harvard Business School, late in the summer of 2001, felt like a halcyon time for capitalism. AOL Time Warner, Yahoo and Napster were benevolently connecting the world. Enron and WorldCom were bringing innovation to hidebound industries. President George W. Bush — an H.B.S. graduate himself — had promised to deliver progress and prosperity with businesslike efficiency.

The next few years would prove how little we (and Washington and much of corporate America) really understood about the economy and the world. But at the time, for the 895 first-years preparing ourselves for business moguldom, what really excited us was our good luck. A Harvard M.B.A. seemed like a winning lottery ticket, a gilded highway to world-changing influence, fantastic wealth and — if those self-satisfied portraits that lined the hallways were any indication — a lifetime of deeply meaningful work.

So it came as a bit of a shock, when I attended my 15th reunion last summer, to learn how many of my former classmates weren’t overjoyed by their professional lives — in fact, they were miserable. I heard about one fellow alum who had run a large hedge fund until being sued by investors (who also happened to be the fund manager’s relatives). Another person had risen to a senior role inside one of the nation’s most prestigious companies before being savagely pushed out by corporate politics. Another had learned in the maternity ward that her firm was being stolen by a conniving partner. Want more? You can read the full article here

Facebook

03Bissell-jumbo

It’s been about 5 years since Jim Cramer and Bob Lang coined the acronym “FANG” for mega-cap high growth stocks Facebook, Amazon, Netflix and Alphabet Google.

And while it just happens to lead a handy acronym, Facebook is quite possibly the most controversial tech company of all time.

For most, this is due to one person, Facebook’s CEO, Mark Zuckerberg. It has been almost a decade since the movie about Facebook and its founder, The Social Network, hit with such force.

We remain fascinated by Facebook and Zuckerberg. We want to learn more, but we want something different. That’s why I was drawn in by a book review for “Zucked.” Here’s how it begins:

The dystopia George Orwell conjured up in “1984” wasn’t a prediction. It was, instead, a reflection. Newspeak, the Ministry of Truth, the Inner Party, the Outer Party — that novel sampled and remixed a reality that Nazi and Soviet totalitarianism had already made apparent. Scary stuff, certainly, but maybe the more frightening dystopia is the one no one warned you about, the one you wake up one morning to realize you’re living inside.

Roger McNamee, an esteemed venture capitalist, would appear to agree. “A dystopian technology future overran our lives before we were ready,” he writes in “Zucked.” Think that sounds like overstatement? Let’s examine the evidence. At its peak the planet’s fourth most valuable company, and arguably its most influential, is controlled almost entirely by a young man with the charisma of a geometry T.A. The totality of this man’s professional life has been running this company, which calls itself “a platform.”

Company, platform — whatever it is, it provides a curious service wherein billions of people fill it with content: baby photos, birthday wishes, concert promotions, psychotic premonitions of Jewish lizard-men. No one is paid by the company for this labor; on the contrary, users are rewarded by being tracked across the web, even when logged out, and consequently strip-mined by a complicated artificial intelligence trained to sort surveilled information into approximately 29,000 predictive data points, which are then made available to advertisers and other third parties, who now know everything that can be known about a person without trepanning her skull. Amazingly, none of this is secret, despite the company’s best efforts to keep it so. Somehow, people still use and love this platform. Want more? You can read the full article here

Unsung Warriors

00kent-alt-superJumbo

Much has been written – most of it reasonable, but some of it shrill – regarding women in the U.S. military.

While integration of women into the U.S. military has progressed by leaps and bounds in the last few decades, one place that has remained a male-only bastion has been Special Operations.

Or has it? Four Americans were killed by a suicide bomber in Syria in mid-January. One was U.S. Navy Cryptologic Technician Chief Petty Officer Shannon Kent. She was operating with the U.S. Navy SEALs. While not “officially” a SEAL or other Special Operator, she was just as vital to the mission as her male counterparts – and just as vulnerable.

Here is how a recent New York Times article describes how Chief Petty Officer Kent served – and died.

Given who she really was, military officials had little choice in how they described Shannon Kent. They said only that she was a “cryptologic technician,” which anyone might assume meant that her most breakneck work was behind a desk.

In reality, she spent much of her professional life wearing body armor and toting an M4 rifle, a Sig Sauer pistol strapped to her thigh, on operations with Navy SEALs and other elite forces — until a suicide bombing took her life last month in northeastern Syria.

She was, in all but name, part of the military’s top-tier Special Operations forces. Officially a chief petty officer in the Navy, she actually worked closely with the nation’s most secretive intelligence outfit, the National Security Agency, to target leaders of the Islamic State.

The last few years have seen a profound shift in attitudes toward women in combat roles. Since 2016, combat jobs have been open to female service members, and they have been permitted to try out for Special Operations units. More than a dozen have completed the Army’s Ranger school, one of the most challenging in the military. Some have graduated from infantry officer courses, and even command combat units. And in November, a woman completed the Army’s grueling Special Forces Assessment and Selection course, the initial step to becoming a Green Beret.

Yet Chief Kent illustrates an unspoken truth: that for many years women have been doing military jobs as dangerous, secretive and specialized as anything men do. This is just a snippet. Want more? You can read the full article here

Easy Not To Be Rude

merlin_150710904_17496c6e-6605-4e5f-8f6a-a8aba7bfb663-superJumbo

Remember when e-mail was novel? Remember Tom Hanks and Meg Ryan in the 1998 movie “You’ve Got Mail.” Recall how excited she was when her computer went “bing.”

We’ve come a long way in the last two-plus decades since that movie. Most of us feel we are drowning in e-mails. The difference is how we deal with it.

That’s why I found Adam Grant’s piece, “No, You Can’t Ignore E-Mail. It’s Rude,” so refreshing – and useful.

Far from being a polemic against those who ignore their e-mails, he shows how those who can’t (or choose not to) keep up are hurting themselves at work and in life. Here’s how he begins:

I’m really sorry I didn’t say hi, make eye contact or acknowledge your presence in any way when you waved to me in the hallway the other day. It’s nothing personal. I just have too many people trying to greet me these days, and I can’t respond to everyone.

That sounds ridiculous, right? You would never snub a colleague trying to strike up a conversation. Yet when you ignore a personal email, that’s exactly what you’ve done: digital snubbery.

Yes, we’re all overwhelmed with email. One recent survey suggested that the average American’s inbox has 199 unread messages. But volume isn’t an excuse for not replying. Ignoring email is an act of incivility.

“I’m too busy to answer your email” really means “Your email is not a priority for me right now.” That’s a popular justification for neglecting your inbox: It’s full of other people’s priorities. But there’s a growing body of evidence that if you care about being good at your job, your inbox should be a priority.

When researchers compiled a huge database of the digital habits of teams at Microsoft, they found that the clearest warning sign of an ineffective manager was being slow to answer emails. Responding in a timely manner shows that you are conscientious — organized, dependable and hardworking. And that matters. In a comprehensive analysis of people in hundreds of occupations, conscientiousness was the single best personality predictor of job performance. (It turns out that people who are rude online tend to be rude offline, too.) Want more? You can read the full article here

Reading and Writing

guide-books-slide-GJD8-jumbo

While there is a massive amount of good writing advice from multiple sources, if there is one that is routinely at the top of every “advice” list, it’s: “Every good writer is a reader.”

Great advice, but it’s often packaged in ways that don’t always resonate. That’s why I was drawn in by a recent piece, “How to Tap Your Inner Reader.” Here’s how it began:

Studies suggest all kinds of benefits to reading, including increased empathy, stress reduction and memory retention. It can even curb your criminal instincts, according to some researchers, although my family might have their doubts about me. 

But if you’re a reader, you probably love books not because they lower your cholesterol but because they bring you joy. Reading is, ideally, a leisure activity: the kind of thing you can devote an afternoon to while dinner is bubbling in the slow cooker and the cat is curled at your feet and you slouch in an armchair like a teenager (hey, maybe you are a teenager) losing yourself in a world somebody else has imagined into being. Reading a book is a form of communication because you’re communing: The writer speaks, the reader listens, and somewhere along the way you achieve a real intimacy, of a sort. That’s magical. 

But leisure activities require leisure time, and who’s got that? Let’s face it; the afternoon in the armchair probably isn’t happening, even if somebody else takes care of dinner. Finding time to read generally means making time to read, and that means making it a priority. If you can incorporate the gym into your regular routine, you can incorporate quality time with a book too.  Want more? You can read it here

China!

merlin_149673690_b8c995f9-3eaa-4041-be95-7f4025c81651-jumbo

An enormous amount of ink has been spilled regarding China and especially China’s rise. I have blogged about China frequently on this site, most recently, earlier this month, here.

When I saw David Brook’s recent Op-Ed, “How China Brings Us Together,” I wasn’t prepared for his subtitle: “An existential threat for the 21st century.”

It got my attention – and it should get yours. Here’s how he begins:

I’ve always thought Americans would come together when we realized that we faced a dangerous foreign foe. And lo and behold, now we have one: China. It’s become increasingly clear that China is a grave economic, technological and intellectual threat to the United States and the world order.

And sure enough, beneath the TV bluster of daily politics, Americans are beginning to join together. Mike Pence and Elizabeth Warren can sound shockingly similar when talking about China’s economic policy. Nancy Pelosi and Republicans sound shockingly similar when they talk about Chinese human rights abuses. Conservative and liberal policy thinkers can sound shockingly similar when they start talking about how to respond to the challenge from China.

For the past few decades, China has appeared to be a net positive force in world affairs. Sure, Beijing violated trade agreements and escalated regional tensions. But the Chinese economic explosion lowered our cost of living and expanded prosperity worldwide.

But a few things have now changed. First, instead of liberalizing, the Chinese regime has become more aggressive and repressive. This is just a snippet. Want more? You can read the full article here

Climate Panic

merlin_150753201_d05bc8c2-7b95-43df-9be0-66d80d40eef3-superJumbo

Climate change! Some have called it an existential threat to humanity. Others have denied its existence.

If one thing is true it’s that the arguments about climate change have become increasingly shrill and that it is increasingly difficult to separate fact from fiction.

That’s why I found Davis Wallace-Wells recent, short article on the subject so refreshing. He explains the “why” behind our inability to take on this challenge in especially compelling terms. Here is how he begins:

The age of climate panic is here. Last summer, a heat wave baked the entire Northern Hemisphere, killing dozens from Quebec to Japan. Some of the most destructive wildfires in California history turned more than a million acres to ash, along the way melting the tires and the sneakers of those trying to escape the flames. Pacific hurricanes forced three million people in China to flee and wiped away almost all of Hawaii’s East Island.

We are living today in a world that has warmed by just one degree Celsius (1.8 degrees Fahrenheit) since the late 1800s, when records began on a global scale. We are adding planet-warming carbon dioxide to the atmosphere at a rate faster than at any point in human history since the beginning of industrialization.

In October, the United Nations Intergovernmental Panel on Climate Change released what has become known as its “Doomsday” report — “a deafening, piercing smoke alarm going off in the kitchen,” as one United Nations official described it — detailing climate effects at 1.5 and two degrees Celsius of warming (2.7 and 3.6 degrees Fahrenheit). At the opening of a major United Nations conference two months later, David Attenborough, the mellifluous voice of the BBC’s “Planet Earth” and now an environmental conscience for the English-speaking world, put it even more bleakly: “If we don’t take action,” he said, “the collapse of our civilizations and the extinction of much of the natural world is on the horizon.”

Scientists have felt this way for a while. But they have not often talked like it. For decades, there were few things with a worse reputation than “alarmism” among those studying climate change. This is a bit strange. You don’t typically hear from public health experts about the need for circumspection in describing the risks of carcinogens, for instance. The climatologist James Hansen, who testified before Congress about global warming in 1988, has called the phenomenon “scientific reticence” and chastised his colleagues for it — for editing their own observations so conscientiously that they failed to communicate how dire the threat actually was. Want more? You can read the full article here